Citation: | HUANG Xiaoge, WU Yuhang, YIN Hongbo, LIANG Chengchao, CHEN Qianbin. Direct Acyclic Graph Blockchain-based Personalized Federated Mutual Distillation Learning in Internet of Vehicles[J]. Journal of Electronics & Information Technology, 2024, 46(7): 2821-2830. doi: 10.11999/JEIT230976 |
[1] |
HUANG Xiaoge, YIN Hongbo, CHEN Qianbin, et al. DAG-based swarm learning: A secure asynchronous learning framework for internet of vehicles[J]. Digital Communications and Networks, 2023. doi: 10.1016/j.dcan.2023.10.004.
|
[2] |
YIN Hongbo, HUANG Xiaoge, WU Yuhang, et al. Multi-region asynchronous swarm learning for data sharing in large-scale internet of vehicles[J]. IEEE Communications Letters, 2023, 27(11): 2978–2982. doi: 10.1109/LCOMM.2023.3314662.
|
[3] |
MCMAHAN B, MOORE E, RAMAGE D, et al. Communication-efficient learning of deep networks from decentralized data[C]. The 20th International Conference on Artificial Intelligence and Statistics, Fort Lauderdale, USA, 2017: 1273–1282.
|
[4] |
HUANG Xiaoge, WU Yuhang, LIANG Chengchao, et al. Distance-aware hierarchical federated learning in blockchain-enabled edge computing network[J]. IEEE Internet of Things Journal, 2023, 10(21): 19163–19176. doi: 10.1109/JIOT.2023.3279983.
|
[5] |
FENG Lei, ZHAO Yiqi, GUO Shaoyong, et al. BAFL: A blockchain-based asynchronous federated learning framework[J]. IEEE Transactions on Computers, 2022, 71(5): 1092–1103. doi: 10.1109/TC.2021.3072033.
|
[6] |
XIAO Huizi, ZHAO Jun, PEI Qingqi, et al. Vehicle selection and resource optimization for federated learning in vehicular edge computing[J]. IEEE Transactions on Intelligent Transportation Systems, 2022, 23(8): 11073–11087. doi: 10.1109/TITS.2021.3099597.
|
[7] |
CAO Mingrui, ZHANG Long, and CAO Bin. Toward on-device federated learning: A direct acyclic graph-based blockchain approach[J]. IEEE Transactions on Neural Networks and Learning Systems, 2023, 34(4): 2028–2042. doi: 10.1109/TNNLS.2021.3105810.
|
[8] |
ZHENG Haifeng, GAO Min, CHEN Zhizhang, et al. A distributed hierarchical deep computation model for federated learning in edge computing[J]. IEEE Transactions on Industrial Informatics, 2021, 17(12): 7946–7956. doi: 10.1109/TII.2021.3065719.
|
[9] |
WU Chuhan, WU Fangzhao, LYU Lingjuan, et al. Communication-efficient federated learning via knowledge distillation[J]. Nature Communications, 2022, 13(1): 2023. doi: 10.1038/s41467-022-29763-x.
|
[10] |
JEONG E, OH S, KIM H, et al. Communication-efficient on-device machine learning: Federated distillation and augmentation under non-IID private data[EB/OL]. https://arxiv.org/abs/1811.11479v1, 2018.
|
[11] |
LEE G, JEONG M, SHIN Y, et al. Preservation of the global knowledge by not-true distillation in federated learning[C]. The 36th International Conference on Neural Information Processing Systems, New Orleans, USA, 2022: 2787.
|
[12] |
FALLAH A, MOKHTARI A, and OZDAGLAR A. Personalized federated learning: A meta-learning approach[EB/OL]. https://arxiv.org/abs/2002.07948, 2020.
|
[13] |
LI Tian, SAHU A K, ZAHEER M, et al. Federated optimization in heterogeneous networks[C]. Machine Learning and Systems, Austin, USA, 2020: 429–450.
|
[14] |
JIANG Yuang, WANG Shiqiang, VALLS V, et al. Model pruning enables efficient federated learning on edge devices[J]. IEEE Transactions on Neural Networks and Learning Systems, 2023, 34(12): 10374–10386. doi: 10.1109/TNNLS.2022.3166101.
|
[15] |
WEN Dingzhu, JEON K J, and HUANG Kaibin. Federated dropout—a simple approach for enabling federated learning on resource constrained devices[J]. IEEE Wireless Communications Letters, 2022, 11(5): 923–927. doi: 10.1109/LWC.2022.3149783.
|
[16] |
NORI M K, YUN S, and KIM I M. Fast federated learning by balancing communication trade-offs[J]. IEEE Transactions on Communications, 2021, 69(8): 5168–5182. doi: 10.1109/TCOMM.2021.3083316.
|