Advanced Search
Volume 46 Issue 7
Jul.  2024
Turn off MathJax
Article Contents
HUANG Xiaoge, WU Yuhang, YIN Hongbo, LIANG Chengchao, CHEN Qianbin. Direct Acyclic Graph Blockchain-based Personalized Federated Mutual Distillation Learning in Internet of Vehicles[J]. Journal of Electronics & Information Technology, 2024, 46(7): 2821-2830. doi: 10.11999/JEIT230976
Citation: HUANG Xiaoge, WU Yuhang, YIN Hongbo, LIANG Chengchao, CHEN Qianbin. Direct Acyclic Graph Blockchain-based Personalized Federated Mutual Distillation Learning in Internet of Vehicles[J]. Journal of Electronics & Information Technology, 2024, 46(7): 2821-2830. doi: 10.11999/JEIT230976

Direct Acyclic Graph Blockchain-based Personalized Federated Mutual Distillation Learning in Internet of Vehicles

doi: 10.11999/JEIT230976
Funds:  The National Natural Science Foundation of China (62371082,62001076), The General Program of Natural Science Foundation of Chongqing (CSTB2023NSCQ-MSX0726,cstc2020jcyj-msxmX0878)
  • Received Date: 2023-09-06
  • Rev Recd Date: 2024-04-16
  • Available Online: 2024-05-12
  • Publish Date: 2024-07-29
  • Federated Learning (FL) emerges as a distributed training method in the Internet of Vehicle (IoV), allowing Connected and Automated Vehicles (CAVs) to train a global model by exchanging models instead of raw data, protecting data privacy. Due to the limitation of model accuracy and communication overhead in FL, in this paper, a Directed Acyclic Graph (DAG) blockchain-based IoV is proposed that comprises a DAG layer and a CAV layer for model sharing and training, respectively. Furthermore, a DAG blockchain-based Asynchronous Federated Mutual distillation Learning (DAFML) algorithm is introduced to improve the model performance, which utilizes a teacher model and a student model to mutual distillation in the local training. Specifically, the teacher model with a professional network could achieve higher model accuracy, while the student model with a lightweight network could reduce the communication overhead in contrast. Moreover, to further improve the model accuracy, the personalized weight based on global epoch and model accuracy is designed to adjust the mutual distillation in the model updating. Simulation results demonstrate that the proposed DAFML algorithm outperforms other benchmarks in terms of the model accuracy and distillation ratio.
  • loading
  • [1]
    HUANG Xiaoge, YIN Hongbo, CHEN Qianbin, et al. DAG-based swarm learning: A secure asynchronous learning framework for internet of vehicles[J]. Digital Communications and Networks, 2023. doi: 10.1016/j.dcan.2023.10.004.
    [2]
    YIN Hongbo, HUANG Xiaoge, WU Yuhang, et al. Multi-region asynchronous swarm learning for data sharing in large-scale internet of vehicles[J]. IEEE Communications Letters, 2023, 27(11): 2978–2982. doi: 10.1109/LCOMM.2023.3314662.
    [3]
    MCMAHAN B, MOORE E, RAMAGE D, et al. Communication-efficient learning of deep networks from decentralized data[C]. The 20th International Conference on Artificial Intelligence and Statistics, Fort Lauderdale, USA, 2017: 1273–1282.
    [4]
    HUANG Xiaoge, WU Yuhang, LIANG Chengchao, et al. Distance-aware hierarchical federated learning in blockchain-enabled edge computing network[J]. IEEE Internet of Things Journal, 2023, 10(21): 19163–19176. doi: 10.1109/JIOT.2023.3279983.
    [5]
    FENG Lei, ZHAO Yiqi, GUO Shaoyong, et al. BAFL: A blockchain-based asynchronous federated learning framework[J]. IEEE Transactions on Computers, 2022, 71(5): 1092–1103. doi: 10.1109/TC.2021.3072033.
    [6]
    XIAO Huizi, ZHAO Jun, PEI Qingqi, et al. Vehicle selection and resource optimization for federated learning in vehicular edge computing[J]. IEEE Transactions on Intelligent Transportation Systems, 2022, 23(8): 11073–11087. doi: 10.1109/TITS.2021.3099597.
    [7]
    CAO Mingrui, ZHANG Long, and CAO Bin. Toward on-device federated learning: A direct acyclic graph-based blockchain approach[J]. IEEE Transactions on Neural Networks and Learning Systems, 2023, 34(4): 2028–2042. doi: 10.1109/TNNLS.2021.3105810.
    [8]
    ZHENG Haifeng, GAO Min, CHEN Zhizhang, et al. A distributed hierarchical deep computation model for federated learning in edge computing[J]. IEEE Transactions on Industrial Informatics, 2021, 17(12): 7946–7956. doi: 10.1109/TII.2021.3065719.
    [9]
    WU Chuhan, WU Fangzhao, LYU Lingjuan, et al. Communication-efficient federated learning via knowledge distillation[J]. Nature Communications, 2022, 13(1): 2023. doi: 10.1038/s41467-022-29763-x.
    [10]
    JEONG E, OH S, KIM H, et al. Communication-efficient on-device machine learning: Federated distillation and augmentation under non-IID private data[EB/OL]. https://arxiv.org/abs/1811.11479v1, 2018.
    [11]
    LEE G, JEONG M, SHIN Y, et al. Preservation of the global knowledge by not-true distillation in federated learning[C]. The 36th International Conference on Neural Information Processing Systems, New Orleans, USA, 2022: 2787.
    [12]
    FALLAH A, MOKHTARI A, and OZDAGLAR A. Personalized federated learning: A meta-learning approach[EB/OL]. https://arxiv.org/abs/2002.07948, 2020.
    [13]
    LI Tian, SAHU A K, ZAHEER M, et al. Federated optimization in heterogeneous networks[C]. Machine Learning and Systems, Austin, USA, 2020: 429–450.
    [14]
    JIANG Yuang, WANG Shiqiang, VALLS V, et al. Model pruning enables efficient federated learning on edge devices[J]. IEEE Transactions on Neural Networks and Learning Systems, 2023, 34(12): 10374–10386. doi: 10.1109/TNNLS.2022.3166101.
    [15]
    WEN Dingzhu, JEON K J, and HUANG Kaibin. Federated dropout—a simple approach for enabling federated learning on resource constrained devices[J]. IEEE Wireless Communications Letters, 2022, 11(5): 923–927. doi: 10.1109/LWC.2022.3149783.
    [16]
    NORI M K, YUN S, and KIM I M. Fast federated learning by balancing communication trade-offs[J]. IEEE Transactions on Communications, 2021, 69(8): 5168–5182. doi: 10.1109/TCOMM.2021.3083316.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(7)  / Tables(3)

    Article Metrics

    Article views (320) PDF downloads(34) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return