Advanced Search
Volume 44 Issue 9
Sep.  2022
Turn off MathJax
Article Contents
SHANG Fengjun, LI Saisai, WANG Ying, CUI Yunfan. Traffic Classification Method Based on Dynamic Balance Adaptive Transfer Learning[J]. Journal of Electronics & Information Technology, 2022, 44(9): 3308-3319. doi: 10.11999/JEIT210623
Citation: SHANG Fengjun, LI Saisai, WANG Ying, CUI Yunfan. Traffic Classification Method Based on Dynamic Balance Adaptive Transfer Learning[J]. Journal of Electronics & Information Technology, 2022, 44(9): 3308-3319. doi: 10.11999/JEIT210623

Traffic Classification Method Based on Dynamic Balance Adaptive Transfer Learning

doi: 10.11999/JEIT210623
Funds:  The National Natural Science Foundation of China (61672004)
  • Received Date: 2021-06-22
  • Accepted Date: 2022-03-10
  • Rev Recd Date: 2022-01-28
  • Available Online: 2022-03-20
  • Publish Date: 2022-09-19
  • In this paper, an improved and adaptive transfer learning algorithm is proposed for mobile application traffic recognition filed, which maps the sample features of source domain and target domain into high-dimensional feature space to minimize the marginal distribution and conditional distribution distances of the domains. A probabilistic model is presented to judge and calculate the difference between marginal distribution and conditional distribution between the domains. It can determine the degree of classification category and calculate quantitively the balance factor $ \mu $, which solves effectively the problem that DDA only considers classification error rate and ignores the degree of confirmation. Besides, the cliff-type down strategy is introduced to determine dynamically the number of feature principal. Compared with the traditional machine learning method, the proposed algorithm improves the accuracy by about 7%. Moreover, a feature selection algorithm for application traffic recognition is proposed to solve the problem of high feature dimension, where the reverse feature self-deleting strategy combined with the Earth Mover’s Distance (EMD) and used the correlation coefficient of the bulldozer to weight the information gain is introduced. It solves the problems that increasing the training time of model due to invalid features and decreasing the model performance and accuracy caused by irrelevant features. Simulation result shows that when the training input data for transfer learning uses the feature set processed by the proposed algorithm, the time of the transfer algorithm can be shortened by about 80%.
  • loading
  • [1]
    邹腾宽, 汪钰颖, 吴承荣. 网络背景流量的分类与识别研究综述[J]. 计算机应用, 2019, 39(3): 802–811. doi: 10.11772/j.issn.1001-9081.2018071552

    ZOU Tengkuan, WANG Yuying, and WU Chengrong. Review of network background traffic classification and identification[J]. Journal of Computer Applications, 2019, 39(3): 802–811. doi: 10.11772/j.issn.1001-9081.2018071552
    [2]
    SUN Guanglu, LIANG Lili, CHEN Teng, et al. Network traffic classification based on transfer learning[J]. Computers & Electrical Engineering, 2018, 69: 920–927. doi: 10.1016/j.compeleceng.2018.03.005
    [3]
    李猛, 李艳玲, 林民. 命名实体识别的迁移学习研究综述[J]. 计算机科学与探索, 2020, 15(2): 206–218. doi: 10.3778/j.issn.1673-9418.2003049

    LI Meng, LI Yanling, and LIN Min. Review of transfer learning for named entity recognition[J]. Journal of Frontiers of Computer Science and Technology, 2020, 15(2): 206–218. doi: 10.3778/j.issn.1673-9418.2003049
    [4]
    WANG Jindong. Concise handbook of transfer learning[EB/OL].https://www.sohu.com/a/420774574_114877.
    [5]
    李号号. 基于实例的迁移学习技术研究及应用[D]. [硕士论文], 武汉大学, 2018.

    LI Haohao. Research and application of instance-based transfer learning[D]. [Master dissertation], Wuhan University, 2018.
    [6]
    PAN S J, TSANG I W, KWOK J T, et al. Domain adaptation via transfer component analysis[J]. IEEE Transactions on Neural Networks, 2011, 22(2): 199–210. doi: 10.1109/TNN.2010.2091281
    [7]
    季鼎承, 蒋亦樟, 王士同. 基于域与样例平衡的多源迁移学习方法[J]. 电子学报, 2019, 47(3): 692–699.

    JI Dingcheng, JIANG Yizhang, and WANG Shitong. Multi-source transfer learning method by balancing both the domains and instances[J]. Acta Electronica Sinica, 2019, 47(3): 692–699.
    [8]
    YAO Yi and DORETTO G. Boosting for transfer learning with multiple sources[C]. 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Francisco, USA, 2010: 1855–1862.
    [9]
    唐诗淇, 文益民, 秦一休. 一种基于局部分类精度的多源在线迁移学习算法[J]. 软件学报, 2017, 28(11): 2940–2960. doi: 10.13328/j.cnki.jos.005352

    TANG Shiqi, WEN Yimin, and QIN Yixiu. Online transfer learning from multiple sources based on local classification accuracy[J]. Journal of Software, 2017, 28(11): 2940–2960. doi: 10.13328/j.cnki.jos.005352
    [10]
    张博, 史忠植, 赵晓非, 等. 一种基于跨领域典型相关性分析的迁移学习方法[J]. 计算机学报, 2015, 38(7): 1326–1336. doi: 10.11897/SP.J.1016.2015.01326

    ZHANG Bo, SHI Zhongzhi, ZHAO Xiaofei, et al. A transfer learning based on canonical correlation analysis across different domains[J]. Chinese Journal of Computers, 2015, 38(7): 1326–1336. doi: 10.11897/SP.J.1016.2015.01326
    [11]
    张宁. 基于决策树分类器的迁移学习研究[D]. [硕士论文], 西安电子科技大学, 2014.

    ZHANG Ning. Research on transfer learning based on decision tree classifier[D]. [Master dissertation], Xidian University, 2014.
    [12]
    洪佳明, 印鉴, 黄云, 等. TrSVM: 一种基于领域相似性的迁移学习算法[J]. 计算机研究与发展, 2011, 48(10): 1823–1830.

    HONG Jiaming, YIN Jian, HUANG Yun, et al. TrSVM: A transfer learning algorithm using domain similarity[J]. Journal of Computer Research and Development, 2011, 48(10): 1823–1830.
    [13]
    FADDOUL J B and CHIDLOVSKII B. Learning multiple tasks with boosted decision trees[P]. US, 8694444, 2014.
    [14]
    WAN Zitong, YANG Rui, HUANG Mengjie, et al. A review on transfer learning in EEG signal analysis[J]. Neurocomputing, 2021, 421: 1–14. doi: 10.1016/j.neucom.2020.09.017
    [15]
    TAN Ben, ZHANG Yu, PAN S, et al. Distant domain transfer learning[C]. The Thirty-First AAAI Conference on Artificial Intelligence, San Francisco, USA, 2017: 2604–2610.
    [16]
    CHIANG K J, WEI Chunshu, NAKANISHI M, et al. Boosting template-based SSVEP decoding by cross-domain transfer learning[J]. Journal of Neural Engineering, 2021, 18(1): 016002. doi: 10.1088/1741-2552/abcb6e
    [17]
    NIU Shuteng, HU Yihao, WANG Jian, et al. Feature-based distant domain transfer learning[C]. 2020 IEEE International Conference on Big Data (Big Data), Atlanta, USA, 2020: 5164–5171.
    [18]
    DUAN Lixin, XU Dong, and TSANG I W H. Domain adaptation from multiple sources: A domain-dependent regularization approach[J]. IEEE Transactions on Neural Networks and Learning Systems, 2012, 23(3): 504–518. doi: 10.1109/TNNLS.2011.2178556
    [19]
    WANG Jindong, CHEN Yiqiang, FENG Wenjie, et al. Transfer learning with dynamic distribution adaptation[J]. ACM Transactions on Intelligent Systems and Technology, 2020, 11(1): 6. doi: 10.1145/3360309
    [20]
    JIANG R, PACCHIANO A, STEPLETON T, et al. Wasserstein fair classification[C]. The Thirty-Fifth Conference on Uncertainty in Artificial Intelligence, Tel Aviv, Israel, 2019: 862–872.
    [21]
    PRONZATO L. Performance analysis of greedy algorithms for minimising a maximum mean discrepancy[J]. arXiv preprint arXiv: 2101.07564, 2021.
    [22]
    张东光. 统计学[M]. 2版. 北京: 科学出版社, 2020: 161–169.

    ZHANG Dongguang. Statistics[M]. 2nd ed. Beijing: China Science Press, 2020: 161–169.
    [23]
    MOORE A W and ZUEV D. Internet traffic classification using bayesian analysis techniques[J]. ACM SIGMETRICS Performance Evaluation Review, 2005, 33(1): 50–60. doi: 10.1145/1071690.1064220
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(7)  / Tables(5)

    Article Metrics

    Article views (596) PDF downloads(94) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return