高级搜索

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

基于迁移权重的条件对抗领域适应

王进 王科 闵子剑 孙开伟 邓欣

王进, 王科, 闵子剑, 孙开伟, 邓欣. 基于迁移权重的条件对抗领域适应[J]. 电子与信息学报, 2019, 41(11): 2729-2735. doi: 10.11999/JEIT190115
引用本文: 王进, 王科, 闵子剑, 孙开伟, 邓欣. 基于迁移权重的条件对抗领域适应[J]. 电子与信息学报, 2019, 41(11): 2729-2735. doi: 10.11999/JEIT190115
Jin WANG, Ke WANG, Zijian MIN, Kaiwei SUN, Xin DENG. Transfer Weight Based Conditional Adversarial Domain Adaptation[J]. Journal of Electronics & Information Technology, 2019, 41(11): 2729-2735. doi: 10.11999/JEIT190115
Citation: Jin WANG, Ke WANG, Zijian MIN, Kaiwei SUN, Xin DENG. Transfer Weight Based Conditional Adversarial Domain Adaptation[J]. Journal of Electronics & Information Technology, 2019, 41(11): 2729-2735. doi: 10.11999/JEIT190115

基于迁移权重的条件对抗领域适应

doi: 10.11999/JEIT190115
基金项目: 国家自然科学基金(61806033), 国家社会科学基金西部项目(18XGL013)
详细信息
    作者简介:

    王进:男,1979年生,教授,研究方向为机器学习、数据挖掘

    王科:男,1993年生,硕士生,研究方向为机器学习

    闵子剑:男,1995年生,硕士生,研究方向为机器学习

    孙开伟:男,1987年生,讲师,研究方向为机器学习、数据挖掘

    邓欣:男,1981年生,副教授,研究方向为机器学习、认知计算

    通讯作者:

    王进 wangjin@cqupt.edu.cn

  • 中图分类号: TP391.41

Transfer Weight Based Conditional Adversarial Domain Adaptation

Funds: The National Nature Science Foundation of China(61806033), The National Social Science Foundation of China(18XGL013)
  • 摘要: 针对条件对抗领域适应(CDAN)方法未能充分挖掘样本的可迁移性,仍然存在部分难以迁移的源域样本扰乱目标域数据分布的问题,该文提出一种基于迁移权重的条件对抗领域适应(TW-CDAN)方法。首先利用领域判别模型的判别结果作为衡量样本迁移性能的主要度量指标,使不同的样本具有不同的迁移性能;其次将样本的可迁移性作为权重应用在分类损失和最小熵损失上,旨在消除条件对抗领域适应中难以迁移样本对模型造成的影响;最后使用Office-31数据集的6个迁移任务和Office-Home数据集的12个迁移任务进行了实验,该方法在14个迁移任务上取得了提升,在平均精度上分别提升1.4%和3.1%。
  • 图  1  TW-CDAN模型结构图

    图  2  Office-Home数据集

    图  3  算法收敛性对比实验

    图  4  T-SNE特征可视化

    表  1  Office-31数据集结果(使用平均精度进行评价)

    方法AWDWWDADDAWA平均
    ResNet-50[17]68.496.799.368.962.560.776.1
    DAN[7]80.597.199.678.663.662.880.4
    RTN[8]84.596.899.477.566.264.881.6
    DANN[10]82.096.999.179.768.267.482.2
    ADDA[11]86.296.298.477.869.568.982.9
    JAN[18]85.497.499.884.768.670.084.3
    GTA[19]89.597.999.887.772.871.486.5
    CDAN[13]93.198.6100.092.971.069.387.5
    TW-CDAN94.999.2100.094.072.772.588.9
    下载: 导出CSV

    表  2  Office-Home数据集结果(使用平均精度进行评价)

    方法ArClArPrArRwClArClPrClRwPrArPrClPrRwRwArRwClRwPr平均
    ResNet-50[17]34.950.058.037.441.946.238.531.260.453.941.259.946.1
    DAN[7]43.657.067.945.856.560.444.043.667.763.151.574.356.3
    DANN[10]45.659.370.147.058.560.946.143.768.563.251.876.857.6
    JAN[18]45.961.268.950.459.761.045.843.470.363.952.476.858.3
    CDAN[13]50.665.973.455.762.764.251.849.174.568.256.980.762.8
    TW-CDAN48.871.176.761.668.970.260.446.677.971.355.481.965.9
    下载: 导出CSV

    表  3  不同迁移权重设置在Office-31数据集结果(使用平均精度进行评价)

    方法AWDWWDADDAWA平均
    CDAN[13]93.198.6100.092.971.069.387.5
    CDAN(S)93.098.7100.092.771.069.187.4
    TW-CDAN(E)93.798.8100.093.471.571.388.1
    TW-CDAN(C)94.298.9100.093.172.171.888.4
    TW-CDAN94.999.2100.094.072.772.588.9
    下载: 导出CSV
  • YOSINSKI J, CLUNE J, BENGIO Y, et al. How transferable are features in deep neural networks?[C]. Proceedings of the 27th International Conference on Neural Information Processing Systems, Montreal, Canada, 2014: 3320-3328.
    PAN S J and YANG Qiang. A survey on transfer learning[J]. IEEE Transactions on Knowledge and Data Engineering, 2010, 22(10): 1345–1359. doi: 10.1109/TKDE.2009.191
    GEBRU T, HOFFMAN J, LI Feifei, et al. Fine-grained recognition in the wild: A multi-task domain adaptation approach[C]. Proceedings of IEEE International Conference on Computer Vision, Venice, Italy, 2017: 1358–1367.
    GLOROT X, BORDES A, and BENGIO Y. Domain adaptation for large-scale sentiment classification: A deep learning approach[C]. Proceedings of the 28th International Conference on Machine Learning, Bellevue, USA, 2011: 513–520.
    WANG Mei and DENG Weihong. Deep visual domain adaptation: A survey[J]. Neurocomputing, 2018, 312: 135–153. doi: 10.1016/j.neucom.2018.05.083
    GRETTON A, BORGWARDT K, RASCH M, et al. A kernel method for the two-sample-problem[C]. Proceedings of the 19th Conference on Neural Information Processing Systems, Vancouver, Canada, 2007: 513–520.
    LONG Mingsheng, CAO Yue, WANG Jianmin, et al. Learning transferable features with deep adaptation networks[C]. Proceedings of the 32nd International Conference on Machine Learning, Lille, France, 2015: 97–105.
    LONG Mingsheng, ZHU Han, WANG Jianmin, et al. Deep transfer learning with joint adaptation networks[C]. Proceedings of the 34th International Conference on Machine Learning, Sydney, Australia, 2017: 2208–2217.
    GOODFELLOW I J, POUGET-ABADIE J, MIRZA M, et al. Generative adversarial nets[C]. Proceedings of the 27th International Conference on Neural Information Processing Systems, Montreal, Canada, 2014: 2672–2680.
    GANIN Y, USTINOVA E, AJAKAN H, et al. Domain-adversarial training of neural networks[J]. The Journal of Machine Learning Research, 2016, 17(1): 2096–2030.
    TZENG E, HOFFMAN J, SAENKO K, et al. Adversarial discriminative domain adaptation[C]. Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, USA, 2017: 2962–2971.
    MIRZA M and OSINDERO S. Conditional generative adversarial nets[EB/OL]. https://arxiv.org/abs/1411.1784, 2014.
    LONG Mingsheng, CAO Zhangjie, WANG Jianmin, et al. Conditional adversarial domain adaptation[C]. Proceedings of the 32nd Conference on Neural Information Processing Systems, Montréal, Canada, 2018: 1647–1657.
    GRANDVALET Y and BENGIO Y. Semi-supervised learning by entropy minimization[C]. Proceedings of the 17th International Conference on Neural Information Processing Systems, Vancouver, Canada, 2004: 529–536.
    SAENKO K, KULIS B, FRITZ M, et al. Adapting visual category models to new domains[C]. Proceedings of the 11th European Conference on Computer Vision, Heraklion, Greece, 2010: 213–226.
    VENKATESWARA H, EUSEBIO J, CHAKRABORTY S, et al. Deep hashing network for unsupervised domain adaptation[C]. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, USA, 2017: 5385–5394.
    HE Kaiming, ZHANG Xiangyu, REN Shaoqing, et al. Deep residual learning for image recognition[C]. Proceedings of 2016 IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, USA, 2016: 770–778. doi: 10.1109/CVPR.2016.90.
    LONG Mingsheng, ZHU Han, WANG Jianmin, et al. Unsupervised domain adaptation with residual transfer networks[C]. Proceedings of the 30th Conference on Neural Information Processing Systems, Barcelona, Spain, 2016: 136–144.
    SANKARANARAYANAN S, BALAJI Y, CASTILLO C D, et al. Generate to adapt: Aligning domains using generative adversarial networks[C]. Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, USA, 2018: 8503–8512.
  • 加载中
图(4) / 表(3)
计量
  • 文章访问数:  3588
  • HTML全文浏览量:  1331
  • PDF下载量:  89
  • 被引次数: 0
出版历程
  • 收稿日期:  2019-02-27
  • 修回日期:  2019-06-11
  • 网络出版日期:  2019-06-24
  • 刊出日期:  2019-11-01

目录

    /

    返回文章
    返回