高级搜索

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

一种用于常开型智能视觉感算系统的极速高精度模拟减法器

刘博 王想军 麦麦提·那扎买提 郑辞晏 向菲 魏琦 杨兴华 乔飞

刘博, 王想军, 麦麦提·那扎买提, 郑辞晏, 向菲, 魏琦, 杨兴华, 乔飞. 一种用于常开型智能视觉感算系统的极速高精度模拟减法器[J]. 电子与信息学报, 2024, 46(9): 3807-3817. doi: 10.11999/JEIT231099
引用本文: 刘博, 王想军, 麦麦提·那扎买提, 郑辞晏, 向菲, 魏琦, 杨兴华, 乔飞. 一种用于常开型智能视觉感算系统的极速高精度模拟减法器[J]. 电子与信息学报, 2024, 46(9): 3807-3817. doi: 10.11999/JEIT231099
LIU Bo, WANG Xiangjun, NAZHAMAITI Maimaiti, ZHENG Ciyan, XIANG Fei, WEI Qi, YANG Xinghua, QIAO Fei. Ultra High-speed High-precision Analog Subtractor Applied to Always-on Intelligent Visual Sense-computing System[J]. Journal of Electronics & Information Technology, 2024, 46(9): 3807-3817. doi: 10.11999/JEIT231099
Citation: LIU Bo, WANG Xiangjun, NAZHAMAITI Maimaiti, ZHENG Ciyan, XIANG Fei, WEI Qi, YANG Xinghua, QIAO Fei. Ultra High-speed High-precision Analog Subtractor Applied to Always-on Intelligent Visual Sense-computing System[J]. Journal of Electronics & Information Technology, 2024, 46(9): 3807-3817. doi: 10.11999/JEIT231099

一种用于常开型智能视觉感算系统的极速高精度模拟减法器

doi: 10.11999/JEIT231099
基金项目: 国家自然科学基金(92164203, 62334006, 61704049),新疆维吾尔自治区重点研发计划(2022B01008),河南省科技攻关计划(232102211066, 242102211101),河南省高校青年骨干教师计划(2020GGJS077)
详细信息
    作者简介:

    刘博:男,副教授,研究方向为生物感算一体及通信芯片设计及其应用系统

    王想军:男,硕士生,研究方向为智能视觉感知集成电路

    麦麦提·那扎买提:男,博士研究生,研究方向为超低功耗智能视觉感知芯片设计,面向视觉感知的能量采集和能量管理架构和电路设计

    郑辞晏:女,副教授,研究方向为基于忆阻器的信号感知与处理电路设计

    向菲:女,副教授,研究方向为信息安全和保密通信技术

    魏琦:男,副研究员,研究方向为集成电路设计

    杨兴华:男,讲师,研究方向为近似计算电路系统设计

    乔飞:男,副研究员,研究方向为智能感知集成电路与系统

    通讯作者:

    乔飞 qiaofei@tsinghua.edu.cn

  • 中图分类号: TN911.73; TN492

Ultra High-speed High-precision Analog Subtractor Applied to Always-on Intelligent Visual Sense-computing System

Funds: The National Natural Science Foundation of China (92164203, 62334006, 61704049), The Key Research and Development Program of Xinjiang Uygur Autonomous Region (2022B01008), The Key Science and Technology Program of Henan Province (232102211066, 242102211101) , The Young Teacher Talent Program of Henan Province (2020GGJS077)
  • 摘要: 常开型智能视觉感算系统对图像边缘特征提取的精度和实时性要求更高,其硬件能耗也随之暴增。采用模拟减法器代替传统数字处理在模拟域同步实现感知和边缘特征提取,可有效降低感存算一体系统的整体能耗,但与此同时,突破10–7 s数量级的长计算时间也成为了模拟减法器设计的瓶颈。该文提出一种新型的模拟减法运算电路结构,由模拟域的信号采样和减法运算两个功能电路组成。信号采样电路进一步由经改进的自举采样开关和采样电容组成;减法运算则由所提出的一种新型开关电容式模拟减法电路执行,可在2次采样时间内实现3次减法运算的高速并行处理。基于TSMC 180 nm/1.8 V CMOS工艺,完成整体模拟减法运算电路的设计。仿真实验结果表明,该减法器能够实现在模拟域中信号采样与计算的同步并行处理,一次并行处理的周期仅为20 ns,具备高速计算能力;减法器的计算取值范围宽至–900~900 mV,相对误差小于1.65%,最低仅为0.1%左右,处理精度高;电路能耗为25~27.8 pJ,处于中等可接受水平。综上,所提模拟减法器具备良好的速度、精度和能耗的性能平衡,可有效适用于高性能常开型智能视觉感知系统。
  • 图  1  视觉感知系统最前端 CIS信号接收过程抽象图

    图  2  两种常用减法运算电路架构

    图  3  2次采样完成3次减法运算的电阻式模拟减法器

    图  4  电阻式模拟减法器的运算时序图

    图  5  结合采样保持功能的基于单运放实现3次减法运算的开关电容式模拟减法运算电路

    图  6  减法器3次减法运算功能示意图

    图  7  常规的栅压自举采样开关

    图  8  所提自举采样开关

    图  9  自举采样开关的瞬态仿真结果

    图  10  自举采样开关基于DFT的频谱及动态性能

    图  11  理想的减法计算差值与减法电路的仿真输出结果对比

    图  12  不同的减法器差值输出对应的误差采样曲线

    图  13  整体模拟域减法运算电路的计算值与仿真值的对比

    图  14  不同的减法器差值输出对应的误差采样曲线

    图  15  模拟减法器计算值与能耗的曲线

    表  1  本文设计与他参考文献各指标对比

    指标参数 2023年[22] 2022年[23] 2021年[24] 2020年[25] 本文
    工艺(nm) 180 180 180 180 180
    电源电压(V) 1.8 1.8 1.2 1.8 1.8
    采样率(MS/s) 50 50 50 1 100
    ENOB(bit) 16.5 16.5 13.6 14 16.79
    SNDR(dB) 101.1 101.11 83.7 86.94 102.85
    SFDR(dBc) 101.8 101.83 83.9 87.32 103.09
    THD(dB) N/A –101.2 –83.7 –87.3 –103.08
    下载: 导出CSV

    表  2  本文提出的模拟减法运算电路与其他设计案例的对比

    文献 工作类型 工艺尺寸(nm) 电源电压(V) 总计算时间(μs ) 能耗(nJ ) 计算误差(%)
    本文 开关电容式模拟减法器 180 1.8 0.02 0.0278b <1.65
    [14] KCL电流模式减法器 350 0.45a 50000b <3c
    [15] 开关电容式模拟减法器 180 0.56 /0.8 1220a 0.0125b
    [19] 忆阻器式模拟减法器 180 4 0.2a 0.019b
    [20] 开关电容式模拟减法器 180 1.8 2 8.67b <12.73
    注:a: 每一帧的处理时间;b: 每一帧能耗;c: 像素不匹配误差
    下载: 导出CSV
  • [1] HSU Y C and CHANG R C H. Intelligent chips and technologies for AIoT era[C]. Proceedings of 2020 IEEE Asian Solid-State Circuits Conference, Hiroshima, Japan, 2020: 1–4. doi: 10.1109/A-SSCC48613.2020.9336122.
    [2] HOCKLEY W E. The picture superiority effect in associative recognition[J]. Memory & Cognition, 2008, 36(7): 1351–1359. doi: 10.3758/MC.36.7.1351.
    [3] 姚峰林. 数字图像处理及在工程中的应用[M]. 北京: 北京理工大学出版社, 2014.

    YAO Fenglin. Digital Image Processing and Application in Engineering[M]. Beijing: Beijing Institute of Technology Press, 2014.
    [4] CHOI J, SHIN J, KANG Dongwu, et al. Always-on CMOS image sensor for mobile and wearable devices[J]. IEEE Journal of Solid-State Circuits, 2016, 51(1): 130–140. doi: 10.1109/JSSC.2015.2470526.
    [5] CHIOU A Y C and HSIEH C C. An ULV PWM CMOS imager with adaptive-multiple-sampling linear response, HDR imaging, and energy harvesting[J]. IEEE Journal of Solid-State Circuits, 2019, 54(1): 298–306. doi: 10.1109/JSSC.2018.2870559.
    [6] AL BAHOU A, KARUNARATNE G, ANDRI R, et al. XNORBIN: A 95 TOp/s/W hardware accelerator for binary convolutional neural networks[C]. Proceedings of 2018 IEEE Symposium in Low-Power and High-Speed Chips, Yokohama, Japan, 2018: 1–3. doi: 10.1109/CoolChips.2018.8373076.
    [7] KRESTINSKAYA O and JAMES A P. Binary weighted memristive analog deep neural network for near-sensor edge processing[C]. Proceedings of 2018 IEEE 18th International Conference on Nanotechnology, Cork, Ireland, 2018: 1–4. doi: 10.1109/NANO.2018.8626224.
    [8] ZHANG Jintao, WANG Zhuo, and VERMA N. In-memory computation of a machine-learning classifier in a standard 6T SRAM array[J]. IEEE Journal of Solid-State Circuits, 2017, 52(4): 915–924. doi: 10.1109/JSSC.2016.2642198.
    [9] JOKIC P, EMERY S, and BENINI L. BinaryEye: A 20 kfps streaming camera system on FPGA with real-time on-device image recognition using binary neural networks[C]. Proceedings of 2018 IEEE 13th International Symposium on Industrial Embedded Systems, Graz, Austria, 2018: 1–7. doi: 10.1109/SIES.2018.8442108.
    [10] LI Ziwei, XU Han, LIU Zheyu, et al. A 2.17μW@120fps ultra-low-power dual-mode CMOS image sensor with senputing architecture[C]. Proceedings of 2022 27th Asia and South Pacific Design Automation Conference, Taipei, China, 2022: 92–93. doi: 10.1109/ASP-DAC52403.2022.9712591.
    [11] TAKAHASHI N, FUJITA K, and SHIBATA T. A pixel-parallel self-similitude processing for multiple-resolution edge-filtering analog image sensors[J]. IEEE Transactions on Circuits and Systems I: Regular Papers, 2009, 56(11): 2384–2392. doi: 10.1109/TCSI.2009.2015598.
    [12] TAKAHASHI N and SHIBATA T. A row-parallel cyclic-line-access edge detection CMOS image sensor employing global thresholding operation[C]. Proceedings of 2010 IEEE International Symposium on Circuits and Systems, Paris, France, 2010: 625–628. doi: 10.1109/ISCAS.2010.5537512.
    [13] DORZHIGULOV A, BERDALIYEV Y, and JAMES A P. Coarse to fine difference edge detection with binary neural firing model[C]. Proceedings of 2017 International Conference on Advances in Computing, Communications and Informatics, Udupi, India, 2017: 1098–1102. doi: 10.1109/ICACCI.2017.8125988.
    [14] GARCIA-LAMONT J. Analogue CMOS prototype vision chip with Prewitt edge processing[J]. Analog Integrated Circuits and Signal Processing, 2012, 71(3): 507–514. doi: 10.1007/s10470-011-9694-6.
    [15] CHIU M Y, CHEN Guancheng, HUANG Y H, et al. A 0.56V/0.8V vision sensor with temporal contrast pixel and column-parallel local binary pattern extraction for dynamic depth sensing using stereo vision[C]. Proceedings of 2022 IEEE Asian Solid-State Circuits Conference, Taipei, China, 2022: 1–3. doi: 10.1109/A-SSCC56115.2022.9980799.
    [16] LIU Liqiao, REN Xu, ZHAO Kai, et al. FD-SOI-based pixel with real-time frame difference for motion extraction and image preprocessing[J]. IEEE Transactions on Electron Devices, 2023, 70(2): 594–599. doi: 10.1109/TED.2022.3231573.
    [17] XU Han, LIN Ningchao, LUO Li, et al. Senputing: An ultra-low-power always-on vision perception chip featuring the deep fusion of sensing and computing[J]. IEEE Transactions on Circuits and Systems I: Regular Papers, 2022, 69(1): 232–243. doi: 10.1109/TCSI.2021.3090668.
    [18] JAKLIN M, GARCÍA-LESTA D, BREA V M, et al. Low-power techniques on a CMOS vision sensor chip for event generation by frame differencing with high dynamic range[C]. Proceedings of 2022 29th IEEE International Conference on Electronics, Circuits and Systems, Glasgow, United Kingdom, 2022: 1–4. doi: 10.1109/ICECS202256217.2022.9970907.
    [19] KRESTINSKAYA O and JAMES A P. Real-time analog pixel-to-pixel dynamic frame differencing with memristive sensing circuits[C]. Proceedings of 2018 IEEE SENSORS, New Delhi, India, 2018: 1–4. doi: 10.1109/ICSENS.2018.8589849.
    [20] NAZHAMAITI M, XU Han, LIU Zheyu, et al. NS-MD: Near-sensor motion detection with energy harvesting image sensor for always-on visual perception[J]. IEEE Transactions on Circuits and Systems II: Express Briefs, 2021, 68(9): 3078–3082. doi: 10.1109/TCSII.2021.3087840.
    [21] ABO A M and GRAY P R. A 1.5-V, 10-bit, 14.3-MS/s CMOS pipeline analog-to-digital converter[J]. IEEE Journal of Solid-State Circuits, 1999, 34(5): 599–606. doi: 10.1109/4.760369.
    [22] CAO Chao, ZHAO Wei, FAN Jihui, et al. A complementary high linearity bootstrap switch based on negative voltage bootstrap capacitor[J]. Microelectronics Journal, 2023, 133: 105695. doi: 10.1016/j.mejo.2023.105695.
    [23] ZHAO Wei, CAO Chao, FAN Jihui, et al. Improved complementary bootstrap switch based on negative voltage bootstrap capacitance[C]. Proceedings of 2022 IEEE 16th International Conference on Solid-State & Integrated Circuit Technology, Nangjing, China, 2022: 1–3. doi: 10.1109/ICSICT55466.2022.9963367.
    [24] WEI Cong, WEI Rongshan, and HE Minghua. Bootstrapped switch with improved linearity based on a negative-voltage bootstrapped capacitor[J]. IEICE Electronics Express, 2021, 18(7): 20210062. doi: 10.1587/elex.18.20210062.
    [25] KHAJEH M G and SOBHI J. An 87-dB-SNDR 1MS/s bilateral bootstrapped CMOS switch for sample-and-hold circuit[C]. Proceedings of 2020 28th Iranian Conference on Electrical Engineering, Tabriz, Iran, 2020: 1–5. doi: 10.1109/ICEE50131.2020.9260778.
  • 加载中
图(15) / 表(2)
计量
  • 文章访问数:  193
  • HTML全文浏览量:  54
  • PDF下载量:  27
  • 被引次数: 0
出版历程
  • 收稿日期:  2023-10-10
  • 修回日期:  2024-08-24
  • 网络出版日期:  2024-08-30
  • 刊出日期:  2024-09-26

目录

    /

    返回文章
    返回