Advanced Search
Turn off MathJax
Article Contents
ZHANG Sheng, ZHENG ShengNan, SHEN Jie, YIN Xinghui, XU Lizhong. Review on Olfactory and Visual Neural Pathways in Drosophila[J]. Journal of Electronics & Information Technology. doi: 10.11999/JEIT230508
Citation: ZHANG Sheng, ZHENG ShengNan, SHEN Jie, YIN Xinghui, XU Lizhong. Review on Olfactory and Visual Neural Pathways in Drosophila[J]. Journal of Electronics & Information Technology. doi: 10.11999/JEIT230508

Review on Olfactory and Visual Neural Pathways in Drosophila

doi: 10.11999/JEIT230508
Funds:  The National Natural Science Foundation of China (51979085)
  • Received Date: 2023-05-29
  • Rev Recd Date: 2024-04-01
  • Available Online: 2024-04-19
  • The olfactory and visual neural systems in Drosophila are highly sensitive to the olfactory and visual stimuli in the natural environment. The highly sensitive single-modal perception and cross-modal collaboration decision-making mechanisms of the olfactory and visual neural systems provide certain inspiration for bionic applications. Firstly, based on the olfactory and visual neural systems in Drosophila, the current research status of the physiological mechanisms and computational models of single-modal perception decision-making of the olfactory and visual neurons is summarized. The summary is divided into three parts: capturing, processing, and decision-making of the olfactory and visual signals. Meanwhile, the physiological mechanisms and computational models of cross-modal collaboration decision-making of the olfactory and visual neurons in Drosophila are further expounded. Then, the typical bionic applications of single-modal perception and cross-modal collaboration in Drosophila are summarized. Finally, the current challenges of the physiological mechanisms and computational models of the olfactory and visual neural pathways in Drosophila are summarized and the future development trends are outlook for, which lays a foundation for future research work.
  • loading
  • [1]
    CHOW D M L. Visuo-olfactory integration in Drosophila flight control: Neural circuits, behavior, and ecological implications[D]. [Ph. D. dissertation], University of California, 2011.
    [2]
    DVOŘÁČEK J, BEDNÁŘOVÁ A, KRISHNAN N, et al. Dopaminergic mushroom body neurons in Drosophila: Flexibility of neuron identity in a model organism?[J]. Neuroscience & Biobehavioral Reviews, 2022, 135: 104570. doi: 10.1016/j.neubiorev.2022.104570.
    [3]
    JOVANIC T. Studying neural circuits of decision-making in Drosophila larva[J]. Journal of Neurogenetics, 2020, 34(1): 162–170. doi: 10.1080/01677063.2020.1719407.
    [4]
    郭爱克, 彭岳清, 张柯, 等. 小虫春秋: 果蝇的视觉学习记忆与认知[J]. 自然杂志, 2009, 31(2): 63–68. doi: 10.3969/j.issn.0253-9608.2009.02.001.

    GUO Aike, PENG Yueqing, ZHANG Ke, et al. Insect behavior: Visual cognition in fruit fly[J]. Chinese Journal of Nature, 2009, 31(2): 63–68. doi: 10.3969/j.issn.0253-9608.2009.02.001.
    [5]
    张式兴. 多巴胺在果蝇惩罚性嗅觉记忆保持中的作用[D]. [博士论文], 中国科学院生物物理研究所, 2008.

    ZHANG Shixing. The effect of dopamine on the aversive olfactory memory retention in Drosophila[D]. [Ph. D. dissertation], Institute of Biophysics, Chinese Academy of Sciences, 2008.
    [6]
    章盛, 沈洁, 郑胜男, 等. 类果蝇复眼视觉神经计算建模及仿生应用研究综述[J]. 红外技术, 2023, 45(3): 229–240.

    ZHANG Sheng, SHEN Jie, ZHENG Shengnan, et al. Research review of Drosophila-like compound eye visual neural computational modeling and bionic applications[J]. Infrared Technology, 2023, 45(3): 229–240.
    [7]
    VOGT K, SCHNAITMANN C, DYLLA K V, et al. Shared mushroom body circuits underlie visual and olfactory memories in Drosophila[J]. eLife, 2014, 3: e02395. doi: 10.7554/eLife.02395.
    [8]
    张单可. 神经信息处理的简单模型研究[D]. [博士论文], 华南理工大学, 2013.

    ZHANG Danke. Researches of simple models on neural information processing[D]. [Ph. D. dissertation], South China University of Technology, 2013.
    [9]
    BACHTIAR L R, UNSWORTH C P, and NEWCOMB R D. Using multilayer perceptron computation to discover ideal insect olfactory receptor combinations in the mosquito and fruit fly for an efficient electronic nose[J]. Neural Computation, 2015, 27(1): 171–201. doi: 10.1162/NECO_a_00691.
    [10]
    LEVAKOVA M, KOSTAL L, MONSEMPÈS C, et al. Adaptive integrate-and-fire model reproduces the dynamics of olfactory receptor neuron responses in a moth[J]. Journal of the Royal Society Interface, 2019, 16(157): 20190246. doi: 10.1098/rsif.2019.0246.
    [11]
    LAZAR A A, LIU Tingkai, and YEH C H. An odorant encoding machine for sampling, reconstruction and robust representation of odorant identity[C]. 2020 IEEE International Conference on Acoustics, Speech and Signal Processing, Barcelona, Spain, 2020: 1743–1747. doi: 10.1109/icassp40776.2020.9054588.
    [12]
    TEETER C, IYER R, MENON V, et al. Generalized leaky integrate-and-fire models classify multiple neuron types[J]. Nature Communications, 2018, 9(1): 709. doi: 10.1038/s41467-017-02717-4.
    [13]
    张铁林, 李澄宇, 王刚, 等. 适合类脑脉冲神经网络的应用任务范式分析与展望[J]. 电子与信息学报, 2023, 45(8): 2675–2688. doi: 10.11999/JEIT221459.

    ZHANG Tielin, LI Chengyu, WANG Gang, et al. Research advances and new paradigms for biology- inspired spiking neural networks[J]. Journal of Electronics & Information Technology, 2023, 45(8): 2675–2688. doi: 10.11999/JEIT221459.
    [14]
    刘伟, 王桂荣. 昆虫嗅觉中枢系统对外周信号的整合编码研究进展[J]. 昆虫学报, 2020, 63(12): 1536–1545. doi: 10.16380/j.kcxb.2020.12.012.

    LIU Wei and WANG Guirong. Research progress of integrated coding of peripheral olfactory signals in the central nervous system of insects[J]. Acta Entomologica Sinica, 2020, 63(12): 1536–1545. doi: 10.16380/j.kcxb.2020.12.012.
    [15]
    PEHLEVAN C, GENKIN A, and CHKLOVSKII D B. A clustering neural network model of insect olfaction[C]. The 51st Asilomar Conference on Signals, Systems, and Computers, Pacific Grove, USA, 2017: 593–600. doi: 10.1109/acssc.2017.8335410.
    [16]
    ZHANG Y L and SHARPEE T O. A robust feedforward model of the olfactory system[J]. PLoS Computational Biology, 2016, 12(4): e1004850. doi: 10.1371/journal.pcbi.1004850.
    [17]
    LEE H, KOSTAL L, KANZAKI R, et al. Spike frequency adaptation facilitates the encoding of input gradient in insect olfactory projection neurons[J]. Biosystems, 2023, 223: 104802. doi: 10.1016/j.biosystems.2022.104802.
    [18]
    DAS CHAKRABORTY S and SACHSE S. Olfactory processing in the lateral horn of Drosophila[J]. Cell and Tissue Research, 2021, 383(1): 113–123. doi: 10.1007/s00441-020-03392-6.
    [19]
    尹艳. 果蝇的嗅觉抉择策略是基于其记忆间的竞争[D]. [博士论文], 中国科学院生物物理研究所, 2009.

    YIN Yan. Choice strategies in Drosophila are based on competition between olfactory memories[D]. [Ph. D. dissertation], Institute of Biophysics, Chinese Academy of Sciences, 2009.
    [20]
    田月. 嗅觉信息在果蝇蘑菇体中的传递及呈现方式的探究[D]. [博士论文], 中国科学院大学, 2020.

    TIAN Yue. Transmission and representation of olfactory in Drosophila mushroom body[D]. [Ph. D. dissertation], University of Chinese Academy of Sciences, 2020.
    [21]
    ROHLFS C. A descriptive analysis of olfactory sensation and memory in Drosophila and its relation to artificial neural networks[J]. Neurocomputing, 2023, 518: 15–29. doi: 10.1016/j.neucom.2022.10.068.
    [22]
    KENNEDY A. Learning with naturalistic odor representations in a dynamic model of the Drosophila olfactory system[J]. bioRxiv, 2019. doi: 10.1101/783191.
    [23]
    SPRINGER M and NAWROT M P. A mechanistic model for reward prediction and extinction learning in the fruit fly[J]. eNeuro, 2021, 8(3): ENEURO. 0549-20.2021. doi: 10.1523/ENEURO.0549-20.2021.
    [24]
    MOSQUEIRO T S and HUERTA R. Computational models to understand decision making and pattern recognition in the insect brain[J]. Current Opinion in Insect Science, 2014, 6: 80–85. doi: 10.1016/j.cois.2014.10.005.
    [25]
    YUE Shigang and RIND F C. A collision detection system for a mobile robot inspired by the locust visual system[C]. The 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, 2005: 3832–3837. doi: 10.1109/ROBOT.2005.1570705.
    [26]
    WANG Hongxin, PENG Jigen, and YUE Shigang. An improved LPTC neural model for background motion direction estimation[C]. 2017 Joint IEEE International Conference on Development and Learning and Epigenetic Robotics, Lisbon, Portugal, 2017: 47–52. doi: 10.1109/DEVLRN.2017.8329786.
    [27]
    WANG Hongxin, PENG Jigen, and YUE Shigang. Bio-inspired small target motion detector with a new lateral inhibition mechanism[C]. 2016 International Joint Conference on Neural Networks, Vancouver, Canada, 2016: 4751–4758. doi: 10.1109/IJCNN.2016.7727824.
    [28]
    FU Qinbing and YUE Shigang. Modelling Drosophila motion vision pathways for decoding the direction of translating objects against cluttered moving backgrounds[J]. Biological Cybernetics, 2020, 114(4/5): 443–460. doi: 10.1007/s00422-020-00841-x.
    [29]
    BEHNIA R, CLARK D A, CARTER A G, et al. Processing properties of ON and OFF pathways for Drosophila motion detection[J]. Nature, 2014, 512(7515): 427–430. doi: 10.1038/nature13427.
    [30]
    WANG Hongxin, PENG Jigen, and YUE Shigang. A feedback neural network for small target motion detection in cluttered backgrounds[C]. The 27th International Conference on Artificial Neural Networks, Rhodes, Greece, 2018: 728–737. doi: 10.1007/978-3-030-01424-7_71.
    [31]
    FU Qinbing, BELLOTTO N, and YUE Shigang. A directionally selective neural network with separated ON and OFF pathways for translational motion perception in a visually cluttered environment[J]. arXiv: 1808.07692, 2018.
    [32]
    STROTHER J A, WU S T, WONG A M, et al. The emergence of directional selectivity in the visual motion pathway of Drosophila[J]. Neuron, 2017, 94(1): 168–182. e10. doi: 10.1016/j.neuron.2017.03.010.
    [33]
    GRUNTMAN E, ROMANI S, and REISER M B. The computation of directional selectivity in the Drosophila OFF motion pathway[J]. eLife, 2019, 8: e50706. doi: 10.7554/eLife.50706.
    [34]
    FU Qinbing and YUE Shigang. Mimicking fly motion tracking and fixation behaviors with a hybrid visual neural network[C]. 2017 IEEE International Conference on Robotics and Biomimetics, Macau, China, 2017: 1636–1641. doi: 10.1109/ROBIO.2017.8324652.
    [35]
    FU Qinbing, BELLOTTO N, WANG Huatian, et al. A visual neural network for robust collision perception in vehicle driving scenarios[C]. The 15th IFIP Advances in Information and Communication Technology, Hersonissos, Greece, 2019, 559: 67–79. doi: 10.1007/978-3-030-19823-7_5.
    [36]
    WANG Hongxin, PENG Jigen, ZHENG Xuqiang, et al. A robust visual system for small target motion detection against cluttered moving backgrounds[J]. IEEE Transactions on Neural Networks and Learning Systems, 2020, 31(3): 839–853. doi: 10.1109/tnnls.2019.2910418.
    [37]
    WU Zhihua and GUO Aike. Bioinspired figure-ground discrimination via visual motion smoothing[J]. PLoS Computational Biology, 2023, 19(4): e1011077. doi: 10.1371/journal.pcbi.1011077.
    [38]
    CLARK D A, BURSZTYN L, HOROWITZ M A, et al. Defining the computational structure of the motion detector in Drosophila[J]. Neuron, 2011, 70(6): 1165–1177. doi: 10.1016/j.neuron.2011.05.023.
    [39]
    SCHNEIDER J, MURALI N, TAYLOR G W, et al. Can Drosophila melanogaster tell who's who?[J]. PLoS One, 2018, 13(10): e0205043. doi: 10.1371/journal.pone.0205043.
    [40]
    HE Kaiming, ZHANG Xiangyu, REN Shaoqing, et al. Deep residual learning for image recognition[C]. 2016 IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, USA, 2016: 770–778. doi: 10.1109/cvpr.2016.90.
    [41]
    ZHAO Feifei, ZENG Yi, GUO Aike, et al. A neural algorithm for Drosophila linear and nonlinear decision-making[J]. Scientific Reports, 2020, 10(1): 18660. doi: 10.1038/s41598-020-75628-y.
    [42]
    WEI Hui, BU Yijie, and DAI Dawei. A decision-making model based on a spiking neural circuit and synaptic plasticity[J]. Cognitive Neurodynamics, 2017, 11(5): 415–431. doi: 10.1007/s11571-017-9436-2.
    [43]
    CAI Kuijie, SHEN Jihong, and WU Si. Decision-making in Drosophila with two conflicting cues[C]. The 8th International Symposium on Neural Networks, Guilin, China, 2011: 93–100. doi: 10.1007/978-3-642-21105-8_12.
    [44]
    WU Zhihua and GUO Aike. A model study on the circuit mechanism underlying decision-making in Drosophila[J]. Neural Networks, 2011, 24(4): 333–344. doi: 10.1016/j.neunet.2011.01.002.
    [45]
    FU Qinbing, HU Cheng, PENG Jigen, et al. Shaping the collision selectivity in a looming sensitive neuron model with parallel ON and OFF pathways and spike frequency adaptation[J]. Neural Networks, 2018, 106: 127–143. doi: 10.1016/j.neunet.2018.04.001.
    [46]
    GUO Jianzeng and GUO Aike. Crossmodal interactions between olfactory and visual learning in Drosophila[J]. Science, 2005, 309(5732): 307–310. doi: 10.1126/science.1111280.
    [47]
    HEISENBERG M and GERBER B. Behavioral analysis of learning and memory in Drosophila[J]. Learning and Memory:A Comprehensive Reference, 2008, 1: 549–559. doi: 10.1016/B978-012370509-9.00066-8.
    [48]
    李昊. 果蝇嗅觉学习的神经机制以及多巴胺系统的作用[D]. [博士论文], 中国科学院大学, 2011.

    LI Hao. Neural mechanism of olfactory learning in Drosophila[D]. [Ph. D. dissertation], University of Chinese Academy of Sciences, 2011.
    [49]
    LIANG Yuchen, RYALI C K, HOOVER B, et al. Can a fruit fly learn word embeddings?[C]. The 9th International Conference on Learning Representations, Austria, 2021.
    [50]
    ZHOU Mingmin, CHEN Nannan, TIAN Jingsong, et al. Suppression of GABAergic neurons through D2-like receptor secures efficient conditioning in Drosophila aversive olfactory learning[J]. Proceedings of the National Academy of Sciences of the United States of America, 2019, 116(11): 5118–5125. doi: 10.1073/pnas.1812342116.
    [51]
    RAGUSO R A and WILLIS M A. Synergy between visual and olfactory cues in nectar feeding by wild hawkmoths, Manduca sexta[J]. Animal Behaviour, 2005, 69(2): 407–418. doi: 10.1016/j.anbehav.2004.04.015.
    [52]
    ZHANG Lizhen, ZHANG Shaowu, WANG Zilong, et al. Cross-modal interaction between visual and olfactory learning in Apis cerana[J]. Journal of Comparative Physiology A, 2014, 200(10): 899–909. doi: 10.1007/s00359-014-0934-y.
    [53]
    BALKENIUS A and BALKENIUS C. Multimodal interaction in the insect brain[J]. BMC Neuroscience, 2016, 17(1): 29. doi: 10.1186/s12868-016-0258-7.
    [54]
    HARRAP M J M, LAWSON D A, WHITNEY H M, et al. Cross-modal transfer in visual and nonvisual cues in bumblebees[J]. Journal of Comparative Physiology A, 2019, 205(3): 427–437. doi: 10.1007/s00359-019-01320-w.
    [55]
    刘金莹. 基于仿生忆阻神经网络的学习与记忆电路设计及应用[D]. [硕士论文], 西南大学, 2023.

    LIU Jinying. Bioinspired memristive neural network circuit design of learning and memory and its application[D]. [Master dissertation], Southwest University, 2023.
    [56]
    THOMAS A. Memristor-based neural networks[J]. Journal of Physics D:Applied Physics, 2013, 46(9): 093001. doi: 10.1088/0022-3727/46/9/093001.
    [57]
    DAI Xinyu, HUO Dexuan, GAO Zhanyuan, et al. A visual-olfactory multisensory fusion spike neural network for early fire/smoke detection[J]. Research Square, 2023. doi: 10.21203/rs.3.rs-3192562/v1.
    [58]
    LU Yanli and LIU Qingjun. Insect olfactory system inspired biosensors for odorant detection[J]. Sensors & Diagnostics, 2022, 1(6): 1126–1142. doi: 10.1039/D2SD00112H.
    [59]
    HORIBE J, ANDO N, and KANZAKI R. Odor-searching robot with insect-behavior-based olfactory sensor[J]. Sensors and Materials, 2021, 33(12): 4185–4202. doi: 10.18494/SAM.2021.3369.
    [60]
    GARDNER J W and BARTLETT P N. A brief history of electronic noses[J]. Sensors and Actuators B:Chemical, 1994, 18(1/3): 210–211. doi: 10.1016/0925-4005(94)87085-3.
    [61]
    ALI M M, HASHIM N, AZIZ S A, et al. Principles and recent advances in electronic nose for quality inspection of agricultural and food products[J]. Trends in Food Science & Technology, 2020, 99: 1–10. doi: 10.1016/j.jpgs.2020.02.028.
    [62]
    WASILEWSKI T, BRITO N F, SZULCZYŃSKI B, et al. Olfactory receptor-based biosensors as potential future tools in medical diagnosis[J]. TrAC Trends in Analytical Chemistry, 2022, 150: 116599. doi: 10.1016/j.trac.2022.116599.
    [63]
    DWECK H K M, EBRAHIM S A M, RETZKE T, et al. The olfactory logic behind fruit odor preferences in larval and adult Drosophila[J]. Cell Reports, 2018, 23(8): 2524–2531. doi: 10.1016/j.celrep.2018.04.085.
    [64]
    宋岩峰, 郝群, 曹杰, 等. 大视场仿生复眼光电成像探测技术的研究发展[J]. 红外与激光工程, 2022, 51(5): 20210593. doi: 10.3788/IRLA20210593.

    SONG Yanfeng, HAO Qun, CAO Jie, et al. Research and development of Wide-Field-of-View bionic compound eye photoelectric imaging detection technology[J]. Infrared and Laser Engineering, 2022, 51(5): 20210593. doi: 10.3788/IRLA20210593.
    [65]
    XU G J W, GUO Kun, PARK S H, et al. Bio-inspired vision mimetics toward next-generation collision-avoidance automation[J]. The Innovation, 2023, 4(1): 100368. doi: 10.1016/j.xinn.2022.100368.
    [66]
    HOLEŠOVSKÝ O, ŠKOVIERA R, HLAVÁČ V, et al. Experimental comparison between event and global shutter cameras[J]. Sensors, 2021, 21(4): 1137. doi: 10.3390/s21041137.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(10)  / Tables(3)

    Article Metrics

    Article views (44) PDF downloads(10) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return