| Citation: | CHEN Lei, HUANG Zaichao, LIU Chuan, ZHANG Weiwei. For Electric Power Disaster Early Warning Scenarios: A Large Model and Lightweight Models Joint Deployment Scheme Based on Limited Spectrum Resources[J]. Journal of Electronics & Information Technology. doi: 10.11999/JEIT250321 |
| [1] |
高建丰, 王焱, 金卷华. 基于QPSO-BP神经网络的火灾预警算法[J]. 消防科学与技术, 2020, 39(10): 1345–1349. doi: 10.3969/j.issn.1009-0029.2020.10.004.
GAO Jianfeng, WANG Yan, and JIN Juanhua. Fire early warning algorithm based on QPSO-BP neural network[J]. Fire Science and Technology, 2020, 39(10): 1345–1349. doi: 10.3969/j.issn.1009-0029.2020.10.004.
|
| [2] |
ABDALZAHER M S, ELSAYED H A, FOUDA M M, et al. Employing machine learning and IoT for earthquake early warning system in smart cities[J]. Energies, 2023, 16(1): 495. doi: 10.3390/en16010495.
|
| [3] |
窦杰, 向子林, 许强, 等. 机器学习在滑坡智能防灾减灾中的应用与发展趋势[J]. 地球科学, 2023, 48(5): 1657–1674. doi: 10.3799/dqkx.2022.419.
DOU Jie, XIANG Zilin, XU Qiang, et al. Application and development trend of machine learning in landslide intelligent disaster prevention and mitigation[J]. Earth Science, 2023, 48(5): 1657–1674. doi: 10.3799/dqkx.2022.419.
|
| [4] |
SANDERSON K. GPT-4 is here: What scientists think[J]. Nature, 2023, 615(7954): 773. doi: 10.1038/d41586-023-00816-5.
|
| [5] |
李刚, 方鸿, 刘云鹏, 等. 新型电力系统中的大模型驱动技术: 现状、机遇与挑战[J]. 高电压技术, 2024, 50(7): 2864–2878. doi: 10.13336/j.1003-6520.hve.20240863.
LI Gang, FANG Hong, LIU Yunpeng, et al. Large-model drive technology in new power system: Status, challenges and prospects[J]. High Voltage Engineering, 2024, 50(7): 2864–2878. doi: 10.13336/j.1003-6520.hve.20240863.
|
| [6] |
车万翔, 窦志成, 冯岩松, 等. 大模型时代的自然语言处理: 挑战、机遇与发展[J]. 中国科学: 信息科学, 2023, 53(9): 1645–1687. doi: 10.1360/SSI-2023-0113.
CHE Wanxiang, DOU Zhicheng, FENG Yansong, et al. Towards a comprehensive understanding of the impact of large language models on natural language processing: Challenges, opportunities and future directions[J]. Scientia Sinica Informationis, 2023, 53(9): 1645–1687. doi: 10.1360/SSI-2023-0113.
|
| [7] |
李莉, 时榕良, 郭旭, 等. 融合大模型与图神经网络的电力设备缺陷诊断[J]. 计算机科学与探索, 2024, 18(10): 2643–2655. doi: 10.3778/j.issn.1673-9418.2405085.
LI Li, SHI Rongliang, GUO Xu, et al. Diagnosis of power system defects by large language models and graph neural networks[J]. Journal of Frontiers of Computer Science and Technology, 2024, 18(10): 2643–2655. doi: 10.3778/j.issn.1673-9418.2405085.
|
| [8] |
DETTMERS T, LEWIS M, BELKADA Y, et al. GPT3. int8(): 8-bit matrix multiplication for transformers at scale[C]. Proceedings of the 36th Conference on Neural Information Processing Systems, New Orleans, USA, 2022: 30318–30332.
|
| [9] |
XIAO Guangxuan, LIN Ji, SEZNEC M, et al. SmoothQuant: Accurate and efficient post-training quantization for large language models[C]. Proceedings of the 40th International Conference on Machine Learning, Honolulu, USA, 2023: 38087–38099.
|
| [10] |
XIA Haojun, ZHENG Zhen, LI Yuchao, et al. Flash-LLM: Enabling cost-effective and highly-efficient large generative model inference with unstructured sparsity[J]. Proceedings of the VLDB Endowment, 2023, 17(2): 211–224. doi: 10.14778/3626292.3626303.
|
| [11] |
HU E J, SHEN Yelong, WALLIS P, et al. LoRA: Low-rank adaptation of large language models[C]. Proceedings of the 10th International Conference on Learning Representations, 2022: 3. (查阅网上资料, 未找到对应的出版地信息, 请确认).
|
| [12] |
LI Sunzhu, ZHANG Peng, GAN Guobing, et al. Hypoformer: Hybrid decomposition transformer for edge-friendly neural machine translation[C]. Proceedings of 2022 Conference on Empirical Methods in Natural Language Processing, Abu Dhabi, United Arab Emirates, 2022: 7056–7068. doi: 10.18653/v1/2022.emnlp-main.475.
|
| [13] |
GU Yuxian, DONG Li, WEI Furu, et al. MiniLLM: Knowledge distillation of large language models[C]. Proceedings of the 12th International Conference on Learning Representations, Vienna, Austria, 2024.
|
| [14] |
ZHANG Shubin, TONG Xun, CHI Kaikai, et al. Stackelberg game-based multi-agent algorithm for resource allocation and task offloading in MEC-enabled C-ITS[J]. IEEE Transactions on Intelligent Transportation Systems, 2025, doi: 10.1109/TITS.2025.3553487. (查阅网上资料,未找到对应的卷期页码信息,请确认).
|
| [15] |
DAI Linglong, WANG Bichai, DING Zhiguo, et al. A survey of non-orthogonal multiple access for 5G[J]. IEEE Communications Surveys & Tutorials, 2018, 20(3): 2294–2323. doi: 10.1109/COMST.2018.2835558.
|
| [16] |
SHI Zhenjiang and LIU Jiajia. A novel NOMA-enhanced SDT scheme for NR RedCap in 5G/B5G systems[J]. IEEE Transactions on Wireless Communications, 2024, 23(4): 3190–3204. doi: 10.1109/TWC.2023.3306342.
|
| [17] |
BYERS R. A bisection method for measuring the distance of a stable matrix to the unstable matrices[J]. SIAM Journal on Scientific and Statistical Computing, 1988, 9(5): 875–881. doi: 10.1137/0909059.
|
| [18] |
KENNEDY J and EBERHART R. Particle swarm optimization[C]. Proceedings of the ICNN'95-International Conference on Neural Networks, Perth, Australia, 1995: 1942–1948. doi: 10.1109/ICNN.1995.488968.
|
| [19] |
WANG Yichen, WANG Tao, YANG Zihuan, et al. Throughput-oriented non-orthogonal random access scheme for massive MTC networks[J]. IEEE Transactions on Communications, 2020, 68(3): 1777–1793. doi: 10.1109/TCOMM.2019.2957767.
|
| [20] |
SHI Zhenjiang and LIU Jiajia. Massive access in 5G and beyond ultra-dense networks: An MARL-based NORA scheme[J]. IEEE Transactions on Communications, 2023, 71(4): 2170–2183. doi: 10.1109/TCOMM.2023.3244958.
|