首页> 外文会议>NAFOSTED Conference on Information and Computer Science >Lightweight Network for Vietnamese Landmark Recognition based on Knowledge Distillation
【24h】

Lightweight Network for Vietnamese Landmark Recognition based on Knowledge Distillation

机译:基于知识蒸馏的轻量级网络越南地标识别

获取原文

摘要

In our modern world, smart devices, e.g., mobile phones, IoT devices, have become the norm, leading to a vast increase in demand for a smart-ecosystem. Among other technologies that are being researched and applied, there is a trend of embedding Artificial Intelligence modules on these devices. One of the most challenging problems for embedding on smart devices is maintaining good accuracy while reducing the computational cost and speed. State-of-the-art Deep Convolution Neural Networks cannot run on smart devices due to a lack of resources. The need to find such a model is the motivation for our proposal of a lightweight network for landmark recognition using knowledge distillation. Our purpose is not to create a network with higher accuracy; instead, we try to devise a fast and light neural network while keeping approximately similar accuracy of SOTA models by utilizing knowledge distillation. Our proposed student model achieves a decent result with 7.33% accuracy lower than the teacher SOTA model (91.8%), while decreases the processing time by 73.04%. Our experimental results show promising potential for further explorations and research in knowledge distillation. We have also collected a dataset for Vietnam landmarks for our experiments. This data can be used to train a similar network for Vietnam landmarks recognition or other related purposes.
机译:在我们的现代世界,智能设备,例如移动电话,物联网设备已成为常态,导致对智能生态系统的需求增加。在研究和应用的其他技术中,存在嵌入这些设备上的人工智能模块的趋势。在智能设备上嵌入最具挑战性的问题之一是在降低计算成本和速度的同时保持良好的准确性。由于缺乏资源,最先进的深度卷积神经网络无法在智能设备上运行。找到这样的模型的需要是我们使用知识蒸馏使用知识蒸馏的重量级网络的提案。我们的目的不是创建具有更高准确性的网络;相反,我们尝试设计快速和光明的神经网络,同时通过利用知识蒸馏来保持SOTA模型的大致相似的准确性。我们拟议的学生模式实现了一个体面的结果,比教师SOTA模型低7.33%(91.8%),同时将处理时间减少73.04%。我们的实验结果表明了知识蒸馏进一步探索和研究的有希望的潜力。我们还为我们的实验收集了越南地标的数据集。该数据可用于培训类似网络的越南地标识或其他相关目的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号