首页> 外文期刊>IEEE Transactions on Biometrics, Behavior, and Identity Science >Resolution Invariant Face Recognition Using a Distillation Approach
【24h】

Resolution Invariant Face Recognition Using a Distillation Approach

机译:使用蒸馏方法的分辨率不变的人脸识别

获取原文
获取原文并翻译 | 示例
           

摘要

Modern face recognition systems extract face representations using deep neural networks (DNNs) and give excellent identification and verification results, when tested on high resolution (HR) images. However, the performance of such an algorithm degrades significantly for low resolution (LR) images. A straight forward solution could be to train a DNN, using simultaneously, high and low resolution face images. This approach yields a definite improvement at lower resolutions but suffers a performance degradation for high resolution images. To overcome this shortcoming, we propose to train a network using both HR and LR images under the guidance of a fixed network, pretrained on HR face images. The guidance is provided by minimising the KL-divergence between the output Softmax probabilities of the pretrained (i.e., Teacher) and trainable (i.e., Student) network as well as by sharing the Softmax weights between the two networks. The resulting solution is tested on down-sampled images from FaceScrub and MegaFace datasets and shows a consistent performance improvement across various resolutions. We also tested our proposed solution on standard LR benchmarks such as TinyFace and SCFace. Our algorithm consistently outperforms the state-of-the-art methods on these datasets, confirming the effectiveness and merits of the proposed method.
机译:现代人脸识别系统利用深神经网络(DNN)提取面部表示,并在高分辨率(HR)图像上测试时提供出色的识别和验证结果。然而,对于低分辨率(LR)图像,这种算法的性能显着降低。直线解决方案可以是使用同时,高分辨率的面部图像训练DNN。这种方法在较低分辨率下产生明确的改善,但遭受高分辨率图像的性能下降。为了克服这种缺点,我们建议使用HR和LR图像在固定网络的指导下使用HR和LR图像培训网络,在HR面部图像上掠过。通过最小化预先训练(即,教师)和可培训(即,学生)网络的输出Softmax概率之间的KL发散来提供指导,以及通过在两个网络之间共享Softmax权重。从Facescrub和Megaface数据集测试所得溶液的下式图像上,并显示各种分辨率的一致性能改善。我们还在标准LR基准测试中测试了我们提出的解决方案,如TinyFace和Scface。我们的算法始终如一地优于这些数据集的最先进的方法,确认所提出的方法的有效性和优点。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号