首页> 外文期刊>Quality Control, Transactions >A Novel GAN-Based Network for Unmasking of Masked Face
【24h】

A Novel GAN-Based Network for Unmasking of Masked Face

机译:基于GAN的遮蔽面的新型网络

获取原文
获取原文并翻译 | 示例
           

摘要

Recent deep learning based image editing methods have achieved promising results for removing object in an image but fail to generate plausible results for removing large objects of complex nature, especially in facial images. The objective of this work is to remove mask objects in facial images. This problem is challenging because (1) most of the time facial masks cover quite a large region of face that even extends beyond the actual face boundary below chin, and (2) facial image pairs with and without mask object do not exist for training. We break the problem into two stages: mask object detection and image completion of the removed mask region. The first stage of our model automatically produces binary segmentation for the mask region. Then, the second stage removes the mask and synthesizes the affected region with fine details while retaining the global coherency of face structure. For this, we have employed a GAN-based network using two discriminators where one discriminator helps learn the global structure of the face and then another discriminator comes in to focus learning on the deep missing region. To train our model in a supervised manner, we create a paired synthetic dataset using publicly available CelebA dataset and evaluated on real world images collected from the Internet. Our model outperforms others representative state-of-the-art approaches both qualitatively and quantitatively.
机译:最近基于深度学习的图像编辑方法已经实现了对图像中的对象的有希望的结果,但不能产生用于去除复杂性质的大型对象的合理结果,尤其是在面部图像中。这项工作的目的是删除面部图像中的掩模对象。这个问题是具有挑战性的,因为(1)大部分时间面部面膜覆盖了相当大的面积,甚至延伸到下巴以下的实际面边界,而(2)没有掩模对象的面部图像对不存在训练。我们将问题分为两个阶段:掩码对象检测和删除掩模区域的图像完成。我们模型的第一阶段自动为掩模区域产生二进制分割。然后,第二阶段去除掩模并用细细节合成受影响的区域,同时保持面部结构的全局相干性。为此,我们使用了一个使用的GAN的网络,使用两个鉴别者,其中一个鉴别者有助于学习面部的全球结构,然后另一个鉴别者进入将学习焦点在深度缺失区域上。要以监督方式培训我们的模型,我们使用公开可用的Celeba数据集创建配对的合成数据集,并在从Internet收集的真实世界图像上进行评估。我们的模型优于定性和定量的其他代表性的方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号