Abstract
Face swapping refers to the task of changing the appearance of a face appearing in an image by replacing it with the appearance of a face taken from another image, in an effort to produce an authentic-looking result. We describe a method for face swapping that does not require training on faces being swapped and can be easily applied even when face images are unconstrained and arbitrarily paired. Our method offers the following contributions: (a) Instead of tailoring systems for face segmentation, as others previously proposed, we show that a standard fully convolutional network (FCN) can achieve remarkably fast and accurate segmentation, provided that it is trained on a rich enough example set. For this purpose, we describe novel data collection and generation routines which provide challenging segmented face examples. (b) We use our segmentations for robust face swapping under unprecedented conditions, without requiring subject-specific data or training. (c) Unlike previous work, our swapping is robust enough to allow for extensive quantitative tests. To this end, we use the Labeled Faces in the Wild (LFW) benchmark and measure how intra- and inter-subject face swapping affect face recognition. We show that intra-subject swapped faces remain as recognizable as their sources, testifying to the effectiveness of our method. In line with established perceptual studies, we show that better face swapping produces less recognizable inter-subject results (see, e.g., Fig. 2.1). This is the first time this effect was quantitatively demonstrated by a machine vision method. Some of the material in this chapter previously appeared in [47].
Original language | English |
---|---|
Title of host publication | Advances in Computer Vision and Pattern Recognition |
Publisher | Springer Science and Business Media Deutschland GmbH |
Pages | 21-43 |
Number of pages | 23 |
DOIs | |
State | Published - 2021 |
Publication series
Name | Advances in Computer Vision and Pattern Recognition |
---|---|
ISSN (Print) | 2191-6586 |
ISSN (Electronic) | 2191-6594 |
Bibliographical note
Publisher Copyright:© 2021, The Author(s), under exclusive license to Springer Nature Switzerland AG.