Publication Type
Journal Article
Version
publishedVersion
Publication Date
9-2017
Abstract
We address the problem of transferring the style of a headshot photo to face images. Existing methods using a single exemplar lead to inaccurate results when the exemplar does not contain sufficient stylized facial components for a given photo. In this work, we propose an algorithm to stylize face images using multiple exemplars containing different subjects in the same style. Patch correspondences between an input photo and multiple exemplars are established using a Markov Random Field (MRF), which enables accurate local energy transfer via Laplacian stacks. As image patches from multiple exemplars are used, the boundaries of facial components on the target image are inevitably inconsistent. The artifacts are removed by a post-processing step using an edge-preserving filter. Experimental results show that the proposed algorithm consistently produces visually pleasing results. (C) 2017 Elsevier Inc. All rights reserved.
Keywords
Style transfer, Image processing
Discipline
Information Security
Research Areas
Information Systems and Management
Publication
Computer Vision and Image Understanding
Volume
162
First Page
135
Last Page
145
ISSN
1077-3142
Identifier
10.1016/j.cviu.2017.08.009
Publisher
Elsevier
Citation
SONG, Yibing; BAO, Linchao; HE, Shengfeng; YANG, Qingxiong; and YANG, Ming-Hsuan.
Stylizing face images via multiple exemplars. (2017). Computer Vision and Image Understanding. 162, 135-145.
Available at: https://ink.library.smu.edu.sg/sis_research/7841
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://doi.org/10.1016/j.cviu.2017.08.009