Joint bilateral learning for real-time universal photorealistic style transfer
Files
Published version
Date
2020-08-24
DOI
Authors
Xia, Xide
Zhang, Meng
Xue, Tianfan
Sun, Zheng
Fang, Hui
Kulis, Brian
Chen, Jiawen
Version
Published version
OA Version
Citation
Xide Xia, Meng Zhang, Tianfan Xue, Zheng Sun, Hui Fang, Brian Kulis, Jiawen Chen. 2020. "Joint bilateral learning for real-time universal photorealistic style transfer." Proc. European Conference on Computer Vision 2020
Abstract
Photorealistic style transfer is the task of transferring the
artistic style of an image onto a content target, producing a result that
is plausibly taken with a camera. Recent approaches, based on deep
neural networks, produce impressive results but are either too slow to
run at practical resolutions, or still contain objectionable artifacts. We
propose a new end-to-end model for photorealistic style transfer that is
both fast and inherently generates photorealistic results. The core of our
approach is a feed-forward neural network that learns local edge-aware
a ne transforms that automatically obey the photorealism constraint.
When trained on a diverse set of images and a variety of styles, our
model can robustly apply style transfer to an arbitrary pair of input
images. Compared to the state of the art, our method produces visually
superior results and is three orders of magnitude faster, enabling real-
time performance at 4K on a mobile phone. We validate our method
with ablation and user studies.