Please use this identifier to cite or link to this item: https://doi.org/10.1145/3123266.3123425
Title: Laplacian-Steered Neural Style Transfer
Authors: Shaohua Li
Xinxing Xu
Liqiang Nie
Tat-Seng Chua 
Keywords: Convolutional Neural Networks
Image laplacian
Neural Style Transfer
Issue Date: 23-Oct-2017
Publisher: Association for Computing Machinery, Inc
Citation: Shaohua Li, Xinxing Xu, Liqiang Nie, Tat-Seng Chua (2017-10-23). Laplacian-Steered Neural Style Transfer. ACM Multimedia Conference 2017 : 1716-1724. ScholarBank@NUS Repository. https://doi.org/10.1145/3123266.3123425
Abstract: Neural Style Transfer based on Convolutional Neural Networks (CNN) aims to synthesize a new image that retains the high-level structure of a content image, rendered in the low-level texture of a style image. This is achieved by constraining the new image to have high-level CNN features similar to the content image, and lower-level CNN features similar to the style image. However in the traditional optimization objective, low-level features of the content image are absent, and the low-level features of the style image dominate the low-level detail structures of the new image. Hence in the synthesized image, many details of the content image are lost, and a lot of inconsistent and unpleasing artifacts appear. As a remedy, we propose to steer image synthesis with a novel loss function: the Laplacian loss. The Laplacian matrix ("Laplacian" in short), produced by a Laplacian operator, is widely used in computer vision to detect edges and contours. The Laplacian loss measures the difference of the Laplacians, and correspondingly the difference of the detail structures, between the content image and a new image. It is flexible and compatible with the traditional style transfer constraints. By incorporating the Laplacian loss, we obtain a new optimization objective for neural style transfer named Lapstyle. Minimizing this objective will produce a stylized image that better preserves the detail structures of the content image and eliminates the artifacts. Experiments show that Lapstyle produces more appealing stylized images with less artifacts, without compromising their "stylishness". © 2017 Association for Computing Machinery.
Source Title: ACM Multimedia Conference 2017
URI: https://scholarbank.nus.edu.sg/handle/10635/167452
ISBN: 9781450349062
DOI: 10.1145/3123266.3123425
Appears in Collections:Elements
Staff Publications

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
Laplacian-Steered Neural Style Transfer.pdf6.17 MBAdobe PDF

OPEN

NoneView/Download

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.