CVPR
/

PyTorch
style-transfer
face-stylization
PKUWilliamYang commited on
Commit
bd1a237
1 Parent(s): c54b2bb

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +42 -0
README.md ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ library_name: pytorch
4
+ tags:
5
+ - style-transfer
6
+ - face-stylization
7
+ datasets:
8
+ - cartoon
9
+ - caricature
10
+ - anime
11
+ - pixar
12
+ - slamdunk
13
+ - arcane
14
+ - comic
15
+ ---
16
+
17
+ ## Model Details
18
+
19
+ This system provides a web demo for the following paper:
20
+
21
+ **Pastiche Master: Exemplar-Based High-Resolution Portrait Style Transfer (CVPR 2022)**
22
+
23
+ - Algorithm developed by: Shuai Yang, Liming Jiang, Ziwei Liu and Chen Change Loy
24
+ - Web demo developed by: [hysts](https://huggingface.co/hysts)
25
+ - Resources for more information:
26
+ - [Project Page](https://www.mmlab-ntu.com/project/dualstylegan/)
27
+ - [Research Paper](https://arxiv.org/abs/2203.13248)
28
+ - [GitHub Repo](https://github.com/williamyang1991/DualStyleGAN)
29
+
30
+ **Abstract**
31
+ > Recent studies on StyleGAN show high performance on artistic portrait generation by transfer learning with limited data. In this paper, we explore more challenging exemplar-based high-resolution portrait style transfer by introducing a novel DualStyleGAN with flexible control of dual styles of the original face domain and the extended artistic portrait domain. Different from StyleGAN, DualStyleGAN provides a natural way of style transfer by characterizing the content and style of a portrait with an intrinsic style path and a new extrinsic style path, respectively. The delicately designed extrinsic style path enables our model to modulate both the color and complex structural styles hierarchically to precisely pastiche the style example. Furthermore, a novel progressive fine-tuning scheme is introduced to smoothly transform the generative space of the model to the target domain, even with the above modifications on the network architecture. Experiments demonstrate the superiority of DualStyleGAN over state-of-the-art methods in high-quality portrait style transfer and flexible style control.
32
+
33
+
34
+ ## Citation Information
35
+ ```bibtex
36
+ @inproceedings{yang2022Pastiche,
37
+  author = {Yang, Shuai and Jiang, Liming and Liu, Ziwei and and Loy, Chen Change},
38
+  title = {Pastiche Master: Exemplar-Based High-Resolution Portrait Style Transfer},
39
+  booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
40
+  year = {2022}
41
+ }
42
+ ```