Neural-Style-Transfer-Papers
Selected papers, corresponding codes and pre-trained models in our review paper "Neural Style Transfer: A Review"
Citation
If you find this repository useful for your research, please cite
@article{jing2017neural,
title={Neural Style Transfer: A Review},
author={Jing, Yongcheng and Yang, Yezhou and Feng, Zunlei and Ye, Jingwen and Song, Mingli},
journal={arXiv preprint arXiv:1705.04058},
year={2017}
}
Pre-trained Models in Our Paper
A Taxonomy of Current Methods
1. Descriptive Neural Methods Based On Image Iteration
1.1. MMD-based Descriptive Neural Methods
❇️ Code:
✅ [Towards Deep Style Transfer: A Content-Aware Perspective] [Paper] (BMVC 2016)
1.2. MRF-based Descriptive Neural Methods
2. Generative Neural Methods Based On Model Iteration
✅ [Perceptual Losses for Real-Time Style Transfer and Super-Resolution] [Paper] (ECCV 2016)
✅ [Improved Texture Networks: Maximizing Quality and Diversity in Feed-forward Stylization and Texture Synthesis] [Paper] (CVPR 2017)
❇️ Code:
✅ [A Learned Representation for Artistic Style] [Paper] (ICLR 2017)
✅ [Fast Patch-based Style Transfer of Arbitrary Style] [Paper]
Slight Modifications of Current Methods
1. Modifications of Descriptive Neural Methods
✅ [Exploring the Neural Algorithm of Artistic Style] [Paper]
✅ [Controlling Perceptual Factors in Neural Style Transfer] [Paper]
2. Modifications of Generative Neural Methods
✅ [Instance Normalization:The Missing Ingredient for Fast Stylization] [Paper]
✅ [Depth-Preserving Style Transfer] [Paper]
Extensions to Specific Types of Images
✅ [Semantic Style Transfer and Turning Two-Bit Doodles into Fine Artwork] [Paper]
✅ [DeepMovie: Using Optical Flow and Deep Neural Networks to Stylize Movies] [Paper]
Application
❇️ Code:
Application Papers
Blogs
Exciting New Directions
✅ Character Style Transfer