Simplified CycleGAN Implementation in PyTorch

Code Available on GitHub –

Great thanks to Jun-Yan Zhu et al. for their contribution of the CycleGAN paper. The code is adapted from the authors’ implementation but simplified into just a few files. Original project and paper –

CycleGAN: Project | Paper | Torch


  • Linux or macOS
  • Python 3

Getting Started


  • Install PyTorch 0.4+ (1.0 tested) with GPU support.
  • Clone this repo:
    git clone
    cd simple_CycleGAN
  • The command pip install -r requirements.txt will install all required dependencies.

CycleGAN train/test

  • Download a CycleGAN dataset from the authors (e.g. horse2zebra):
    bash ./util/ horse2zebra
  • Train a model (different from original implementation):
    python train
  • Change training options in, all options will be saved to a txt file

  • A new directory by name of will be created inside the checkpoints directory

  • Inside checkpoints\project_name\ you will find

    • checkpoints for training processing results
    • models for saved models
    • test_results for running python test on testing dataset
  • Test the model:

    python test

Use your own Dataset

Follow the naming pattern of trainA, trainB, testA, and place them in datasets\your_dataset\. You can also change directories inside


If you use this code for your research, please cite Jun-Yan et al’s papers.

  title={Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networkss},
  author={Zhu, Jun-Yan and Park, Taesung and Isola, Phillip and Efros, Alexei A},
  booktitle={Computer Vision (ICCV), 2017 IEEE International Conference on},

  title={Image-to-Image Translation with Conditional Adversarial Networks},
  author={Isola, Phillip and Zhu, Jun-Yan and Zhou, Tinghui and Efros, Alexei A},
  booktitle={Computer Vision and Pattern Recognition (CVPR), 2017 IEEE Conference on},

Related Projects

CycleGAN-Torch | pix2pix-Torch | pix2pixHD | iGAN | BicycleGAN | vid2vid

Leave a Reply

Your email address will not be published. Required fields are marked *