The official code repository for the second edition of the O'Reilly book Generative Deep Learning: Teaching Machines to Paint, Write, Compose and Play.
https://learning.oreilly.com/library/view/generative-deep-learning/9781492041931/ https://www.amazon.com/Generative-Deep-Learning-Teaching-Machines/dp/1492041947/ref=sr_1_1
Below is a outline of the book chapters, with links to relevant notebook folders in the codebase.
Part I: Introduction to Generative Deep Learning
- Generative Modeling
- Deep Learning
Part II: Methods
- Variational Autoencoders
- Generative Adversarial Networks
- Autoregressive Models
- Normalizing Flows
- Energy-Based Models
- Diffusion Models
Part III: Applications
- Transformers
- Advanced GANs
- Music Generation
- World Models
- Multimodal Models
- Conclusion
Many of the examples in this book are adapted from the excellent open source implementations that are available through the Keras website (https://keras.io/examples/generative/). I highly recommend you check out this resource as new models and examples are constantly being added.
To download some of the datasets for the book, you will need a Kaggle account and an API token
Follow the instructions here:
https://github.com/Kaggle/kaggle-api
Download the JSON file that stores your username and API key.
Create a file called .env in the root directory, containing the following values (replacing the Kaggle username and API key with the values from the JSON):
JUPYTER_PORT=8888
TENSORBOARD_PORT=6006
KAGGLE_USERNAME=<your_kaggle_username>
KAGGLE_KEY=<your_kaggle_key>
This codebase is designed to be run with Docker.
Don't worry if you've never used Docker before! To get set up, follow the instructions in the Docker README file in this repository. This includes a full run through of why Docker is awesome and a description of how interact with the codebase using Docker.
If you do not have a GPU, run the following command:
docker-compose build
If you do have a GPU that you wish to use, run the following command:
docker-compose -f docker-compose-gpu.yml build
If you do not have a GPU, run the following command:
docker-compose up
If you do have a GPU that you wish to use, run the following command:
docker-compose -f docker-compose-gpu.yml up
The running notebooks will be available in your local browser, on the port specified in your env file - for example
http://localhost:8888
The codebase comes with an in-built data downloader helper script. Use the helper script as follows (from outside the container):
bash scripts/download.sh [faces, bricks, recipes, flowers, wines, cellosuites, chorales]
Tensorboard is really useful for monitoring experiments and seeing how your generative deep learning model training is progressing.
To launch Tensorboard, run the following script (from outside the container), replacing <CHAPTER> with the required chapter (e.g. 03_vae) and <EXAMPLE> with the required example (e.g. 02_vae_fashion).
bash scripts/tensorboard.sh <CHAPTER> <EXAMPLE>
Tensorboard will be available in your local browser on the port specified in your .env file - for example:
http://localhost:6006
To set up a virtual machine with GPU in Google Cloud Platform, follow the instructions in the Google Cloud README file in this repository.
