Skip to content

Commit 216d190

Browse files
sayakpaulpcuenca
andauthored
Update README.md to include our blog post (huggingface#1998)
* Update README.md Co-authored-by: Pedro Cuenca <[email protected]>
1 parent 9b37ed3 commit 216d190

File tree

1 file changed

+3
-0
lines changed

1 file changed

+3
-0
lines changed

examples/dreambooth/README.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -321,3 +321,6 @@ python train_dreambooth_flax.py \
321321
You can enable memory efficient attention by [installing xFormers](https://github.com/facebookresearch/xformers#installing-xformers) and padding the `--enable_xformers_memory_efficient_attention` argument to the script. This is not available with the Flax/JAX implementation.
322322

323323
You can also use Dreambooth to train the specialized in-painting model. See [the script in the research folder for details](https://github.com/huggingface/diffusers/tree/main/examples/research_projects/dreambooth_inpaint).
324+
325+
### Experimental results
326+
You can refer to [this blog post](https://huggingface.co/blog/dreambooth) that discusses some of DreamBooth experiments in detail. Specifically, it recommends a set of DreamBooth-specific tips and tricks that we have found to work well for a variety of subjects.

0 commit comments

Comments
 (0)