I'm trying to get familiar with Cog, so this repo serves as a backup of my learning journey.
- Build a simple GPT inference pipeline.
- Build a streaming GPT pipeline.
- Implement a Stable Diffusion pipeline.
- Add tracing and model warmup.
Basic inference:
cd nlp/basic
cog predict -i prompt="Hello!" -i max_length=100
Streaming inference:
cd nlp/streaming
cog predict -i prompt="Count form one to twenty: One," -i max_length=100