Skip to content

Improvement: use ollama's own AsyncClient instead of OpenAI AsyncOpenAI #74

@MrCsabaToth

Description

@MrCsabaToth

I'm looking for streamlit ollama parallel calls and found your blog post and video. I notice that in the source https://github.com/mneedham/LearnDataWithMark/blob/main/ollama-parallel/app.py you use AsyncOpenAI. I don't know if there's any specific reason, but I'll try to get rid of the extra dependency and use ollama's AsyncClient: https://github.com/ollama/ollama-python/blob/ebe332b29d5c65aeccfadd4151bf6059ded7049b/examples/async-chat-stream/main.py#L27C19-L27C30

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions