Skip to content

Tags: coolaj86/ollama

Tags

v0.1.4

Toggle v0.1.4's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
Merge pull request ollama#859 from jmorganca/mxyng/fix-hostname

fix: ollama host for hostname

v0.1.3

Toggle v0.1.3's commit message
Use correct url for auto updates

v0.1.2

Toggle v0.1.2's commit message
use lower glibc versions in `Dockerfile.build`

v0.1.1

Toggle v0.1.1's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
show a default message when license/parameters/system prompt/template…

… aren't specified (ollama#681)

v0.1.0

Toggle v0.1.0's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
Update linux.md

v0.0.21

Toggle v0.0.21's commit message
add multi line strings to final prompt

v0.0.20

Toggle v0.0.20's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
add word wrapping for lines which are longer than the terminal width (o…

…llama#553)

v0.0.19

Toggle v0.0.19's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
create the blobs directory correctly (ollama#508)

v0.0.18

Toggle v0.0.18's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
fix model manifests (ollama#477)

v0.0.17

Toggle v0.0.17's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
treat stop as stop sequences, not exact tokens (ollama#442)

The `stop` option to the generate API is a list of sequences that should cause generation to stop. Although these are commonly called "stop tokens", they do not necessarily correspond to LLM tokens (per the LLM's tokenizer). For example, if the caller sends a generate request with `"stop":["\n"]`, then generation should stop on any token containing `\n` (and trim `\n` from the output), not just if the token exactly matches `\n`. If `stop` were interpreted strictly as LLM tokens, then it would require callers of the generate API to know the LLM's tokenizer and enumerate many tokens in the `stop` list.

Fixes ollama#295.