Skip to content

Tags: frost-beta/llama3.js

Tags

v0.0.8

Toggle v0.0.8's commit message
Enable setting max tokens

v0.0.7

Toggle v0.0.7's commit message
Add llama3-generate

v0.0.6

Toggle v0.0.6's commit message
Use cache friendly KV Cache

v0.0.5

Toggle v0.0.5's commit message
Simplify the nextTick trick

v0.0.4

Toggle v0.0.4's commit message
Give GC a chance to run

v0.0.3

Toggle v0.0.3's commit message
Update node-mlx to 0.0.6

v0.0.2

Toggle v0.0.2's commit message
Publish via github workflow

v0.0.1

Toggle v0.0.1's commit message
Implement chat interface