distributed load #14477
Unanswered
thunderFireSword
asked this question in
Q&A
distributed load
#14477
Replies: 1 comment
-
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Can llama.cpp load models in a distributed manner across multiple CPU nodes? That is, each node retains a portion of the model's computation graph.
Beta Was this translation helpful? Give feedback.
All reactions