-
Notifications
You must be signed in to change notification settings - Fork 11.8k
Misc. bug: since b4800 llama-cli does not prompt and llama-bench shows no results #13452
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Do you have any files with special or non-ASCII characters in the directory? |
Your question reminded me of: #11198 Anyway, in the directory there were many files and could not see any non-ASCII characters, but to make sure I put the .gguf in a directory on its own. Unfortunately the outcome is the same. |
Please try to obtain a callstack of the crash:
|
There is no crash, it just stays there and I am not able to input anything. Built like this:
gdb output:
|
So it gets stuck, but it doesn't crash or do anything else? You should still be able to get a callstack if you press Ctrl+C. |
Ctrl+C
Also the output of llama-bench, which shows no results but exits without error.
|
The first case shows that it is waiting on |
It all started with this commit: cc473ca |
No, I don't see any code there that could cause this. |
Today debian trixie updated some mesa libs from Thanks for your support @slaren! |
Name and Version
Last working version:
Operating systems
Linux
Which llama.cpp modules do you know to be affected?
llama-cli
Command line
Problem description & steps to reproduce
Starting with b4800 llama-cli does not reach prompt input, it stops here:
and llama-bench shows no results (also no error):
First Bad Commit
cc473ca
Relevant log output
The text was updated successfully, but these errors were encountered: