-
Notifications
You must be signed in to change notification settings - Fork 11.8k
Misc. bug: llama-cli stopped starting in release b4191 (c9b00a7) #13498
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I removed
Callstack:
|
Found a SO answer First std::mutex::lock() crashes in application built with latest Visual Studio 2022: |
Name and Version
llama-cli
release b4191 (c9b00a7) and later
Operating systems
Windows
Which llama.cpp modules do you know to be affected?
llama-cli
Command line
Problem description & steps to reproduce
OS and CPU
Windows 10 Pro (22H2)
AMD Ryzen 7 3700X 8-Core Processor
Summary
Llama-cli stopped to start after upgrading to version of release b4191.
Tried llama-cli from llama-b4191-bin-win-avx-x64.zip, llama-b4191-bin-win-avx2-x64.zip, llama-b4191-bin-win-noavx-x64.zip, llama-b4191-bin-win-vulkan-x64.zip
When run under windbg I see an access violation here:
First Bad Commit
Last working binary release b4179
First failing binary release b4191
Relevant log output
The text was updated successfully, but these errors were encountered: