-
Notifications
You must be signed in to change notification settings - Fork 11.8k
Misc. bug: [SYCL] Unexpected "setvars.sh has already been run" warning #13333
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
The warning comes from the oneAPI toolkit because you sourced If we removed the source from the example script, anyone who has already installed and tested the toolkit would run into problems in a fresh terminal unless they source If you think the docs would be improved by having a dependencies section, feel free to open a PR to improve them! |
Ahh, maybe this is why I was facing an issue error saying, while loading shared libraries: libsvml.so: cannot open shared object file: No such file or directory I'm not sure if I missed anything related to this in that documentation. However, I think the documentation is not much newbie friendly and, can be improved.
I would love to contribute! But the thing is, I'm running into an issue with model output, they are repeating the same words. I think I should search if there is any way to fix it. Otherwise, I may have to create another issue. So far, the models I have tested are, |
Yes, that's the error you encounter when the setvars is not set.
It's easy for us to miss "obvious" things since we are used to the project.
I'm not familiar with the frontend you are using. Did you try running Could you try |
Can you try a modern big enough model like gemma 3 4B to see if this problem persists? |
Name and Version
version: 5288 (a7366fa)
built with Intel(R) oneAPI DPC++/C++ Compiler 2025.1.1 (2025.1.1.20250418) for x86_64-unknown-linux-gnu
Linux ubuntu 6.11.0-25-generic #25~24.04.1-Ubuntu SMP PREEMPT_DYNAMIC Tue Apr 15 17:20:50 UTC 2 x86_64 x86_64 x86_64 GNU/Linux
Operating systems
Linux
Which llama.cpp modules do you know to be affected?
Documentation/Github
Command line
source /opt/intel/oneapi/setvars.sh sycl-ls ./examples/sycl/build.sh
Problem description & steps to reproduce
While trying to build llama.cpp, I got an unexpected warning saying,
I was following the guide for SYCL: Linux. I am getting this warning several times,
source /opt/intel/oneapi/setvars.sh ./examples/sycl/run-llama2.sh 0
Another thing I noticed, there is no mention about the required build packages i.e.
cmake
.The text was updated successfully, but these errors were encountered: