Skip to content

Bug: nvidia-container-cli: requirement error: unsatisfied condition: cuda>=12.6, please update your driver to a newer version, or use an earlier cuda container: unknown. #9665

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wencan opened this issue Sep 27, 2024 · 4 comments
Labels
bug-unconfirmed medium severity Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable) stale

Comments

@wencan
Copy link

wencan commented Sep 27, 2024

What happened?

cmd: docker run --rm -it --gpus all ghcr.nju.edu.cn/ggerganov/llama.cpp:full-cuda --version
output:

docker: Error response from daemon: failed to create task for container: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: error during container init: error running hook #0: error running hook: exit status 1, stdout: , stderr: Auto-detected mode as 'legacy'
nvidia-container-cli: requirement error: unsatisfied condition: cuda>=12.6, please update your driver to a newer version, or use an earlier cuda container: unknown.

ghcr.nju.edu.cn/ggerganov/llama.cpp full-cuda 248adbb30d34

Name and Version

system:

OS: Debian GNU/Linux 12 (bookworm) x86_64 
Kernel: 6.1.0-25-amd64 
Uptime: 9 hours, 35 mins 
Packages: 2401 (dpkg), 29 (brew), 78 (flatpak), 3 (snap) 
Shell: bash 5.2.15 
Resolution: 2560x1440 
DE: GNOME 43.9 
WM: Mutter 
WM Theme: Adwaita 
Theme: Adwaita [GTK2/3] 
Icons: Adwaita [GTK2/3] 
Terminal: gnome-terminal 
CPU: AMD Ryzen 7 1800X (16) @ 3.600GHz 
GPU: NVIDIA GeForce GTX 1080 Ti 
Memory: 6155MiB / 15896MiB 

nvidia-smi

Fri Sep 27 21:44:17 2024       
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.183.01             Driver Version: 535.183.01   CUDA Version: 12.2     |
|-----------------------------------------+----------------------+----------------------+
| GPU  Name                 Persistence-M | Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |         Memory-Usage | GPU-Util  Compute M. |
|                                         |                      |               MIG M. |
|=========================================+======================+======================|
|   0  NVIDIA GeForce GTX 1080 Ti     On  | 00000000:08:00.0  On |                  N/A |
|  0%   50C    P2              61W / 250W |   1078MiB / 11264MiB |      5%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+
                                                                                         
+---------------------------------------------------------------------------------------+
| Processes:                                                                            |
|  GPU   GI   CI        PID   Type   Process name                            GPU Memory |
|        ID   ID                                                             Usage      |
|=======================================================================================|
|    0   N/A  N/A      2914      G   /usr/lib/xorg/Xorg                          402MiB |
|    0   N/A  N/A      3114      G   /usr/bin/gnome-shell                        120MiB |
|    0   N/A  N/A      4009      G   /usr/bin/nautilus                           176MiB |
|    0   N/A  N/A      4510      G   ./hiddify                                    23MiB |
|    0   N/A  N/A     17338      G   ...a68b3ebb3cc3e3b70b434329aaa71e1dc21      216MiB |
|    0   N/A  N/A     53889      G   ...guageDetectionEnabled,Vulkan,WebOTP       25MiB |
|    0   N/A  N/A     62495      G   ...erProcess --variations-seed-version      108MiB |
+---------------------------------------------------------------------------------------+

docker info

Client: Docker Engine - Community
 Version:    27.3.1
 Context:    default
 Debug Mode: false
 Plugins:
  buildx: Docker Buildx (Docker Inc.)
    Version:  v0.17.1
    Path:     /usr/libexec/docker/cli-plugins/docker-buildx
  compose: Docker Compose (Docker Inc.)
    Version:  v2.29.7
    Path:     /usr/libexec/docker/cli-plugins/docker-compose

Server:
 Containers: 0
  Running: 0
  Paused: 0
  Stopped: 0
 Images: 20
 Server Version: 27.3.1
 Storage Driver: overlay2
  Backing Filesystem: extfs
  Supports d_type: true
  Using metacopy: false
  Native Overlay Diff: true
  userxattr: false
 Logging Driver: json-file
 Cgroup Driver: systemd
 Cgroup Version: 2
 Plugins:
  Volume: local
  Network: bridge host ipvlan macvlan null overlay
  Log: awslogs fluentd gcplogs gelf journald json-file local splunk syslog
 Swarm: inactive
 Runtimes: runc io.containerd.runc.v2 nvidia
 Default Runtime: runc
 Init Binary: docker-init
 containerd version: 7f7fdf5fed64eb6a7caf99b3e12efcf9d60e311c
 runc version: v1.1.14-0-g2c9f560
 init version: de40ad0
 Security Options:
  apparmor
  seccomp
   Profile: builtin
  cgroupns
 Kernel Version: 6.1.0-25-amd64
 Operating System: Debian GNU/Linux 12 (bookworm)
 OSType: linux
 Architecture: x86_64
 CPUs: 16
 Total Memory: 15.52GiB
 Name: debian12
 ID: c5874152-b49f-46cc-a0ca-52edf0117d75
 Docker Root Dir: /var/lib/docker
 Debug Mode: false
 Username: [email protected]
 Experimental: false
 Insecure Registries:
  127.0.0.0/8
 Registry Mirrors:
  https://docker.m.daocloud.io/
 Live Restore Enabled: false

WARNING: bridge-nf-call-iptables is disabled
WARNING: bridge-nf-call-ip6tables is disabled

What operating system are you seeing the problem on?

Linux

Relevant log output

No response

@wencan wencan added bug-unconfirmed medium severity Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable) labels Sep 27, 2024
@github-actions github-actions bot added the stale label Oct 28, 2024
@jcuenod
Copy link

jcuenod commented Oct 29, 2024

Getting this on server-cuda after pulling ghcr.io/ggerganov/llama.cpp:server-cuda today.

@github-actions github-actions bot removed the stale label Oct 30, 2024
@github-actions github-actions bot added the stale label Nov 30, 2024
@qixing-ai
Copy link

Error response from daemon: failed to create task for container: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: error during container init: error running prestart hook #1: exit status 1, stdout: , stderr: Auto-detected mode as 'legacy'
nvidia-container-cli: requirement error: unsatisfied condition: cuda>=12.6, please update your driver to a newer version, or use an earlier cuda container: unknown

@github-actions github-actions bot removed the stale label Dec 1, 2024
@mstrfx
Copy link

mstrfx commented Dec 22, 2024

Had exactly the same issue with nvidia 550.142 driver and I think you should either play with installing latest nvidia drivers or build docker image yourself.

Edit .devops/llama-server-cuda.Dockerfile and change ARG CUDA_VERSION=12.6.0 to your version which you find after executing nvidia-smi in your terminal and then docker build -t local/llama.cpp:server-cuda -f .devops/llama-server-cuda.Dockerfile .

@github-actions github-actions bot added the stale label Jan 22, 2025
Copy link
Contributor

github-actions bot commented Feb 5, 2025

This issue was closed because it has been inactive for 14 days since being marked as stale.

@github-actions github-actions bot closed this as completed Feb 5, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug-unconfirmed medium severity Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable) stale
Projects
None yet
Development

No branches or pull requests

4 participants