mirror of
https://github.com/mudler/LocalAI.git
synced 2025-05-20 10:35:01 +00:00
Use correct CUDA container references as noted in the dockerhub overview
This commit is contained in:
parent
0affde29e6
commit
e65958ebe2
1 changed files with 1 additions and 1 deletions
|
@ -64,7 +64,7 @@ To check what CUDA version do you need, you can either run `nvidia-smi` or `nvcc
|
|||
Alternatively, you can also check nvidia-smi with docker:
|
||||
|
||||
```
|
||||
docker run --runtime=nvidia --rm nvidia/cuda nvidia-smi
|
||||
docker run --runtime=nvidia --rm nvidia/cuda:12.8.0-base-ubuntu24.04 nvidia-smi
|
||||
```
|
||||
|
||||
To use CUDA, use the images with the `cublas` tag, for example.
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue