mirror of
https://github.com/mudler/LocalAI.git
synced 2025-05-20 18:45:00 +00:00
Update README.md (#5172)
Modified the README.md to separate out the different docker run commands to make it easier to copy into the terminal. Signed-off-by: qwerty108109 <97707491+qwerty108109@users.noreply.github.com>
This commit is contained in:
parent
e587044449
commit
4fc68409ff
1 changed files with 11 additions and 7 deletions
18
README.md
18
README.md
|
@ -75,17 +75,21 @@ curl https://localai.io/install.sh | sh
|
||||||
```
|
```
|
||||||
|
|
||||||
Or run with docker:
|
Or run with docker:
|
||||||
|
|
||||||
|
### CPU only image:
|
||||||
```bash
|
```bash
|
||||||
# CPU only image:
|
|
||||||
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-cpu
|
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-cpu
|
||||||
|
```
|
||||||
# Nvidia GPU:
|
### Nvidia GPU:
|
||||||
|
```bash
|
||||||
docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-gpu-nvidia-cuda-12
|
docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-gpu-nvidia-cuda-12
|
||||||
|
```
|
||||||
# CPU and GPU image (bigger size):
|
### CPU and GPU image (bigger size):
|
||||||
|
```bash
|
||||||
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest
|
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest
|
||||||
|
```
|
||||||
# AIO images (it will pre-download a set of models ready for use, see https://localai.io/basics/container/)
|
### AIO images (it will pre-download a set of models ready for use, see https://localai.io/basics/container/)
|
||||||
|
```bash
|
||||||
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-aio-cpu
|
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-aio-cpu
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue