mirror of
https://github.com/mudler/LocalAI.git
synced 2025-05-20 10:35:01 +00:00
feat(gallery): support ConfigURLs (#2012)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
This commit is contained in:
parent
da82ce81b5
commit
b2785ff06e
5 changed files with 40 additions and 4 deletions
|
@ -146,12 +146,16 @@ In the body of the request you must specify the model configuration file URL (`u
|
|||
```bash
|
||||
LOCALAI=http://localhost:8080
|
||||
curl $LOCALAI/models/apply -H "Content-Type: application/json" -d '{
|
||||
"url": "<MODEL_CONFIG_FILE>"
|
||||
"config_url": "<MODEL_CONFIG_FILE_URL>"
|
||||
}'
|
||||
# or if from a repository
|
||||
curl $LOCALAI/models/apply -H "Content-Type: application/json" -d '{
|
||||
"id": "<GALLERY>@<MODEL_NAME>"
|
||||
}'
|
||||
# or from a gallery config
|
||||
curl $LOCALAI/models/apply -H "Content-Type: application/json" -d '{
|
||||
"url": "<MODEL_CONFIG_FILE_URL>"
|
||||
}'
|
||||
```
|
||||
|
||||
An example that installs openllama can be:
|
||||
|
@ -159,8 +163,8 @@ An example that installs openllama can be:
|
|||
```bash
|
||||
LOCALAI=http://localhost:8080
|
||||
curl $LOCALAI/models/apply -H "Content-Type: application/json" -d '{
|
||||
"url": "https://github.com/go-skynet/model-gallery/blob/main/openllama_3b.yaml"
|
||||
}'
|
||||
"config_url": "https://raw.githubusercontent.com/mudler/LocalAI/master/embedded/models/hermes-2-pro-mistral.yaml"
|
||||
}'
|
||||
```
|
||||
|
||||
The API will return a job `uuid` that you can use to track the job progress:
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue