mirror of
https://github.com/mudler/LocalAI.git
synced 2025-05-29 06:54:59 +00:00
Update README.md
This commit is contained in:
parent
827f189163
commit
d69048e0b0
1 changed files with 5 additions and 1 deletions
|
@ -163,6 +163,10 @@ func main() {
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Windows compatibility
|
||||||
|
|
||||||
|
It should work, however you need to make sure you give enough resources to the container. See https://github.com/go-skynet/llama-cli/issues/2
|
||||||
|
|
||||||
### Kubernetes
|
### Kubernetes
|
||||||
|
|
||||||
You can run the API directly in Kubernetes:
|
You can run the API directly in Kubernetes:
|
||||||
|
@ -202,4 +206,4 @@ MIT
|
||||||
- [llama.cpp](https://github.com/ggerganov/llama.cpp)
|
- [llama.cpp](https://github.com/ggerganov/llama.cpp)
|
||||||
- https://github.com/tatsu-lab/stanford_alpaca
|
- https://github.com/tatsu-lab/stanford_alpaca
|
||||||
- https://github.com/cornelk/llama-go for the initial ideas
|
- https://github.com/cornelk/llama-go for the initial ideas
|
||||||
- https://github.com/antimatter15/alpaca.cpp for the light model version (this is compatible and tested only with that checkpoint model!)
|
- https://github.com/antimatter15/alpaca.cpp for the light model version (this is compatible and tested only with that checkpoint model!)
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue