mirror of
https://github.com/mudler/LocalAI.git
synced 2025-05-20 10:35:01 +00:00
chore: drop petals (#3316)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
This commit is contained in:
parent
1d651bbfad
commit
9475a6fa05
18 changed files with 3 additions and 319 deletions
|
@ -150,7 +150,6 @@ The devices in the following list have been tested with `hipblas` images running
|
|||
| exllama | no | none |
|
||||
| exllama2 | no | none |
|
||||
| mamba | no | none |
|
||||
| petals | no | none |
|
||||
| sentencetransformers | no | none |
|
||||
| transformers-musicgen | no | none |
|
||||
| vall-e-x | no | none |
|
||||
|
|
|
@ -44,7 +44,6 @@ LocalAI will attempt to automatically load models which are not explicitly confi
|
|||
| `transformers-musicgen` | | no | Audio generation | no | no | N/A |
|
||||
| [tinydream](https://github.com/symisc/tiny-dream#tiny-dreaman-embedded-header-only-stable-diffusion-inference-c-librarypixlabiotiny-dream) | stablediffusion | no | Image | no | no | N/A |
|
||||
| `coqui` | Coqui | no | Audio generation and Voice cloning | no | no | CPU/CUDA |
|
||||
| `petals` | Various GPTs and quantization formats | yes | GPT | no | no | CPU/CUDA |
|
||||
| `transformers` | Various GPTs and quantization formats | yes | GPT, embeddings | yes | yes**** | CPU/CUDA/XPU |
|
||||
|
||||
Note: any backend name listed above can be used in the `backend` field of the model configuration file (See [the advanced section]({{%relref "docs/advanced" %}})).
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue