mirror of
https://github.com/mudler/LocalAI.git
synced 2025-05-20 02:24:59 +00:00
fix: typos (#5376)
Signed-off-by: omahs <73983677+omahs@users.noreply.github.com>
This commit is contained in:
parent
525cf198be
commit
0f365ac204
13 changed files with 15 additions and 15 deletions
|
@ -9,7 +9,7 @@ ico = "rocket_launch"
|
|||
|
||||
### Build
|
||||
|
||||
LocalAI can be built as a container image or as a single, portable binary. Note that the some model architectures might require Python libraries, which are not included in the binary. The binary contains only the core backends written in Go and C++.
|
||||
LocalAI can be built as a container image or as a single, portable binary. Note that some model architectures might require Python libraries, which are not included in the binary. The binary contains only the core backends written in Go and C++.
|
||||
|
||||
LocalAI's extensible architecture allows you to add your own backends, which can be written in any language, and as such the container images contains also the Python dependencies to run all the available backends (for example, in order to run backends like __Diffusers__ that allows to generate images and videos from text).
|
||||
|
||||
|
@ -189,7 +189,7 @@ sudo xcode-select --switch /Applications/Xcode.app/Contents/Developer
|
|||
|
||||
- If completions are slow, ensure that `gpu-layers` in your model yaml matches the number of layers from the model in use (or simply use a high number such as 256).
|
||||
|
||||
- If you a get a compile error: `error: only virtual member functions can be marked 'final'`, reinstall all the necessary brew packages, clean the build, and try again.
|
||||
- If you get a compile error: `error: only virtual member functions can be marked 'final'`, reinstall all the necessary brew packages, clean the build, and try again.
|
||||
|
||||
```
|
||||
# reinstall build dependencies
|
||||
|
|
|
@ -39,7 +39,7 @@ Before you begin, ensure you have a container engine installed if you are not us
|
|||
|
||||
## All-in-one images
|
||||
|
||||
All-In-One images are images that come pre-configured with a set of models and backends to fully leverage almost all the LocalAI featureset. These images are available for both CPU and GPU environments. The AIO images are designed to be easy to use and requires no configuration. Models configuration can be found [here](https://github.com/mudler/LocalAI/tree/master/aio) separated by size.
|
||||
All-In-One images are images that come pre-configured with a set of models and backends to fully leverage almost all the LocalAI featureset. These images are available for both CPU and GPU environments. The AIO images are designed to be easy to use and require no configuration. Models configuration can be found [here](https://github.com/mudler/LocalAI/tree/master/aio) separated by size.
|
||||
|
||||
In the AIO images there are models configured with the names of OpenAI models, however, they are really backed by Open Source models. You can find the table below
|
||||
|
||||
|
|
|
@ -7,7 +7,7 @@ ico = "rocket_launch"
|
|||
+++
|
||||
|
||||
|
||||
For installing LocalAI in Kubernetes, the deployment file from the `examples` can be used and customized as prefered:
|
||||
For installing LocalAI in Kubernetes, the deployment file from the `examples` can be used and customized as preferred:
|
||||
|
||||
```
|
||||
kubectl apply -f https://raw.githubusercontent.com/mudler/LocalAI-examples/refs/heads/main/kubernetes/deployment.yaml
|
||||
|
@ -29,7 +29,7 @@ helm repo update
|
|||
# Get the values
|
||||
helm show values go-skynet/local-ai > values.yaml
|
||||
|
||||
# Edit the values value if needed
|
||||
# Edit the values if needed
|
||||
# vim values.yaml ...
|
||||
|
||||
# Install the helm chart
|
||||
|
|
2
docs/static/install.sh
vendored
2
docs/static/install.sh
vendored
|
@ -647,7 +647,7 @@ install_docker() {
|
|||
$SUDO docker volume create local-ai-data
|
||||
fi
|
||||
|
||||
# Check if container is already runnning
|
||||
# Check if container is already running
|
||||
if $SUDO docker ps -a --format '{{.Names}}' | grep -q local-ai; then
|
||||
info "LocalAI Docker container already exists, replacing it..."
|
||||
$SUDO docker rm -f local-ai
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue