mirror of
https://github.com/mudler/LocalAI.git
synced 2025-05-29 06:54:59 +00:00
![]() **Description** This PR syncs up the `llama` backend to use `gguf` (https://github.com/go-skynet/go-llama.cpp/pull/180). It also adds `llama-stable` to the targets so we can still load ggml. It adapts the current tests to use the `llama-backend` for ggml and uses a `gguf` model to run tests on the new backend. In order to consume the new version of go-llama.cpp, it also bump go to 1.21 (images, pipelines, etc) --------- Signed-off-by: Ettore Di Giacinto <mudler@localai.io> |
||
---|---|---|
.. | ||
assets | ||
backend | ||
gallery | ||
grammar | ||
grpc | ||
langchain | ||
model | ||
stablediffusion | ||
utils |