🤖 The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed, P2P inference https://localai.io
Find a file
Ettore Di Giacinto 56f44d448c chore(docs): decrease logo size, minor enhancements
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-04-15 22:00:51 +02:00
.devcontainer chore(stablediffusion-ncn): drop in favor of ggml implementation (#4652) 2025-01-22 19:34:16 +01:00
.devcontainer-scripts test: preliminary tests and merge fix for authv2 (#3584) 2024-09-24 09:32:48 +02:00
.github chore(deps): bump securego/gosec from 2.22.0 to 2.22.3 (#5134) 2025-04-07 21:09:45 +00:00
.vscode chore(stablediffusion-ncn): drop in favor of ggml implementation (#4652) 2025-01-22 19:34:16 +01:00
aio feat(aio): update AIO image defaults (#5002) 2025-03-12 12:55:06 +01:00
backend fix(stablediffusion): Avoid overwriting SYCL specific flags from outer make call (#5181) 2025-04-15 19:31:25 +02:00
configuration refactor: move remaining api packages to core (#1731) 2024-03-01 16:19:53 +01:00
core feat: rebrand - LocalAGI and LocalRecall joins the LocalAI stack family (#5159) 2025-04-15 17:51:24 +02:00
custom-ca-certs feat(certificates): add support for custom CA certificates (#880) 2023-11-01 20:10:14 +01:00
docs chore(docs): decrease logo size, minor enhancements 2025-04-15 22:00:51 +02:00
examples chore: create examples/README to redirect to the new repository 2024-10-30 09:11:32 +01:00
gallery chore(model gallery): add m1-32b (#5182) 2025-04-15 17:17:17 +02:00
internal feat: cleanups, small enhancements 2023-07-04 18:58:19 +02:00
models Add docker-compose 2023-04-13 01:13:14 +02:00
pkg feat(loader): enhance single active backend by treating as singleton (#5107) 2025-04-01 20:58:11 +02:00
prompt-templates Requested Changes from GPT4ALL to Luna-AI-Llama2 (#1092) 2023-09-22 11:22:17 +02:00
scripts chore(scripts): handle summarization errors (#4271) 2024-11-26 14:51:55 +01:00
swagger feat(swagger): update swagger (#4809) 2025-02-11 21:54:40 +00:00
tests feat(loader): enhance single active backend by treating as singleton (#5107) 2025-04-01 20:58:11 +02:00
.dockerignore feat: Initial Version of vscode DevContainer (#3217) 2024-08-14 09:06:41 +02:00
.editorconfig feat(stores): Vector store backend (#1795) 2024-03-22 21:14:04 +01:00
.env fix: race during stop of active backends (#5106) 2025-04-01 00:01:10 +02:00
.gitattributes chore(linguist): add *.hpp files to linguist-vendored (#4154) 2024-11-14 14:12:16 +01:00
.gitignore feat(bark-cpp): add new bark.cpp backend (#4287) 2024-11-28 22:16:44 +01:00
.gitmodules docs/examples: enhancements (#1572) 2024-01-18 19:41:08 +01:00
.yamllint fix: yamlint warnings and errors (#2131) 2024-04-25 17:25:56 +00:00
assets.go feat: Update gpt4all, support multiple implementations in runtime (#472) 2023-06-01 23:38:52 +02:00
CONTRIBUTING.md Update CONTRIBUTING.md (#3723) 2024-10-03 20:03:35 +02:00
docker-compose.yaml feat: Initial Version of vscode DevContainer (#3217) 2024-08-14 09:06:41 +02:00
Dockerfile fix: ensure git-lfs is present (#5078) 2025-03-27 22:23:28 +01:00
Dockerfile.aio feat(aio): entrypoint, update workflows (#1872) 2024-03-21 22:09:04 +01:00
Earthfile Rename project to LocalAI (#35) 2023-04-19 18:43:10 +02:00
Entitlements.plist Feat: OSX Local Codesigning (#1319) 2023-11-23 15:22:54 +01:00
entrypoint.sh deps(llama.cpp): bump to latest, update build variables (#2669) 2024-06-27 23:10:04 +02:00
go.mod chore(deps): Bump edgevpn to v0.30.1 (#4840) 2025-02-17 16:51:22 +01:00
go.sum chore(deps): Bump edgevpn to v0.30.1 (#4840) 2025-02-17 16:51:22 +01:00
LICENSE chore(docs): update license year 2025-02-15 18:17:15 +01:00
main.go chore: drop remoteLibraryURL from kong vars (#5103) 2025-03-31 22:48:17 +02:00
Makefile fix(stablediffusion): Pass ROCM LD CGO flags through to recursive make (#5179) 2025-04-15 09:27:29 +02:00
README.md feat: rebrand - LocalAGI and LocalRecall joins the LocalAI stack family (#5159) 2025-04-15 17:51:24 +02:00
renovate.json ci: manually update deps 2023-05-04 15:01:29 +02:00
SECURITY.md Create SECURITY.md 2024-02-29 19:53:04 +01:00
webui_static.yaml chore: drop embedded models (#4715) 2025-01-30 00:03:01 +01:00




LocalAI forks LocalAI stars LocalAI pull-requests

LocalAI Docker hub LocalAI Quay.io

Follow LocalAI_API Join LocalAI Discord Community

mudler%2FLocalAI | Trendshift

💡 Get help - FAQ 💭Discussions 💬 Discord 📖 Documentation website

💻 Quickstart 🖼️ Models 🚀 Roadmap 🥽 Demo 🌍 Explorer 🛫 Examples

testsBuild and Releasebuild container imagesBump dependenciesArtifact Hub

LocalAI is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API that's compatible with OpenAI (Elevenlabs, Anthropic... ) API specifications for local AI inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families. Does not require GPU. It is created and maintained by Ettore Di Giacinto.

📚🆕 Local Stack Family

🆕 LocalAI is now part of a comprehensive suite of AI tools designed to work together:

LocalAGI Logo

LocalAGI

A powerful Local AI agent management platform that serves as a drop-in replacement for OpenAI's Responses API, enhanced with advanced agentic capabilities.

LocalRecall Logo

LocalRecall

A REST-ful API and knowledge base management system that provides persistent memory and storage capabilities for AI agents.

Screenshots

Talk Interface Generate Audio
Screenshot 2025-03-31 at 12-01-36 LocalAI - Talk Screenshot 2025-03-31 at 12-01-29 LocalAI - Generate audio with voice-en-us-ryan-low
Models Overview Generate Images
Screenshot 2025-03-31 at 12-01-20 LocalAI - Models Screenshot 2025-03-31 at 12-31-41 LocalAI - Generate images with flux 1-dev
Chat Interface Home
Screenshot 2025-03-31 at 11-57-44 LocalAI - Chat with localai-functioncall-qwen2 5-7b-v0 5 Screenshot 2025-03-31 at 11-57-23 LocalAI API - c2a39e3 (c2a39e3639227cfd94ffffe9f5691239acc275a8)
Login Swarm
Screenshot 2025-03-31 at 12-09-59 Screenshot 2025-03-31 at 12-10-39 LocalAI - P2P dashboard

💻 Quickstart

Run the installer script:

curl https://localai.io/install.sh | sh

Or run with docker:

CPU only image:

docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-cpu

Nvidia GPU:

docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-gpu-nvidia-cuda-12

CPU and GPU image (bigger size):

docker run -ti --name local-ai -p 8080:8080 localai/localai:latest

AIO images (it will pre-download a set of models ready for use, see https://localai.io/basics/container/)

docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-aio-cpu

To load models:

# From the model gallery (see available models with `local-ai models list`, in the WebUI from the model tab, or visiting https://models.localai.io)
local-ai run llama-3.2-1b-instruct:q4_k_m
# Start LocalAI with the phi-2 model directly from huggingface
local-ai run huggingface://TheBloke/phi-2-GGUF/phi-2.Q8_0.gguf
# Install and run a model from the Ollama OCI registry
local-ai run ollama://gemma:2b
# Run a model from a configuration file
local-ai run https://gist.githubusercontent.com/.../phi-2.yaml
# Install and run a model from a standard OCI registry (e.g., Docker Hub)
local-ai run oci://localai/phi-2:latest

For more information, see 💻 Getting started

📰 Latest project news

Roadmap items: List of issues

🚀 Features

🔗 Community and integrations

Build and deploy custom containers:

WebUIs:

Model galleries

Other:

🔗 Resources

📖 🎥 Media, Blogs, Social

Citation

If you utilize this repository, data in a downstream project, please consider citing it with:

@misc{localai,
  author = {Ettore Di Giacinto},
  title = {LocalAI: The free, Open source OpenAI alternative},
  year = {2023},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/go-skynet/LocalAI}},

❤️ Sponsors

Do you find LocalAI useful?

Support the project by becoming a backer or sponsor. Your logo will show up here with a link to your website.

A huge thank you to our generous sponsors who support this project covering CI expenses, and our Sponsor list:


🌟 Star history

LocalAI Star history Chart

📖 License

LocalAI is a community-driven project created by Ettore Di Giacinto.

MIT - Author Ettore Di Giacinto mudler@localai.io

🙇 Acknowledgements

LocalAI couldn't have been built without the help of great software already available from the community. Thank you!

🤗 Contributors

This is a community project, a special thanks to our contributors! 🤗