diff --git a/README.md b/README.md
index a1bf607c..70e4dc85 100644
--- a/README.md
+++ b/README.md
@@ -5,44 +5,92 @@
-[](https://github.com/go-skynet/LocalAI/actions/workflows/test.yml) [](https://github.com/go-skynet/LocalAI/actions/workflows/image.yml) [](https://artifacthub.io/packages/search?repo=localai) [](https://discord.gg/uJAeKSAGDy)
+
+
+
+
+
+
+
+
+
+
+
+
+
+
> :bulb: Get help - [βFAQ](https://localai.io/faq/) [πDiscussions](https://github.com/go-skynet/LocalAI/discussions) [:speech_balloon: Discord](https://discord.gg/uJAeKSAGDy) [:book: Documentation website](https://localai.io/)
+>
+> [π» Quickstart](https://localai.io/basics/getting_started/) [π£ News](https://localai.io/basics/news/) [ π« Examples ](https://github.com/go-skynet/LocalAI/tree/master/examples/) [ πΌοΈ Models ](https://localai.io/models/)
+
+
+[](https://github.com/go-skynet/LocalAI/actions/workflows/test.yml)[](https://github.com/go-skynet/LocalAI/actions/workflows/release.yaml)[](https://github.com/go-skynet/LocalAI/actions/workflows/image.yml)[](https://github.com/go-skynet/LocalAI/actions/workflows/bump_deps.yaml)[](https://artifacthub.io/packages/search?repo=localai)
**LocalAI** is a drop-in replacement REST API that's compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families that are compatible with the ggml format. Does not require GPU.
+Follow LocalAI
+
+
+
+
+
+
+
+
+
+
Connect with the Creator
+
+
+
+
+
+
+
+
+
+
+Share LocalAI Repository
+
+
+
+
+
+
+
+
+
+
+
+
+
+
In a nutshell:
- Local, OpenAI drop-in alternative REST API. You own your data.
- NO GPU required. NO Internet access is required either
- Optional, GPU Acceleration is available in `llama.cpp`-compatible LLMs. See also the [build section](https://localai.io/basics/build/index.html).
-- Supports multiple models:
- - π [Text generation with GPTs](https://localai.io/features/text-generation/) (`llama.cpp`, `gpt4all.cpp`, ... [:book: and more](https://localai.io/model-compatibility/index.html#model-compatibility-table))
- - π£ [Text to Audio](https://localai.io/features/text-to-audio/)
- - π [Audio to Text](https://localai.io/features/audio-to-text/) (Audio transcription with `whisper.cpp`)
- - π¨ [Image generation with stable diffusion](https://localai.io/features/image-generation)
- - π₯ [OpenAI functions](https://localai.io/features/openai-functions/) π
- - π§ [Embeddings generation for vector databases](https://localai.io/features/embeddings/)
- - βοΈ [Constrained grammars](https://localai.io/features/constrained_grammars/)
+- Supports multiple models
- π Once loaded the first time, it keep models loaded in memory for faster inference
-- β‘ Doesn't shell-out, but uses C++ bindings for a faster inference and better performance.
+- β‘ Doesn't shell-out, but uses C++ bindings for a faster inference and better performance.
LocalAI was created by [Ettore Di Giacinto](https://github.com/mudler/) and is a community-driven project, focused on making the AI accessible to anyone. Any contribution, feedback and PR is welcome!
Note that this started just as a [fun weekend project](https://localai.io/#backstory) in order to try to create the necessary pieces for a full AI assistant like `ChatGPT`: the community is growing fast and we are working hard to make it better and more stable. If you want to help, please consider contributing (see below)!
-See the [Getting started](https://localai.io/basics/getting_started/index.html) and [examples](https://github.com/go-skynet/LocalAI/tree/master/examples/) sections to learn how to use LocalAI. For a list of curated models check out the [model gallery](https://localai.io/models/).
+## π Features
+- π [Text generation with GPTs](https://localai.io/features/text-generation/) (`llama.cpp`, `gpt4all.cpp`, ... [:book: and more](https://localai.io/model-compatibility/index.html#model-compatibility-table))
+- π£ [Text to Audio](https://localai.io/features/text-to-audio/)
+- π [Audio to Text](https://localai.io/features/audio-to-text/) (Audio transcription with `whisper.cpp`)
+- π¨ [Image generation with stable diffusion](https://localai.io/features/image-generation)
+- π₯ [OpenAI functions](https://localai.io/features/openai-functions/) π
+- π§ [Embeddings generation for vector databases](https://localai.io/features/embeddings/)
+- βοΈ [Constrained grammars](https://localai.io/features/constrained_grammars/)
+- πΌοΈ [Download Models directly from Huggingface ](https://localai.io/models/)
-| [ChatGPT OSS alternative](https://github.com/go-skynet/LocalAI/tree/master/examples/chatbot-ui) | [Image generation](https://localai.io/api-endpoints/index.html#image-generation) |
-|------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------|
-|  |  |
-
-| [Telegram bot](https://github.com/go-skynet/LocalAI/tree/master/examples/telegram-bot) | [Flowise](https://github.com/go-skynet/LocalAI/tree/master/examples/flowise) |
-|------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------|
- | | |
-
-## Hot topics / Roadmap
+## π₯π₯ Hot topics / Roadmap
- [x] Support for embeddings
- [x] Support for audio transcription with https://github.com/ggerganov/whisper.cpp
@@ -55,32 +103,14 @@ See the [Getting started](https://localai.io/basics/getting_started/index.html)
- [x] π₯ OpenAI functions: https://github.com/go-skynet/LocalAI/issues/588
- [ ] π₯ GPTQ support: https://github.com/go-skynet/LocalAI/issues/796
-## News
-
-Check the news and the release notes in the [dedicated section](https://localai.io/basics/news/index.html)
-
-- π₯π₯π₯ 23-07-2023: **v1.22.0**: LLaMa2, huggingface embeddings, and more ! [Changelog](https://github.com/go-skynet/LocalAI/releases/tag/v1.22.0)
-
-For latest news, follow also on Twitter [@LocalAI_API](https://twitter.com/LocalAI_API) and [@mudler_it](https://twitter.com/mudler_it)
-
-## Media, Blogs, Social
+## :book: π₯ Media, Blogs, Social
- [Create a slackbot for teams and OSS projects that answer to documentation](https://mudler.pm/posts/smart-slackbot-for-teams/)
- [LocalAI meets k8sgpt](https://www.youtube.com/watch?v=PKrDNuJ_dfE)
- [Question Answering on Documents locally with LangChain, LocalAI, Chroma, and GPT4All](https://mudler.pm/posts/localai-question-answering/)
- [Tutorial to use k8sgpt with LocalAI](https://medium.com/@tyler_97636/k8sgpt-localai-unlock-kubernetes-superpowers-for-free-584790de9b65)
-## Contribute and help
-
-To help the project you can:
-
-- [Hacker news post](https://news.ycombinator.com/item?id=35726934) - help us out by voting if you like this project.
-
-- If you have technological skills and want to contribute to development, have a look at the open issues. If you are new you can have a look at the [good-first-issue](https://github.com/go-skynet/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) and [help-wanted](https://github.com/go-skynet/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22help+wanted%22) labels.
-
-- If you don't have technological skills you can still help improving documentation or add examples or share your user-stories with our community, any help and contribution is welcome!
-
-## Usage
+## π» Usage
Check out the [Getting started](https://localai.io/basics/getting_started/index.html) section. Here below you will find generic, quick instructions to get ready and use LocalAI.
@@ -117,7 +147,7 @@ curl http://localhost:8080/v1/completions -H "Content-Type: application/json" -d
}'
```
-### Example: Use GPT4ALL-J model
+### π‘ Example: Use GPT4ALL-J model
@@ -158,55 +188,13 @@ curl http://localhost:8080/v1/chat/completions -H "Content-Type: application/jso
-### Build locally
+### π Resources
-
+- [How to build locally](https://localai.io/basics/build/index.html)
+- [How to install in Kubernetes](https://localai.io/basics/getting_started/index.html#run-localai-in-kubernetes)
+- [Projects integrating LocalAI](https://localai.io/integrations/)
-In order to build the `LocalAI` container image locally you can use `docker`:
-
-```
-# build the image
-docker build -t localai .
-docker run localai
-```
-
-Or you can build the binary with `make`:
-
-```
-make build
-```
-
-
-
-See the [build section](https://localai.io/basics/build/index.html) in our documentation for detailed instructions.
-
-### Run LocalAI in Kubernetes
-
-LocalAI can be installed inside Kubernetes with helm. See [installation instructions](https://localai.io/basics/getting_started/index.html#run-localai-in-kubernetes).
-
-## Supported API endpoints
-
-See the [list of the LocalAI features](https://localai.io/features/index.html) for a full tour of the available API endpoints.
-
-## Frequently asked questions
-
-See [the FAQ](https://localai.io/faq/index.html) section for a list of common questions.
-
-## Projects already using LocalAI to run local models
-
-Feel free to open up a PR to get your project listed!
-
-- [Kairos](https://github.com/kairos-io/kairos)
-- [k8sgpt](https://github.com/k8sgpt-ai/k8sgpt#running-local-models)
-- [Spark](https://github.com/cedriking/spark)
-- [autogpt4all](https://github.com/aorumbayev/autogpt4all)
-- [Mods](https://github.com/charmbracelet/mods)
-- [Flowise](https://github.com/FlowiseAI/Flowise)
-- [BMO Chatbot](https://github.com/longy2k/obsidian-bmo-chatbot)
-- [Mattermost OpenOps](https://openops.mattermost.com)
-- [LocalAGI](https://github.com/mudler/LocalAGI)
-
-## Sponsors
+## β€οΈ Sponsors
> Do you find LocalAI useful?
@@ -219,21 +207,17 @@ A huge thank you to our generous sponsors who support this project:
| [Spectro Cloud](https://www.spectrocloud.com/) |
| Spectro Cloud kindly supports LocalAI by providing GPU and computing resources to run tests on lamdalabs! |
-## Star history
+## π Star history
[](https://star-history.com/#go-skynet/LocalAI&Date)
-## License
+## π License
LocalAI is a community-driven project created by [Ettore Di Giacinto](https://github.com/mudler/).
-MIT
+MIT - Author Ettore Di Giacinto
-## Author
-
-Ettore Di Giacinto and others
-
-## Acknowledgements
+## π Acknowledgements
LocalAI couldn't have been built without the help of great software already available from the community. Thank you!
@@ -244,9 +228,12 @@ LocalAI couldn't have been built without the help of great software already avai
- https://github.com/EdVince/Stable-Diffusion-NCNN
- https://github.com/ggerganov/whisper.cpp
- https://github.com/saharNooby/rwkv.cpp
+- https://github.com/rhasspy/piper
+- https://github.com/cmp-nct/ggllm.cpp
-## Contributors
+## π€ Contributors
+This is a community project, a special thanks to our contributors! π€