diff --git a/README.md b/README.md index f30f85f8..a7018457 100644 --- a/README.md +++ b/README.md @@ -28,7 +28,6 @@ LocalAI is a community-driven project, focused on making the AI accessible to an - [Tutorial to use k8sgpt with LocalAI](https://medium.com/@tyler_97636/k8sgpt-localai-unlock-kubernetes-superpowers-for-free-584790de9b65) - excellent usecase for localAI, using AI to analyse Kubernetes clusters. - ## Model compatibility It is compatible with the models supported by [llama.cpp](https://github.com/ggerganov/llama.cpp) supports also [GPT4ALL-J](https://github.com/nomic-ai/gpt4all) and [cerebras-GPT with ggml](https://huggingface.co/lxe/Cerebras-GPT-2.7B-Alpaca-SP-ggml). @@ -438,7 +437,7 @@ Feel free to open up a PR to get your project listed! - [x] Multi-model support - [x] Have a webUI! - [x] Allow configuration of defaults for models. -- [ ] Enable automatic downloading of models from a curated gallery, with only free-licensed models. +- [ ] Enable automatic downloading of models from a curated gallery, with only free-licensed models, directly from the webui. ## Star history diff --git a/examples/chatbot-ui/README.md b/examples/chatbot-ui/README.md index ff181cb4..75fd073f 100644 --- a/examples/chatbot-ui/README.md +++ b/examples/chatbot-ui/README.md @@ -19,7 +19,7 @@ cd LocalAI/examples/chatbot-ui wget https://gpt4all.io/models/ggml-gpt4all-j.bin -O models/ggml-gpt4all-j # start with docker-compose -docker compose up -d --build +docker-compose up -d --build ``` Open http://localhost:3000 for the Web UI.