From 1bf8f996d1d6cba8bf214ac98f7facd1bb2b7dbc Mon Sep 17 00:00:00 2001 From: Ettore Di Giacinto Date: Sat, 29 Apr 2023 14:50:22 +0200 Subject: [PATCH] docs: clarify GPT4ALL-J licensing (#120) --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 2bf92a5f..1432b596 100644 --- a/README.md +++ b/README.md @@ -9,7 +9,7 @@ [![](https://dcbadge.vercel.app/api/server/uJAeKSAGDy?style=flat-square&theme=default-inverted)](https://discord.gg/uJAeKSAGDy) -**LocalAI** is a straightforward, drop-in replacement API compatible with OpenAI for local CPU inferencing, based on [llama.cpp](https://github.com/ggerganov/llama.cpp), [gpt4all](https://github.com/nomic-ai/gpt4all) and [ggml](https://github.com/ggerganov/ggml), including support GPT4ALL-J which is Apache 2.0 Licensed and can be used for commercial purposes. +**LocalAI** is a straightforward, drop-in replacement API compatible with OpenAI for local CPU inferencing, based on [llama.cpp](https://github.com/ggerganov/llama.cpp), [gpt4all](https://github.com/nomic-ai/gpt4all) and [ggml](https://github.com/ggerganov/ggml), including support GPT4ALL-J which is licensed under Apache 2.0. - OpenAI compatible API - Supports multiple-models