docs: Update model usage statistics and dumb terminal message

This commit is contained in:
Paul Gauthier 2025-01-10 14:04:41 -08:00 committed by Paul Gauthier (aider)
parent 62498ec867
commit 50436e3106
5 changed files with 416 additions and 410 deletions

View file

@ -27,6 +27,8 @@ cog.out(text)
- Prompts to help DeepSeek work better when alternating between `/ask` and `/code`.
- Streaming pretty LLM responses is smoother and faster for long replies.
- Streaming automatically turns of for model that don't support it
- Can now switch to/from `/model o1` and a streaming model
- Increased max chat history tokens limit from 4k to 8k.
- Added support for custom voice format and input device settings.
- Disabled Streamlit email prompt, by apaz-cli.