Updated token error to reflect that token counts can be approximate

This commit is contained in:
Paul Gauthier 2024-06-23 16:19:04 -07:00
parent 9b60243932
commit ecb3d81055
2 changed files with 17 additions and 17 deletions

View file

@ -67,7 +67,7 @@ To avoid hitting output token limits:
- Ask for smaller changes in each request.
- Break your code into smaller source files.
- Try using a stronger model like gpt-4o or opus that can return diffs.
- Use a strong model like gpt-4o, sonnet or opus that can return diffs.
## Other causes