-
Notifications
You must be signed in to change notification settings - Fork 44.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve token counting; account for functions #4919
Improve token counting; account for functions #4919
Conversation
✅ Deploy Preview for auto-gpt-docs canceled.
|
3b47cfa
to
3d207d9
Compare
3d207d9
to
c9c68d5
Compare
You changed AutoGPT's behaviour. The cassettes have been updated and will be merged to the submodule when this Pull Request gets merged. |
Codecov ReportPatch coverage:
Additional details and impacted files@@ Coverage Diff @@
## master #4919 +/- ##
=======================================
Coverage 50.55% 50.56%
=======================================
Files 116 115 -1
Lines 4860 4818 -42
Branches 657 638 -19
=======================================
- Hits 2457 2436 -21
+ Misses 2219 2205 -14
+ Partials 184 177 -7
☔ View full report in Codecov by Sentry. |
You changed AutoGPT's behaviour. The cassettes have been updated and will be merged to the submodule when this Pull Request gets merged. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very nice work on figuring out the function token counting. All comments here optional or for discussion.
You changed AutoGPT's behaviour. The cassettes have been updated and will be merged to the submodule when this Pull Request gets merged. |
4cc221b
to
e081116
Compare
You changed AutoGPT's behaviour. The cassettes have been updated and will be merged to the submodule when this Pull Request gets merged. |
) * Improvements to token counting, including functions --------- Co-authored-by: James Collins <[email protected]>
) * Improvements to token counting, including functions --------- Co-authored-by: James Collins <[email protected]>
Background
Part of #4799
When using the OpenAI Function Call API, the specified functions count towards token consumption. However, they are currently not accounted for in token calculation and compensation logic.
Changes
autogpt.llm.providers.openai
:get_openai_command_spec
take aCommandRegistry
andConfig
as arguments instead of anAgent
count_openai_functions_tokens
format_function_specs_as_typescript_ns
autogpt.llm.base
:"function"
as a valid role in a messageResponseMessageDict
LLMResponse
, disable unused attributesprompt_tokens_used
,completion_tokens_used
autogpt.llm.utils
:max_tokens
token_counter
>count_message_tokens
: allow single message argumentsDocumentation
x
Test Plan
CI
PR Quality Checklist