-
Notifications
You must be signed in to change notification settings - Fork 44.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use GPT-4 in Agent loop by default #4899
Use GPT-4 in Agent loop by default #4899
Conversation
✅ Deploy Preview for auto-gpt-docs ready!
To edit notification comments on pull requests, go to your Netlify site configuration. |
✅ Deploy Preview for auto-gpt-docs canceled.
|
a83739c
to
d2a8402
Compare
You changed AutoGPT's behaviour. The cassettes have been updated and will be merged to the submodule when this Pull Request gets merged. |
Codecov ReportPatch coverage:
Additional details and impacted files@@ Coverage Diff @@
## master #4899 +/- ##
==========================================
- Coverage 50.21% 50.17% -0.05%
==========================================
Files 116 116
Lines 4809 4809
Branches 645 645
==========================================
- Hits 2415 2413 -2
- Misses 2215 2216 +1
- Partials 179 180 +1
☔ View full report in Codecov by Sentry. |
This comment was marked as outdated.
This comment was marked as outdated.
not sure if this still being capped for some users ? |
* Use GPT-4 as default smart LLM in Agent * Rename (smart|fast)_llm_model to (smart|fast)_llm everywhere * Fix test_config.py::test_initial_values * Fix test_config.py::test_azure_config * Fix Azure config backwards compatibility
Background
OpenAI just announced GPT-4 general availability: https://openai.com/blog/gpt-4-api-general-availability
UPDATE: "general availability" only applies to existing developers "with a history of successful payments" for now. :(
Changes
smart_llm_model
tosmart_llm
andfast_llm_model
tofast_llm
everywhere (with backwards compatibility)Documentation
yes
Test Plan
CI
PR Quality Checklist