-
Notifications
You must be signed in to change notification settings - Fork 44.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make prompt parameters configurable #3375
Make prompt parameters configurable #3375
Conversation
Making the openai base url and embedding dimension configurable, these are useful to integrate AutoGPT with other models, like LLaMA
Loading prompt constraints, resources and performance evaluation from a yaml file (default prompt_settings.yaml) Can be set from .env (PROMPT_SETTINGS_FILE) or by commandline (--prompt-settings or -P)
The latest updates on your projects. Learn more about Vercel for Git ↗︎ 1 Ignored Deployment
|
This pull request has conflicts with the base branch, please resolve those so we can evaluate the pull request. |
Conflicts have been resolved! 🎉 A maintainer will review the pull request shortly. |
This pull request has conflicts with the base branch, please resolve those so we can evaluate the pull request. |
Conflicts have been resolved! 🎉 A maintainer will review the pull request shortly. |
This pull request has conflicts with the base branch, please resolve those so we can evaluate the pull request. |
I've not really clear what are you referring to, but it doesn't seem related |
sry, you're right, I was meaning to respond in the issue where someone is working on abstracting out the OpenAI API |
This PR exceeds the recommended size of 200 lines. Please make sure you are NOT addressing multiple issues with one PR. Note this PR might be rejected due to its size |
Please update the branch. |
Done |
This PR exceeds the recommended size of 200 lines. Please make sure you are NOT addressing multiple issues with one PR. Note this PR might be rejected due to its size |
This PR exceeds the recommended size of 200 lines. Please make sure you are NOT addressing multiple issues with one PR. Note this PR might be rejected due to its size |
NICE thanks @DGdev91 BIG BIG BIG are you on discord ? Join us if not. |
good change |
this doesn't seem to cover the hardcoded prompts in |
Yes, i'm already there. DGdev91#8061
True. And there's also still some hardcoded stuff in generator.py. But i wanted to keep this PR simple, and eventually add other stuff to the new yaml config file later |
i have found another prompt that is hardcoded
thank you for doing this, it's really great to have the prompts configurable. all of them. |
Co-authored-by: Nicholas Tindle <[email protected]> Co-authored-by: k-boikov <[email protected]>
Loading prompt constraints, resources and performance evaluation from a yaml file (default prompt_settings.yaml)
Can be set from .env (PROMPT_SETTINGS_FILE) or by commandline (--prompt-settings or -P)
potentially resolves #3954
Background
The main reason for proposed this changes is because they can help with with different LLM models, we talked about that in #25 #567 #2158.
They don't handle the prompts in the same way as GPT3.5/GPT4, and they often get confused. this way can be easy to create and share prompts made specifically for them.
Also, it can be useful for models made for different languages, like for example https://github.com/FreedomIntelligence/LLMZoo
....Or just for people who want to have more control on what AutoGPT can do without having to modify the code
Changes
Moved the hardcoded default prompt constraints, resources and performance evaluation form prompt.py to the new file prompt_settings.yaml, wich will be used as default file.
Added the new configuration variable PROMPT_SETTINGS_FILE in .env.template and modified config.py to handle it
Added the new file autogpt/config/prompt_config.py, wich contain the PromptConfig class, wich is initialized passing the file path and contains the datas from the configuration file
Moved the hardcoded default prompt constraints, resources and performance evaluation form prompt.py to the new file prompt_settings.yaml, wich will be used as default file.
Modified prompt.py to use the values from the PromptConfig instead of hardcoded datas
Modified cli.py, main.py and configurator.py to handle the new --prompt-settings / -P commandline args
Documentation
The new .env variable PROMPT_SETTINGS_FILE is described there, while the new --prompt-settings/-P comd line args are described both in cli.py and in usage.md.
I followed the same policy used for the ai.settings.yaml file
Test Plan
To check if the prompt have actually changed, i also used those changes in my fork (see #2594) while connecting to https://github.com/keldenl/gpt-llama.cpp
I know it isn't something officially supported, but it's still a good and quick way to see what's going on, since the webservice prints the prompt on the standard output
PR Quality Checklist