-
I update my model in my config like so
...which I have verified as being read properly by changing another option and seeing that update. when I press ctrl o to open the options it says gpt-4 (which I by the way cannot get to change back to gpt-3.5-turbo, now) but when I ask 'what model of chat gpt am I using' it always reports chat gpt 3. I've tried even trying to break the model by changing it to 'gpt-5' and it still seems to be using gpt 3 and doesn't break. |
Beta Was this translation helpful? Give feedback.
Replies: 10 comments 1 reply
-
I have the same issue |
Beta Was this translation helpful? Give feedback.
-
I have the same issue. I think the default actions specify 3.5 Turbo and override the params in the config. When I wrote a simple custom function with the model gpt-4-1106-preview, I received a reply from gpt-4 with cutoff early 2023. |
Beta Was this translation helpful? Give feedback.
-
Same here - is there a way how to change model to gpt-4-1106-preview? |
Beta Was this translation helpful? Give feedback.
-
i've been sitting with the same issue for months and months |
Beta Was this translation helpful? Give feedback.
-
how did you even get it to gpt-4? that's all I want |
Beta Was this translation helpful? Give feedback.
-
I changed the config to this config = function() Then in my custom actions.json I have the same default action but changing the model from 3.5 to gpt-4-1106-preview. For example: "complete_code_gpt4Turbo": { I tested it with an action that asked it which model it is and the cutoff and it said GPT4 and cutoff early 2023. So this works only for actions. The main chat window is still hooked to 3.5 despite the config changes. |
Beta Was this translation helpful? Give feedback.
-
this is because the plugin tries to read from a file in the home directory, depending on the type of settings. Look for .chatgpt-chat_completions-params.json and remove it from your home. Then the configuration should kick in. |
Beta Was this translation helpful? Give feedback.
-
I have the same issue, and I do not have the file .chatgpt-chat_completions-params.json . Is there any other suggestion? |
Beta Was this translation helpful? Give feedback.
-
for me...i have a ~/.chatgpt-chatcompletions-params.json ; |
Beta Was this translation helpful? Give feedback.
-
i have the same problem using ollama as an endpoint... some windows still use gtp3.5 turbo EDIT : It's work ! |
Beta Was this translation helpful? Give feedback.
this is because the plugin tries to read from a file in the home directory, depending on the type of settings. Look for .chatgpt-chat_completions-params.json and remove it from your home. Then the configuration should kick in.