Replies: 2 comments 2 replies
-
Sounds similar to: #592 |
Beta Was this translation helpful? Give feedback.
0 replies
-
@cxar , any chance you can give an example? |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Seems like local models don't receive the context + current file if the file is too long, or just fail to acknowledge it. Seems to work for smaller files or remote models. For some reason it also doesn't work for highlights. I believe the long context may be causing these smaller local models to get confused.
It could be nice to have a config to limit the max code context size sent to the model.
Opening this discussion to see if others have experienced this + have any workarounds
Update: I had to set the ollama max context to match the model's max context using a Modelfile. It is now working as-expected.
Beta Was this translation helpful? Give feedback.
All reactions