-
-
Notifications
You must be signed in to change notification settings - Fork 173
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Twinny tries ollama URL for oobabooga embeddings #336
Comments
Hey, thanks for the report. However, I'm sorry but I cannot replicate this? |
I can test next week more. Anything I should start with? What I did: I use ooba for chat and it works fine and I do not have a FIM model configured yet. |
Thanks for the detailed response. Please could you let me know what version you are using? |
twinny-3.17.20-linux-x64 |
Had the same issue with embedding with twinny <-> LM Studio. |
The question is whether the default settings shouldn't be overwritten anyway when I configure an own provider. |
Hey, sorry about this bug. I thought I'd removed that code in a previous version, I just released version v3.17.24 which should address it. Many thanks, |
I still have the problem with v3.17.24. |
I think the problem may start somewhere else. When I select the embedding provider after I clicked the cylinder icon, it switched back to blank after a few seconds. It neither tries to reach the ooba port nor the ollama port. I think trying to embed without a selected provider then defaults to ollama. Under what conditions is the selection of the embedding provider changed? Are there some checks that fail for the configured provider? I wonder what may be checked, because there is no http request that could fail in the time from selecting the ooba provider and twinny switching back to the empty item in the select box. |
Describe the bug
I configured oobabooga as embedding provider:
127.0.0.1
5000
/v1/embeddings
all-mpnet-base-v2
When I now select the provider and click "Embed workspace documents" vscode still requests
http://0.0.0.0:11434/api/embed
To Reproduce
Try to use oobabooga as embedding provider.
Expected behavior
The configured provider, URL, port and path should be used..
API Provider
Oobabooga
Chat or Auto Complete?
Embedding
Model Name
all-mpnet-base-v2
Desktop (please complete the following information):
Additional context
Chat works as expected with oobabooga (chat) provider.
The text was updated successfully, but these errors were encountered: