Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Ollama #616

Closed
Mulugruntz opened this issue Jan 25, 2024 · 6 comments
Closed

Support for Ollama #616

Mulugruntz opened this issue Jan 25, 2024 · 6 comments

Comments

@Mulugruntz
Copy link

Mulugruntz commented Jan 25, 2024

Is your feature request related to a problem? Please describe.
新功能是否与解决某个问题相关, 请描述
When trying to have the extension target my local Mistral handled by Ollama, it fails (HTTP 403).
image

The settings:
image
I haven't digged into the code yet, but maybe it's because it uses a different API.

Can see here that Ollama WebUI works as expected
image

and that Ollama is running
image

Describe the solution you'd like
你期望的新功能实现方案
ChatGPTBox to work with Ollama.

Additional context
其他
Yes, see above :).

@josStorer
Copy link
Owner

Could you please capture the request through F12 Network in the Ollama WebUI, and provide the relevant request headers and body

@Donno191
Copy link

It would be nice to use ollama ! Maybe it works with the support of ollama openai api ? Example of settings ?

@josStorer
Copy link
Owner

josStorer commented Mar 3, 2024

@Mulugruntz @Donno191

After consulting the documentation of ollama, I found that ollama provides APIs compatible with OpenAI API.

And for the 403 error, it is actually caused by cross-origin requests. According to https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-allow-additional-web-origins-to-access-ollama, you can set environment variable OLLAMA_ORIGINS to *, and then restart ollama to allow requests from browser extensions.

The settings of ChatGPTBox are as follows:
image

@BarfingLemurs
Copy link

thanks for the tip, filling the model name eg: "gemma:2b" is a requirement for ollama to works, along with: export OLLAMA_ORIGINS=* on linux.

@Katzenwerfer
Copy link

Something to note.
Setting origins with a wildcard is generally not recommended.
It is better to be specific and ideally add moz-extension://[ID OF THE EXTENSION], or with a wildcard if you can't find the ID moz-extension://*

@lrq3000
Copy link

lrq3000 commented Nov 28, 2024

For reference, the issue was discussed here on the ollama issues tracker:

For better security, it was suggested here to set the var to the following:

OLLAMA_ORIGINS=chrome-extension://* ollama serve

Note that on Windows you need to logout and log back in, or restart the computer, for the environment variable change to take effect. If you want to try asap, you can instead kill ollama, then open a command line terminal and write:

ollama serve```

This will temporarily allow cross origin access. But this won't persist, if you want a permanent environment change, use instead:

```setx OLLAMA_ORIGINS *```

Of course you need to have downloaded a model in ollama beforehand (eg, ```ollama run phi3.5```).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants