Replies: 2 comments 1 reply
-
Hi @Nisim1! I'm here to help you with any bugs or questions you have. Let's squash those bugs together! To resolve the "Embedding model client not properly initialized" error when using HuggingFace Embeddings Inference in Docker, you can follow these steps:
These steps should help in diagnosing and resolving the issue with the HuggingFace Embeddings Inference component in Docker [1]. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Beta Was this translation helpful? Give feedback.
-
hey, what's you langflow version and the browser and os you are using? |
Beta Was this translation helpful? Give feedback.
-
Hi community!
I am trying to use HuggingFace Embeddings Inference.
My goal is to use open-source embedding models to perform text embedding.
I am getting an error that says:
"Embedding model client not properly initialized."
When I try to run the same model via Postman, it works!
How can this be resolved?
(The langFlow it's on Docker if it's matter)
Thanks in Advance!
Beta Was this translation helpful? Give feedback.
All reactions