Replies: 1 comment
-
I tried it out, share it with everyone `import { ChatOpenAI } from "langchain/chat_models"; async function v_chat2(){ /* Create the vectorstore */ const chain = RetrievalQAChain.fromLLM(model, vectorStore.asRetriever()); |
Beta Was this translation helpful? Give feedback.
-
For example reading archives from csv, json or Web Loaders
Use ChatOpenAI to ask questions, reply based on the profile I provide
Beta Was this translation helpful? Give feedback.
All reactions