ChatOpenAI streaming in Next.js 13 route handler #1350
Unanswered
JasperAlexander
asked this question in
Q&A
Replies: 1 comment
-
It seems like that when I call ConversationalRetrievalQAChain, the answer of the CONDENSE_PROMPT is also returned in the callback function. Why is that, and how to only get the answer of the QA_PROMPT? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
How to return an OpenAI stream in a Next.js 13 route handler? The handleLLM callbacks seem to be called quite randomly and gives a lot of errors.
Beta Was this translation helpful? Give feedback.
All reactions