From 0555a2d44288d6ff2a2fb95a015b18a799e7845a Mon Sep 17 00:00:00 2001 From: Stainless Bot Date: Tue, 19 Mar 2024 14:13:34 +0000 Subject: [PATCH] docs: assistant improvements --- README.md | 31 +++++++++++++++++++++++++++++++ helpers.md | 37 ++++++++++++++++++++++++++++++------- 2 files changed, 61 insertions(+), 7 deletions(-) diff --git a/README.md b/README.md index 68d337e81..1eca06c85 100644 --- a/README.md +++ b/README.md @@ -100,6 +100,37 @@ Documentation for each method, request param, and response field are available i > [!IMPORTANT] > Previous versions of this SDK used a `Configuration` class. See the [v3 to v4 migration guide](https://github.com/openai/openai-node/discussions/217). +### Streaming Helpers + +The SDK also includes helpers to process streams and handle the incoming events. + +```ts +const run = openai.beta.threads.runs + .createAndStream(thread.id, { + assistant_id: assistant.id, + }) + .on('textCreated', (text) => process.stdout.write('\nassistant > ')) + .on('textDelta', (textDelta, snapshot) => process.stdout.write(textDelta.value)) + .on('toolCallCreated', (toolCall) => process.stdout.write(`\nassistant > ${toolCall.type}\n\n`)) + .on('toolCallDelta', (toolCallDelta, snapshot) => { + if (toolCallDelta.type === 'code_interpreter') { + if (toolCallDelta.code_interpreter.input) { + process.stdout.write(toolCallDelta.code_interpreter.input); + } + if (toolCallDelta.code_interpreter.outputs) { + process.stdout.write('\noutput >\n'); + toolCallDelta.code_interpreter.outputs.forEach((output) => { + if (output.type === 'logs') { + process.stdout.write(`\n${output.logs}\n`); + } + }); + } + } + }); +``` + +More information on streaming helpers can be found in the dedicated documentation: [helpers.md](helpers.md) + ### Streaming responses This library provides several conveniences for streaming chat completions, for example: diff --git a/helpers.md b/helpers.md index 9f01a126a..9a94a618e 100644 --- a/helpers.md +++ b/helpers.md @@ -36,6 +36,29 @@ const run = openai.beta.threads.runs }); ``` +### Starting a stream + +There are three helper methods for creating streams: + +```ts +openai.beta.threads.runs.createAndStream(); +``` + +This method can be used to start and stream the response to an existing run with an associated thread +that is already populated with messages. + +```ts +openai.beta.threads.createAndRunStream(); +``` + +This method can be used to add a message to a thread, start a run and then stream the response. + +```ts +openai.beta.threads.runs.submitToolOutputsStream(); +``` + +This method can be used to submit a tool output to a run waiting on the output and start a stream. + ### Assistant Events The assistant API provides events you can subscribe to for the following events. @@ -108,25 +131,25 @@ The last event send when a stream ends. The assistant streaming object also provides a few methods for convenience: ```ts -.currentEvent() +.currentEvent(): AssistantStreamEvent | undefined -.currentRun() +.currentRun(): Run | undefined -.currentMessageSnapshot() +.currentMessageSnapshot(): Message -.currentRunStepSnapshot() +.currentRunStepSnapshot(): Runs.RunStep ``` These methods are provided to allow you to access additional context from within event handlers. In many cases the handlers should include all the information you need for processing, but if additional context is required it can be accessed. -Note: There is not always a relevant context in certain situations (these will be undefined in those cases). +Note: There is not always a relevant context in certain situations (these will be `undefined` in those cases). ```ts -await.finalMessages(); +await .finalMessages() : Promise -await.finalRunSteps(); +await .finalRunSteps(): Promise ``` These methods are provided for convenience to collect information at the end of a stream. Calling these events