Skip to content

Commit

Permalink
feat!: Migrate from function calling to tool calling (#400)
Browse files Browse the repository at this point in the history
  • Loading branch information
davidmigloz authored May 5, 2024
1 parent 2a50aec commit 44413b8
Show file tree
Hide file tree
Showing 63 changed files with 2,358 additions and 1,416 deletions.
9 changes: 5 additions & 4 deletions docs/_sidebar.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,10 +51,10 @@
- [Ollama](/modules/model_io/models/llms/integrations/ollama.md)
- [Chat Models](/modules/model_io/models/chat_models/chat_models.md)
- How-to
- [Caching](/modules/model_io/models/chat_models/how_to/caching.md)
- [LLMChain](/modules/model_io/models/chat_models/how_to/llm_chain.md)
- [Prompts](/modules/model_io/models/chat_models/how_to/prompts.md)
- [Streaming](/modules/model_io/models/chat_models/how_to/streaming.md)
- [Tool calling](/modules/model_io/models/chat_models/how_to/tools.md)
- [LLMChain](/modules/model_io/models/chat_models/how_to/llm_chain.md)
- Integrations
- [OpenAI](/modules/model_io/models/chat_models/integrations/openai.md)
- [GCP Vertex AI](/modules/model_io/models/chat_models/integrations/gcp_vertex_ai.md)
Expand All @@ -67,7 +67,8 @@
- [Prem App](/modules/model_io/models/chat_models/integrations/prem.md)
- [Output parsers](/modules/model_io/output_parsers/output_parsers.md)
- [String output parser](/modules/model_io/output_parsers/string.md)
- [Functions output parsers](/modules/model_io/output_parsers/functions.md)
- [JSON output parser](/modules/model_io/output_parsers/json.md)
- [Tools output parser](/modules/model_io/output_parsers/tools.md)
- [Retrieval](/modules/retrieval/retrieval.md)
- [Document loaders](/modules/retrieval/document_loaders/document_loaders.md)
- How-to
Expand Down Expand Up @@ -116,7 +117,7 @@
- [Memory](/modules/memory/memory.md)
- [Agents](/modules/agents/agents.md)
- [Agent types](/modules/agents/agent_types/agent_types.md)
- [OpenAI functions](/modules/agents/agent_types/openai_functions_agent.md)
- [OpenAI functions](/modules/agents/agent_types/openai_tools_agent.md)
- [Tools](/modules/agents/tools/tools.md)
- [Calculator](/modules/agents/tools/calculator.md)
- [DALL-E Image Generator](/modules/agents/tools/openai_dall_e.md)
Expand Down
145 changes: 80 additions & 65 deletions docs/expression_language/cookbook/prompt_llm_parser.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,15 +27,22 @@ final chain = promptTemplate | model;
final res = await chain.invoke({'foo': 'bears'});
print(res);
// ChatResult{
// generations: [
// ChatGeneration{
// output: AIChatMessage{
// content: Why don't bears wear shoes?\n\nBecause they have bear feet!,
// },
// },
// ],
// usage: ...,
// modelOutput: ...,
// id: chatcmpl-9LBNiPXHzWIwc02rR6sS1HTcL9pOk,
// output: AIChatMessage{
// content: Why don't bears wear shoes?\nBecause they have bear feet!,
// },
// finishReason: FinishReason.stop,
// metadata: {
// model: gpt-3.5-turbo-0125,
// created: 1714835666,
// system_fingerprint: fp_3b956da36b
// },
// usage: LanguageModelUsage{
// promptTokens: 13,
// responseTokens: 13,
// totalTokens: 26,
// },
// streaming: false
// }
```

Expand All @@ -61,19 +68,26 @@ final chain = promptTemplate | model.bind(ChatOpenAIOptions(stop: ['\n']));
final res = await chain.invoke({'foo': 'bears'});
print(res);
// ChatResult{
// generations: [
// ChatGeneration{
// output: AIChatMessage{
// content: Why don't bears wear shoes?,
// },
// },
// ],
// usage: ...,
// modelOutput: ...,
// id: chatcmpl-9LBOohTtdg12zD8zzz2GX1ib24UXO,
// output: AIChatMessage{
// content: Why don't bears wear shoes? ,
// },
// finishReason: FinishReason.stop,
// metadata: {
// model: gpt-3.5-turbo-0125,
// created: 1714835734,
// system_fingerprint: fp_a450710239
// },
// usage: LanguageModelUsage{
// promptTokens: 13,
// responseTokens: 8,
// totalTokens: 21
// },
// streaming: false
// }
```

### Attaching Function Call information
### Attaching Tool Call information

```dart
final openaiApiKey = Platform.environment['OPENAI_API_KEY'];
Expand All @@ -83,10 +97,10 @@ final promptTemplate = ChatPromptTemplate.fromTemplate(
'Tell me a joke about {foo}',
);
const function = ChatFunction(
const tool = ToolSpec(
name: 'joke',
description: 'A joke',
parameters: {
inputJsonSchema: {
'type': 'object',
'properties': {
'setup': {
Expand All @@ -105,30 +119,41 @@ const function = ChatFunction(
final chain = promptTemplate |
model.bind(
ChatOpenAIOptions(
functions: const [function],
functionCall: ChatFunctionCall.forced(functionName: function.name),
tools: const [tool],
toolChoice: ChatToolChoice.forced(name: tool.name),
),
);
final res = await chain.invoke({'foo': 'bears'});
print(res);
// ChatResult{
// generations: [
// ChatGeneration{
// output: AIChatMessage{
// content: ,
// functionCall: AIChatMessageFunctionCall{
// name: joke,
// arguments: {
// setup: Why don't bears wear shoes?,
// punchline: Because they already have bear feet!
// },
// },
// },
// },
// ],
// usage: ...,
// modelOutput: ...,
// id: chatcmpl-9LBPyaZcFMgjmOvkD0JJKAyA4Cihb,
// output: AIChatMessage{
// content: ,
// toolCalls: [
// AIChatMessageToolCall{
// id: call_JIhyfu6jdIXaDHfYzbBwCKdb,
// name: joke,
// argumentsRaw: {"setup":"Why don't bears like fast food?","punchline":"Because they can't catch it!"},
// arguments: {
// setup: Why don't bears like fast food?,
// punchline: Because they can't catch it!
// },
// }
// ],
// },
// finishReason: FinishReason.stop,
// metadata: {
// model: gpt-3.5-turbo-0125,
// created: 1714835806,
// system_fingerprint: fp_3b956da36b
// },
// usage: LanguageModelUsage{
// promptTokens: 77,
// responseTokens: 24,
// totalTokens: 101
// },
// streaming: false
// }
```

Expand Down Expand Up @@ -157,9 +182,9 @@ print(res);

Notice that this now returns a string - a much more workable format for downstream tasks.

### Functions Output Parser
### Tools Output Parser

When you specify the function to return, you may just want to parse that directly.
When you specify a tool that the model should call, you may just want to parse the tool call directly.

```dart
final openaiApiKey = Platform.environment['OPENAI_API_KEY'];
Expand All @@ -169,10 +194,10 @@ final promptTemplate = ChatPromptTemplate.fromTemplate(
'Tell me a joke about {foo}',
);
const function = ChatFunction(
const tool = ToolSpec(
name: 'joke',
description: 'A joke',
parameters: {
inputJsonSchema: {
'type': 'object',
'properties': {
'setup': {
Expand All @@ -191,32 +216,22 @@ const function = ChatFunction(
final chain = promptTemplate |
model.bind(
ChatOpenAIOptions(
functions: const [function],
functionCall: ChatFunctionCall.forced(functionName: function.name),
tools: const [tool],
toolChoice: ChatToolChoice.forced(name: tool.name),
),
) |
JsonOutputFunctionsParser();
ToolsOutputParser();
final res = await chain.invoke({'foo': 'bears'});
print(res);
// {setup: Why don't bears wear shoes?, punchline: Because they have bear feet!}
```

Or we can even extract one of the arguments from the function call using the `JsonKeyOutputFunctionsParser`:

```dart
final chain = promptTemplate |
model.bind(
ChatOpenAIOptions(
functions: const [function],
functionCall: ChatFunctionCall.forced(functionName: function.name),
),
) |
JsonKeyOutputFunctionsParser(keyName: 'setup');
final res = await chain.invoke({'foo': 'bears'});
print(res);
// Why don't bears wear socks?
// [ParsedToolCall{
// id: call_tDYrlcVwk7bCi9oh5IuknwHu,
// name: joke,
// arguments: {
// setup: What do you call a bear with no teeth?,
// punchline: A gummy bear!
// },
// }]
```

## Simplifying input
Expand All @@ -232,7 +247,7 @@ final chain = map | promptTemplate | model | StringOutputParser();

*`Runnable.passthrough()` is a convenience method that creates a `RunnablePassthrough` object. This is a `Runnable` that takes the input it receives and passes it through as output.*

However, this is a bit verbose. We can simplify it by using `Runnable.getItemFromMap` which does the same under the hood:
However, this is a bit verbose. We can simplify it by using `Runnable.getMapFromInput` which does the same under the hood:

```dart
final chain = Runnable.getMapFromInput('foo') |
Expand Down
21 changes: 14 additions & 7 deletions docs/expression_language/primitives/binding.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,23 +69,23 @@ print(res);

Another similar use case is to use different `temperature` settings for different parts of the chain. You can easily do this by using `model.bind(ChatOpenAIOptions(temperature: 1))` as shown above.

## Attaching functions
## Attaching tools

One particularly useful application of `Runnable.bind()` is to attach the functions that the model can call.
One particularly useful application of `Runnable.bind()` is to attach the tools that the model can call.

```dart
final model = ChatOpenAI(apiKey: openaiApiKey);
final outputParser = JsonOutputFunctionsParser();
final outputParser = ToolsOutputParser();
final promptTemplate = ChatPromptTemplate.fromTemplates([
(ChatMessageType.system, 'Write out the following equation using algebraic symbols then solve it.'),
(ChatMessageType.human, '{equation_statement}'),
]);
const function = ChatFunction(
const tool = ToolSpec(
name: 'solver',
description: 'Formulates and solves an equation',
parameters: {
inputJsonSchema: {
'type': 'object',
'properties': {
'equation': {
Expand All @@ -103,10 +103,17 @@ const function = ChatFunction(
final chain = Runnable.getMapFromInput<String>('equation_statement')
.pipe(promptTemplate)
.pipe(model.bind(ChatOpenAIOptions(functions: [function])))
.pipe(model.bind(ChatOpenAIOptions(tools: [tool])))
.pipe(outputParser);
final res = await chain.invoke('x raised to the third plus seven equals 12');
print(res);
// {equation: x^3 + 7 = 12, solution: x = 1}
// [ParsedToolCall{
// id: call_T2Y3g7rU5s0CzEG4nL35FJYK,
// name: solver,
// arguments: {
// equation: x^3 + 7 = 12,
// solution: x = 1
// },
// }]
```
Original file line number Diff line number Diff line change
@@ -1,16 +1,16 @@
# OpenAI functions
# OpenAI tools

Certain OpenAI models (like `gpt-3.5-turbo` and `gpt-4`) have been
fine-tuned to detect when a function should to be called and respond with the
inputs that should be passed to the function. In an API call, you can describe
functions and have the model intelligently choose to output a JSON object
containing arguments to call those functions. The goal of the OpenAI Function
APIs is to more reliably return valid and useful function calls than a generic
fine-tuned to detect when a tool should to be called and respond with the
inputs that should be passed to the tool. In an API call, you can describe
tools and have the model intelligently choose to output a JSON object
containing arguments to call those tools. The goal of the OpenAI Function
APIs is to more reliably return valid and useful tool calls than a generic
text completion or chat API.

The OpenAI Functions Agent is designed to work with these models.
The OpenAI Tools Agent is designed to work with these models.

> **Note**: Must be used with an [OpenAI Functions](https://platform.openai.com/docs/guides/gpt/function-calling) model.
> **Note**: Must be used with an [OpenAI Tools](https://platform.openai.com/docs/guides/function-calling) model.
```dart
final llm = ChatOpenAI(
Expand All @@ -21,7 +21,7 @@ final llm = ChatOpenAI(
),
);
final tool = CalculatorTool();
final agent = OpenAIFunctionsAgent.fromLLMAndTools(llm: llm, tools: [tool]);
final agent = OpenAIToolsAgent.fromLLMAndTools(llm: llm, tools: [tool]);
final executor = AgentExecutor(agent: agent);
final res = await executor.run('What is 40 raised to the 0.43 power? ');
print(res); // -> '40 raised to the power of 0.43 is approximately 4.8852'
Expand Down Expand Up @@ -94,7 +94,7 @@ final llm = ChatOpenAI(
);
final memory = ConversationBufferMemory(returnMessages: true);
final agent = OpenAIFunctionsAgent.fromLLMAndTools(
final agent = OpenAIToolsAgent.fromLLMAndTools(
llm: llm,
tools: [tool],
memory: memory,
Expand Down Expand Up @@ -131,19 +131,19 @@ final model = ChatOpenAI(
apiKey: openaiApiKey,
defaultOptions: ChatOpenAIOptions(
temperature: 0,
functions: [tool.toChatFunction()],
tools: [tool],
),
);
const outputParser = OpenAIFunctionsAgentOutputParser();
const outputParser = OpenAIToolsAgentOutputParser();
List<ChatMessage> buildScratchpad(final List<AgentStep> intermediateSteps) {
return intermediateSteps
.map((final s) {
return s.action.messageLog +
[
ChatMessage.function(
name: s.action.tool,
ChatMessage.tool(
toolCallId: s.action.id,
content: s.observation,
),
];
Expand Down
2 changes: 1 addition & 1 deletion docs/modules/agents/agents.md
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ Finally, let's initialize an agent with the tools, the language model, and the
type of agent we want to use.

```dart
final agent = OpenAIFunctionsAgent.fromLLMAndTools(llm: llm, tools: tools);
final agent = OpenAIToolsAgent.fromLLMAndTools(llm: llm, tools: tools);
```

Now let's create the agent executor and test it out!
Expand Down
2 changes: 1 addition & 1 deletion docs/modules/agents/tools/calculator.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ final llm = ChatOpenAI(
),
);
final tool = CalculatorTool();
final agent = OpenAIFunctionsAgent.fromLLMAndTools(llm: llm, tools: [tool]);
final agent = OpenAIToolsAgent.fromLLMAndTools(llm: llm, tools: [tool]);
final executor = AgentExecutor(agent: agent);
final res = await executor.run('What is 40 raised to the 0.43 power? ');
print(res); // -> '40 raised to the power of 0.43 is approximately 4.8852'
Expand Down
2 changes: 1 addition & 1 deletion docs/modules/agents/tools/openai_dall_e.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ final tools = <Tool>[
CalculatorTool(),
OpenAIDallETool(apiKey: openAiKey),
];
final agent = OpenAIFunctionsAgent.fromLLMAndTools(llm: llm, tools: tools);
final agent = OpenAIToolsAgent.fromLLMAndTools(llm: llm, tools: tools);
final executor = AgentExecutor(agent: agent);
final res = await executor.run(
'Calculate the result of 40 raised to the power of 0.43 and generate a funny illustration with it. '
Expand Down
3 changes: 0 additions & 3 deletions docs/modules/model_io/models/chat_models/how_to/caching.md

This file was deleted.

Loading

0 comments on commit 44413b8

Please sign in to comment.