Skip to content

Commit

Permalink
Add support for RunnableRouter (#386)
Browse files Browse the repository at this point in the history
  • Loading branch information
davidmigloz authored Apr 22, 2024
1 parent d5e6b4d commit 827e262
Show file tree
Hide file tree
Showing 26 changed files with 997 additions and 74 deletions.
2 changes: 1 addition & 1 deletion analysis_options.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@ linter:
- prefer_final_fields
- prefer_final_in_for_each
- prefer_final_locals
- prefer_final_parameters
# - prefer_final_parameters # adds too much verbosity
- prefer_for_elements_to_map_fromIterable
- prefer_foreach
- prefer_function_declarations_over_variables
Expand Down
1 change: 1 addition & 0 deletions docs/_sidebar.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
- Cookbook
- [Prompt + LLM](/expression_language/cookbook/prompt_llm_parser.md)
- [Multiple chains](/expression_language/cookbook/multiple_chains.md)
- [Route logic based on input](/expression_language/cookbook/routing.md)
- [Adding memory](/expression_language/cookbook/adding_memory.md)
- [Retrieval](/expression_language/cookbook/retrieval.md)
- [Using Tools](/expression_language/cookbook/tools.md)
Expand Down
163 changes: 163 additions & 0 deletions docs/expression_language/cookbook/routing.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,163 @@
# Dynamically route logic based on input

This notebook covers how to do routing in the LangChain Expression Language.

Routing allows you to create non-deterministic chains where the output of a previous step defines the next step. Routing helps provide structure and consistency around interactions with LLMs.

## Using RunnableRouter

We’ll illustrate how to perform routing using a two-step sequence where the first step classifies an input question as being about LangChain, Anthropic, or Other, then routes to a corresponding prompt chain.

First, let’s create a chain that will identify incoming questions as being about `LangChain`, `Anthropic`, or `Other`:

```dart
final chatModel = ChatOllama(
defaultOptions: ChatOllamaOptions(model: 'llama3'),
);
final classificationChain = PromptTemplate.fromTemplate('''
Given the user question below, classify it as either being about `LangChain`, `Anthropic`, or `Other`.
Do not respond with more than one word.
<question>
{question}
</question>
Classification:
''') | chatModel | StringOutputParser();
final res1 = await classificationChain.invoke({
'question': 'how do I call Anthropic?',
});
print(res1);
// Anthropic
```

Now, let’s create three sub-chains:

```dart
final langchainChain = PromptTemplate.fromTemplate('''
You are an expert in langchain.
Always answer questions starting with "As Harrison Chase told me".
Respond to the following question:
Question: {question}
Answer:
''') | chatModel | StringOutputParser();
final anthropicChain = PromptTemplate.fromTemplate('''
You are an expert in anthropic.
Always answer questions starting with "As Dario Amodei told me".
Respond to the following question:
Question: {question}
Answer:
''') | chatModel | StringOutputParser();
final generalChain = PromptTemplate.fromTemplate('''
Respond to the following question:
Question: {question}
Answer:
''') | chatModel | StringOutputParser();
```

`RunnableRouter` is a type of runnable that takes a function that routes the input to a specific `Runnable`. You can use `Runnable.fromRouter` to create a `RunnableRouter`.

In this example, we will return one of the three chains we defined earlier based on the topic returned by the classification chain.

```dart
final router = Runnable.fromRouter((Map<String, dynamic> input, _) {
final topic = (input['topic'] as String).toLowerCase();
if (topic.contains('langchain')) {
return langchainChain;
} else if (topic.contains('anthropic')) {
return anthropicChain;
} else {
return generalChain;
}
});
final fullChain = Runnable.fromMap({
'topic': classificationChain,
'question': Runnable.getItemFromMap('question'),
}) | router;
final res2 = await fullChain.invoke({
'question': 'how do I use Anthropic?',
});
print(res2);
// As Dario Amodei told me, using Anthropic is a straightforward process that...
final res3 = await fullChain.invoke({
'question': 'how do I use LangChain?',
});
print(res3);
// As Harrison Chase told me, using LangChain is a breeze!
final res4 = await fullChain.invoke({
'question': 'whats 2 + 2',
});
print(res4);
// The answer is... 4!
```

## Routing by semantic similarity

One especially useful technique is to use embeddings to route a query to the most relevant prompt.

Here’s an example where we have two specialized prompts, one for physics and one for history. We will use embeddings to determine which prompt is best suited to answer a given question.

```dart
const physicsTemplate = '''
You are a very smart physicist.
You are great at answering questions about physics (e.g. black holes, quantum mechanics, etc.)
in a concise and easy to understand manner.
When you don't know the answer to a question you admit that you don't know.
Here is a question:
{query}
''';
const historyTemplate = '''
You are a very good historian.
You are great at answering history questions (e.g. about the Roman Empire, World War II, etc.)
in a detailed and engaging manner.
You are able to provide a lot of context and background information.
Here is a question:
{query}
''';
final embeddings = OllamaEmbeddings(model: 'llama3');
final promptTemplates = [physicsTemplate, historyTemplate];
final promptEmbeddings = await embeddings.embedDocuments(
promptTemplates.map((final pt) => Document(pageContent: pt)).toList(),
);
final chain = Runnable.fromMap<String>({'query': Runnable.passthrough()}) |
Runnable.fromRouter((input, _) async {
final query = input['query'] as String;
final queryEmbedding = await embeddings.embedQuery(query);
final mostSimilarIndex = getIndexesMostSimilarEmbeddings(queryEmbedding, promptEmbeddings).first;
print('Using ${mostSimilarIndex == 0 ? 'Physicist' : 'Historian'}');
return PromptTemplate.fromTemplate(promptTemplates[mostSimilarIndex]);
}) |
ChatOllama(
defaultOptions: const ChatOllamaOptions(model: 'llama3'),
) |
StringOutputParser();
final res1 = await chain.invoke("What's a black hole?");
print(res1);
// Using Physicist
// Black holes! One of my favorite topics!
// A black hole is a region in space where the gravitational pull is so strong...
final res2 = await chain.invoke('When did World War II end?');
print(res2);
// Using Historian
// A great question to start with! World War II ended on September 2, 1945...
```
31 changes: 31 additions & 0 deletions docs/expression_language/interface.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ The type of the input and output varies by component:
| `RunnableMap` | Runnable input type | `Map<String, dynamic>` |
| `RunnableBinding` | Runnable input type | Runnable output type |
| `RunnableFunction` | Runnable input type | Runnable output type |
| `RunnableRouter` | Runnable input type | Runnable output type |
| `RunnablePassthrough` | Runnable input type | Runnable input type |
| `RunnableItemFromMap` | `Map<String, dynamic>` | Runnable output type |
| `RunnableMapFromInput` | Runnable input type | `Map<String, dynamic>` |
Expand Down Expand Up @@ -285,6 +286,36 @@ print(res);
// 3 + 9 = 12
```

### RunnableRouter

A `RunnableRouter` takes the input it receives and routes it to the runnable specified by the `router` function.

You can create a `RunnableRouter` using the `Runnable.router` static method.

When you call `invoke` on a `RunnableRouter`, it will take the input it receives and return the output of the `router` function.

Example:
```dart
final router = Runnable.fromRouter((Map<String, dynamic> input, _) {
return switch(input['topic'] as String) {
'langchain' => langchainChain,
'anthropic' => anthropicChain,
_ => generalChain,
};
});
final fullChain = Runnable.fromMap({
'topic': classificationChain,
'question': Runnable.getItemFromMap('question'),
}).pipe(router);
final res2 = await fullChain.invoke({
'question': 'how do I use Anthropic?',
});
print(res2);
// As Dario Amodei told me, using Anthropic is a straightforward process that...
```

Check the [Routing guide](cookbook/routing.md) for more information.

### RunnablePassthrough

A `RunnablePassthrough` takes the input it receives and passes it through as output.
Expand Down
10 changes: 5 additions & 5 deletions examples/browser_summarizer/pubspec.lock
Original file line number Diff line number Diff line change
Expand Up @@ -225,28 +225,28 @@ packages:
path: "../../packages/langchain"
relative: true
source: path
version: "0.4.2"
version: "0.5.0+1"
langchain_community:
dependency: "direct main"
description:
path: "../../packages/langchain_community"
relative: true
source: path
version: "0.0.1-dev.2"
version: "0.1.0"
langchain_core:
dependency: "direct overridden"
description:
path: "../../packages/langchain_core"
relative: true
source: path
version: "0.0.1-dev.2"
version: "0.1.0"
langchain_openai:
dependency: "direct main"
description:
path: "../../packages/langchain_openai"
relative: true
source: path
version: "0.4.1"
version: "0.5.0+1"
langchain_tiktoken:
dependency: transitive
description:
Expand Down Expand Up @@ -309,7 +309,7 @@ packages:
path: "../../packages/openai_dart"
relative: true
source: path
version: "0.1.7"
version: "0.2.1"
path:
dependency: transitive
description:
Expand Down
Loading

0 comments on commit 827e262

Please sign in to comment.