Deployed here: https://langtail-weather-chat.vercel.app/
This example demonstrates how to create a chat application using the Langtail library within a Next.js framework. It covers making requests to the Langtail API, handling responses on both the client and server sides, managing tool calls, and streaming the AI's responses to the client's UI in React.
git clone [email protected]:langtail/examples.git
cd examples/langtail-weather-chat
Set your Langtail API key, OpenAI API key, and OpenWeather API key:
export OPENAI_API_KEY=""
export LANGTAIL_API_KEY=""
export OPEN_WEATHER_API_KEY=""
Alternatively, you can set these in the .env.example
file and rename it to .env
.
npm install
npm run dev
Navigate to http://localhost:3000.
You can deploy this project to Vercel or any other platform that supports Next.js. Our instance runs here: https://langtail-weather-chat.vercel.app/.
This project showcases how to create a chat application using Langtail, including streaming AI responses and handling tool calls.
- Langtail AI Endpoint: A simple endpoint that forwards requests to the Langtail API.
- Weather Endpoint: An endpoint for the OpenWeather API that also forwards requests.
The AI and weather endpoints are called from the backend (Node.js) to ensure that secrets are not exposed to the browser.
- Chat Page: Renders the chat interface, handles AI tools and function calls, and calls the weather endpoint to obtain weather data based on AI requests.
- Chat Component: Manages the AI fetch streams and renders them to the UI through the
messages
state. Calls thefunctionHandler
callback in the chat page based on AI responses.
- Basic Chat Example: https://langtail-weather-chat.vercel.app/