Deployed here: https://langtail-weather-chat-vercel-ai.vercel.app/
This example builds upon the basic weather chat application, adding modern features for enhanced functionality. The Langtail tools SDK allows for better and type-safe handling of AI tools and function calls.
Using Vercel's ai
package, you can stream messages directly and handle streams elegantly on the frontend.
git clone [email protected]:langtail/examples.git
cd examples/langtail-weather-chat-vercel-ai
Set your Langtail API key, OpenAI API key, and OpenWeather API key:
export OPENAI_API_KEY=""
export LANGTAIL_API_KEY=""
export OPEN_WEATHER_API_KEY=""
Alternatively, you can set these in the .env.example
file and rename it to .env
.
npm install
npm run dev
Navigate to http://localhost:3000.
You can deploy this project to Vercel or any other platform that supports Next.js. Our instance is available here: https://langtail-weather-chat-vercel-ai.vercel.app/.
This project showcases how to create a chat application using Langtail, including streaming AI responses and handling tool calls.
- Langtail AI Endpoint: An endpoint where we use Vercel AI to stream data in a compatible format to the frontend. Additionally, we handle AI weather requests here to provide the AI with a message about the current weather in a human-readable format.
- Weather Endpoint: An endpoint for the OpenWeather API that also forwards requests.
The AI and weather endpoints are called from the backend (Node.js) to ensure that secrets are not exposed to the browser.
- Chat Page: Renders the chat and requests JSON data about the weather to display weather tiles on the left, based on how the AI calls the tools.
- Chat Component: Manages the AI chat using Vercel AI hooks
useChat
, simplifying the handling of streams. It also notifiespage.tsx
to request weather data to display the tiles.
- Basic Chat Example: https://langtail-weather-chat-vercel-ai.vercel.app/