Generate text with Vercel's AI SDK
Before we start coding, don't forget to retrieve your API key from
OpenAI's dashboard. Create the .env file and insert the API key as env
variable OPENAI_API_KEY="sk-proj-...".
Okay, ready to go!
This tutorial uses AI SDK 6 (ai@^6 with provider packages like @ai-sdk/openai). The first step is very simple —
take the generateText() function from the ai package.
Simply pass the prompt and model to the generateText() function. The response will be returned in the text
property.
We will use OpenAI's gpt-4o-mini model - as it's cheap and will be enough for start.
Changing the model or provider as as simple as replacing the model passed to the generateText() call.
But let's stick with gpt-4o-mini for now. The last steps is to simply call the function with some initial prompt and
log the result:
1import { generateText } from "ai";23export async function generateAnswer(prompt: string) {4const { text } = await generateText({5model,6prompt,7});89return text;10}
In this lesson, we've learned how to use Vercel's AI SDK to generate text responses using different language models. We started with a simple implementation using OpenAI's GPT-4 model and then saw how easy it is to switch to other providers like Anthropic's Claude. The SDK's unified interface makes it straightforward to experiment with different models while keeping your code clean and maintainable