Adding system prompt with Vercel's AI SDK
The system prompt is one of the most important variables for the LLM model. It's used to influence the model's behavior.
What is a system prompt?
Have you ever wondered how are all the Custom GPTs in ChatGPT built? There are now thousands of them. They are all built by simply passing a system prompt to the LLM model.
You can define a lot of behavioral trails in the system prompt. The model will follow the instructions in the system prompt, if you define them correctly.
Let's look what is the most common way to define a system prompt:
- Define the tonality of the language
- Define if the model should be formal or casual
- Specify if the model should use emojis
- Define if you want the model to produce short or long answers
- Tip: Define examples of the correct examples
- Tip: Define guide rails - define what the model could talk about
Specific prompt engineering is outside of the scope of this tutorial, but we will see examples of guide rails in the next lessons.
Simply pass the system property to the generateText() function. Let's look at an example:
Let's write a system prompt that will help the model to behave more like a pirate.
Now let's extract the system prompt into a separate variable:
Here is the complete example, including model and prompt.
1import { generateText } from "ai";23export async function generateAnswer(prompt: string) {4const { text } = await generateText({5prompt,6system: "-- your system prompt --",7});89return text;10}
In this lesson, you have learned how to define a system prompt and pass it to the generateText() function. The system
prompt plays a crucial role in defining the behavior of the model.