Full API Route Example
A complete Next.js API route combining the Supyagent SDK client, tools, persistence, and context management.
Full API Route Example
This page walks through a complete Next.js API route that combines all SDK features: the Supyagent client, skills, built-in tools, Prisma persistence, and streaming.
This is what create-supyagent-app generates in src/app/api/chat/route.ts.
The Route
import { convertToModelMessages, streamText, stepCountIs, type UIMessage } from 'ai';
import { anthropic } from '@ai-sdk/anthropic';
import { supyagent, createBashTool, createViewImageTool } from '@supyagent/sdk';
import { createPrismaAdapter } from '@supyagent/sdk/prisma';
import { prisma } from '@/lib/prisma';
// Initialize once at module scope
const client = supyagent({ apiKey: process.env.SUPYAGENT_API_KEY! });
const adapter = createPrismaAdapter(prisma);
export async function POST(req: Request) {
const { messages, chatId }: { messages: UIMessage[]; chatId: string } = await req.json();
// 1. Persist incoming messages
await adapter.saveChat(chatId, messages as any);
// 2. Load skills (cached for 5 minutes)
const { systemPrompt: skillsPrompt, tools: skillTools } = await client.skills({ cache: 300 });
// 3. Combine skill tools with built-in tools
const tools = {
...skillTools,
bash: createBashTool(),
viewImage: createViewImageTool(),
};
// 4. Stream the response
const result = streamText({
model: anthropic('claude-sonnet-4-6-20250620'),
system: `You are a helpful assistant.\n\n${skillsPrompt}`,
messages: await convertToModelMessages(messages),
tools,
stopWhen: stepCountIs(5),
// Inject images into context for subsequent steps
prepareStep: async ({ steps }) => {
const imageUrls: string[] = [];
for (const step of steps) {
for (const call of step.toolCalls) {
if (call.toolName === 'viewImage' && call.input?.url) {
imageUrls.push(call.input.url as string);
}
}
}
if (imageUrls.length === 0) return undefined;
return {
messages: [
...messages,
{
role: 'user' as const,
content: imageUrls.map(url => ({
type: 'file' as const,
data: new URL(url),
mediaType: 'image/jpeg',
})),
},
],
};
},
});
// 5. Stream back to client, persisting after each step and on finish
return result.toUIMessageStreamResponse({
originalMessages: messages,
onStepFinish: async ({ messages: updatedMessages }) => {
await adapter.saveChat(chatId, updatedMessages as any);
},
onFinish: async ({ messages: updatedMessages }) => {
await adapter.saveChat(chatId, updatedMessages as any);
},
});
}Step-by-Step Breakdown
1. Module-Level Setup
const client = supyagent({ apiKey: process.env.SUPYAGENT_API_KEY! });
const adapter = createPrismaAdapter(prisma);Both the Supyagent client and Prisma adapter are created once when the module loads. The client's internal cache persists across requests.
2. Persist Incoming Messages
await adapter.saveChat(chatId, messages as any);Before calling the LLM, save the user's message. This ensures no messages are lost even if the LLM call fails.
3. Load Skills
const { systemPrompt: skillsPrompt, tools: skillTools } = await client.skills({ cache: 300 });client.skills() returns:
systemPrompt— A text block listing available skills (e.g., "Gmail", "Slack", "Calendar") for the LLMtools— Two AI SDK tools:loadSkill(fetches detailed docs) andapiCall(makes authenticated API calls)
With cache: 300, this fetches from Supyagent Cloud at most once every 5 minutes.
4. Combine Tools
const tools = {
...skillTools, // loadSkill + apiCall
bash: createBashTool(), // Shell execution
viewImage: createViewImageTool(), // Image display
};Spread the skill tools and add any built-in tools your app needs.
5. Stream with streamText
Key parameters:
stopWhen: stepCountIs(5)— Prevents infinite tool-call loops. The agent stops after 5 steps and returns whatever text it has.prepareStep— Injects viewed images into the conversation so the model can see them in subsequent steps.convertToModelMessages— ConvertsUIMessage[]format to the model's expected message format.
6. Stream Response with Persistence
return result.toUIMessageStreamResponse({
originalMessages: messages,
onStepFinish: async ({ messages: updatedMessages }) => {
await adapter.saveChat(chatId, updatedMessages as any);
},
onFinish: async ({ messages: updatedMessages }) => {
await adapter.saveChat(chatId, updatedMessages as any);
},
});onStepFinish— Called after each tool-use step. Persists intermediate state so you don't lose progress.onFinish— Called when the stream completes. Final persistence.originalMessages— The initial messages from the client, used to reconstruct the full message list.
Using Tools Mode Instead of Skills
If you prefer to load all tools upfront (simpler but uses more tokens):
const tools = await client.tools({ cache: 300 });
const result = streamText({
model: anthropic('claude-sonnet-4-6-20250620'),
system: 'You are a helpful assistant.',
messages: await convertToModelMessages(messages),
tools: {
...tools,
bash: createBashTool(),
},
});No systemPrompt needed — the tool definitions contain all the information the LLM needs.
Adding Context Management
For long conversations, add the context manager to automatically summarize older messages:
import { createContextManager } from '@supyagent/sdk/context';
const contextManager = createContextManager({
maxTokens: 200_000,
summaryModel: anthropic('claude-haiku-4-5-20251001'),
});
export async function POST(req: Request) {
let { messages, chatId } = await req.json();
// Compactify if context is too large
if (contextManager.shouldCompactify(messages)) {
messages = await contextManager.compactify(messages);
await adapter.saveChat(chatId, messages);
}
const result = streamText({
// ... same as above
});
return result.toUIMessageStreamResponse({
originalMessages: messages,
onStepFinish: async ({ messages: updated, usage }) => {
contextManager.recordUsage({
inputTokens: usage.inputTokens,
outputTokens: usage.outputTokens,
});
await adapter.saveChat(chatId, updated as any);
},
onFinish: async ({ messages: updated }) => {
await adapter.saveChat(chatId, updated as any);
},
});
}What's Next
- Client — More on the
supyagent()client API - Built-in Tools — All tool options and safety features
- Persistence — Chat management API routes
- Context Management — Threshold configuration and custom summarizers