Vercel’s AI SDK 6 beta introduces groundbreaking features like tool execution approval, a new agent abstraction, and enhanced capabilities for building production-ready AI applications across frameworks like Next.js, React, Vue, and Svelte.[1][5] This release addresses key pain points in LLM integration, such as safely granting models powerful tools while abstracting provider differences.[1][3]

What is the Vercel AI SDK?

The AI SDK is a TypeScript-first toolkit that simplifies building AI-powered apps by providing a unified interface for multiple LLM providers, including OpenAI, Anthropic, Google, Grok, and more.[3][4] It eliminates boilerplate for chatbots, text generation, structured data, and now advanced agents, supporting frameworks like Next.js, Vue, Svelte, Node.js, React, Angular, and SolidJS.[3][4][6]

Core Components of the AI SDK

The SDK is divided into key libraries:

  • AI SDK Core: Handles server-side operations with functions like generateText, streamText, generateObject, and streamObject for text and structured JSON generation.[2][3]
  • AI SDK UI: Client-side hooks such as useChat, useCompletion, useObject, and useAssistant for real-time interactions, state management, and streaming UI.[2][6]
  • AI SDK RSC: Experimental React Server Components for server-streamed UI (use UI hooks for production).[2]

Key Benefit: Switch providers with minimal code changes—e.g., from OpenAI’s GPT-5 to Anthropic’s Claude by updating two lines.[3][4]

What’s New in AI SDK 6 Beta?

Announced by the Vercel team, version 6 focuses on agentic workflows and safety, making it easier to build reliable AI systems.[1][5]

1. Tool Execution Approval: Human-in-the-Loop Safety

Tool execution approval implements a “human in the loop” pattern, allowing developers to approve or deny LLM tool calls before execution.[1] This is crucial for tools with side effects, like file writing or API calls, preventing accidental damage while enabling powerful capabilities.[1]

// Example: Approve tool calls manually
const { toolResults } = await runAgent({
  agent,
  tools: { writeFile: writeFileTool },
  onToolCall: (call) => {
    // Human approval step
    if (confirm(`Approve ${call.name}?`)) {
      return call.execute();
    }
  }
});

This feature adapts to new LLM abilities, acting as a compatibility layer across providers.[1]

2. New Agent Abstraction

AI SDK 6 introduces a declarative agent abstraction, separating agent definitions from usage.[1] Define agents once (e.g., a weather agent) and reuse them across API routes or apps—ideal for packaging via Vercel’s registry.[1]

Custom agents are now possible by implementing the agent interface as a class.[1]

// Declare a reusable agent
const weatherAgent = createAgent({
  model: 'openai/gpt-5',
  tools: [getWeatherTool],
  instructions: 'You are a helpful weather assistant.'
});

// Use in API route
export async function GET() {
  const result = await weatherAgent.run('Weather in NYC?');
  return Response.json(result);
}

3. Additional Power Features

  • Full MCP Support: Enhanced multi-cloud provider integration.
  • Reranking: Improves response quality by reordering candidates.
  • Image Editing: Native support for vision models.
  • DevTools: Debugging tools for agents and streams.[5]

Building Your First AI Feature with AI SDK Core

Start with a simple text generator using generateText or streamText.[2][3]

Installation and Setup

npm install ai
// app/api/chat/route.ts
import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';

export async function POST(req: Request) {
  const { prompt } = await req.json();
  const { text } = await generateText({
    model: openai('gpt-5'),
    prompt,
  });
  return Response.json({ text });
}

Streaming with AI SDK UI

Enhance the frontend with useCompletion for real-time updates.[2]

// app/page.tsx (React)
'use client';
import { useCompletion } from 'ai/react';

export default function Page() {
  const { completion, input, handleInputChange, handleSubmit } = useCompletion();
  return (
    <form onSubmit={handleSubmit}>
      <input value={input} onChange={handleInputChange} />
      <p>{completion}</p>
    </form>
  );
}

This auto-manages streaming, loading states, and errors for engaging UIs.[2][6]

Generating Structured Data and Using Tools

Constrain outputs with Zod schemas for type-safe JSON.[3]

import { generateObject } from 'ai';
import { z } from 'zod';

const recipeSchema = z.object({
  ingredients: z.array(z.string()),
  steps: z.array(z.string()),
});

const { object } = await generateObject({
  model: openai('gpt-5'),
  schema: recipeSchema,
  prompt: 'Generate a vegan pizza recipe.',
});

Tool Calling is built-in for external interactions.[3]

const tools = {
  getWeather: tool({
    description: 'Get current weather',
    parameters: z.object({ city: z.string() }),
    execute: async ({ city }) => fetchWeather(city),
  }),
};

UI Framework Support in AI SDK 6

FrameworkuseChatuseCompletionuseObject
React
Vue.js
Svelte
Angular
SolidJS✅ (community)

Hooks like useChat abstract chat state, making cross-framework development seamless.[6]

Why Upgrade to AI SDK 6?

  • Provider Agnostic: 20+ providers with unified APIs.[4]
  • Agentic Focus: Safer, reusable agents reduce complexity.[1]
  • Production Ready: Streaming, tools, and UI hooks speed up development.[2][5]
  • TypeScript Native: Full type safety with Zod integration.[3]

Getting Started Resources

  • Official Docs: AI SDK Introduction[4]
  • Tutorial Video: Matt Pocock’s walkthrough on agents and tools.[1]
  • Community Tutorial: Vercel forum guide for v6 basics.[7]
  • Full Guide: Codecademy’s step-by-step app build.[2]

Conclusion

Vercel AI SDK 6 marks a leap forward for AI development, empowering developers to build sophisticated agents with safety features like tool approval and modular abstractions.[1][5] Whether you’re adding chat to Next.js or structured generation to Vue, the unified API and hooks make it faster and more reliable.

Experiment with the beta today—install via npm install ai@beta and start with a simple agent. As AI evolves, tools like this ensure your apps stay ahead without vendor lock-in. Dive in, approve those tools, and build the next generation of intelligent applications.[3][4]