How To Use Vercel Ai Sdk
📖 Bu rehber ToolPazar ekibi tarafından hazırlanmıştır. Tüm araçlarımız ücretsiz ve reklamsızdır.
What it is
The AI SDK (package name ai) normalizes the wire formats of OpenAI, Anthropic, Google, Mistral, Groq, Cohere, Amazon Bedrock, and dozens more behind a single API. You get generateText, streamText, generateObject (for Zod- validated structured output), and a set of React/Svelte/Vue hooks that plug straight into streaming UIs. It is the de-facto standard for TypeScript AI apps.
Install
The SDK is Apache-2.0 licensed, maintained by Vercel, and split across ai (core), @ai-sdk/openai and siblings (providers), and @ai-sdk/react (UI hooks). It targets Node 18+, works on the Vercel Edge Runtime, and runs in the browser for providers that allow it.
First run
Stream a response from an API route and render it in a React component:
Everyday workflows
The SDK had a major version bump (v4 to v5) that changed message shape and tool-call semantics. Blog posts from 2024 often target v3; check the package version before copy-pasting. Also remember that toDataStreamResponse uses a custom protocol — if you consume it outside the built-in hooks, read the stream spec first.
Gotchas and tips
Edge runtime is fast but limited. No Node APIs, 1MB bundle cap, and some providers’ SDKs pull in fs or crypto transitively. Check your bundle with next build before deploying a chat route to the edge.
Who it’s for
Any TypeScript developer shipping an LLM feature into a web app. Tip: put all model selection behind one environment variable — swapping gpt-4o for claude-sonnet-4 then becomes a config change, not a refactor.