Providers & Models
VoltAgent is built directly on top of the Vercel AI SDK. You choose an ai-sdk provider package (OpenAI, Anthropic, Google, Mistral, Groq, Ollama, etc.) and pass the resulting LanguageModel to your Agent via the model prop.
- No extra VoltAgent provider package is required.
- Full compatibility with ai-sdk’s streaming, tool calling, and structured outputs.
Installation
Install the ai-sdk provider(s) you want to use:
- npm
- yarn
- pnpm
# For example, to use OpenAI:
npm install @ai-sdk/openai
# Or Anthropic:
npm install @ai-sdk/anthropic
# Or Google:
npm install @ai-sdk/google
# For example, to use OpenAI:
yarn add @ai-sdk/openai
# Or Anthropic:
yarn add @ai-sdk/anthropic
# Or Google:
yarn add @ai-sdk/google
# For example, to use OpenAI:
pnpm add @ai-sdk/openai
# Or Anthropic:
pnpm add @ai-sdk/anthropic
# Or Google:
pnpm add @ai-sdk/google
Usage Example
import { Agent } from "@voltagent/core";
import { openai } from "@ai-sdk/openai";
const agent = new Agent({
name: "my-agent",
instructions: "You are a helpful assistant",
model: openai("gpt-4o-mini"),
});
Available Providers
First-Party AI SDK Providers
These providers are maintained by Vercel and offer the highest level of support and integration:
Foundation Models
| Provider | Package | Documentation | Key Models |
|---|---|---|---|
| xAI Grok | @ai-sdk/xai | Docs | grok-4, grok-3, grok-2-vision |
| OpenAI | @ai-sdk/openai | Docs | gpt-4.1, gpt-4o, o3, o1 |
| Anthropic | @ai-sdk/anthropic | Docs | claude-opus-4, claude-sonnet-4, claude-3.5 |
| Google Generative AI | @ai-sdk/google | Docs | gemini-2.0-flash, gemini-1.5-pro |
| Google Vertex | @ai-sdk/google-vertex | Docs | gemini models, claude models via Vertex |
| Mistral | @ai-sdk/mistral | Docs | mistral-large, pixtral-large, mistral-medium |
Cloud Platforms
| Provider | Package | Documentation | Description |
|---|---|---|---|
| Amazon Bedrock | @ai-sdk/amazon-bedrock | Docs | Access to various models via AWS |
| Azure OpenAI | @ai-sdk/azure | Docs | OpenAI models via Azure |
| Vercel | @ai-sdk/vercel | Docs | v0 model for code generation |
Specialized Providers
| Provider | Package | Documentation | Specialization |
|---|---|---|---|
| Groq | @ai-sdk/groq | Docs | Ultra-fast inference |
| Together.ai | @ai-sdk/togetherai | Docs | Open-source models |
| Cohere | @ai-sdk/cohere | Docs | Enterprise search & generation |
| Fireworks | @ai-sdk/fireworks | Docs | Fast open-source models |
| DeepInfra | @ai-sdk/deepinfra | Docs | Affordable inference |
| DeepSeek | @ai-sdk/deepseek | Docs | DeepSeek models including reasoner |
| Cerebras | @ai-sdk/cerebras | Docs | Fast Llama models |
| Perplexity | @ai-sdk/perplexity | Docs | Search-enhanced responses |
Audio & Speech Providers
| Provider | Package | Documentation | Specialization |
|---|---|---|---|
| ElevenLabs | @ai-sdk/elevenlabs | Docs | Text-to-speech |
| LMNT | @ai-sdk/lmnt | Docs | Voice synthesis |
| Hume | @ai-sdk/hume | Docs | Emotional intelligence |
| Rev.ai | @ai-sdk/revai | Docs | Speech recognition |
| Deepgram | @ai-sdk/deepgram | Docs | Speech-to-text |
| Gladia | @ai-sdk/gladia | Docs | Audio intelligence |
| AssemblyAI | @ai-sdk/assemblyai | Docs | Speech recognition & understanding |
Community Providers
These providers are created and maintained by the open-source community:
| Provider | Package | Documentation | Description |
|---|---|---|---|
| Ollama | ollama-ai-provider | Docs | Local model execution |
| FriendliAI | @friendliai/ai-provider | Docs | Optimized inference |
| Portkey | @portkey-ai/vercel-provider | Docs | LLM gateway & observability |
| Cloudflare Workers AI | workers-ai-provider | Docs | Edge AI inference |
| OpenRouter | @openrouter/ai-sdk-provider | Docs | Unified API for multiple providers |
| Requesty | @requesty/ai-sdk | Docs | Request management |
| Crosshatch | @crosshatch/ai-provider | Docs | Specialized models |
| Mixedbread | mixedbread-ai-provider | Docs | Embedding models |
| Voyage AI | voyage-ai-provider | Docs | Embedding models |
| Mem0 | @mem0/vercel-ai-provider | Docs | Memory-enhanced AI |
| Letta | @letta-ai/vercel-ai-sdk-provider | Docs | Stateful agents |
| Spark | spark-ai-provider | Docs | Chinese language models |
| AnthropicVertex | anthropic-vertex-ai | Docs | Claude via Vertex AI |
| LangDB | @langdb/vercel-provider | Docs | Database-aware AI |
| Dify | dify-ai-provider | Docs | LLMOps platform |
| Sarvam | sarvam-ai-provider | Docs | Indian language models |
| Claude Code | ai-sdk-provider-claude-code | Docs | Code-optimized Claude |
| Built-in AI | built-in-ai | Docs | Browser-native AI |
| Gemini CLI | ai-sdk-provider-gemini-cli | Docs | CLI-based Gemini |
| A2A | a2a-ai-provider | Docs | Specialized models |
| SAP-AI | @mymediset/sap-ai-provider | Docs | SAP AI Core integration |
OpenAI-Compatible Providers
For providers that follow the OpenAI API specification:
| Provider | Documentation | Description |
|---|---|---|
| LM Studio | Docs | Local model execution with GUI |
| Baseten | Docs | Model deployment platform |
| Any OpenAI-compatible API | Docs | Custom endpoints |
Model Capabilities
The AI providers support different language models with various capabilities. Here are the capabilities of popular models:
| Provider | Model | Image Input | Object Generation | Tool Usage | Tool Streaming |
|---|---|---|---|---|---|
| xAI Grok | grok-4 | ❌ | ✅ | ✅ | ✅ |
| xAI Grok | grok-3 | ❌ | ✅ | ✅ | ✅ |
| xAI Grok | grok-3-fast | ❌ | ✅ | ✅ | ✅ |
| xAI Grok | grok-3-mini | ❌ | ✅ | ✅ | ✅ |
| xAI Grok | grok-3-mini-fast | ❌ | ✅ | ✅ | ✅ |
| xAI Grok | grok-2-1212 | ❌ | ✅ | ✅ | ✅ |
| xAI Grok | grok-2-vision-1212 | ✅ | ✅ | ✅ | ✅ |
| xAI Grok | grok-beta | ❌ | ✅ | ✅ | ✅ |
| xAI Grok | grok-vision-beta | ✅ | ❌ | ❌ | ❌ |
| Vercel | v0-1.0-md | ✅ | ✅ | ✅ | ✅ |
| OpenAI | gpt-4.1 | ✅ | ✅ |