βοΈ What are Tools?
Tools are the various ways you can extend an LLM's capabilities beyond simple text generation. When enabled, they allow your chatbot to do amazing things β like search the web, scrape data, generate images, talk back using AI voices, and more.
Because there are several ways to integrate "Tools" in Open WebUI, it's important to understand which type you are using.
π§© Tooling Taxonomy: Which "Tool" are you using?β
Users often encounter the term "Tools" in different contexts. Here is how to distinguish them:
| Type | Location in UI | Best For... | Source |
|---|---|---|---|
| Native Features | Admin/Settings | Core platform functionality | Built-in to Open WebUI |
| Workspace Tools | Workspace > Tools | User-created or community Python scripts | Community Library |
| Native MCP (HTTP) | Settings > Connections | Standard MCP servers reachable via HTTP/SSE | External MCP Servers |
| MCP via Proxy (MCPO) | Settings > Connections | Local stdio-based MCP servers (e.g., Claude Desktop tools) | MCPO Adapter |
| OpenAPI Servers | Settings > Connections | Standard REST/OpenAPI web services | External Web APIs |
1. Native Features (Built-in)β
These are deeply integrated into Open WebUI and generally don't require external scripts.
- Web Search: Integrated via engines like SearXNG, Google, or Tavily.
- Image Generation: Integrated with DALL-E, ComfyUI, or Automatic1111.
- RAG (Knowledge): The ability to query uploaded documents (
#).
2. Workspace Tools (Custom Plugins)β
These are Python scripts that run directly within the Open WebUI environment.
- Capability: Can do anything Python can do (web scraping, complex math, API calls).
- Access: Managed via the
Workspacemenu. - Safety: Always review code before importing, as these run on your server.
- β οΈ Security Warning: Normal or untrusted users should not be given permission to access the Workspace Tools section. This access allows a user to upload and execute arbitrary Python code on your server, which could lead to a full system compromise.
3. MCP (Model Context Protocol) πβ
MCP is an open standard that allows LLMs to interact with external data and tools.
- Native HTTP MCP: Open WebUI can connect directly to any MCP server that exposes an HTTP/SSE endpoint.
- MCPO (Proxy): Most community MCP servers use
stdio(local command line). To use these in Open WebUI, you use the MCPO Proxy to bridge the connection.
4. OpenAPI / Function Calling Serversβ
Generic web servers that provide an OpenAPI (.json or .yaml) specification. Open WebUI can ingest these specs and treat every endpoint as a tool.
π¦ How to Install & Manage Workspace Toolsβ
Workspace Tools are the most common way to extend your instance with community features.
- Go to Community Tool Library
- Choose a Tool, then click the Get button.
- Enter your Open WebUI instanceβs URL (e.g.
http://localhost:3000). - Click Import to WebUI.
Never import a Tool you donβt recognize or trust. These are Python scripts and might run unsafe code on your host system. Crucially, ensure you only grant "Tool" permissions to trusted users, as the ability to create or import tools is equivalent to the ability to run arbitrary code on the server.
π§ How to Use Tools in Chatβ
Once installed or connected, hereβs how to enable them for your conversations:
Option 1: Enable on-the-fly (Specific Chat)β
While chatting, click the β (plus) icon in the input area. Youβll see a list of available Tools β you can enable them specifically for that session.
Option 2: Enable by Default (Global/Model Level)β
- Go to Workspace β‘οΈ Models.
- Choose the model youβre using and click the βοΈ edit icon.
- Scroll to the Tools section.
- β Check the Tools you want this model to always have access to by default.
- Click Save.
You can also let your LLM auto-select the right Tools using the AutoTool Filter.
π§ Choosing How Tools Are Used: Default vs Nativeβ
Once Tools are enabled, Open WebUI gives you two different ways to let your LLM interact with them. You can switch this via the chat settings:

- Open a chat with your model.
- Click βοΈ Chat Controls > Advanced Params.
- Look for the Function Calling setting and switch between:
π‘ Default Mode (Prompt-based)β
Here, your LLM doesnβt need to natively support function calling. We guide the model using a smart tool-selection prompt template to select and use a Tool.
-
β Works with practically any model (including smaller local models).
-
π‘ Admin Note: You can also toggle the default mode for each specific model in the Admin Panel > Settings > Models > Advanced Parameters.
-
β Not as reliable as Native Mode when chaining multiple complex tools.
π’ Native Mode (Built-in Function Calling)β
If your model supports native function calling (like GPT-4o, Gemini, Claude, or GPT-5), use this for a faster, more accurate experience where the LLM decides exactly when and how to call tools.
- β Fast, accurate, and can chain multiple tools in one response.
- β Requires a model that explicitly supports tool-calling schemas.
| Mode | Who itβs for | Pros | Cons |
|---|---|---|---|
| Default | Practically any model (basic/local) | Broad compatibility, safer, flexible | May be less accurate or slower |
| Native | GPT-4o, Gemini, Claude, GPT-5, etc. | Fast, smart, excellent tool chaining | Needs proper function call support |
π Summary & Next Stepsβ
Tools bring your AI to life by giving it hands to interact with the world.
- Browse Tools: openwebui.com/search
- Advanced Setup: Learn more about MCP Support
- Development: Writing your own Custom Toolkits