Skip to main content

βš™οΈ What are Tools?

Tools are the various ways you can extend an LLM's capabilities beyond simple text generation. When enabled, they allow your chatbot to do amazing things β€” like search the web, scrape data, generate images, talk back using AI voices, and more.

Because there are several ways to integrate "Tools" in Open WebUI, it's important to understand which type you are using.


🧩 Tooling Taxonomy: Which "Tool" are you using?​

Users often encounter the term "Tools" in different contexts. Here is how to distinguish them:

TypeLocation in UIBest For...Source
Native FeaturesAdmin/SettingsCore platform functionalityBuilt-in to Open WebUI
Workspace ToolsWorkspace > ToolsUser-created or community Python scriptsCommunity Library
Native MCP (HTTP)Settings > ConnectionsStandard MCP servers reachable via HTTP/SSEExternal MCP Servers
MCP via Proxy (MCPO)Settings > ConnectionsLocal stdio-based MCP servers (e.g., Claude Desktop tools)MCPO Adapter
OpenAPI ServersSettings > ConnectionsStandard REST/OpenAPI web servicesExternal Web APIs

1. Native Features (Built-in)​

These are deeply integrated into Open WebUI and generally don't require external scripts.

  • Web Search: Integrated via engines like SearXNG, Google, or Tavily.
  • Image Generation: Integrated with DALL-E, ComfyUI, or Automatic1111.
  • RAG (Knowledge): The ability to query uploaded documents (#).

2. Workspace Tools (Custom Plugins)​

These are Python scripts that run directly within the Open WebUI environment.

  • Capability: Can do anything Python can do (web scraping, complex math, API calls).
  • Access: Managed via the Workspace menu.
  • Safety: Always review code before importing, as these run on your server.
  • ⚠️ Security Warning: Normal or untrusted users should not be given permission to access the Workspace Tools section. This access allows a user to upload and execute arbitrary Python code on your server, which could lead to a full system compromise.

3. MCP (Model Context Protocol) πŸ”Œβ€‹

MCP is an open standard that allows LLMs to interact with external data and tools.

  • Native HTTP MCP: Open WebUI can connect directly to any MCP server that exposes an HTTP/SSE endpoint.
  • MCPO (Proxy): Most community MCP servers use stdio (local command line). To use these in Open WebUI, you use the MCPO Proxy to bridge the connection.

4. OpenAPI / Function Calling Servers​

Generic web servers that provide an OpenAPI (.json or .yaml) specification. Open WebUI can ingest these specs and treat every endpoint as a tool.


πŸ“¦ How to Install & Manage Workspace Tools​

Workspace Tools are the most common way to extend your instance with community features.

  1. Go to Community Tool Library
  2. Choose a Tool, then click the Get button.
  3. Enter your Open WebUI instance’s URL (e.g. http://localhost:3000).
  4. Click Import to WebUI.
Safety Tip

Never import a Tool you don’t recognize or trust. These are Python scripts and might run unsafe code on your host system. Crucially, ensure you only grant "Tool" permissions to trusted users, as the ability to create or import tools is equivalent to the ability to run arbitrary code on the server.


πŸ”§ How to Use Tools in Chat​

Once installed or connected, here’s how to enable them for your conversations:

Option 1: Enable on-the-fly (Specific Chat)​

While chatting, click the βž• (plus) icon in the input area. You’ll see a list of available Tools β€” you can enable them specifically for that session.

Option 2: Enable by Default (Global/Model Level)​

  1. Go to Workspace ➑️ Models.
  2. Choose the model you’re using and click the ✏️ edit icon.
  3. Scroll to the Tools section.
  4. βœ… Check the Tools you want this model to always have access to by default.
  5. Click Save.

You can also let your LLM auto-select the right Tools using the AutoTool Filter.


🧠 Choosing How Tools Are Used: Default vs Native​

Once Tools are enabled, Open WebUI gives you two different ways to let your LLM interact with them. You can switch this via the chat settings:

Chat Controls

  1. Open a chat with your model.
  2. Click βš™οΈ Chat Controls > Advanced Params.
  3. Look for the Function Calling setting and switch between:

🟑 Default Mode (Prompt-based)​

Here, your LLM doesn’t need to natively support function calling. We guide the model using a smart tool-selection prompt template to select and use a Tool.

  • βœ… Works with practically any model (including smaller local models).

  • πŸ’‘ Admin Note: You can also toggle the default mode for each specific model in the Admin Panel > Settings > Models > Advanced Parameters.

  • ❗ Not as reliable as Native Mode when chaining multiple complex tools.

🟒 Native Mode (Built-in Function Calling)​

If your model supports native function calling (like GPT-4o, Gemini, Claude, or GPT-5), use this for a faster, more accurate experience where the LLM decides exactly when and how to call tools.

  • βœ… Fast, accurate, and can chain multiple tools in one response.
  • ❗ Requires a model that explicitly supports tool-calling schemas.
ModeWho it’s forProsCons
DefaultPractically any model (basic/local)Broad compatibility, safer, flexibleMay be less accurate or slower
NativeGPT-4o, Gemini, Claude, GPT-5, etc.Fast, smart, excellent tool chainingNeeds proper function call support

πŸš€ Summary & Next Steps​

Tools bring your AI to life by giving it hands to interact with the world.