The Power of AI in Your IDE
DevoxxGenie is a fully Java-based LLM Code Assistant plugin for IntelliJ IDEA, designed to integrate with both local and cloud-based LLM providers.
With DevoxxGenie, developers can leverage the power of artificial intelligence to improve code quality, solve problems faster, and learn new concepts, all within their familiar IDE environment.
100% Open Source and Free - DevoxxGenie is completely open source and free to use, following the BYOK (Bring Your Own Keys) model for LLM API keys.
Unlock the Full Potential of AI in Your Development Workflow
Spec-driven Development
Define what needs to be built as structured task specs with acceptance criteria, and let the LLM agent figure out how to build it. Powered by Backlog.md integration, SDD brings disciplined, traceable AI-assisted development to your IDE.
Backlog Browser: Browse tasks grouped by status (To Do, In Progress, Done) in a dedicated tool window. Select any task and click "Implement with Agent" to inject the full spec as structured context into the LLM prompt.
17 Built-in Backlog Tools: Just type "Create a task for..." in the prompt to create structured specs instantly. The agent can also edit, search, and complete tasks, manage documents and milestones — all from natural language. Acceptance criteria are checked off in real-time as the agent works.
--- id: TASK-42 title: Add caching layer status: To Do priority: high milestone: v2.0 --- ## Acceptance Criteria - [ ] Redis client configured - [ ] GET endpoints cached - [ ] Cache invalidation on writes - [ ] Metrics exposed - [ ] Graceful degradation

Advanced MCP Support
DevoxxGenie implements Model Context Protocol (MCP) support, which enables advanced agent-like capabilities, allowing the LLM to access external tools and services to provide more comprehensive and accurate responses.
Built-in MCP Marketplace: Browse, search, and install MCP servers directly from within IntelliJ IDEA. Discover tools for filesystem access, databases, web browsing, and more — all without leaving your IDE.
MCP Debugging: Built-in debugging panel lets you monitor MCP requests and responses in real-time, making it easy to understand how the LLM interacts with external tools and troubleshoot any issues.
Agent Mode
Agent Mode enables the LLM to autonomously explore and modify your codebase using built-in tools — reading files, listing directories, searching for patterns, and making targeted edits.
Works with Local Models: Run Agent Mode entirely on your machine using powerful local models like GLM-4.7-flash via Ollama — no cloud API keys required.
Built-in Tools: read_file, write_file, edit_file, list_files, search_files, and run_command — all managed by the LLM with safety approvals for write operations.


Parallel Sub-Agents
Spawn multiple read-only AI assistants that concurrently investigate different aspects of your project. Each sub-agent can use a different LLM provider/model and runs with its own isolated context and tool call budget.
Cost Optimization: Use smaller, faster models for sub-agents (e.g., gpt-4o-mini, gemini-flash) while keeping a powerful model for the main agent coordinator.
Per-Agent Model Overrides: Dynamically add or remove sub-agent slots (up to 10) and assign a specific model to each one — mix cloud and local providers as needed.
CLI Runners
Execute prompts and spec tasks via external CLI tools like Claude Code, GitHub Copilot, OpenAI Codex, Google Gemini CLI, and Kimi — directly from the chat interface or the Spec Browser.
Chat Mode: Select a CLI runner as your provider and chat naturally. DevoxxGenie routes your prompts to the external CLI tool and streams the response back into the conversation.
Spec Task Execution: Run individual or batch spec tasks through any configured CLI runner. Combined with the Agent Loop, you can delegate entire task backlogs to your preferred coding assistant.

Start Using DevoxxGenie Today
Join thousands of developers who are already using DevoxxGenie to improve their productivity.
100% free and open source with no hidden costs - just bring your own API keys (BYOK)!