No description
2025-12-04 14:31:01 +01:00
examples minor changes 2025-12-04 14:31:01 +01:00
md moved design stuff to md subfolder 2025-11-14 08:23:00 +01:00
src minor changes 2025-12-04 14:31:01 +01:00
.gitignore bugfix 2025-12-01 11:05:44 +01:00
Cargo.lock updates 2025-12-02 09:02:25 +01:00
Cargo.toml Add OpenAI and Mistral AI provider implementations 2025-12-01 12:55:18 +01:00
CLAUDE.md Add OpenAI and Mistral AI provider implementations 2025-12-01 12:55:18 +01:00
LICENSE Initial commit 2025-11-06 16:36:35 +01:00
Makefile minor changes 2025-12-04 14:31:01 +01:00
mental_model.md added mental model 2025-11-14 07:04:04 +01:00
README.md Refactor providers: remove lm-oneshot, rename lm-chat to lmstudio 2025-12-01 11:04:55 +01:00

ttyclaude

A minimal, keyboard-driven terminal client for conversing with Claude AI. Designed for Unix workflow integration, particularly with screen/tmux and Claude Code.

Why?

Because switching to a browser breaks flow. Because your terminal is your home. Because screen ttyclaude should be as natural as vim config.toml.

The workflow:

screen -S work
# window 0: ttyclaude - strategy, planning, discussion
# window 1: claude - tactical execution, coding
# Ctrl-a 0/1 - flip between thinking and doing

Status

🚧 v0.2.1 - Functional MVP

The core application is working! You can:

  • Chat with Claude via Anthropic API or OpenAI-compatible endpoints
  • Stream responses in real-time with beautiful markdown rendering
  • Switch between multiple buffers (like irssi/weechat)
  • Toggle between markdown and plain text rendering
  • Navigate input history

Still missing: Persistence (conversations lost on exit), vim modal editing, conversation search.

This is a learning project - our first substantial Rust application published on Codeberg. Expect iteration, refactoring, and occasional "interesting" architectural decisions as we learn.

Features

Implemented (v0.2.1)

  • Native Terminal UI - No browser, no Electron, pure terminal
  • Glow-style Markdown - Beautiful rendering with termimad (toggle with Ctrl-T)
  • Streaming Responses - Real-time token-by-token display with live markdown rendering
  • Multi-Buffer Support - Switch between conversations like irssi/weechat
  • Multiple Providers - Anthropic Claude, OpenAI-compatible endpoints, local LLMs
  • Config File - ~/.config/ttyclaude/config.ini for settings and API keys
  • Input History - Navigate previous inputs with Ctrl-P/Ctrl-N
  • Fast - Sub-100ms startup, minimal memory footprint

Planned

  • Local Storage - sqlite for conversation persistence (v0.2)
  • Vim Bindings - Modal editing (v0.3)
  • Scriptable - Pipe content in/out, integrate with shell (v0.4)
  • Search - Find text within conversations (v0.3)
  • Export - Save conversations as markdown files (v0.2)
  • Private - No telemetry, no phone-home (already true, just not documented)

Why Markdown Rendering Matters

Claude speaks markdown. We write markdown. Code blocks, tables, lists - they all look gorgeous when properly rendered. Why settle for raw text when you can have the Glow experience right in your conversation?

Instead of this:           You get this:
## Heading                ╔══════════════════╗
- item one                â•‘ Heading          â•‘
- item two                ╚══════════════════╝
```rust                   • item one
fn main() {}              • item two
                      ┌─────────────────┐
                      │ fn main() {}    │
                      └─────────────────┘

## Prerequisites

- Unix-like OS (Linux, BSD, macOS)
- Rust toolchain (1.70+) - Install from [rustup.rs](https://rustup.rs)
- Anthropic API key - Get from [console.anthropic.com](https://console.anthropic.com)

## Installation

```bash
# Clone
git clone https://codeberg.org/Threadpanic/ttyclaude
cd ttyclaude

# Build
cargo build --release

# Install (optional)
cargo install --path .

# Or copy binary
cp target/release/ttyclaude ~/.local/bin/

Configuration

Create ~/.config/ttyclaude/config.ini:

[general]
default_provider = anthropic
history_size = 100

[provider.anthropic]
api_key = env:ANTHROPIC_API_KEY
model = claude-sonnet-4-20250514
max_tokens = 4096
temperature = 0.7

[provider.lmstudio]
endpoint = http://127.0.0.1:1234/v1/chat/completions

[ui]
vim_mode = false
markdown_theme = auto
code_theme = monokai

API Key Options:

  • env:VARIABLE_NAME - Read from environment variable (recommended)
  • sk-ant-... - Literal key value (not recommended for version control)
  • Or set ANTHROPIC_API_KEY environment variable directly

Usage

# Start with default provider from config
ttyclaude

# Override provider via environment
PROVIDER=anthropic ttyclaude

# Use LM Studio (OpenAI-compatible local endpoint)
PROVIDER=lmstudio ttyclaude

Key Bindings (v0.2.1)

Message Input:

  • Enter - Insert newline (multi-line input)
  • F5 or Ctrl-Y - Send message
  • Backspace - Delete character
  • Ctrl-P / Ctrl-N - Navigate input history (previous/next)

Response Window:

  • PageUp / PageDown - Scroll response window
  • Ctrl-T - Toggle markdown/plain text rendering
  • Ctrl-L - Clear screen

Buffer Management:

  • Alt-1 / Alt-2 / Alt-3 ... - Switch to buffer 1, 2, 3, etc.
  • ESC-1 / ESC-2 / ESC-3 ... - Alternative (for terminals that intercept Alt)
  • Ctrl-Left / Ctrl-Right - Cycle through buffers

Application:

  • Ctrl-Q - Quit

Commands (type in input):

  • /buffer <name> - Create/switch to buffer
  • /provider <name> - Switch provider (echo, shell, lmstudio, anthropic)
  • /help - Show help (not yet implemented)

Note: Vim modal editing (hjkl navigation, command mode) is planned but not yet implemented.

Roadmap

  • v0.1 - MVP: Basic TUI, streaming, single session
  • v0.2.1 - Multi-buffer, markdown rendering, multiple providers
  • v0.3 - Persistence: sqlite storage, history, export
  • v0.4 - Enhanced UI: vim mode, syntax highlighting, search
  • v0.5 - Integration: pipes, clipboard, files
  • v0.6 - Advanced: branching, conversation search, themes

Design Philosophy

  1. Minimal dependencies - Standard Unix tools mindset
  2. Keyboard-first - Mouse is a crutch
  3. Fast - Instant response, no bloat
  4. Scriptable - Play nice with pipes and scripts
  5. Local-first - Your data stays yours
  6. No surprises - Do what you expect, nothing more

Comparison

Feature ttyclaude v0.2.1 Web UI Claude Code
Terminal native âœ" ✗ âœ"
Glow-style markdown âœ" âœ" ✗
Streaming responses âœ" âœ" âœ"
Multi-buffer/sessions âœ" âœ" (Projects) ✗
Conversation history (v0.3) âœ" ✗
Code execution ✗ âœ" âœ"
Vim bindings (v0.4) ✗ ✗
Local storage (v0.3) Cloud Local
Scriptable I/O (v0.5) ✗ âœ"
Screen/tmux native âœ" ✗ âœ"
Multiple LLM providers âœ" ✗ ✗

Use ttyclaude for: Planning, discussion, design, thinking, multi-LLM workflows Use Claude Code for: Implementation, debugging, refactoring, file operations Use Web UI for: Artifacts, images, complex interactions

Contributing

This is an open design process. See DESIGN.md and CLAUDE.md for architecture.

Current focus (v0.3):

  • sqlite schema for conversation persistence
  • Export functionality (markdown format)
  • Conversation list/search UI
  • Resume conversations

Pull requests welcome for:

  • Core functionality
  • Bug fixes
  • Documentation
  • Tests

Not accepting:

  • GUI/web versions
  • Telemetry/analytics
  • "Enterprise" features
  • Complexity without clear value

Development

# Build
cargo build

# Run
cargo run

# Test
cargo test

# Check
cargo clippy

License

TBD - Likely GPL-3.0 or MIT

Acknowledgments

Inspired by:

  • irssi, weechat - Terminal chat done right
  • vim - Modal editing paradigm
  • screen, tmux - Terminal multiplexing
  • Claude Code - API integration for developers
  • broot - Excellent use of termimad

Built as a learning project to explore Rust + async + TUI development while solving a real workflow problem.

Contact


Built by terminal nerds, for terminal nerds.