Skip to content

beeper/ai-bridge

Repository files navigation

Bridge AI chats into Beeper

Highly experimental Matrix ↔︎ AI bridge for Beeper, built on top of mautrix/bridgev2. Supports any OpenAI-compatible provider, including OpenRouter.

Currently best works with alpha versions of Beeper Desktop. Beeper Plus users can use it without providing their own keys by picking the Beeper AI provider when adding an account.

Highlights

  • Multi-provider routing with prefixed model IDs (for example openai/o3-mini, openai/gpt-5.2, anthropic/claude-sonnet-4.5 via OpenRouter)
  • Per-model contacts so each model appears as its own chat contact
  • Streaming responses with status updates
  • Multimodal input (images, PDFs, audio, video) when the selected model supports it
  • Per-room settings for model, temperature, system prompt, context limits, and tools
  • User-managed keys via login flow, plus optional Beeper-managed credentials
  • OpenClaw-style memory search over MEMORY.md + memory/*.md stored in the bridge DB
  • Virtual file tools (read, write, edit, ls, find, grep) backed by SQLite

Memory

Memory is DB-only (SQLite) and modeled after OpenClaw:

  • memory_search returns snippet results with path + line ranges.
  • memory_get reads line ranges from MEMORY.md or memory/*.md.
  • Memory files are stored in the bridge DB (not on disk) and are per-agent.
  • Optional session transcript indexing can be enabled via network.memory_search.experimental.session_memory.

Login flows

There are three login flows:

  • Beeper: Select a Beeper domain (or set network.beeper.base_url in config) and provide a Beeper AI key unless it is already in config.yaml.
  • Magic Proxy: Provide a Magic Proxy link (the token is taken from the URL fragment after #).
  • Custom: Provide API keys for the services you want to use (OpenRouter/OpenAI plus optional search/fetch providers). Base URLs are configured only in config.yaml.

Base URL overrides live in config:

  • network.providers.openai.base_url
  • network.providers.openrouter.base_url

When to use what

Common to all methods:

  • One-click setup for all Beeper Plus users (uses Beeper AI servers, rate limits apply)
  • BYOK for any OpenAI Responses API compatible provider
    • Supports local LLM providers when running with On-Device or Self-Hosted
Method Sync Where requests run Local LLMs Notes
Beeper Cloud Syncs to all devices Beeper servers No Good default when you want full sync and zero setup
Beeper On-Device No sync Your device Yes Everything stays on-device; Beeper never sees your messages
Beeper Self-Hosted Syncs to all devices, encrypted Your bridge host Yes* Beeper never sees your messages
  • Local LLMs require an OpenAI-compatible provider endpoint.

Build

Requires libolm for encryption support.

./build.sh

Or use Docker:

docker build -t ai .

About

AI Chats on Beeper

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages