Highly experimental Matrix ↔︎ AI bridge for Beeper, built on top of mautrix/bridgev2. Supports any OpenAI-compatible provider, including OpenRouter.
Currently best works with alpha versions of Beeper Desktop. Beeper Plus users can use it without providing their own keys by picking the Beeper AI provider when adding an account.
- Multi-provider routing with prefixed model IDs (for example
openai/o3-mini,openai/gpt-5.2,anthropic/claude-sonnet-4.5via OpenRouter) - Per-model contacts so each model appears as its own chat contact
- Streaming responses with status updates
- Multimodal input (images, PDFs, audio, video) when the selected model supports it
- Per-room settings for model, temperature, system prompt, context limits, and tools
- User-managed keys via login flow, plus optional Beeper-managed credentials
- OpenClaw-style memory search over
MEMORY.md+memory/*.mdstored in the bridge DB - Virtual file tools (
read,write,edit,ls,find,grep) backed by SQLite
Memory is DB-only (SQLite) and modeled after OpenClaw:
memory_searchreturns snippet results with path + line ranges.memory_getreads line ranges fromMEMORY.mdormemory/*.md.- Memory files are stored in the bridge DB (not on disk) and are per-agent.
- Optional session transcript indexing can be enabled via
network.memory_search.experimental.session_memory.
There are three login flows:
- Beeper: Select a Beeper domain (or set
network.beeper.base_urlin config) and provide a Beeper AI key unless it is already inconfig.yaml. - Magic Proxy: Provide a Magic Proxy link (the token is taken from the URL fragment after
#). - Custom: Provide API keys for the services you want to use (OpenRouter/OpenAI plus optional search/fetch providers). Base URLs are configured only in
config.yaml.
Base URL overrides live in config:
network.providers.openai.base_urlnetwork.providers.openrouter.base_url
Common to all methods:
- One-click setup for all Beeper Plus users (uses Beeper AI servers, rate limits apply)
- BYOK for any OpenAI Responses API compatible provider
- Supports local LLM providers when running with On-Device or Self-Hosted
| Method | Sync | Where requests run | Local LLMs | Notes |
|---|---|---|---|---|
| Beeper Cloud | Syncs to all devices | Beeper servers | No | Good default when you want full sync and zero setup |
| Beeper On-Device | No sync | Your device | Yes | Everything stays on-device; Beeper never sees your messages |
| Beeper Self-Hosted | Syncs to all devices, encrypted | Your bridge host | Yes* | Beeper never sees your messages |
- Local LLMs require an OpenAI-compatible provider endpoint.
Requires libolm for encryption support.
./build.shOr use Docker:
docker build -t ai .