AI-powered document title generator for Paperless NGX supporting OpenAI and Ollama.
ngx-renamer automatically generates intelligent titles for your Paperless NGX documents using AI. When you upload a document, it analyzes the OCR text and creates a meaningful title instead of generic filenames like scan_001.pdf.
Example transformations:
scan_2024_03_15.pdf→Amazon - Monthly Prime Subscription InvoiceIMG_2043.pdf→Deutsche Bank - Account Statement March 2024document.pdf→Versicherung - Änderungen AVB DKV 2026
- Multiple LLM Providers - OpenAI (cloud), Anthropic Claude (cloud), or Ollama (local/private)
- Zero-Setup Installation - Automatic initialization, no manual setup
- Smart Title Generation - Context-aware titles in the document's language
- Configurable Prompts - Customize via YAML settings
- Production Ready - Handles errors gracefully
- Paperless NGX in Docker
- Choose one:
- Paperless API token (from your user profile)
- Quick Start
- LLM Provider Setup - OpenAI or Ollama
- Configuration
- Troubleshooting
- Development & Architecture - See ARCHITECTURE.md
Choose your installation method:
Fully automated setup with persistent configuration.
1. Clone the repository
cd ~/paperless # Your docker-compose.yml location
git clone https://github.com/chriskoch/ngx-renamer.git ngx-renamer2. Configure credentials
Add to your docker-compose.env:
PAPERLESS_NGX_API_KEY=your-paperless-api-token
PAPERLESS_NGX_URL=http://webserver:8000/api
# For OpenAI:
OPENAI_API_KEY=sk-your-key-here
# OR for Anthropic Claude:
CLAUDE_API_KEY=sk-ant-your-key-here
# OR for Ollama:
OLLAMA_BASE_URL=http://host.docker.internal:11434
OLLAMA_API_KEY= # Optional, leave empty for local/unauthenticated Ollama3. Update docker-compose.yml
Add to your webserver service:
webserver:
volumes:
- ./ngx-renamer:/usr/src/ngx-renamer:ro
- ngx-renamer-venv:/usr/src/ngx-renamer-venv
environment:
PAPERLESS_POST_CONSUME_SCRIPT: /usr/src/ngx-renamer/scripts/post_consume_wrapper.sh
entrypoint: /usr/src/ngx-renamer/scripts/init-and-start.sh
volumes:
ngx-renamer-venv:4. Restart
docker compose down && docker compose up -dFirst startup takes 30-60 seconds. Check logs: docker compose logs webserver | grep ngx-renamer
Minimal setup - one file only.
1. Download the standalone script
cd ~/paperless
wget https://raw.githubusercontent.com/<repo>/ngx-renamer/main/ngx-renamer-standalone.py2. Update docker-compose.yml
webserver:
volumes:
- ./ngx-renamer-standalone.py:/usr/local/bin/ngx-renamer.py:ro
environment:
PAPERLESS_POST_CONSUME_SCRIPT: python3 /usr/local/bin/ngx-renamer.py
OPENAI_API_KEY: ${OPENAI_API_KEY}
PAPERLESS_NGX_URL: http://webserver:8000/api
PAPERLESS_NGX_API_KEY: ${PAPERLESS_API_KEY}3. Add credentials to .env
OPENAI_API_KEY=sk-your-key-here
PAPERLESS_API_KEY=your-token-here4. Restart
docker compose down && docker compose up -dChoose between OpenAI (cloud), Anthropic Claude (cloud), or Ollama (local/private).
Quick setup for cloud-based AI:
- Get API key: https://platform.openai.com/settings/organization/api-keys
- Add to
docker-compose.env:OPENAI_API_KEY=sk-your-key-here
- Edit
settings.yaml:llm_provider: "openai" openai: model: "gpt-4o-mini" # or "gpt-4o" for better quality
High-quality cloud AI with strong reasoning:
- Get API key: https://console.anthropic.com/settings/keys
- Add to
docker-compose.env:CLAUDE_API_KEY=sk-ant-your-key-here
- Edit
settings.yaml:llm_provider: "claude" claude: model: "claude-sonnet-4-5-20250929" # Latest Claude Sonnet 4.5 model
For local/private AI (free, no API costs):
-
Install Ollama on your host:
# macOS/Linux curl -fsSL https://ollama.ai/install.sh | sh # Windows: Download from https://ollama.ai/download
-
Pull a model:
ollama pull gpt-oss:latest # Or: llama3, mistral, qwen2.5 -
Configure ngx-renamer in
docker-compose.env:# Mac/Windows: OLLAMA_BASE_URL=http://host.docker.internal:11434 # Linux: OLLAMA_BASE_URL=http://172.17.0.1:11434 # Optional: Only needed for authenticated Ollama instances OLLAMA_API_KEY= # Leave empty for local/self-hosted Ollama
-
Edit
settings.yaml:llm_provider: "ollama" ollama: model: "gpt-oss:latest"
Authentication Note: Local Ollama installations don't require an API key. Only set OLLAMA_API_KEY if your Ollama instance is behind an authentication proxy or cloud service that requires it.
Switch providers anytime by editing llm_provider in settings.yaml - no restart needed!
Edit settings.yaml to customize title generation.
Choose model:
llm_provider: "openai" # or "claude" or "ollama"
openai:
model: "gpt-4o-mini" # or "gpt-4o" for better quality
claude:
model: "claude-sonnet-4-5-20250929" # Latest Claude Sonnet 4.5 model
ollama:
model: "gpt-oss:latest"Date prefix:
with_date: false # Set true to add YYYY-MM-DD prefixCustomize prompt:
prompt:
main: |
* generate a title for the given OCR text in its original language
* add sender/author (max 20 chars)
* format: Sender - Brief Description
* max 127 characters total
* no asterisks or currencies
with_date: |
* find document date and add as YYYY-MM-DD prefix
no_date: |
* use format: sender titleSee settings.yaml for full configuration options.
"Failed to get document details"
- Verify
PAPERLESS_NGX_URLuseshttp://webserver:8000/api(notlocalhost, must end with/api) - Check API token is correct
- Test:
docker compose exec webserver curl http://webserver:8000/api/
"OPENAI_API_KEY not set"
- Verify
.envfile or docker-compose environment variables - Check:
docker compose exec webserver env | grep OPENAI
"Cannot connect to Ollama"
- Ensure Ollama is running:
curl http://localhost:11434/api/version - Check
OLLAMA_BASE_URLis correct for your platform (Mac/Windows:host.docker.internal, Linux:172.17.0.1) - Verify model is pulled:
ollama list
Title generation not running
- Check logs:
docker compose logs webserver | grep ngx-renamer - Verify
PAPERLESS_POST_CONSUME_SCRIPTis set - Upload a test document and watch logs
Force rebuild after updates
docker compose exec webserver rm /usr/src/ngx-renamer-venv/.initialized
docker compose restart webserverFor AI Coding Agents: See AGENTS.md for development commands, testing, and code conventions.
For Human Developers: See ARCHITECTURE.md for detailed architecture documentation.
Quick links:
- System architecture and data flow → ARCHITECTURE.md
- Development setup and testing → AGENTS.md
- Component documentation → ARCHITECTURE.md
- Security considerations → ARCHITECTURE.md
GPL-3.0 License - see LICENSE file for details.