A dbt project template for Dune using Trino and uv for Python package management.
Using this as a template? See SETUP_FOR_NEW_TEAMS.md for first-time setup instructions.
Running dbt models on Dune from automated pipelines can quickly consume a lot of credits on Dune. We have disabled the CI workflows in this repo by default to prevent accidents. Please check the Development Workflow doc for more information.
When you're ready to enable automated dbt runs on PRs, pushes to main, or a schedule, uncomment the triggers in the github workflow files:
New to this repo? See the docs/ directory for complete guides:
- Getting Started - Initial setup for new developers
- Development Workflow - How to develop models
- dbt Best Practices - Patterns and configurations
- Testing - Test requirements
- CI/CD - GitHub Actions workflows
- Troubleshooting - Common issues
uv syncRequired variables:
| Variable | Description | Where to Get |
|---|---|---|
DUNE_API_KEY |
Your Dune API key for authentication | dune.com/settings/api |
DUNE_TEAM_NAME |
Your team name (determines schema where models are written) | Your Dune team name |
Optional variables:
| Variable | Description | Default |
|---|---|---|
DEV_SCHEMA_SUFFIX |
Personal dev schema suffix (creates {team}__tmp_{suffix}) |
None (uses {team}__tmp_) |
See Getting Started for multiple options to set these variables (shell profile, session export, or inline).
uv run dbt deps # Install dbt packages
uv run dbt debug # Test connection
uv run dbt run # Run models (uses dev target by default)
uv run dbt test # Run testsThis project uses dbt targets to control schema naming, not API endpoints:
- Both
devandprodtargets connect to the same production API (trino.api.dune.com) - Target names control where models are written:
devtarget (default): Writes to{team}__tmp_schemas (safe for development)prodtarget: Writes to{team}schemas (production tables)
Local development uses dev target by default. To test with prod target locally:
uv run dbt run --target prod # Use prod schema namingSet DEV_SCHEMA_SUFFIX=your_name environment variable to use schema {team}__tmp_{your_name} instead of {team}__tmp_.
# Add to shell profile for persistence
echo 'export DEV_SCHEMA_SUFFIX=your_name' >> ~/.zshrc
source ~/.zshrc
# Or export for current session
export DEV_SCHEMA_SUFFIX=your_name
# Or inline with command
DEV_SCHEMA_SUFFIX=your_name uv run dbt runTo disable suffix after using it:
unset DEV_SCHEMA_SUFFIXuv run dbt run # Run all models
uv run dbt run --select model_name # Run specific model
uv run dbt run --select model_name --full-refresh # Full refresh incremental model
uv run dbt test # Run all tests
uv run dbt test --select model_name # Test specific model
uv run dbt docs generate && uv run dbt docs serve # View documentationThis repo includes optional Cursor AI guidelines in .cursor/rules/:
dbt-best-practices.mdc- dbt patterns and best practices- Repository configs, development workflow, incremental models
- Model organization, DuneSQL optimization, data quality
These are basic guidelines, not requirements. Cursor AI applies them automatically when available.
Note: SQL formatting preferences (sql-style-guide.mdc) are kept local and not committed to the repo.
dune catalog prefix on Dune app/API.
Pattern: dune.{team_name}.{table} (where {team_name} = DUNE_TEAM_NAME environment variable)
-- ❌ Won't work
select * from dune__tmp_.dbt_template_view_model
-- ✅ Correct (with DUNE_TEAM_NAME=dune)
select * from dune.dune.dbt_template_view_model
select * from dune.dune__tmp_.dbt_template_view_modelNote: dbt logs omit the catalog name, so copy-pasting queries from dbt output won't work directly—you must prepend dune. to the schema.
| Type | File | Use Case |
|---|---|---|
| View | dbt_template_view_model.sql |
Lightweight, always fresh |
| Table | dbt_template_table_model.sql |
Static snapshots |
| Incremental (Merge) | dbt_template_merge_incremental_model.sql |
Efficient updates via merge |
| Incremental (Delete+Insert) | dbt_template_delete_insert_incremental_model.sql |
Efficient updates via delete+insert |
| Incremental (Append) | dbt_template_append_incremental_model.sql |
Append-only with deduplication |
All templates are in models/templates/.
Runs on every PR. Enforces branch is up-to-date with main, then runs and tests modified models.
Target: Uses dev target with DEV_SCHEMA_SUFFIX=pr{number} for isolated testing
Steps:
- Enforces branch is up-to-date with main
- Runs modified models with full refresh
- Tests modified models
- Runs modified incremental models (incremental run)
- Tests modified incremental models
PR schema: {team}__tmp_pr{number} (e.g., dune__tmp_pr123)
Runs hourly on main branch. Uses state comparison to only full refresh modified models, then runs normal cadence runs.
Target: Sets DBT_TARGET: prod to write to production schemas ({team})
Steps:
- Downloads previous manifest (if exists)
- If state exists: Runs modified models with full refresh and tests
- Runs all models (handles incremental logic)
- Tests all models
- Uploads manifest for next run
- Sends email on failure
State comparison: Saves manifest.json after each run. Next run downloads it to detect changes. Manifest expires after 90 days.
Required:
- Add Secret:
DUNE_API_KEY(Settings → Secrets and variables → Actions → Secrets) - Add Variable:
DUNE_TEAM_NAME(Settings → Secrets and variables → Actions → Variables)- Optional, defaults to
'dune'if not set
- Optional, defaults to
Recommended:
- Public repos: Require approval for outside contributor workflows (Settings → Actions → General → Fork pull request workflows)
- Protects secrets from unauthorized access
- See SETUP_FOR_NEW_TEAMS.md for details
Email notifications:
- Enable workflow notifications: Profile → Settings → Notifications → Actions → "Notify me for failed workflows only"
- Verify email address is set
- Watch repository: Click "Watch" (any level works, even "Participating and @mentions")
Environment variables not set:
# Verify variables are set
env | grep DUNE_API_KEY
env | grep DUNE_TEAM_NAME
# If not set, export them
export DUNE_API_KEY=your_api_key
export DUNE_TEAM_NAME=your_team_nameConnection errors:
uv run dbt debug # Test connection and check for errorsdbt_utils not found:
uv run dbt depsDependency issues:
uv sync --reinstallmodels/ # dbt models and templates
macros/ # Custom Dune macros (schema overrides, sources)
└── dune_dbt_overrides/
└── get_custom_schema.sql # Controls schema naming based on target
scripts/ # Utility scripts for managing your Dune dbt project
└── drop_tables.py # Drop tables/views by schema pattern or specific table
.cursor/ # Cursor AI rules (dbt-best-practices.mdc)
└── rules/
└── dbt-best-practices.mdc # dbt patterns and configurations
profiles.yml # Connection profile (uses env_var() to read environment variables)
dbt_project.yml # Project configuration
The get_custom_schema.sql macro determines where models are written based on the dbt target:
| Target | DEV_SCHEMA_SUFFIX | Schema Name | Use Case |
|---|---|---|---|
prod |
(any) | {team} |
Production tables |
dev |
Not set | {team}__tmp_ |
Local development |
dev |
Set to pr123 |
{team}__tmp_pr123 |
CI/CD per PR |
dev |
Set to alice |
{team}__tmp_alice |
Personal dev space |
This ensures safe isolation between development and production environments.
The scripts/ directory contains utility scripts for managing tables and schemas. See scripts/README.md for details.