Credit Usage in CI/CDAll dbt executions triggered by CI/CD workflows consume Dune credits. This includes pull request workflows, deploy workflows, and scheduled runs. The template repository ships with automated triggers disabled by default so you can enable them on your own terms. See Pricing & Best Practices for optimization guidance.
Understanding Credit Costs in CI/CD
When you run dbt models through CI/CD, each execution consumes credits from your Dune plan. Here are a few things to keep in mind:- Each push or PR can trigger a pipeline run. If your GitHub Actions run on every commit, frequent pushes during development will each use credits.
- All executions draw from the same pool. Whether a query is triggered locally, from a CI runner, or a scheduled job, it counts the same way.
- Concurrent executions add up. Multiple CI jobs running simultaneously (e.g., several open PRs) each consume credits independently.
- Timed-out queries still use credits. If a query runs for 30 minutes before timing out, credits are consumed for the compute used during that time.
- Start with
workflow_dispatch(manual trigger only) until you’ve validated your models - Use
--select state:modifiedto only run changed models in CI - Set appropriate query timeouts to keep costs predictable
- Monitor your credit usage in the Dune dashboard after enabling automated workflows
Development Workflow
Local Development
-
Create a feature branch:
-
Develop models locally:
-
Query your tables on Dune:
- Remember to use the
dune.catalog prefix:
- Remember to use the
Pull Request Workflow
-
Push changes and open PR:
-
Automated CI runs:
- CI enforces that branch is up-to-date with main
- Runs modified models with
--full-refreshin isolated schema{team}__tmp_pr{number} - Runs tests on modified models
- Tests incremental run logic
Tip: The pull request workflow is disabled by default in the template. To enable it, uncomment the
on: trigger block in .github/workflows/dbt_ci.yml. Each PR sync event (new commits pushed to a PR branch) will trigger a run and consume credits.- Team review:
- Review transformation logic in GitHub
- Check CI results
- Approve and merge when ready
Production Deployment
The production workflow includes an hourly schedule (0 * * * *), but it’s commented out by default. You can enable it by uncommenting the corresponding lines when you’re ready to run production jobs automatically.Tip: The deploy and scheduled workflows are disabled by default. Only
workflow_dispatch (manual trigger) is enabled out of the box. Uncomment the push and schedule triggers in the respective workflow files when you’re ready to automate.- State comparison: Uses manifest from previous run to detect changes
- Full refresh modified models: Any changed models run with
--full-refresh - Incremental run: All models run with normal incremental logic
- Testing: All models are tested
- Notification: Email sent on failure
CI/CD with GitHub Actions
The template includes two GitHub Actions workflows:CI Workflow (.github/workflows/ci.yml)
Runs on every pull request:
DUNE_API_KEY
DUNE_TEAM_NAME
Production Workflow (.github/workflows/prod.yml)
Runs hourly on main branch:
Troubleshooting
Connection Issues
Problem:dbt debug fails with connection error.
Solution:
- Verify
DUNE_API_KEYandDUNE_TEAM_NAMEare set correctly - Check that you have Data Transformations enabled for your team
- Ensure
transformations: trueis in session properties
Models Not Appearing in Dune
Problem: Can’t find tables in Data Explorer or queries. Solution:- Check the Connectors section in Data Explorer under “My Data”
- Remember to use
dune.catalog prefix in queries - Verify the table was created in the correct schema
Incremental Models Not Working
Problem: Incremental models always do full refresh. Solution:- Check that
is_incremental()macro is used correctly - Verify the
unique_keyconfiguration matches your table structure - Ensure the target table exists before running incrementally
CI/CD Failures
Problem: GitHub Actions failing. Solution:- Verify secrets and variables are set correctly in GitHub
- Check that branch is up-to-date with main
- Review workflow logs for specific errors
Limitations
Metadata Discovery
Limited support for some metadata discovery queries likeSHOW TABLES or SHOW SCHEMAS in certain contexts. This may affect autocomplete in some BI tools.
Workaround: Use the Data Explorer or query information_schema directly.
Result Set Size
Large result sets may timeout. Consider:- Paginating with
LIMITandOFFSET - Narrowing filters to reduce data volume
- Breaking complex queries into smaller parts
Read-After-Write Consistency
Tables and views are available for querying immediately after creation, but catalog caching may cause brief delays (typically < 60 seconds) before appearing in some listing operations.Rate Limits
Rate limits for Data Transformations align with the Dune Data API:- Requests are subject to the same rate limiting as API executions
- Large query operations run on the Large Query Engine tier
- See Rate Limits for detailed information