Skip to main content
Use dbt to publish transformation outputs from your private Dune namespace into a configured Datashare target such as Snowflake or BigQuery. This is useful when you want to keep the transformation logic on Dune, but consume the resulting tables inside your own warehouse.
dbt to Datashare is an enterprise workflow that requires both Data Transformations and Datashare to be enabled for your team.
Datashare syncs are billed based on bytes transferred and byte-months of storage for the synced table.

How It Works

  1. dbt builds a table or incremental model in your Dune namespace.
  2. A dbt post-hook runs ALTER TABLE ... EXECUTE datashare(...) on that table.
  3. Dune registers or updates the datashare sync and sends the data to your configured target.
  4. You monitor the sync in dune.datashare.table_syncs and dune.datashare.table_sync_runs.

Prerequisites

  • Enterprise account with Data Transformations enabled
  • Datashare configured for your team by Dune
  • Dune API key with write access
  • A dbt project using the Dune Trino connector
The fastest way to get started is the public template repo:

dune-dbt-template

Includes the datashare macro, a prod-only post-hook, and an opt-in example model.

Add The dbt Post-Hook

Add the datashare post-hook to your dbt_project.yml:
models:
  your_project:
    +post-hook:
      - sql: "{{ datashare_trigger_sync() }}"
        transaction: true
For safety, the template runs this hook only on the prod target. Temporary dev or CI schemas should not create datashare syncs by default.

Configure A Model

Enable datashare in meta.datashare on a table or incremental model:
{% set time_start = "current_date - interval '1' day" if is_incremental() else "current_date - interval '2' day" %}
{% set time_end = "current_date + interval '1' day" %}

{{ config(
    materialized = 'incremental',
    incremental_strategy = 'merge',
    unique_key = ['block_number', 'block_date'],
    meta = {
        "datashare": {
            "enabled": true,
            "time_column": "block_date",
            "time_start": time_start,
            "time_end": time_end
        }
    }
) }}

select ...

Configuration Fields

FieldRequiredDescription
enabledYesMust be true to trigger a sync.
time_columnYesColumn used by the sync window.
time_startYesSQL expression for the start of the window.
time_endNoSQL expression for the end of the window. Defaults to now().
unique_key_columnsNoUnique row identity columns. Falls back to the model unique_key.
Keep time_column, time_start, and time_end at the same granularity. For example, if time_column is a date, use date-based expressions rather than hour-based timestamp windows.

Full Refresh Behavior

The sync uses full_refresh = true when:
  • the model is materialized as a table
  • the incremental model is running for the first time
  • the model is run with --full-refresh
Normal incremental runs use full_refresh = false.

Manual Syncs

You can trigger a sync manually with dbt run-operation:
uv run dbt run-operation datashare_trigger_sync_operation --target prod --args '
model_selector: my_model
dry_run: true
'
This is useful for previewing the generated SQL or running one-off syncs outside a normal dbt run.

Monitor Syncs

SELECT *
FROM dune.datashare.table_syncs
WHERE source_schema = 'your_team';

SELECT *
FROM dune.datashare.table_sync_runs
WHERE source_schema = 'your_team'
ORDER BY created_at DESC;
Use table_syncs for the current registration state and table_sync_runs for execution history.

Remove A Table From Datashare

ALTER TABLE dune.your_team.my_table EXECUTE delete_datashare

Reference