Skip to main content

Overview

Perfect for data teams, researchers, and analysts who need to programmatically analyze onchain data, build custom reports, and create automated analytics pipelines.

What You’ll Learn

  • Execute SQL queries via API
  • Retrieve and process results
  • Handle query parameters
  • Build analytics pipelines
  • Filter and transform data

Prerequisites

Quick Start

1. Execute a Simple Query

Let’s start by executing a query to analyze recent DEX trading volume:
from dune_client.client import DuneClient
import pandas as pd

# Initialize client
dune = DuneClient(api_key="YOUR_API_KEY")

# Execute SQL query
sql = """
SELECT 
    blockchain,
    DATE_TRUNC('hour', block_time) as hour,
    SUM(amount_usd) as volume_usd,
    COUNT(*) as trade_count
FROM dex.trades
WHERE block_time > now() - interval '24' hour
GROUP BY 1, 2
ORDER BY 2 DESC
"""

results = dune.run_sql(query_sql=sql)

# Convert to pandas DataFrame for analysis
df = pd.DataFrame(results.result.rows)
print(df.head())
The run_sql() function in our Python SDK automatically handles query execution and polling for completion. With cURL, you’ll need to poll the execution status endpoint and then fetch results manually.

2. Use Parameterized Queries

For reusable analytics, create queries with parameters:
from dune_client.client import DuneClient
from dune_client.query import QueryBase
from datetime import datetime, timedelta

dune = DuneClient(api_key="YOUR_API_KEY")

# Execute a saved query with parameters
query_id = 3493826  # Example: Popular DEX analysis query

query = QueryBase(query_id=query_id)
results = dune.run_query(query=query)

# Process results
for row in results.result.rows:
    protocol = row.get('protocol') or row.get('project', 'Unknown')
    volume = row.get('volume') or row.get('volume_usd', 0)
    print(f"{protocol}: ${volume:,.2f}")

3. Build an Analytics Pipeline

Create a pipeline that runs multiple queries and combines results:
from dune_client.client import DuneClient
import pandas as pd
from datetime import datetime

dune = DuneClient(api_key="YOUR_API_KEY")

def daily_dex_analysis():
    """Run daily DEX analytics pipeline"""
    
    # 1. Get overall volume
    volume_query = """
    SELECT 
        SUM(amount_usd) as total_volume,
        COUNT(DISTINCT taker) as unique_traders
    FROM dex.trades
    WHERE block_time > now() - interval '24' hour
    """
    volume_data = dune.run_sql(query_sql=volume_query)
    
    # 2. Get top protocols
    protocol_query = """
    SELECT 
        project as protocol,
        SUM(amount_usd) as volume,
        COUNT(*) as trades
    FROM dex.trades
    WHERE block_time > now() - interval '24' hour
    GROUP BY 1
    ORDER BY 2 DESC
    LIMIT 10
    """
    protocol_data = dune.run_sql(query_sql=protocol_query)
    
    # 3. Get chain breakdown
    chain_query = """
    SELECT 
        blockchain,
        SUM(amount_usd) as volume
    FROM dex.trades
    WHERE block_time > now() - interval '24' hour
    GROUP BY 1
    ORDER BY 2 DESC
    """
    chain_data = dune.run_sql(query_sql=chain_query)
    
    # Combine and process results
    report = {
        'timestamp': datetime.now(),
        'total_volume': volume_data.result.rows[0]['total_volume'],
        'unique_traders': volume_data.result.rows[0]['unique_traders'],
        'top_protocols': pd.DataFrame(protocol_data.result.rows),
        'chain_breakdown': pd.DataFrame(chain_data.result.rows)
    }
    
    return report

# Run pipeline
report = daily_dex_analysis()
print(f"24h Volume: ${report['total_volume']:,.2f}")
print(f"Unique Traders: {report['unique_traders']:,}")
print("\nTop Protocols:")
print(report['top_protocols'])

4. Filter and Transform Results

Use Dune’s powerful filtering to refine results:
from dune_client.client import DuneClient

dune = DuneClient(api_key="YOUR_API_KEY")

# Execute query with SQL-based filtering
sql = """
SELECT 
    blockchain,
    project,
    SUM(amount_usd) as volume_usd
FROM dex.trades
WHERE block_time > now() - interval '24' hour
    AND blockchain = 'ethereum'
    AND amount_usd > 1000000
GROUP BY 1, 2
ORDER BY 3 DESC
"""

results = dune.run_sql(query_sql=sql)

# Results are already filtered server-side
for row in results.result.rows:
    print(row)
Server-side filtering with Dune’s SQL-like WHERE clause syntax is unique to Dune’s API! It saves bandwidth and processing time compared to filtering client-side.

Common Patterns

Scheduled Analytics

Run analytics on a schedule (e.g., with cron, Airflow, or cloud scheduler):
import schedule
import time
from dune_client.client import DuneClient

dune = DuneClient(api_key="YOUR_API_KEY")

def hourly_report():
    """Generate hourly DEX report"""
    sql = """
    SELECT 
        project,
        SUM(amount_usd) as hourly_volume
    FROM dex.trades
    WHERE block_time > now() - interval '1' hour
    GROUP BY 1
    ORDER BY 2 DESC
    """
    
    results = dune.run_sql(query_sql=sql)
    
    # Send to your dashboard/database/Slack
    send_to_dashboard(results.result.rows)

# Run every hour
schedule.every().hour.at(":00").do(hourly_report)

while True:
    schedule.run_pending()
    time.sleep(60)

Onchain Monitoring

Monitor specific metrics and trigger alerts:
from dune_client.client import DuneClient
import time

dune = DuneClient(api_key="YOUR_API_KEY")

def monitor_large_trades(threshold=1000000):
    """Alert on large trades"""
    sql = f"""
    SELECT *
    FROM dex.trades
    WHERE block_time > now() - interval '5' minute
    AND amount_usd > {threshold}
    ORDER BY block_time DESC
    """
    
    results = dune.run_sql(query_sql=sql)
    
    if results.result.rows:
        for trade in results.result.rows:
            send_alert(f"Large trade: ${trade['amount_usd']:,.2f} on {trade['blockchain']}")

# Run continuously
while True:
    monitor_large_trades()
    time.sleep(300)  # Check every 5 minutes

Next Steps

Example Use Cases

  • Research Reports: Generate daily/weekly reports on DeFi protocols, NFT markets, or chain activity
  • Automated Dashboards: Feed data to custom dashboards or BI tools
  • Alert Systems: Monitor metrics and trigger alerts when thresholds are met
  • Comparative Analysis: Compare metrics across chains, protocols, or time periods