💵
Stablecoin On-Chain Flows
Cross-chain USDC, USDT, and DAI mint/burn/transfer events
Category
Events
Append-only transactions
Granularity
Per-transaction
Format
Apache Parquet
Snappy compression, Hive-partitioned
Nodes
1 (EU-Central)
Frankfurt — low-latency to HyperLiquid & Ethereum
About This Dataset
On-chain stablecoin transfer events tracking mints, burns, and large transfers of USDC, USDT, and DAI. Tracks fiat-equivalent capital movements across chains.
Schema
| Column | Type | Description |
|---|---|---|
time_chain | timestamp | Block timestamp (UTC) |
time_local | timestamp | Ingestion timestamp (UTC) |
token | string | Stablecoin symbol (USDC, USDT, DAI) |
event_type | string | Event: mint, burn, or transfer |
amount_usd | float64 | Transfer amount in USD |
chain | string | Blockchain network |
Use Cases
- Track stablecoin supply expansion and contraction
- Detect large stablecoin mints as a bullish capital inflow signal
- Monitor cross-chain stablecoin migration patterns
- Correlate stablecoin flows with crypto market direction
R2 Path
s3://algotick-data-lake/events/stablecoin_flows/year=YYYY/month=MM/day=DD/node={node}/data.parquet
Replace
YYYY, MM, DD with the target date.
Data is collected from node=eu-central.
Query with DuckDB
import duckdb df = duckdb.sql(""" SELECT * FROM read_parquet( 's3://algotick-data-lake/events/stablecoin_flows/year=2026/month=04/day=20/node=eu-central/data.parquet' ) LIMIT 100 """).df() print(f"Rows: {len(df)}, Columns: {list(df.columns)}")
Download via API
import requests resp = requests.get( "https://algotick.dev/v2/history/raw?dataset=stablecoin_flows&date=2026-04-20", stream=True, ) with open("stablecoin-flows.parquet", "wb") as f: for chunk in resp.iter_content(8192): f.write(chunk) # Then query locally import duckdb df = duckdb.sql("SELECT * FROM 'stablecoin-flows.parquet' LIMIT 100").df() print(df)
Related
Related signals: Whale Activity
Related datasets: Bridge Transfer Events DEX Swaps
Don't just stare at the dashboard. Automate it.
Every metric on this page is available via our sub-millisecond API.
Build trading bots, backtest strategies, and power AI agents with institutional-grade data.