⚖️
Cross-Venue Orderbook Imbalance
Aggregated buy/sell pressure imbalance across venues for BTC, ETH, SOL
Category
States
Point-in-time snapshots
Granularity
Continuous (~1–5 second updates)
Format
Apache Parquet
Snappy compression, Hive-partitioned
Nodes
1 (EU-Central)
Frankfurt — low-latency to HyperLiquid & Ethereum
About This Dataset
Cross-venue orderbook imbalance computed from multiple data sources. Captures aggressive flow, absorption patterns, and momentum signals. Updated every 1–5 seconds.
Partitions
asset:
btc, eth, solSchema
| Column | Type | Description |
|---|---|---|
time_chain | timestamp | Measurement timestamp (UTC) |
time_local | timestamp | Ingestion timestamp (UTC) |
asset | string | Asset symbol |
imbalance | float64 | Imbalance score (-1 to 1) |
bid_depth_usd | float64 | Total bid depth within 1% (USD) |
ask_depth_usd | float64 | Total ask depth within 1% (USD) |
Use Cases
- Detect short-term directional pressure
- Build orderbook-based trading signals
- Study imbalance vs forward returns correlation
- Combine with funding rate for confluence signals
R2 Path
s3://algotick-data-lake/states/imbalance/asset={asset}/year=YYYY/month=MM/day=DD/node={node}/data.parquet
Replace
YYYY, MM, DD with the target date.
Data is collected from node=eu-central.
Query with DuckDB
import duckdb df = duckdb.sql(""" SELECT * FROM read_parquet( 's3://algotick-data-lake/states/imbalance/asset=btc/year=2026/month=04/day=20/node=eu-central/data.parquet' ) LIMIT 100 """).df() print(f"Rows: {len(df)}, Columns: {list(df.columns)}")
Download via API
import requests resp = requests.get( "https://algotick.dev/v2/history/raw?dataset=imbalance&coin=BTC&date=2026-04-20", stream=True, ) with open("imbalance.parquet", "wb") as f: for chunk in resp.iter_content(8192): f.write(chunk) # Then query locally import duckdb df = duckdb.sql("SELECT * FROM 'imbalance.parquet' LIMIT 100").df() print(df)
Related
Related signals: Whale Activity Composite Signal
Don't just stare at the dashboard. Automate it.
Every metric on this page is available via our sub-millisecond API.
Build trading bots, backtest strategies, and power AI agents with institutional-grade data.