Skip to content
⚖️

Cross-Venue Orderbook Imbalance

Aggregated buy/sell pressure imbalance across venues for BTC, ETH, SOL

Category
States
Point-in-time snapshots
Granularity
Continuous (~1–5 second updates)
Format
Apache Parquet
Snappy compression, Hive-partitioned
Nodes
1 (EU-Central)
Frankfurt — low-latency to HyperLiquid & Ethereum

About This Dataset

Cross-venue orderbook imbalance computed from multiple data sources. Captures aggressive flow, absorption patterns, and momentum signals. Updated every 1–5 seconds.

Partitions
asset: btc, eth, sol

Schema

ColumnTypeDescription
time_chaintimestampMeasurement timestamp (UTC)
time_localtimestampIngestion timestamp (UTC)
assetstringAsset symbol
imbalancefloat64Imbalance score (-1 to 1)
bid_depth_usdfloat64Total bid depth within 1% (USD)
ask_depth_usdfloat64Total ask depth within 1% (USD)

Use Cases

R2 Path

s3://algotick-data-lake/states/imbalance/asset={asset}/year=YYYY/month=MM/day=DD/node={node}/data.parquet
Replace YYYY, MM, DD with the target date. Data is collected from node=eu-central.

Query with DuckDB

import duckdb

df = duckdb.sql("""
    SELECT *
    FROM read_parquet(
        's3://algotick-data-lake/states/imbalance/asset=btc/year=2026/month=04/day=20/node=eu-central/data.parquet'
    )
    LIMIT 100
""").df()

print(f"Rows: {len(df)}, Columns: {list(df.columns)}")

Download via API

import requests

resp = requests.get(
    "https://algotick.dev/v2/history/raw?dataset=imbalance&coin=BTC&date=2026-04-20",
    stream=True,
)

with open("imbalance.parquet", "wb") as f:
    for chunk in resp.iter_content(8192):
        f.write(chunk)

# Then query locally
import duckdb
df = duckdb.sql("SELECT * FROM 'imbalance.parquet' LIMIT 100").df()
print(df)
Get API Key →

Related

Related signals: Whale Activity Composite Signal
Related datasets: L2 Order Book Snapshots HLP Perpetual Trades Perpetual Funding Rates

Don't just stare at the dashboard. Automate it.

Every metric on this page is available via our sub-millisecond API.
Build trading bots, backtest strategies, and power AI agents with institutional-grade data.

Explore API →

Continue the Research

⚡ API Reference🍳 Cookbook📊 Backtests📖 Playbooks