Skip to content

Crypto Market Data Library

Institutional-grade datasets stored as Hive-partitioned Apache Parquet files. Free download via R2 or stream through the API.

15
Datasets
2
Ingest Nodes
Parquet
Format
Daily
Update Frequency

Event Streams

Append-only transaction-level data. Every trade, swap, bridge transfer, and staking event captured as it happens.

State Snapshots

Point-in-time snapshots of market state. Orderbook depth, funding rates, gamma exposure, and network metrics.

Quick Start

# Read any dataset with DuckDB (no download needed)
import duckdb

df = duckdb.sql("""
    SELECT * FROM read_parquet(
        's3://algotick-data-lake/events/trades/exchange=hyperliquid/year=2026/month=03/day=14/node=eu-central/data.parquet'
    ) LIMIT 100
""").df()
print(df)

Architecture

Dual-Citadel Ingest: Data is independently collected by two geographically separated nodes (Frankfurt, EU and Canada, NA). This enables cross-validation and geographic arbitrage analysis.

Hive Partitioning: Every file follows the pattern {category}/{dataset}/{partition}/year=YYYY/month=MM/day=DD/node={region}/data.parquet for efficient date-range and region-specific queries.

Dual Timestamps: Every row has time_chain (on-chain/exchange event time) and time_local (ingestion time) for microsecond-accuracy latency analysis.

Don't just stare at the dashboard. Automate it.

Every metric on this page is available via our sub-millisecond API.
Build trading bots, backtest strategies, and power AI agents with institutional-grade data.

Explore API →

Continue the Research

⚡ API Reference🍳 Cookbook📊 Backtests📖 Playbooks