📊
HLP Perpetual Trades
Tick-level trade executions from HyperLiquid perpetual futures
Category
Events
Append-only transactions
Granularity
Per-tick (50–500 ticks/sec during volatility)
Format
Apache Parquet
Snappy compression, Hive-partitioned
Nodes
1 (EU-Central)
Frankfurt — low-latency to HyperLiquid & Ethereum
About This Dataset
Every single fill captured with price, size, side, and sub-millisecond timestamp. Covers BTC, ETH, SOL, and HYPE perpetual futures.
Partitions
exchange:
hyperliquidSchema
| Column | Type | Description |
|---|---|---|
time_chain | timestamp | Exchange execution timestamp (UTC) |
time_local | timestamp | Ingestion timestamp (UTC) |
coin | string | Asset symbol (BTC, ETH, SOL, etc.) |
side | string | Trade side: buy or sell |
price | float64 | Execution price (USD) |
size | float64 | Trade size in base asset units |
trade_id | string | Exchange trade identifier |
Use Cases
- Reconstruct OHLCV candles at any resolution
- Compute VWAP, TWAP, and cumulative volume delta
- Measure trade-flow toxicity and order-flow imbalance
- Identify institutional block trades and whale activity
R2 Path
s3://algotick-data-lake/events/trades/exchange=hyperliquid/year=YYYY/month=MM/day=DD/node={node}/data.parquet
Replace
YYYY, MM, DD with the target date.
Data is collected from node=eu-central.
Query with DuckDB
import duckdb df = duckdb.sql(""" SELECT * FROM read_parquet( 's3://algotick-data-lake/events/trades/exchange=hyperliquid/year=2026/month=04/day=20/node=eu-central/data.parquet' ) LIMIT 100 """).df() print(f"Rows: {len(df)}, Columns: {list(df.columns)}")
Download via API
import requests resp = requests.get( "https://algotick.dev/v2/history/raw?dataset=trades&date=2026-04-20", stream=True, ) with open("trades.parquet", "wb") as f: for chunk in resp.iter_content(8192): f.write(chunk) # Then query locally import duckdb df = duckdb.sql("SELECT * FROM 'trades.parquet' LIMIT 100").df() print(df)
Related
Don't just stare at the dashboard. Automate it.
Every metric on this page is available via our sub-millisecond API.
Build trading bots, backtest strategies, and power AI agents with institutional-grade data.