Skip to content
📊

HLP Perpetual Trades

Tick-level trade executions from HyperLiquid perpetual futures

Category
Events
Append-only transactions
Granularity
Per-tick (50–500 ticks/sec during volatility)
Format
Apache Parquet
Snappy compression, Hive-partitioned
Nodes
1 (EU-Central)
Frankfurt — low-latency to HyperLiquid & Ethereum

About This Dataset

Every single fill captured with price, size, side, and sub-millisecond timestamp. Covers BTC, ETH, SOL, and HYPE perpetual futures.

Partitions
exchange: hyperliquid

Schema

ColumnTypeDescription
time_chaintimestampExchange execution timestamp (UTC)
time_localtimestampIngestion timestamp (UTC)
coinstringAsset symbol (BTC, ETH, SOL, etc.)
sidestringTrade side: buy or sell
pricefloat64Execution price (USD)
sizefloat64Trade size in base asset units
trade_idstringExchange trade identifier

Use Cases

R2 Path

s3://algotick-data-lake/events/trades/exchange=hyperliquid/year=YYYY/month=MM/day=DD/node={node}/data.parquet
Replace YYYY, MM, DD with the target date. Data is collected from node=eu-central.

Query with DuckDB

import duckdb

df = duckdb.sql("""
    SELECT *
    FROM read_parquet(
        's3://algotick-data-lake/events/trades/exchange=hyperliquid/year=2026/month=04/day=20/node=eu-central/data.parquet'
    )
    LIMIT 100
""").df()

print(f"Rows: {len(df)}, Columns: {list(df.columns)}")

Download via API

import requests

resp = requests.get(
    "https://algotick.dev/v2/history/raw?dataset=trades&date=2026-04-20",
    stream=True,
)

with open("trades.parquet", "wb") as f:
    for chunk in resp.iter_content(8192):
        f.write(chunk)

# Then query locally
import duckdb
df = duckdb.sql("SELECT * FROM 'trades.parquet' LIMIT 100").df()
print(df)
Get API Key →

Related

Related signals: Whale Activity Liquidations Volatility
Related datasets: L2 Order Book Snapshots Perpetual Funding Rates Cross-Venue Orderbook Imbalance

Don't just stare at the dashboard. Automate it.

Every metric on this page is available via our sub-millisecond API.
Build trading bots, backtest strategies, and power AI agents with institutional-grade data.

Explore API →

Continue the Research

⚡ API Reference🍳 Cookbook📊 Backtests📖 Playbooks