Skip to content
💧

LP Mint/Burn Events

Concentrated liquidity LP events from Uniswap V3 and Aerodrome

Category
Events
Append-only transactions
Granularity
Per-transaction (every LP event)
Format
Apache Parquet
Snappy compression, Hive-partitioned
Nodes
1 (EU-Central)
Frankfurt — low-latency to HyperLiquid & Ethereum

About This Dataset

Mint, Burn, and Swap events from concentrated liquidity pools. Enables exact reconstruction of the decentralized orderbook for slippage modeling.

Partitions
chain: base, arbitrum, ethereum

Schema

ColumnTypeDescription
time_chaintimestampBlock timestamp (UTC)
time_localtimestampIngestion timestamp (UTC)
poolstringPool contract address
event_typestringEvent type: Mint, Burn, or Swap
tick_lowerint64Lower tick boundary
tick_upperint64Upper tick boundary
amount0float64Amount of token0
amount1float64Amount of token1
tx_hashstringTransaction hash

Use Cases

R2 Path

s3://algotick-data-lake/events/lp_flows/chain={chain}/year=YYYY/month=MM/day=DD/node={node}/data.parquet
Replace YYYY, MM, DD with the target date. Data is collected from node=eu-central.

Query with DuckDB

import duckdb

df = duckdb.sql("""
    SELECT *
    FROM read_parquet(
        's3://algotick-data-lake/events/lp_flows/chain=base/year=2026/month=04/day=20/node=eu-central/data.parquet'
    )
    LIMIT 100
""").df()

print(f"Rows: {len(df)}, Columns: {list(df.columns)}")

Download via API

import requests

resp = requests.get(
    "https://algotick.dev/v2/history/raw?dataset=lp_flows&chain=base&date=2026-04-20",
    stream=True,
)

with open("lp-flows.parquet", "wb") as f:
    for chunk in resp.iter_content(8192):
        f.write(chunk)

# Then query locally
import duckdb
df = duckdb.sql("SELECT * FROM 'lp-flows.parquet' LIMIT 100").df()
print(df)
Get API Key →

Related

Related signals: Whale Activity
Related datasets: DEX Swaps Stablecoin On-Chain Flows

Don't just stare at the dashboard. Automate it.

Every metric on this page is available via our sub-millisecond API.
Build trading bots, backtest strategies, and power AI agents with institutional-grade data.

Explore API →

Continue the Research

⚡ API Reference🍳 Cookbook📊 Backtests📖 Playbooks