⚡
Flashbots Builder Bids
Per-block Flashbots relay builder bids on Ethereum
Category
Events
Append-only transactions
Granularity
Per-block (every Ethereum block)
Format
Apache Parquet
Snappy compression, Hive-partitioned
Nodes
1 (EU-Central)
Frankfurt — low-latency to HyperLiquid & Ethereum
About This Dataset
Every Flashbots relay builder bid: block number, builder public key, bid value in ETH, gas used, and gas limit. Maps the exact cost of Ethereum block space and toxic order flow.
Partitions
chain:
ethereumSchema
| Column | Type | Description |
|---|---|---|
time_chain | timestamp | Block timestamp (UTC) |
block_number | int64 | Ethereum block number |
builder_pubkey | string | Builder public key |
bid_value_eth | float64 | Bid value in ETH |
gas_used | int64 | Gas consumed by the block |
gas_limit | int64 | Block gas limit |
Use Cases
- Analyze MEV extraction trends over time
- Track builder competition and market concentration
- Correlate block space cost with network activity
- Study toxic order flow and its impact on traders
R2 Path
s3://algotick-data-lake/events/mev_bids/chain=ethereum/year=YYYY/month=MM/day=DD/node={node}/data.parquet
Replace
YYYY, MM, DD with the target date.
Data is collected from node=eu-central.
Query with DuckDB
import duckdb df = duckdb.sql(""" SELECT * FROM read_parquet( 's3://algotick-data-lake/events/mev_bids/chain=ethereum/year=2026/month=04/day=20/node=eu-central/data.parquet' ) LIMIT 100 """).df() print(f"Rows: {len(df)}, Columns: {list(df.columns)}")
Download via API
import requests resp = requests.get( "https://algotick.dev/v2/history/raw?dataset=mev_bids&chain=ethereum&date=2026-04-20", stream=True, ) with open("mev-bids.parquet", "wb") as f: for chunk in resp.iter_content(8192): f.write(chunk) # Then query locally import duckdb df = duckdb.sql("SELECT * FROM 'mev-bids.parquet' LIMIT 100").df() print(df)
Related
Related signals: Gas Tracker
Related datasets: Network Congestion HLP Perpetual Trades
Don't just stare at the dashboard. Automate it.
Every metric on this page is available via our sub-millisecond API.
Build trading bots, backtest strategies, and power AI agents with institutional-grade data.