Senior Data Engineer

Biconomy logo

Biconomy

Worldwide
Full-TimeWorldwide

Job Description

Biconomy empowers Web3 developers to build seamless, user-friendly dApps that work effortlessly across multiple blockchains. Our battle-tested modular account and execution stack eliminates traditional UX friction points, helping projects accelerate user adoption while reducing development costs. By processing over 70 million transactions across the 300+ dApps we've served, we're powering the future of onchain economies.

The Role - Powering the Next Generation of Real-Time Market IntelligenceHyperSignals is Biconomy’s high-throughput analytics engine, designed to extract millisecond-level insights from both on-chain activity and centralized exchange (CEX) trading flows. Our mission is to equip traders and protocols with alpha-rich signals across spot and perpetual futures markets.

As a Data Engineer, you’ll play a foundational role in building and scaling the infrastructure that powers HyperSignals. From ingesting terabytes of blockchain and CEX data to delivering analysis-ready datasets for our quant and product teams, you'll help create the data backbone of a truly cutting-edge trading intelligence system.

What You Will Be Doing• Design, build, and maintain streaming and batch ETL pipelines for on-chain sources across EVM, Solana, Sui, Starknet, and more

  • Develop NLP and sentiment pipelines for off-chain sources (Binance, Bybit, social platforms) to extract actionable market signals
  • Normalize and unify disparate market data schemas (order books, trades, liquidations, funding rates) into a single analytics model for perpetuals
  • Implement low-latency ingestion systems using Kafka, Kinesis, PubSub, WebSockets, or Firehose, with exactly-once guarantees
  • Build and optimize lakehouse/warehouse layers (Iceberg, Delta, Snowflake, BigQuery) with Z-ordering, partitioning, and materialized views
  • Enforce data quality and observability using dbt tests, Great Expectations, and OpenTelemetry
  • Collaborate with quants and backend engineers to deliver data via GraphQL/REST APIs and feature stores
  • Continuously optimize performance, cost, and scalability across AWS/GCP infrastructure

Requirements• 4+ years of experience as a Data Engineer in high-throughput environments such as trading, crypto, or fintech

  • Expert-level Python (pandas, pyarrow, asyncio) and SQL skills, with strong fundamentals in algorithms and distributed systems
  • Proven experience with streaming frameworks (Flink, Spark Structured Streaming, Kafka Streams) and message buses (Kafka, Kinesis, Pulsar)
  • In-depth understanding of blockchain data structures (blocks, receipts, logs), indexers (The Graph, Substreams), and node/RPC infrastructure
  • Familiarity with CEX market APIs (REST & WebSocket) and mechanics of perpetual futures (funding, mark price, open interest, liquidations)
  • Proficient in cloud-native development (AWS or GCP), including IaC (Terraform/CDK), CI/CD, and container orchestration (EKS/GKE)
  • A strong track record of building and owning production systems end-to-end, with clear documentation and operational rigor
  • Passionate about perpetual futures and market microstructure-you don’t need to be a trader, but curiosity is key

What We Offer• Flexible Working Hours - Enjoy autonomy over your schedule

  • Generous Vacation Policy - 25 days vacation per year plus public holidays
  • Competitive Salary - With regular performance reviews
  • Token Allocation - Be rewarded with tokens as part of our compensation package
  • Growth Opportunities - Be part of an exciting new project with significant career growth potential
  • Innovative Work Culture - Join a team that’s at the cutting edge of Web3, AI, and DeFi, and help shape the future of the digital economy
  • Fun and Engaging Team Activities - Game nights, virtual celebrations, and work retreats to keep things exciting