Tempo is a layer-1 blockchain purpose-built for stablecoins and real-world payments, born from Stripe’s experience in global payments and Paradigm’s expertise in crypto tech.
Tempo’s payment-first design provides a scalable, low-cost predictable backbone that meets the needs of high-volume payment use cases. Our goal is to move money reliably, cheaply, and at scale. Our north star is simplicity for users: fintechs, traditional banks, merchants, platforms, and anyone else looking to move their payments into the 21st century.
We're building Tempo with design partners who are global leaders in AI, e-commerce, and financial services: Anthropic, Coupang, Deutsche Bank, DoorDash, Mercury, Nubank, OpenAI, Revolut, Shopify, Standard Chartered, Visa, and more.
We’re a team of crypto-optimists, building the infrastructure needed to bring real, substantial economic flows onchain. Our team primarily works in-person out of our San Francisco and NYC offices. We like to move fast and swing for the fences — join us!
The Role
As Tempo’s first data engineer, you’ll architect and operate the core data infrastructure that powers analytics across the Tempo blockchain. You’ll design ETL pipelines, index on-chain data, and ensure the reliability and scalability of our data systems. Your work will make blockchain data accessible — fueling analytics, dashboards, and developer tools that power Tempo’s ecosystem. You’ll relate data from stablecoins on blockchains today and existing payment systems worldwide to help us produce meaningful metrics and reports to move payments on-chain.
Responsibilities
Create, from scratch, Tempo’s data pipelines and infrastructure using modern technologies and techniques, and onboard the company onto this infrastructure.
Design and build scalable data pipelines to extract, transform, and load blockchain data (transactions, validator metrics, token activity, smart contracts, etc.)
Develop and maintain data warehouses, ETLs, and APIs to support analytics, explorers, and ecosystem partners.
Work closely with protocol engineers to ensure accurate instrumentation of on-chain metrics.
Optimize data ingestion and storage systems for real-time and historical blockchain data.
Collaborate with data analysts/scientists and researchers to operationalize models and insights.
Build or contribute to open-source tools for blockchain data processing (e.g., indexers, query layers, analytics SDKs).
Ensure data quality, security, and scalability across Tempo’s data infrastructure.
Qualifications
5+ years of experience as a Data Engineer, Analytics Engineer, or similar role.
Strong proficiency in Python, SQL, and data pipeline frameworks.
Experience building data infrastructure pipelines from scratch
Experience with cloud data warehouses and data orchestration.
Familiarity with blockchain data structures, RPC APIs, and node querying.
Experience designing real-time or large-scale data systems.
Knowledge of EVM smart contract platforms and indexing architectures.
Strong understanding of data modeling, schema design, and performance optimization.
Attributes
Razor-sharp thinker with a strong focus on optimal systems for reliability and performance.
Hands-on and proactive; comfortable owning infrastructure end-to-end.
Curious about blockchain internals and distributed systems.
Organized, detail-oriented, and execution-focused.
Collaborative mindset with an open-source ethos.
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.
A fast-growing SaaS firm is looking for a Finance Data Engineer to build scalable financial data workflows and automate critical finance processes to improve accuracy and visibility.
Experienced LiDAR calibration analyst needed at NV5 in Portland to perform boresight adjustments, QA/QC, and lead process improvements for airborne LiDAR production.
Cardinal Health is looking for a Senior Data Engineering Analyst to develop and optimize GCP-based data pipelines (Cloud Composer/Airflow, Dataflow, BigQuery) and drive reliable, cost-effective data delivery for analytics.
Philips seeks a Data Management Specialist in Nashville to ensure CRM and sales data accuracy through cleansing, auditing, and reporting to support dental sales operations.
Philadelphia Gas Works is hiring a GIS Analyst to design, manage, and automate ESRI ArcGIS Enterprise GIS solutions that support utility asset management and business workflows.
Hiscox US seeks an Associate Data Engineer to help develop and maintain a modern Databricks-based cloud data platform using SQL/Python and CI/CD practices in a hybrid Atlanta role.
A growing tech services firm is looking for a Data Engineer experienced in Azure Synapse and large-scale data migrations to modernize ETL pipelines and support data warehouse initiatives.
Condé Nast is hiring a senior Data Engineer to design and maintain scalable dimensional data models and streaming/batch pipelines that power analytics and BI across multiple consumer and advertising domains.
Senior data engineering role at Comcast responsible for architecting and implementing large-scale ETL and analytics solutions using Python, Spark/Databricks, AWS, and modern data tooling for telecom and billing systems.
NV5 is hiring a LiDAR Calibration Analyst in Portland to perform calibration, QA/QC, and validation of airborne LiDAR swaths to meet internal and client accuracy standards.
Phoenix American is hiring a detail-focused Data Steward to research, validate and resolve fund and rep-level data issues feeding into our MARS platform.
Senior Data Engineer needed to design and build a scalable, multi-tenant data platform that powers Verge's CONVERGE dataset and customer-facing SaaS analytics.
Schmidt Sciences seeks a technically fluent leader to architect and oversee next-generation data management and software systems for its astrophysics observatory and mission portfolio.