Let’s get started
By clicking ‘Next’, I agree to the Terms of Service
and Privacy Policy, and consent to receive emails from Rise
Jobs / Job page
Backend Engineer - Data Warehouse image - Rise Careers
Job details

Backend Engineer - Data Warehouse

Help us to increase the number of successful products in the world!

About PostHog

We're shipping every product that companies need to run their business from their first day, to the day they IPO, and beyond. The operating system for folks who build software.

We started with open-source product analytics, launched out of Y Combinator's W20 cohort. We've since shipped more than a dozen products, including:

  • A built-in data warehouse, so users can query product and customer data together using custom SQL insights.

  • A customer data platform, so they can send their data wherever they need with ease.

  • Max AI, an AI-powered analyst that answers product questions, helps users find useful session recordings, and writes custom SQL queries.

Next on the roadmap are CRM, messaging, revenue analytics, and support products. When we say every product that companies need to run their business, we really mean it!

We are:

  1. Product-led. More than 100,000 companies have installed PostHog, mostly driven by word-of-mouth. We have intensely strong product-market fit.

  2. Default alive. Revenue is growing 10% MoM on average, and we're very efficient. We raise money to push ambition and grow faster, not to keep the lights on.

  3. Well-funded. We've raised more than $100m from some of the world's top investors. We're set up for a long, ambitious journey.

We're focused on building an awesome product for end users, hiring exceptional teammates, shipping fast, and being as weird as possible.

Things we care about

  • Transparency: Everyone can read about our roadmap, how we pay (or even let go of) people, our strategy, and how we work, in our public company handbook. Internally, we share revenue, notes and slides from board meetings, and fundraising plans, so everyone has the context they need to make good decisions.

  • Autonomy: We don’t tell anyone what to do. Everyone chooses what to work on next based on what's going to have the biggest impact on our customers, and what they find interesting and motivating to work on. Engineers lead product teams and make product decisions. Teams are flexible and easy to change when needed.

  • Shipping fast: Why not now? We want to build a lot of products; we can't do that shipping at a normal pace. We've built the company around small teams – autonomous, highly-efficient groups of cracked engineers who can outship much larger companies because they own their products end-to-end.

  • Time for building: Nothing gets shipped in a meeting. We're a natively remote company. We default to async communication – PRs > Issues > Slack. Tuesdays and Thursdays are meeting-free days, and we prioritize heads down building time over perfect coordination. This will be the most productive job you've ever had.

  • Ambition: We want to solve big problems. We strongly believe that aiming for the best possible upside, and sometimes missing, is better than never trying. We're optimistic about what's possible and our ability to get there.

  • Being weird: Weird means redesigning an already world-class website for the 5th time. It means shipping literally every product that relates to customer data. It means building an objectively unnecessary developer toy with dubious shareholder value. Doing weird stuff is a competitive advantage. And it's fun.

Who we’re looking for

We're seeking a backend engineer who thrives on building robust, high-performance data pipelines. You’re passionate about turning complex ELT workflows into reliable products and working deeply with modern data formats like Arrow, Iceberg, and Delta. You enjoy pushing the boundaries of what data tools can do while ensuring they remain stable and production-ready.


Ideally, you're as much a data engineer as you are a software engineer. You would be a great fit if you've used the tools of the modern data stack (maybe you've even built some) and you've also built complex software from the ground up.

What makes this role unique

At PostHog, data warehousing is both a core product for our users and a foundational platform for our internal teams. You'll help build the tools that enable users to import, transform, and analyze their data via SQL, while also creating the infrastructure that powers current and future PostHog features.


Our data stack is end-to-end:

  • We’ve developed our own SQL parser from scratch

  • Built pipelines to import data from APIs and databases

  • Created a SQL editor for data exploration

  • Developed a materialization pipeline to transform and serve data efficiently


There’s a huge breadth of challenges and opportunities to tackle, and nothing is off-limits. Data tooling is a first-class product at PostHog, not an afterthought. You’ll have the chance to build the data tools you’ve always wanted to use.

What you'll be doing

Our team stretches from California to Hungary, but we are remote-first and take it seriously. Every team at PostHog makes an effort to develop the best way to work for its team members. You will be working with a small team that is currently 3 engineers and 1 product manager.

Your core responsibility will be to maintain and grow our data pipeline that enables our users to import their data from API and database sources. The work will span anything from expanding the source library for our users to refactoring how we stream data from ClickHouse to object storage using Arrow. You should be very comfortable building well-architected and well-tested code. However, you are also pragmatic and know how to scope implementations in a way that allows you to ship fast.


Examples of day-to-day work:

  • Designing and implementing a core interface that makes it easy to expand our source library

  • Debugging memory issues in our data pipeline service

  • Implementing granular schema control for users to configure when setting up an import

  • Building a graph traverser to materialize user-submitted queries

  • Instrumenting usage tracking to allow users to understand their import volume and costs

Requirements

  • Experience with Python and Django. Our core application backend and data pipeline services are built with Python and Django

  • Hands-on experience with the Arrow data format. We stream data from ClickHouse to object storage with Arrow as the intermediary format

  • Strong skills in designing, architecting, and building data systems from the ground up

  • While frontend may not be your primary focus, you’re not afraid to dive in when needed

Nice to have

  • Experience using Temporal- Experience with Clickhouse

  • Experience with open source table formats (Iceberg or Delta tables)

  • Experience with ASTs

  • You've carried a pager and have dealt with incidents

  • You're comfortable with provisioning and deploying infrastructure


We believe people from diverse backgrounds, with different identities and experiences, make our product and our company better. That’s why we dedicated a page in our handbook to
diversity and inclusion. No matter your background, we'd love to hear from you! Alignment with our values is just as important as experience! 🙏


Also, if you have a disability, please let us know if there's any way we can make the interview process better for you - we're happy to accommodate!

PostHog Glassdoor Company Review
5.0 Glassdoor star iconGlassdoor star iconGlassdoor star iconGlassdoor star iconGlassdoor star icon
PostHog DE&I Review
No rating Glassdoor star iconGlassdoor star iconGlassdoor star iconGlassdoor star iconGlassdoor star icon
CEO of PostHog
PostHog CEO photo
James Hawkins
Approve of CEO

Average salary estimate

$150000 / YEARLY (est.)
min
max
$120000K
$180000K

If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.

Similar Jobs
Photo of the Rise User
Posted 16 hours ago

Senior Analytics Engineer II at Carta responsible for building scalable data models, partnering with cross-functional teams to define core metrics, and championing analytics engineering best practices for internal and customer reporting.

Photo of the Rise User

Senior Data/AI Engineer sought to lead cloud-native data integration, warehousing, and AI/ML initiatives for a market-leading mortgage securitization fintech.

Photo of the Rise User
BuzzClan LLC Hybrid Alpha Rd, Dallas, TX, USA
Posted 9 hours ago

Lead the design and delivery of large-scale ETL pipelines and Apache Iceberg-based data lake solutions for a banking enterprise, mentoring engineers and ensuring data governance and quality.

Apply your spreadsheet mastery and obsessive attention to detail at BuildOn Technologies as an Estimated Data Specialist responsible for flawless product option catalogs, accurate pricing, and customer-facing support during implementations.

Photo of the Rise User

Lead the Data Platform engineering team at Cohere Health to build scalable, secure, and enterprise-grade data infrastructure that powers analytics and real-time healthcare applications.

Photo of the Rise User
Posted 7 hours ago

Lead and grow Toptal’s Data practice as the P&L-owning Practice Director, building scalable data services and driving go-to-market, sales enablement, and talent development across the U.S. market.

Photo of the Rise User
Hypergiant Hybrid Aberdeen, Maryland ; Oahu, HI; Remote
Posted 13 hours ago

Hypergiant (an Accelint company) is hiring a Senior Data Engineer to design and deliver scalable, secure data services for DoD C2 systems while maintaining clearance eligibility.

Photo of the Rise User
Posted 24 hours ago

AmeriLife is hiring a Director of Data Governance to define and enforce enterprise data policies, certify datasets, and partner with business and technical stakeholders to ensure trusted, compliant data across the organization.

Photo of the Rise User
Posted 15 hours ago

Senior Data Engineer role at a mission-driven healthcare company, focused on building scalable ETL pipelines, database architectures, and analytics-ready data platforms using modern tools like Python, Spark, Databricks, and AWS.

Photo of the Rise User
Posted 18 hours ago

ChowNow is looking for a Senior Data Analytics Engineer to develop and maintain a modern Snowflake + dbt data stack that delivers accurate, timely analytics and product-facing insights across the business.

Photo of the Rise User
Posted 23 hours ago

Carta is hiring a Senior Analytics Engineer I to create scalable data models, define core metrics with cross-functional partners, and drive data quality across the organization.

Photo of the Rise User
Posted 14 hours ago
Dental Insurance
Disability Insurance
Flexible Spending Account (FSA)
Health Savings Account (HSA)
Vision Insurance
Sabbatical
Paid Holidays

Senior Data Engineer needed to lead design and implementation of scalable data pipelines and platforms at Handshake, enabling data-driven features and AI initiatives across the company.

Photo of the Rise User
Chipotle On-Site 610 Newport Center Drive, Newport Beach, California 92660, USA
Posted 16 hours ago
Inclusive & Diverse
Rise from Within
Mission Driven
Diversity of Opinions
Work/Life Harmony
Customer-Centric
Social Impact Driven
Rapid Growth
Passion for Exploration
Dare to be Different
Reward & Recognition
Empathetic
Take Risks
Growth & Learning
Open Door Policy
Transparent & Candid
Medical Insurance
Paid Time-Off
Maternity Leave
Mental Health Resources
Equity
Child Care stipend
Onsite Child Care
Family Medical Leave
Paternity Leave
Fully Distributed
Snacks
Social Gatherings
Company Retreats
Fitness Stipend
Family Coverage (Insurance)
Health Savings Account (HSA)
Flexible Spending Account (FSA)
Conferences Stipend
Learning & Development
Bias Training

Chipotle is hiring a Manager of Data Platform Engineering & Operations to lead MDM and CDP initiatives and scale reliable data platforms at its Newport Beach office.

PostHog is increasing the number of successful products in the world. We do that by providing a platform for open-source product analytics. This helps software teams understand user behaviour. We took part in Y Combinator’s W20 cohort, and had...

2 jobs
MATCH
Calculating your matching score...
FUNDING
DEPARTMENTS
SENIORITY LEVEL REQUIREMENT
TEAM SIZE
EMPLOYMENT TYPE
Full-time, remote
DATE POSTED
August 22, 2025
Risa star 🔮 Hi, I'm Risa! Your AI
Career Copilot
Want to see a list of jobs tailored to
you, just ask me below!