Full Time
Fully Remote (US or Canada Based)
Nest is building the care plan platform for modern veterinary hospitals. Our technology helps practices design, deliver, and measure wellness programs that make preventative care easy to follow and sustainable to provide. For pet parents, that means predictable costs and better outcomes. For hospitals, it means stronger client relationships, steadier revenue, and more pets staying healthy under their care.
We’re a mission-driven team working at the intersection of design, technology, and medicine, and we’re just getting started. If you’ve ever wanted your work to both advance technology and improve the lives of millions of pets, Nest is the place to do it.
We’re seeking a Data Engineer to design and scale the systems that power our integrations and analytics platform. As our first dedicated data hire, you’ll own the pipelines that connect practice management systems (PIMS), third-party APIs, and internal applications, making veterinary and financial data accessible, reliable, and actionable.
This is a chance to lay the foundation for our data infrastructure and build the analytics backbone that empowers veterinarians, pet parents, and our internal teams alike.
Build and maintain ETL pipelines to integrate veterinary PIMS data with our systems
Develop integrations with third-party APIs to expand product capabilities
Design data models and real-time pipelines that power analytics and reporting
Partner with our Director of Finance to enable self-service analytics through Sigma Computing
Write clean, maintainable, well-documented code supported by tests
Collaborate with cross-functional teams (Engineering, Product, Finance) to deliver end-to-end solutions
Proactively identify and resolve technical challenges in a fast-moving environment
Bring an ownership mentality: improve processes, troubleshoot production issues, and propose new solutions
You might be a great fit if:
You bring 3–5+ years of professional data engineering experience
You’re fluent in Python (Polars, Pandas, dbt) and SQL
You’ve built and maintained real-time data pipelines
You’re comfortable working independently and can explain complex ideas clearly
You thrive in startup environments where you can take ownership and move fast
Bonus points if:
You’ve worked with Google Cloud Platform (Dataflow, Spanner, BigQuery)
You’ve built or maintained analytics pipelines at scale
You’ve integrated with practice management systems (PIMS) or other healthcare software
You know Java, Sigma, or other modern BI platforms
You’re curious about veterinary workflows and how data can transform them
Mission-Driven Impact: Build technology that improves pet health and strengthens the human-animal bond
Foundational Role: Be our first data hire and define our analytics infrastructure
Growth & Development: Opportunities to expand your scope as Nest scales
Modern Stack: Work with GCP, BigQuery, Sigma, and other best-in-class tools
Remote Flexibility: Fully remote team with flexible work options
Ready to use your engineering skills to transform veterinary care and make a difference for millions of pets? Apply now.
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.
Nest Veterinary is hiring a remote QA Lead to design and own QA processes, tooling, and testing strategy for their growing veterinary care platform.
Mod Op is seeking a Junior Data Engineer to build scalable ETL/ELT pipelines and dashboards across GCP/AWS to enable data-driven marketing and analytics.
Experienced Azure Data Engineer with Power BI skills needed to design and deliver secure, scalable data pipelines and dashboards for Vantage Data Centers' global Data & Analytics team.
Arbiter seeks a Senior Business Intelligence Developer to build dashboards, analytics, and data pipelines on GCP/dbt that drive decisions across providers and payers.
Ibotta is hiring a hybrid Data Engineer in Denver to develop scalable, secure data pipelines and advance its Databricks Lakehouse and AWS-based data platform.
As Tin Can’s first Analytics Engineer you’ll design the end-to-end data stack, ship production pipelines and models, and turn data into actionable insights that shape product and growth.
Jobgether seeks a Senior Data Engineer to design and deliver scalable batch and streaming data pipelines that turn complex data into reliable, actionable insights for a US-based employer offering flexible remote work.
SpaceX is hiring a Data Engineer in Redmond, WA to build data systems and analytics that power the Direct to Cell network and scale connectivity for millions of users.
Experienced data engineer needed to build and optimize production ETL and real-time data pipelines on Azure while mentoring junior engineers and improving data architecture.
Lead Vida’s backend data engineering to design and scale secure ETL pipelines and transform complex healthcare datasets for analytics, data science and product use.
SailPoint is hiring a Senior Data Engineer to design and implement scalable data models and pipelines that power AI/ML and analytics for its Identity Security Cloud.
Experienced Data Engineer needed to architect and deliver scalable, low-latency cloud data systems and lead cross-functional projects for a US-based partner, recruited via Jobgether.
A remote-first SaaS company is hiring a Technical Leader and Manager to own platform and data architecture, lead an engineering team, and deliver reliable, observable production systems.
Experienced Informatica Product360 Architect needed to lead PIM architecture, workflow development, and governance for high-impact, cross-team projects in a remote U.S. role.