Machinify is a leading healthcare intelligence company with expertise across the payment continuum, delivering unmatched value, transparency, and efficiency to health plan clients across the country. Deployed by over 60 health plans, including many of the top 20, and representing more than 160 million lives, Machinify brings together a fully configurable and content-rich, AI-powered platform along with best-in-class expertise. We’re constantly reimagining what’s possible in our industry, creating disruptively simple, powerfully clear ways to maximize financial outcomes and drive down healthcare costs.
As a Data Engineering Manager, you will lead a high-performing team responsible for transforming raw external and customer data into actionable, trusted datasets. Your team’s work powers product decisions, ML models, operational dashboards, and client integrations.
You’ll combine hands-on technical expertise with people and project leadership, reviewing and designing production pipelines, mentoring engineers, and driving best practices. You will also be a key cross-functional partner, collaborating with product managers, Server teams, Platform teams, UI teams, SMEs, account managers, analytics teams, ML/DS teams, and customer success to ensure data is accurate, reliable, and impactful.
This is a high-visibility role with both strategic and tactical impact — shaping data workflows, onboarding new customers, and scaling the team as the company grows.
Lead, mentor, and grow a high-performing team of Data Engineers, fostering technical excellence, collaboration, and career growth.
Own the design, review, and optimization of production pipelines, ensuring high performance, reliability, and maintainability.
Drive customer data onboarding projects, standardizing external feeds into canonical models.
Collaborate with senior leadership to define team priorities, project roadmaps, and data standards, translating objectives into actionable assignments for your team.
Lead sprint planning and work with cross-functional stakeholders to prioritize initiatives that improve customer metrics and product impact.
Partner closely with Product, ML, Analytics, Engineering, and Customer teams to translate business needs into effective data solutions.
Ensure high data quality, observability, and automated validations across all pipelines.
Contribute hands-on when necessary to architecture, code reviews, and pipeline design.
Identify and implement tools, templates, and best practices that improve team productivity and reduce duplication.
Build cross-functional relationships to advocate for data-driven decision-making and solve complex business problems.
Hire, mentor, and develop team members, fostering a culture of innovation, collaboration, and continuous improvement.
Communicate technical concepts and strategies effectively to both technical and non-technical stakeholders.
Measure team impact through metrics and KPIs, ensuring alignment with company goals.
Degree in Computer Science, Engineering, or a related field.
3+ years of combined technical leadership and engineering management experience, preferably in a startup, with a proven track record of managing data teams and delivering high-impact projects from concept to deployment.
10+ years of experience in data engineering, including building and maintaining production pipelines and distributed computing frameworks.
Strong expertise in Python, Spark, SQL, and Airflow
Hands-on experience in pipeline architecture, code review, and mentoring junior engineers.
Prior experience with customer data onboarding and standardizing non-canonical external data.
Deep understanding of distributed data processing, pipeline orchestration, and performance tuning.
Exceptional ability to manage priorities, communicate clearly, and work cross-functionally, with experience building and leading high-performing teams.
Demonstrated experience leading small teams, including performance management and career development.
Comfortable with ambiguity, taking initiative, thinking strategically, and executing methodically.
Ability to drive change, inspire distributed teams, and solve complex problems with a data-driven mindset.
Customer-oriented, ensuring work significantly advances product value and impact.
Bonus:
Familiarity with healthcare data (837/835 claims, EHR, UB04).
Experience with cloud platforms (AWS/GCP), databricks , streaming frameworks (Kafka/SQS), and containerized workflows (Docker/Kubernetes).
Experience building internal DE tooling, frameworks, or SDKs to improve team productivity.
High Impact: Your team’s work powers key decisions across product, ML, operations, and customer-facing initiatives.
Ownership & Growth: Influence the data platform and pipeline architecture while mentoring a growing team.
Cross-Functional Exposure: Work with product, platform, engineering , ML, analytics, and customer teams to solve meaningful problems.
Remote Flexibility: Fully remote with opportunities to collaborate across teams.
Early Builder Advantage: Shape processes, standards, and practices as we scale.
Equal Employment Opportunity at Machinify
Machinify is committed to hiring talented and qualified individuals with diverse backgrounds for all of its positions. Machinify believes that the gathering and celebration of unique backgrounds, qualities, and cultures enriches the workplace.
See our Candidate Privacy Notice at: https://www.machinify.com/candidate-privacy-notice/
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.
Lead the development of production-grade GenAI and agentic workflows at Machinify to scale document processing and automate claims validation for major health plans.
Lead cross-functional software programs at Machinify to deliver customer-facing and internal products that improve financial outcomes for health plans.
Lead architecture and implementation of enterprise-scale data platforms on GCP (BigQuery, GCS, Composer) using SQL, Python, and dbt for a global, remote-first organization.
Lead a data engineering team to design and scale ETL pipelines and cloud data infrastructure that power analytics and business decisions across the organization.
Experienced data platform architect needed to design and implement a scalable, AI/ML-ready data platform and CI/CD infrastructure primarily on Google Cloud for a fast-growing broadband provider.
Sia Partners is hiring a Data Architect Manager to lead data architecture and engineering delivery across US clients within its expanding AI & Data practice.
Lead the design and implementation of scalable ETL and data platform solutions using Databricks, Spark, AWS, and modern data engineering practices to power analytics for federal missions.
Lead the architecture and cloud migration of enterprise data platforms, shaping controls, governance, and integration strategies for a large-scale analytics modernization effort.
Lead and scale analytics for Brigit's credit products to drive portfolio performance, risk insight, and strategic decision-making across Product, Risk, and Finance.
Senior Data Management Engineer needed to build and automate cloud-scale data pipelines (Azure/Databricks) while driving data quality, governance, and self-service frameworks across the enterprise.
Senior Data Engineer needed to architect and maintain scalable, cloud-based data pipelines and platforms for a high-impact financial software organization in a fully remote capacity.
Experienced JIRA administrator and visualization specialist needed to deliver Agile metrics, dashboards, and executive reporting for a Coast Guard federal program.
Lead enterprise-scale, multi-cloud data platform strategy and delivery as the VP, Head of Data Solutions, driving architecture, governance, and a high-performing data practice across client engagements.
WVU Health System seeks a certified Hospital Coding Specialist II to accurately assign ICD-10 and CPT codes for moderately complex outpatient hospital encounters, including injection and infusion coding.
Deutsche Bank is hiring an Assistant Vice President in Jacksonville to govern Critical Data Elements, execute data trace checks, and drive data quality and lineage across payments and loans systems.
We are methodically reinventing how healthcare is delivered and paid for, with AI-powered products that dramatically improve efficiency, quality and accessibility of medical care.
4 jobs