Job Summary:
The Data Engineer III - Databricks is responsible for engineering, architecting, developing, and implementing unique and innovative enterprise-wide solutions.
Essential Functions:
Primary responsibility will be driving the cloud solution utilizing Azure DevOps\Data Bricks.
Key contributor for data stewardship workgroups and teams in data management and governance activities
Be a champion or ambassador of metadata and data quality practices across the organization – leading lunch ‘n’ learn sessions, knowledge sharing sessions, etc.
Gain an understanding of the stakeholder’s business culture and priorities, and the need for different modeling techniques when eliciting definitions and data quality requirements
Perform analysis and research to support and/or develop glossary definitions, policies, programs, standards, guidelines and workflows
Lead the development and maintenance of metadata catalogs
Identify and quantify data related pain points within the organization and assist in the development of remediation plans
Collaborate with SMEs to develop business rules that measure and assure data quality; facilitating discussions with end users and data modelers
Lead efforts to quantify and qualify reported data quality issues; develop and report program metrics to demonstrate progress and compliance
Sets objectives for own job with the understanding of how output would affect and impact other job areas related to own job. Contributes and provides input to the development of operational area goals within the context of own job area. Work consists of tasks that are typically not routine, works independently and applies discretion within established operational boundaries and procedures. Work direction is only provided for new areas of work or unique assignments.
Problems and issues faced are vague, and require reasoning and analysis to address problems and issues. Updates and/or modifies work methods based on past experiences, and identifies solutions to problems by recognizing similarities through data trends. Has higher autonomy to arrive at solutions to non-routine problems within established operational boundaries and procedures. Uses personal experiences, makes reasonable judgment calls, and may call on other resources where necessary to identify solutions.
Perform any other job duties as requested
Education and Experience:
Bachelor’s degree in Computer Science or related field or equivalent years of relevant work experience is required
A minimum of three (3) years of healthcare payer experience is required
A minimum of five (5) years of experience designing and developing ETL solutions
Demonstrable experience in all stages of definition, design, implementation, testing and deployment of complex database, ETL, BI and Analytical solutions is required
Participation and experience working within a formal Data Warehouse and Governance environment is preferred
Understanding of Data Vault and Kimball model
Healthcare Domain knowledge
Experience with the following:
DataBricks
Python\PySpark
Azure Data Factory
Competencies, Knowledge and Skills:
Proven ability to write complex queries
Proven ability developing medium to complex data quality profiling or metadata analysis processes with an industry leading tool
Ability to manage multiple medium to large scale projects while demonstrating a sense of urgency
Solid understanding of data management and governance concepts, especially data quality and metadata management
Ability to establish effective working relationships with stakeholders at all different levels
Effective problem-solving skills with attention to detail
Strong interpersonal skills including excellent written and verbal communication skills; listening and critical thinking
Ability to effectively prioritize and execute tasks while working both independently and in a team-oriented, collaborative environment
Customer Service oriented
Working Conditions:
General office environment; may be required to sit or stand for extended periods of time
Compensation Range:
$92,300.00 - $161,600.00CareSource takes into consideration a combination of a candidate’s education, training, and experience as well as the position’s scope and complexity, the discretion and latitude required for the role, and other external and internal data when establishing a salary level. In addition to base compensation, you may qualify for a bonus tied to company and individual performance. We are highly invested in every employee’s total well-being and offer a substantial and comprehensive total rewards package.
Compensation Type (hourly/salary):
SalaryOrganization Level Competencies
Create an Inclusive Environment
Cultivate Partnerships
Develop Self and Others
Drive Execution
Influence Others
Pursue Personal Excellence
Understand the Business
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.
Drive analytics, dashboarding, and data-driven decisions for NAOU Supply Chain as the Manager, Business Systems, supporting product order and transportation improvements.
Sand Technologies is hiring a Data Governance Analyst in Baltimore to drive data quality, implement governance policies, and support enterprise-wide compliance and cataloging efforts.
Allegion is hiring a Data Governance Manager to lead customer and pricing data governance for Access Technologies in a remote role supporting Connecticut, ensuring data quality across D365, Salesforce and related systems.
Experienced Data Business Analyst needed to translate business needs into actionable data requirements and manage backlog and delivery for a hybrid team in Riverview, FL.
Lead n8n's first dedicated data function—defining strategy, building the stack, and growing a team to turn data into a core driver of growth and decision-making.
Northwestern Medicine seeks a data-driven Senior Analytics Developer to design and deliver ETL pipelines, analytics, and visualizations that enable strategic decision-making across the organization.
Lead the build-out of a scalable GTM analytics stack at Bitwise, owning data modeling, pipelines, and high-impact dashboards to drive marketing and sales decisions.
A fast-growing US-focused company is hiring a Staff Data Warehouse Engineer to lead architecture, implementation, and operationalization of scalable, secure data warehouse solutions across the organization.
Base Operations is hiring a Data Engineer to build and operationalize production data pipelines, warehouses, and data-quality systems for a remote-first security-intelligence platform.
Humana is hiring a Senior Business Intelligence Engineer to lead maintenance, implementation, and analysis of Medicare outpatient pricer logic and provider reimbursement methodologies across systems.
Work remotely as a Data Implementation Specialist to clean, transform, and automate customer data imports for an AI-driven construction project management platform.
MindMed is hiring a Senior Manager, Data Management to lead end-to-end clinical data activities and CRO oversight for early and late-phase clinical studies.
Serve as the regional Customer Data Steward at Dematic, ensuring governance, quality, and consistency of customer master data across ERP/MDG environments to support global operations.
To make a lasting difference in our members' lives by improving their health and well-being.
3 jobs