The HIM Data Integrity Specialist is responsible for maintaining data integrity in the Master Patient Index (MPI) database. Provides analytical and operational support to ensure necessary corrections and synchronization between various systems. Performs data integrity functions to ensure timely corrections and accurate information within the electronic health record. Performs systematic merges of duplicate patients in the appropriate systems to ensure that patient matches have been completed correctly. Ensures the appropriate validations for requested amendments to the medical record. Reports and resolves MPI related inquires in a timely manner. Applies the departmental policies and procedures to minimalize duplication of medical records while promoting continuous process improvement.
DCH Standards:
High school diploma or GED required. Minimum of one year of experience in a Medical Records Department preferred. Possess knowledge of the workflow with Master Patient Index(MPI) preferred. Must have strong personal computer skills and a high level of experience with operation of equipment such as printers, computers, and fax machines. Exhibits interpersonal skills and abilities to deal effectively with all levels of staff. Must be detailed oriented, self-motivated and have the ability to stay focused on tasks for extended periods of time. Knowledge and experience in patient security, identity and patient record matching preferred. State and Federal laws preferred. Must be able to read, write legibly, speak and comprehend English.
WORKING CONDITIONS
WORK CONTEXT
PHYSICAL FACTORS
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.
Lead Gusto’s revenue analytics function to deliver forecasting, marketing attribution, and retention insights that drive scalable GTM growth across acquisition, expansion, and retention.
Work with Sentry’s Data Engineering team in San Francisco to build and optimize data pipelines and infrastructure using Python, SQL, Airflow, and GCP.
Cobot is hiring an Analytics Engineer to design DBT models and dashboards that convert robotics and operations data into reliable, business-aligned insights for engineers and leaders.
Seasonal Data Integrations Engineer needed to own ingestion, transformation, and export of district data to enable schools to integrate with Timely's AI scheduling platform.
Responsible for contributing to design and implementation of large-scale data and analytics solutions at AbbVie’s Lake County hybrid BTS team, supporting R&D and enterprise analytics.
Neon One is hiring a Senior Data Engineer to architect and maintain scalable data pipelines and a centralized Snowflake-based data hub that supports analytics and ML across the organization.
Lead ServiceNow CSS’s enterprise data strategy and predictive analytics efforts to drive AI-enabled efficiency, executive decision-making, and measurable ROI across support operations.
Capital One is hiring a Data Engineer in Chicago to design and implement cloud-native, scalable data pipelines and platforms that power analytics and machine learning across the company.
Brillio is hiring a GIS Specialist (Irving, TX — hybrid) to develop and validate spatial data solutions using Python, QGIS, PostGIS/PostgreSQL, and modern deployment practices.
Procurement Sciences seeks a Senior Data Engineer to build and operate production data pipelines and foundational data models that enable analytics, ML training, and business insights for a fast-growing, AI-driven GovCon SaaS platform.
Help build Fram Energy's data platform and quality practices as the Founding Data and Quality Engineer, driving reliable pipelines and observability for a fast-moving climate and proptech startup in NYC.
Redhorse Corporation seeks a Data Management Specialist to design, maintain, and quality-control ESRI ArcGIS data solutions supporting the BLM National Operations Center in Denver.
Firstup is hiring a remote Data Engineer to design scalable ETL and data infrastructure, optimize large-scale query performance, and support analytics for a rapidly growing, distributed product.