Let’s get started
By clicking ‘Next’, I agree to the Terms of Service
and Privacy Policy, and consent to receive emails from Rise
Jobs / Job page
Data Engineer image - Rise Careers
Job details

Data Engineer - job 1 of 2

Job Title: Data Engineer
Position Type: Full-Time, Remote
Working Hours: U.S. client business hours (with flexibility for pipeline monitoring and data refresh cycles)

About the Role:
Our client is seeking a Data Engineer to design, build, and maintain reliable data pipelines and infrastructure that deliver clean, accessible, and actionable data. This role requires strong software engineering fundamentals, experience with modern data stacks, and an eye for quality and scalability. The Data Engineer ensures data flows seamlessly from source systems to warehouses and BI tools, powering decision-making across the business.

Responsibilities:
Pipeline Development:

  • Build and maintain ETL/ELT pipelines using Python, SQL, or Scala.
  • Orchestrate workflows with Airflow, Prefect, Dagster, or Luigi.
  • Ingest structured and unstructured data from APIs, SaaS platforms, relational databases, and streaming sources.

Data Warehousing:

  • Manage data warehouses (Snowflake, BigQuery, Redshift).
  • Design schemas (star/snowflake) optimized for analytics.
  • Implement partitioning, clustering, and query performance tuning.

Data Quality & Governance:

  • Implement validation checks, anomaly detection, and logging for data integrity.
  • Enforce naming conventions, lineage tracking, and documentation (dbt, Great Expectations).
  • Maintain compliance with GDPR, HIPAA, or industry-specific regulations.

Streaming & Real-Time Data:

  • Develop and monitor streaming pipelines with Kafka, Kinesis, or Pub/Sub.
  • Ensure low-latency ingestion for time-sensitive use cases.

Collaboration:

  • Partner with analysts and data scientists to provide curated, reliable datasets.
  • Support BI teams in building dashboards (Tableau, Looker, Power BI).
  • Document data models and pipelines for knowledge transfer.

Infrastructure & DevOps:

  • Containerize data services with Docker and orchestrate in Kubernetes.
  • Automate deployments via CI/CD pipelines (GitHub Actions, Jenkins, GitLab CI).
  • Manage cloud infrastructure using Terraform or CloudFormation.

What Makes You a Perfect Fit:

  • Passion for clean, reliable, and scalable data.
  • Strong problem-solving skills with debugging mindset.
  • Balance of software engineering rigor and data intuition.
  • Collaborative communicator who thrives in cross-functional environments.

Required Experience & Skills (Minimum):

  • 3+ years in data engineering or back-end development.
  • Strong Python and SQL skills.
  • Experience with at least one major data warehouse (Snowflake, Redshift, BigQuery).
  • Familiarity with pipeline orchestration tools (Airflow, Prefect).

Ideal Experience & Skills:

  • Experience with dbt for transformations and data modeling.
  • Streaming data experience (Kafka, Kinesis, Pub/Sub).
  • Cloud-native data platforms (AWS Glue, GCP Dataflow, Azure Data Factory).
  • Background in regulated industries (healthcare, finance) with strict compliance.

What Does a Typical Day Look Like?
A Data Engineer’s day revolves around keeping pipelines running, improving reliability, and enabling teams with high-quality data. You will:

  • Check pipeline health in Airflow/Prefect and resolve any failed jobs.
  • Ingest new data sources, writing connectors for APIs or SaaS platforms.
  • Optimize SQL queries and warehouse performance to reduce costs and latency.
  • Collaborate with analysts/data scientists to deliver clean datasets for dashboards and models.
  • Implement validation checks to prevent downstream reporting issues.
  • Document and monitor pipelines so they’re reproducible, scalable, and audit-ready.
    In essence: you ensure the business has accurate, timely, and trustworthy data powering every decision.

Key Metrics for Success (KPIs):

  • Pipeline uptime ≥ 99%.
  • Data freshness within agreed SLAs (hourly, daily, weekly).
  • Zero critical data quality errors reaching BI/analytics.
  • Cost-optimized queries and warehouse performance.
  • Positive feedback from data consumers (analysts, scientists, leadership).

Interview Process:

  • Initial Phone Screen
  • Video Interview with Pavago Recruiter
  • Technical Task (e.g., build a small ETL pipeline or optimize a SQL query)
  • Client Interview with Engineering/Data Team
  • Offer & Background Verification

Average salary estimate

$135000 / YEARLY (est.)
min
max
$110000K
$160000K

If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.

Similar Jobs
Photo of the Rise User
Pavago Hybrid No location specified
Posted 7 hours ago

Remote appointment setter needed to qualify prospects and schedule sales-ready meetings with U.S. sales teams using CRM and scheduling tools.

Photo of the Rise User
Pavago Hybrid No location specified
Posted 6 hours ago

Remote QA Engineer needed to create and run manual and automated tests, integrate testing into CI/CD, and act as the last line of defense before U.S.-hour releases.

Senior BI Developer to design scalable analytics/BI solutions (BigQuery, Power BI, Tableau, Looker) that enable product and platform innovation across Brightspeed's technology organization.

Photo of the Rise User
Posted 21 hours ago
Inclusive & Diverse
Feedback Forward
Collaboration over Competition
Growth & Learning

Lead the design and delivery of auditable finance data systems and agentic AI automation at OpenAI, connecting engineering systems to enterprise platforms like Oracle Fusion to enable reliable financial reporting and intelligent workflows.

Photo of the Rise User
Kioxia Hybrid 2610 Orchard Pkwy, San Jose, CA 95134, USA
Posted 13 hours ago

KIOXIA America is hiring an SAP BI/BODS Engineer to develop BI universes, BODS ETL jobs and Oracle/PL-SQL integrations that streamline reporting and data processes for business stakeholders.

Photo of the Rise User
Posted 20 hours ago

Help Blumen modernize permitting by researching, structuring, and quality-checking environmental and land-use regulations that feed an AI platform for national infrastructure projects.

Photo of the Rise User

A 12-week Data Engineering internship at Visa’s Global Data Office to build ETL pipelines, optimize Spark/Hive jobs, and support reporting and platform capabilities in a hybrid Foster City role.

Photo of the Rise User
KBR Hybrid Camarillo, California
Posted 5 hours ago

KBR is hiring a seasoned Data Engineer to design, operate, and present large-scale ETL and analytics solutions that support DoD test & evaluation data and stakeholder needs.

Photo of the Rise User
Posted 9 hours ago

Lead the design and delivery of scalable ETL/ELT data pipelines and analytics solutions for a healthcare-focused company in a remote Data Engineer / Lead Analyst role.

Photo of the Rise User

Middlebury College is hiring a Manager of Enterprise Data, Reporting and Analytics to lead the team that maintains the enterprise reporting platform, data architecture, integrations, and analytics tooling.

Photo of the Rise User
Awesome Motive Hybrid Hybrid - New York, NY
Posted 22 hours ago

Data Engineer Intern at Coinbase working on scalable data pipelines, analytics models, and LLM data integrations in a hybrid New York role.

Photo of the Rise User
Jobgether Hybrid No location specified
Posted 16 hours ago

Client-facing Data Engineer needed to design and implement scalable ETL pipelines, cloud data warehouses, and BI solutions while advising and mentoring clients across industries.

Photo of the Rise User

Contribute to Visa’s Global Data Office as a Data Strategy Intern, performing market research, quantitative analysis, and strategic communications to support product, data, and AI initiatives.

Photo of the Rise User
Posted 22 hours ago

Lead the design and delivery of scalable ETL/ELT pipelines and cloud data solutions for a healthcare-focused organization as a remote Data Engineer/Lead Analyst across the U.S.

Photo of the Rise User
Posted 19 hours ago

Experienced data engineer needed to build scalable ETL pipelines, improve data quality, and mentor junior engineers within a healthcare analytics environment.

pavago - thinking globally to grow locally 🌍 welcome to pavago, where the world is your talent pool. we believe in a borderless future where businesses can harness the best of international expertise without breaking the bank. 🌟 why choose pav...

62 jobs
MATCH
Calculating your matching score...
FUNDING
DEPARTMENTS
SENIORITY LEVEL REQUIREMENT
TEAM SIZE
EMPLOYMENT TYPE
Full-time, remote
DATE POSTED
October 15, 2025
Risa star 🔮 Hi, I'm Risa! Your AI
Career Copilot
Want to see a list of jobs tailored to
you, just ask me below!