Job Description Summary
The Cloud Data Engineer supports the design, implementation, and maintenance of data pipelines, and predictive analytics workflows within Bentley’s cloud data ecosystem.This role plays a key part in sourcing, transforming, and delivering data from enterprise systems to data lakes and warehouses while supporting analytics and future data science initiatives through scalable and secure infrastructure. The position involves collaboration with a variety of stakeholders including data analysts, and IT teams to ensure data is accessible, reliable, and actionable.
Essential Duties:
Participate in Bentley University’s data modernization initiatives by developing cloud-based data pipelines and analytics infrastructure that support data engineering, analytics, and predictive modeling needs.
Build and maintain ETL/ELT processes to extract, transform, and load data from enterprise systems into cloud platforms such as Azure Synapse and Microsoft Fabric.
Use programming languages such as SQL, PL/SQL, Python, and R, along with tools like Azure Data Factory, Alteryx, Power Query, SnapLogic, or Informatica, to support data preparation, transformation, and integration activities.
Develop reusable scripts and tools to streamline analytics workflows and improve data accessibility across teams.
Collaborate with data users and analysts to design infrastructure that supports analytics and machine learning workflows, including feature engineering pipelines and model output integration.
Assist in deploying and scaling predictive models within Bentley’s cloud environment to support initiatives such as student success analytics.
Design and maintain data warehouse structures (e.g., star and snowflake schemas with fact, bridge, and dimension tables) using Slowly Changing Dimension (SCD) techniques in databases such as Azure SQL (serverless and dedicated) and Oracle.
Partner with business stakeholders to clarify data and reporting requirements and ensure that pipelines meet evolving analytical needs.
Monitor and troubleshoot data jobs and model pipelines to ensure data quality, integrity, and timely delivery.
Implement logging, alerting, and monitoring processes to ensure data reliability and performance.
Contribute to the development and maintenance of metadata management tools, data catalogs, and data dictionaries in collaboration with the Data Governance team.
Support secure data access practices and collaborate with cloud engineering, integration, and security teams to ensure compliance and operational reliability.
Minimum Qualifications:
Bachelor’s degree in Computer Science, Data Science, Information Systems, or a related field.
Minimum of three years of professional experience, including at least two consecutive years in data engineering or a related technical role.
Hands-on experience with the Azure data ecosystem, such as Azure Synapse, Microsoft Fabric, and Azure Data Factory; experience with other cloud platforms (e.g., AWS or GCP) is a plus.
Experience with data integration and pipeline development tools, such as SnapLogic, Informatica, or Azure Data Factory to design, automate, and maintain data flows across multiple systems.
Experience in data warehouse design and modeling, including star/snowflake schemas and Slowly Changing Dimension (SCD) techniques, using platforms such as Azure SQL or Oracle.
Proficiency in SQL for query development and optimization.
Hands-on experience with Python or R for data wrangling, transformation, and basic machine learning tasks.
Strong understanding of ETL/ELT processes structured and semi-structured data, and exposure to machine learning operations (MLOps) workflows and tools.
Demonstrated ability to collaborate effectively with cross-functional teams, including data analysts, data scientists, and IT professionals.
Preferred Qualifications:
Experience supporting predictive analytics initiatives (e.g., churn models, student success models, forecasting).
Familiarity with visualization tools such as Power BI or similar.
Exposure to metadata management and data governance practices.
Experience in higher education or a similar mission-driven environment.
Understanding of Workday data structures, particularly Workday Learning or Student, is a plus.
Work Environment:
Typical office setting with extended screen time and computer use.
Some campus travel may be required for meetings and collaboration.
This role offers a flexible work arrangement, combining in-person attendance with remote work. On-site presence is required based on business needs, team collaboration, or scheduled meetings. This work arrangement is subject to change.
Bentley University requires references checks and may conduct other pre-employment screening.
DIVERSITY STATEMENT
Bentley University strives to create a campus community that welcomes the exchange of ideas, and fosters a culture that values differences and views them as a strength in our community.
Bentley University is an Equal Opportunity Employer, building strength through diversity. The University is committed to building a community of talented students, faculty and staff who reflect the diversity of global business. We strongly encourage applications from persons from underrepresented groups, individuals with disabilities, covered veterans and those with diverse experiences and backgrounds.
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.
cBEYONData seeks a seasoned Data and Infrastructure Consultant to build Databricks/PySpark data pipelines, integrate Army ERP and legacy data, and support Audit Analytics for a federal DoD customer.
Work with a cross-disciplinary team at Legis1 to explore congressional data sources, support data quality efforts, and conduct targeted policy research in a part-time, remote/hybrid internship.
Credence is hiring a Qlik Sense Developer to architect and deliver secure, high-performance BI solutions for mission-critical DoD systems from its McLean or Dayton offices.
ProducePay is hiring a Senior Data Engineer to design, build, and maintain scalable data pipelines and a modern analytics warehouse that enable evidence-based business decisions.
SchoolsFirst Federal Credit Union is hiring a Data Engineer II to build and maintain complex ETL/ELT pipelines, data models, and integrations that power analytics and operational systems.
Lead Analytics Engineer to architect trusted, reusable data products and semantic layers at Coupa using DBT and Snowflake to power company-wide analytics.
Curinos is hiring a Senior Data Engineer to build and optimize large-scale data pipelines and support ML-driven analytics for its B2B SaaS platform serving financial institutions.
Experienced data engineering leader sought to manage a remote team building scalable batch and real-time pipelines, data lake architectures, and reliable data infrastructure.
Mid-level SQL Server DBA role at AnaVation supporting on-site, mission-driven database operations with a required Top-Secret (SCI) clearance.