School scheduling is a hard problem, one that technology was meant to solve. Yet for decades, the master schedule has been built by hand, consuming educators’ valuable time while delivering a sub-optimal result for students. We started Timely to build the tool we wish we had when we were in schools. And today, we are changing how schools create high quality schedules for teachers and students. Timely combines groundbreaking AI technology with support from a team of expert school schedulers: saving schedulers’ time, and surfacing opportunities for leaders to unlock staff potential. Timely delivers optimized master schedules that are more efficient, accommodate diverse educator needs, and enable student access to courses and pathways that impact their lives.
Our district and school partners agree: scheduling is a key opportunity for educational innovation, and that Timely is the indispensable partner in addressing the challenge. We’re experiencing exciting growth as a company, while doubling down on our mission and people-centered approach to leveraging powerful technology. We serve districts and charter school organizations across 15+ states, growing (and renewing) at rates that lead the education technology industry. This growth has been driven by our strong product-market fit, deep understanding of education, and commitment to customer success.
Come work with us as a seasonal data integrations engineer, helping us manage the data pipeline between Timely and our clients.
As a data integrations engineer, you will operate across both engineering and customer success to ensure that customers can integrate with and leverage our scheduling technology for their needs. You’ll work closely with both the engineering and customer success teams to own a portfolio of clients who wish to integrate with Timely for their scheduling needs.
Full-time or close to full-time availability during the primary scheduling season (January to July, exact start and end dates negotiable)
Partner with data integration engineers, customer success, SIS representatives and district IT staff to plan out requirements for integration of a customer.
Design and document the architecture, scope and requirements for an integration with a single SIS - can be expanded with relevant experience.
Own the ingestion, transformation, and export of data between district systems and the Timely data model and product.
Partner with customer success, SIS representatives and district IT staff to plan out requirements for integration of a customer.
Meet with customers as needed to execute integration workflows and troubleshoot any challenges. Some travel may be required.
Experience working with school and district data as an educator, administrator or IT staff member. We’re particularly excited about candidates with school scheduling and programming experience.
Prior experience successfully implementing data ingestion, transformation, cleaning and export through a variety of means (ranging from CSV-based integrations to service APIs).
Experience with at least 1 programming language (we predominantly use Python & Typescript) and 1+ years of experience with data query & analysis technologies (SQL, Python Pandas, Bigquery, etc.) in a client-facing capacity. Exposure to these technologies in production environments.
Demonstrated experience leveraging communication and collaboration skills to work with customers and cross-functional team members.
Ability & excitement to step into an early (seed-stage) startup role, with urgency of execution, breadth of responsibilities and opportunities for growth that come with it.
This role is remote, with potential for expansion to other data engineering roles with relevant experience.
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.
ServiceNow is hiring a Senior Manager to define and execute enterprise data management and AI-enabled analytics for Customer Service & Support to drive efficiency, predictive insights, and executive decision-making.
Senior data engineering leader needed to define and scale enterprise data platforms, enable analytics and AI workflows, and manage distributed engineering teams across the organization.
Data Engineer II needed to build and optimize cloud-based ETL pipelines, support real-time data solutions, and mentor junior engineers on a US-remote team.
MDVIP is hiring a Data Operations Analyst I in Boca Raton to manage, cleanse, and prepare clinical and operational data for analytics and Salesforce with an emphasis on accuracy and HIPAA compliance.
Lead Gusto’s revenue analytics function to deliver forecasting, marketing attribution, and retention insights that drive scalable GTM growth across acquisition, expansion, and retention.
Capital One is hiring a Data Engineer in Chicago to design and implement cloud-native, scalable data pipelines and platforms that power analytics and machine learning across the company.
AbbVie seeks a Senior Master Data Analyst to lead Material Master Data governance and quality across SAP and enterprise systems from its North Chicago hybrid site.
Experienced data engineer needed to build and optimize production ETL and real-time data pipelines on Azure while mentoring junior engineers and improving data architecture.
Work with Sentry’s Data Engineering team in San Francisco to build and optimize data pipelines and infrastructure using Python, SQL, Airflow, and GCP.
Responsible for contributing to design and implementation of large-scale data and analytics solutions at AbbVie’s Lake County hybrid BTS team, supporting R&D and enterprise analytics.
Brillio is hiring a GIS Specialist (Irving, TX — hybrid) to develop and validate spatial data solutions using Python, QGIS, PostGIS/PostgreSQL, and modern deployment practices.
Experienced Data Engineer needed to architect and deliver scalable, low-latency cloud data systems and lead cross-functional projects for a US-based partner, recruited via Jobgether.
Arbiter seeks a Senior Business Intelligence Developer to build dashboards, analytics, and data pipelines on GCP/dbt that drive decisions across providers and payers.