KAYAK, part of Booking Holdings (NASDAQ: BKNG), is the world’s leading travel search engine. With billions of queries across our platforms, we help people find their perfect flight, stay, rental car and vacation package. We’re also transforming business travel with a new corporate travel solution, KAYAK for Business.
As an employee of KAYAK, you will be part of a travel company that operates a portfolio of global metasearch brands including momondo, Cheapflights and HotelsCombined, among others. From start-up to industry leader, innovation is in our DNA and every employee has an opportunity to make their mark. Our focus is on building the best travel search engine to make it easier for everyone to experience the world.
We’re looking for a Data Engineer with a solid foundation in building robust data pipelines and a collaborative mindset to join our Marketing Data Engineering team. If you thrive in a fast-paced environment, enjoy working cross-functionally, and are eager to embrace change (especially innovations like AI coding assistants), this could be your next adventure!
In This Role You Will:
Design, build, and maintain high-performance data pipelines and orchestration workflows
Write clean, modular Python code to transform, parse, clean, and enrich large datasets
Support stakeholders by developing dashboards and visualizations
Partner closely with marketing analysts, engineers, and data scientists to define and deliver data needs
Actively participate in agile ceremonies, code reviews, and planning discussions
Experiment with and use AI coding tools to boost productivity and code quality
Languages: Python, SQL
Workflow orchestration: Airflow
Query engine: Trino
Data warehouse: Vertica
Source control: Git
AI coding tools: Cursor
Please apply if you have:
6+ years of professional experience in data engineering
Proficient in SQL and Python, and know how to write scalable, maintainable code
Worked with AI coding tools and are excited about how they’re shaping the future of development
Understand modern data architecture from ingestion to transformation to delivery
Built and operated Airflow pipelines (or something similar)
You’re comfortable estimating project scope, managing timelines, and delivering reliably
Soft Skills We Value:
Excellent collaborator and communicator, comfortable working with technical and non-technical peers
Solution-oriented and driven by curiosity
Welcome change and innovation, and you’re quick to adapt your tools and practices
Thrive in an international, fast-paced, and feedback-driven environment
There are a variety of factors that go into determining a salary range, including but not limited to external market benchmark data, geographic location, and years of experience sought/required. The base range for this United States located role is $115,000 - 130,000.00.
We offer a competitive base salary and benefits including: health benefits; flexible spending account; retirement benefits; life insurance; paid time off (including PTO, paid sick leave, medical leave, bereavement leave, floating holidays and paid holidays); and parental leave benefits. This role is eligible to be considered for an annual bonus.
Inclusion
At KAYAK, we want everyone to have the space to grow, share ideas and do great work. That’s why we’re focused on hiring the best talent from all walks of life and experiences, supporting them well and making sure no one feels like they have to fit a mold to belong here.
Need any adjustments for the interview, application or on the job? No problem - just give us a heads-up. We’ve got you.
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.
Lead architecture and implementation of enterprise-scale lakehouse and cloud data platforms for a mission-driven consulting firm, delivering secure, compliant solutions across Azure and AWS.
Kentro is hiring a senior AI Data Integration Specialist to architect and implement AI/ML-ready data pipelines and governance for mission-critical VA operations (remote, US ET hours).
Experienced Data Warehouse Engineer needed to build and maintain AWS data lakes, warehouses, and analytics-ready pipelines to support FinTech analytics and financial models at One Park Financial.
A U.S.-based company seeks an Analytics Engineer to deliver Power BI dashboards, SQL analytics, and gold-layer data models while helping scale the analytics platform.
Kepler AI is hiring a Data Engineer to own and scale the ingestion, transformation, and validation pipelines that power our financial research platform in New York City.
iHerb is seeking an experienced Senior Data Engineer to design and operate cloud-native data platforms and MLOps pipelines that enable production AI/ML at scale.
Lead the engineering and scaling of Clay’s Revenue + Cost + Margin Engine, building auditable, AI-native data models and interfaces that power company-wide decisions.
Iambic Therapeutics is looking for an experienced Data Engineer to design and optimize multi-terabyte data pipelines and data storage to support AI-driven drug discovery in a remote-friendly role.
Help shape Notion’s GTM data foundation by designing and shipping scalable datasets and pipelines that power marketing, sales, and revenue analytics.
Senior Data Analytics Engineer needed to architect and deliver Snowflake- and AWS-based analytics solutions while partnering directly with clients in a fully remote role.