Booking NL

Data Engineer (For independent contractors)

Posted May 6, 2026
Project ID: 12442-1
Location
Amsterdam, NH
Hours/week
40 hrs/week
Timeline
6 months
Starts: May 8, 2026
Ends: Nov 7, 2026
Payrate range
50 - 100 €/hr

Data Engineers are responsible for the development, performance, quality, and scaling of Booking’s data pipelines, with a special focus on data quality. The incumbent will take part in the execution of technical tasks within the scope of data management. As part of a team, the data engineer builds internal tools/infrastructure for other teams, or directly contributes to building user-facing products.


In this role, you will be expected to operate with independence across core architectural and deployment tasks, while applying hands-on experience in building reliable software and data solutions. You will also play a key role in advancing our "Data as a Product" mindset, ensuring our datasets are robust enough to power the next generation of autonomous and programmatic systems.



Key Responsibilities


Data Architecture & Modeling:


  • Independently segment data assets into sustainable and business-enabling domains.


  • Create physical data models to meet business requirements and map data flows between systems and workflows autonomously.


  • Support the definition of data architecture requirements and processing methodologies.


Data as a Product & Advanced Consumption:


  • Data as a Product: Design, build, and maintain well-managed, unified data solutions treated as standalone products.


  • AI & MCP Enablement: Engineer highly reliable, high-quality data assets to be leveraged not just for traditional BI dashboards, but optimized for programmatic consumption via Model Context Protocols (MCPs) and autonomous AI Agents.


  • Build extensible data pipelines spanning different data encodings to support these varied, advanced business requirements.


Data Solution Build & System Ownership:


  • Independently deploy code to production while maintaining end-to-end system ownership.


  • Implement scalable tooling for data flow automation, efficient data ingestion solutions, and batch/event-based streams.


  • Monitor relevant SLIs and SLOs, observability, application monitoring, and engineering for failure to ensure reliable solutions.


Data Quality & Governance:


  • Independently support the development and use of data validation solutions for values and schemas to guarantee data accuracy and reliability.


  • Implement and use solutions for monitoring, failure detection, and data availability to ensure data timeliness and completeness.


  • Ensure data solutions meet all regulatory, compliance, and risk management requirements by supporting the proper definition of processes and controls along data flows.


Software Engineering Best Practices:


  • Apply solid experience in writing and refactoring code, building tools, and managing project dependencies.


  • Adhere to core engineering principles (KISS, SOLID, DRY) and utilize technical documentation and test automation.



Requirements


  • Master's degree in Computer Science, Data Engineering, Information Systems, or a related field.


  • Proven independence in data architecture, physical data modeling, and deploying code to production.


  • Hands-on experience with SLIs/SLOs, test automation, and scalable data flow tooling.


  • Forward-thinking mindset with experience or strong interest in structuring data to power LLMs, MCPs, and agentic workflows.


Technical Requirements


  • Core Data & Semantic Layer: Advanced proficiency in SQL and Python, with deep expertise in Snowflake data modeling and building robust semantic layers .


  • AI & Programmatic Enablement: Genuine interest engineering data as a product for AI Agents and LLMs, including practical knowledge of Model Context Protocols (MCP) and Agent Mesh architectures.


  • Governance & Security: Hands-on experience with data access control, IAM integrations, and governance platforms like Immuta.


  • DevEx & Consumption: Strong engineering fundamentals (CI/CD, API development, containerization) and experience serving data to downstream BI tools and interactive apps like Streamlit and Tableau.

Similar projects

+ Search all projects