Booking NL
Data Engineer II
Data Engineer Customer Service (CS) is a technical subject matter expert who is responsible for the development, performance, quality, and scaling of Booking’s data assets, with a special focus on data engineering and data quality. As part of the CS Data Engineering team, the data engineer builds internal tools/infrastructure for other teams, or directly contributes to building user facing products. The Data Engineer CS reports to the Manager Data Engineering CS.
What You'll Be Doing:
Rapidly developing next-generation scalable, flexible, and high-performance data pipelines;
Solving issues with data and data pipelines, prioritizing based on customer impact, and building solutions that prevent them from happening again (root cause);
End-to-end ownership of data quality of our core data assets;
Drive efficiency and resilience by mapping data flows between systems and workflows across the company;
Support data requirements of new and existing solutions by developing scalable and extensible physical data model(s) that can be operationalized within the company’s workflows and infrastructure;
Developing integrations between multiple applications and services, both on premise and in the cloud;
Experimenting with new tools and technologies to meet business requirements regarding performance, scaling, and data quality;
Contributing to self-organizing tools that help the analytics community discover data, assess quality, explore usage, and find peers with relevant expertise;
Building effective monitoring of data, and jumping in to handle outages.
What You'll Bring:
Understand success starts with accountability and ownership
Thrive with change and get things done
Team up: care more about being successful and reaching goals together than individually
Low ego, open, friendly and remembers that diversity gives us strength
Level of Education:
Bachelor's degree in Computer Science or related field
Years of relevant Job Knowledge:
2 years of experience building data pipelines and transformations at scale, with technologies such as Hadoop, Cassandra, Kafka, Spark, HBase, MySQL, AWS
2 years of experience in data modeling
2 years of experience handling data streaming
Requirements of special knowledge/skills:
Intermediate knowledge of data governance requirements based on standard methodologies, e.g.DAMA, and tooling for continuous automated data governance activities
Proven understanding of data quality requirements and implementation methodologies
Excellent English interpersonal skills, both written and verbal.