Duties & Accountabilities
- Apply data engineering practices and standards to develop robust and maintainable data pipelines.
- Analyze and organize raw data ingestion pipelines.
- Evaluate business needs and objectives.
- Support senior business stakeholders in defining new data product use cases and their value.
- Take ownership of data product pipelines and their maintenance.
- Explore ways to enhance data quality and reliability, be the “Quality Gatekeeper” for developed Data
Products. - Adapt and apply best practices from the Data One community.
- Be constantly on the lookout for ways to improve best practices and efficiencies and make concrete
proposals. - Take leadership and collaborate with other teams proactively to keep things moving.
- Be flexible and take on other responsibilities within the scope of the Agile Team.
Requirements
- Minimum 7 years of relevant work experience in data-related roles (e.g., Data Engineer/Data Architect).
- At least 2-3 years of DB/ETL development experience, Talend and DBT preferred.
- Bachelor’s/Master’s degree in Computer Science, Information Technology, Software Engineering, or
related field. - Technical expertise with Data Modeling techniques (Data Vault).
- Advanced expertise with ETL tools (Talend, Alteryx, etc.).
- Advanced SQL programming experience. Python is a plus.
- Previous experience with Agile methodologies in Software Development.
- Previous experience working with Data Transformation Tools like DBT.
- Previous experience with DevOps, DataOps practices (CI/CD, GitLab, DataOps.live).
- Hands-on experience with Snowflake.
- Experience with the lifecycle management of data products.
- Knowledge of Data Mesh and FAIR principles.
- Certification is a plus.
- Open to working on EMEA time zones.