Role Purpose
The Data Engineer position is an integral part of the Data team within the Digital and Information function. This role entails overseeing the availability, integrity, quality, and accuracy of data. The primary responsibilities include designing, developing, and maintaining data pipelines and workflows to support the organisation's data processing needs and facilitate BI reporting and analytics delivery. The Data Engineer will contribute to enhancing our Azure-based data platform using technologies such as Azure Data Factory and Azure Databricks and will have the opportunity to work on exciting digital transformation projects.
Operating within Agile principles and methodologies, the Data Engineer collaborates with other data engineers and key stakeholders to shape and manage the portfolio of reporting and data requirements. Working alongside peers, this role contributes to refining the overall data strategy.
Key Responsibilities
Design, develop, and maintain scalable data pipelines and workflows using Azure Data Factory and Databricks.
Perform data transformation and integration tasks to consolidate and enrich data from various sources into a unified format, whilst also ensuring data governance and compliance by design
Implement data quality checks and validation processes to ensure accuracy, completeness, and consistency of data.
Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders to understand data requirements, perform UAT and deliver solutions accordingly.
Monitor pipeline performance and troubleshoot issues promptly to ensure uninterrupted data processing.
Document data pipelines, workflows, and processes to ensure knowledge sharing and maintainability.
Ensures the effective & timely management and closure of 2nd line data and reporting support requests, through the appropriate ticketing systems, to minimise impact on the operations of the wider business.
Actively participate in daily stand ups, weekly stakeholder meetings, retrospectives.
Requirements
Hands-on Experience with Azure Data Factory: Proven experience in designing, developing, and managing data pipelines using Azure Data Factory.
Good knowledge of Databricks for data processing and analytics
Experience with data modelling techniques and data warehousing concepts.
Proficiency in SQL for data querying and manipulation.
Ability to work effectively in a fast-paced, dynamic environment and manage multiple priorities, following the Agile way of working
Problem-solving Skills: Ability to analyse complex data problems and propose effective solutions.
Communication and Collaboration: Good communication skills with the ability to collaborate effectively with cross-functional teams.
Experience with Python is highly desirable