My client is a B2B Software company and is looking for a Data Engineer with expertise in understanding of how to design, model, and build data and analytics platforms, including data lakes / lakehouse, data warehouse, Azure Data Factory etc.
Duties & Responsibilities:
Help to design and implement the core Group reporting and analytics architecture
Integrate data from various Group sources into the reporting and analytics platforms
Creating enterprise data models that are fit for purpose for the current and future reporting needs
Working with stakeholders at all levels to understand data and reporting needs, and to help create the most appropriate data sources for those needs.
Build scalable and flexible data pipelines from multiple sources to destination – ensuring all the data is available in the right place, the right shape, and at the right time. Underpin all technical data platform design and development with best practice, driven by robust Data Governance principles.
Operate within the regulatory and compliance framework and take personal responsibility of Individual Conduct rules relevant to the role.
Upholding the vision, mission, and values
Expertise:
Understanding of how to design, model, and build data and analytics platforms, including data lakes / lakehouse, data warehouse, etc.
Familiar with various data modelling and architecture concepts (Kimball etc)
Key Skills & Experience:
Azure Data Factory / Azure Data Lake / Azure Synapse Analytics
SQL Server / T-SQL / Python
Experience developing ETL / ELT pipelines.
Experience of working with version control, testing frameworks, and CI / CD.
Experience of working with large and complex datasets, both structured and unstructured (databases, XML, JSON, etc.)
Desirable skills:
SSIS, SSRS, SSAS
Power BI
Streaming / real-time data pipelines
Databricks
Experience with, or some exposure to Microsoft Fabric in relation to data engineering