We're seeking an experienced Senior Data Engineer for a 6-month contract engagement with a leading public sector organization. This role requires deep expertise in Microsoft Azure, Data Engineering, Azure Data Factory (ADF), Databricks, Python, and SQL. You'll play a key role in designing and modernizing data pipelines and lakehouse structures within a collaborative Agile environment.
Key Responsibilities:
- Collaborate with product teams to analyze system requirements and design cloud-based data solutions
- Design and build data lakes and lakehouse structures using Azure-native tools
- Migrate legacy pipelines from Synapse Analytics and ADF to modernized Databricks solutions using Delta Lake
- Develop automated and scalable data pipelines
- Work closely with IT teams to resolve issues and enhance analytics product usage
- Contribute to the development of reusable frameworks and data engineering standards
- Conduct peer code reviews, promote best practices, and share knowledge across teams
- Prepare and deliver knowledge transfer documentation and sessions
Key Requirements:
- 5+ years of hands-on experience in Microsoft Azure data engineering environments
- Strong background (5+ years) in building and orchestrating data pipelines with Azure Data Factory (ADF) and Databricks
- Proficient in Python and SQL for data transformation and analytics
- Previous experience in the public sector or with government clients is a strong asset
We’re an equal opportunity employer committed to increasing diversity and inclusion in today’s workforce. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. Minorities, women, LGBTQ candidates, and individuals with disabilities are encouraged to apply. If you require an accommodation, please review our
accessibility policy and reach out to our accessibility officer with any questions.