Standard Bank Group is a leading Africa-focused financial services group, and an innovative player on the global stage, that offers a variety of career-enhancing opportunities – plus the chance to work alongside some of the sector’s most talented, motivated professionals. Our clients range from individuals, to businesses of all sizes, high net worth families and large multinational corporates and institutions. We’re passionate about creating growth in Africa. Bringing true, meaningful value to our clients and the communities we serve and creating a real sense of purpose for you.
Job DescriptionTo design and implement scalable and efficient data pipelines using Microsoft Fabric components such as OneLake, Dataflows Gen2, and Lakehouse. Develop ETL/ELT processes using Azure Data Factory, PySpark, Spark SQL, and Python. Ensure data quality, integrity, and security across all platforms. Collaborate with stakeholders to gather requirements and deliver technical solutions. Optimize data workflows and troubleshoot performance issues. Support hybrid cloud deployments and integrate on-premises and cloud environments while maintaining documentation and following best practice in data engineering, including version control and modular code design.
QualificationsBsc Computer Science or Information Technology as well as Microsoft certification in Azure Data Engineering or Microsoft FabricMinimum 3 years’ experience in a Data engineering role with strong hands-on experience in Microsoft Fabric, Azure Synapse, Azure SQL, and DatabricksProficiency in SQL, Python, and Power BISolid understanding of data modelling, data governance, and data warehousingExperience with CI/CD pipelines, DevOps, or machine learning workflows is a plus.Additional InformationBehavioural Competencies:
Adopting Practical ApproachesChecking ThingsDeveloping ExpertiseEmbracing ChangeExamining InformationTechnical Competencies:
Big Data Frameworks and ToolsData EngineeringData IntegrityIT KnowledgeStakeholder Management (IT)#LI_DNI