Standard Bank Group is a leading Africa-focused financial services group, and an innovative player on the global stage, that offers a variety of career-enhancing opportunities – plus the chance to work alongside some of the sector’s most talented, motivated professionals. Our clients range from individuals, to businesses of all sizes, high net worth families and large multinational corporates and institutions. We’re passionate about creating growth in Africa. Bringing true, meaningful value to our clients and the communities we serve and creating a real sense of purpose for you.
Job DescriptionTo design, code, verify, test, document, amend, and secure data pipelines and data stores according to agreed architecture, solution designs, standards, policies and governance requirements. To monitor and report on own progress and proactively identify issues related to data engineering activities. To collaborate in reviews of work with others where appropriate.
QualificationsBachelor's Degree in: Information Technology, Informatics, Software Development, Statistics, Mathematics, Physics or related
Experience:
3-5 years: Experience in building databases, warehouses, reporting and data integration solutions. Experience building and optimising big data data-pipelines, architectures and data sets. Experience in creating and integrating APIs. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvementExperience in database programming languages including SQL, Oracle or an appropriate data tooling. Understanding of data pipelining and performance optimisation, data principles, how data fits in an organisation, including customers, products and transactional information. Knowledge of integration patterns, styles, protocols and systems theory.Key Responsibilities:
Analyse data elements and systems, data flow, dependencies, and relationships to contribute to conceptual physical and logical data models.Assist in the execution of the design, definition and development of Application Programming Interfaces (API's), aligning to relevant frameworks and guidelines.Contribute to the build of required infrastructure for optimal extraction, transformation and loading of data from various data sources, using various technologies (i.e. AWS and SQL technologies).Develop on one application platform and build, create, and optimise data pipelines, move data pipelines into production, enabling data consumers to utilise data for reporting purposes.Give input to data traceability standards, guidelines and processes to ensure data quality, keeping track of data sources and setting up appropriate data version control processes and systems ensuring alignment to defined data architecture.Behavioural Competencies:
Adopting Practical ApproachesArticulating InformationExamining InformationInterpreting DataProducing OutputTechnical Competencies:
Big Data Frameworks and ToolsData EngineeringData IntegrityIT KnowledgeStakeholder Management (IT)