Some careers shine brighter than others.
If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.
HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.
We are currently seeking an experienced professional to join our team in the role of Consultant Specialist.
In this role, you will:
Design and Develop Scalable Data Pipelines: Architect and implement end-to-end data workflows using Apache Airflow for orchestration, integrating multiple data sources and sinks across cloud and on-prem environments.BigQuery Data Modeling and Optimization: Build and optimize data models in Google BigQuery for performance and cost-efficiency, including partitioning, clustering, and materialized views to support analytics and reporting use cases.ETL/ELT Development and Maintenance: Design robust ETL/ELT pipelines to extract, transform, and load structured and semi-structured data, ensuring data quality, reliability, and availability.Cloud-Native Engineering on GCP: Leverage GCP services like Cloud Storage, Pub/Sub, Dataflow, and Cloud Functions to build resilient, event-driven data workflows.CI/CD and Automation: Implement CI/CD for data pipelines using tools like Cloud Composer (managed Airflow), Git, and Terraform, ensuring automated deployment and versioning of workflows.Data Governance and Security: Ensure proper data classification, access control, and audit logging within GCP, adhering to data governance and compliance standards.Monitoring and Troubleshooting: Build proactive monitoring for pipeline health and data quality using tools such as Stackdriver (Cloud Monitoring) and custom Airflow alerting mechanisms.Collaboration and Stakeholder Engagement: Work closely with data analysts, data scientists, and business teams to understand requirements and deliver high-quality, timely data products.