Some careers shine brighter than others.
If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.
HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.
We are currently seeking an experienced professional to join our team in the role of a Consultant Specialist
In this role, you will:
Be expected to define and contribute at a high-level to many aspects of our collaborative Agile development process:
Big Data development, automated testing of new and existing components in an Agile, DevOps and dynamic environmentWorking with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Spark, and MapReduce access for the new userExecutes the review / acceptance / tuning and ensures environment availability 24x7General operational excellence. This includes good troubleshooting skills, understanding of system’s capacity and bottlenecks.Must Haves:
Strong problem-solving skills and adaptability to a complex environmentProviding technical support and design Hadoop Big Data platforms (preferably Cloudera distributions like Hive, Beeline, Spark, HDFS, Kafka, Yarn, Zookeeper etc.), handle and identify possible failure scenario (Incident Management), respond to end users of Hadoop platform on data or application issues, report and monitor daily SLA that identifies vulnerabilities and opportunities for improvement.Hands-on experience with large scale Big Data environment builds, capacity planning, performance tuning and monitoring, including end-to-end Cloudera cluster installation.Handling Hadoop security activities using Apache Ranger, Knox, TLs, Kerberos and Encryption zone management.Expertise in software installation and configuration, orchestration, and automation with tools such as Jenkins/Ansible.Improve the current estate by incorporating the use of centralized S3 data storage (VAST) throughout the platform processing stack5 years experience in engineering solutions in a Big Data on-prem or cloud environment.Some careers shine brighter than others.
If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.
HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.
We are currently seeking an experienced professional to join our team in the role of a Consultant Specialist
In this role, you will:
Be expected to define and contribute at a high-level to many aspects of our collaborative Agile development process:
Big Data development, automated testing of new and existing components in an Agile, DevOps and dynamic environmentWorking with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Spark, and MapReduce access for the new userExecutes the review / acceptance / tuning and ensures environment availability 24x7General operational excellence. This includes good troubleshooting skills, understanding of system’s capacity and bottlenecks.Must Haves:
Strong problem-solving skills and adaptability to a complex environmentProviding technical support and design Hadoop Big Data platforms (preferably Cloudera distributions like Hive, Beeline, Spark, HDFS, Kafka, Yarn, Zookeeper etc.), handle and identify possible failure scenario (Incident Management), respond to end users of Hadoop platform on data or application issues, report and monitor daily SLA that identifies vulnerabilities and opportunities for improvement.Hands-on experience with large scale Big Data environment builds, capacity planning, performance tuning and monitoring, including end-to-end Cloudera cluster installation.Handling Hadoop security activities using Apache Ranger, Knox, TLs, Kerberos and Encryption zone management.Expertise in software installation and configuration, orchestration, and automation with tools such as Jenkins/Ansible.Improve the current estate by incorporating the use of centralized S3 data storage (VAST) throughout the platform processing stack5 years experience in engineering solutions in a Big Data on-prem or cloud environment.To be successful in this role, you should meet the following requirements:
Forward thinking, independent, creative, and self-sufficient; who can work with limited documentation, has exposure testing complex multi-tiered integrated applications. Ability to work with minimal supervision on own initiative and on multiple tasks simultaneously.Develop shell scripts, LINUX utilities LINUX Commands within the Hadoop system management context.Experience in monitoring and diagnosing Apache Spark jobs, including performance tuning and optimization for large-scale data processing.Implement and manage CI/CD pipelines using Jenkins and Ansible to automate deployment processes and infrastructure provisioning.Collaborate with Spark processing designers to build more efficient data processing at a large/massive scale.Exposure to Agile Project methodology but also with exposure to other methodologies (such as Kanban)You’ll achieve more when you join HSBC.
www.hsbc.com/careers
HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.
Issued by – HSBC Software Development India
To be successful in this role, you should meet the following requirements:
Forward thinking, independent, creative, and self-sufficient; who can work with limited documentation, has exposure testing complex multi-tiered integrated applications. Ability to work with minimal supervision on own initiative and on multiple tasks simultaneously.Develop shell scripts, LINUX utilities LINUX Commands within the Hadoop system management context.Experience in monitoring and diagnosing Apache Spark jobs, including performance tuning and optimization for large-scale data processing.Implement and manage CI/CD pipelines using Jenkins and Ansible to automate deployment processes and infrastructure provisioning.Collaborate with Spark processing designers to build more efficient data processing at a large/massive scale.Exposure to Agile Project methodology but also with exposure to other methodologies (such as Kanban)You’ll achieve more when you join HSBC.
www.hsbc.com/careers
HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.
Issued by – HSBC Software Development India