Job Description:
At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. We do this by driving Responsible Growth and delivering for our clients, teammates, communities and shareholders every day.
Being a Great Place to Work is core to how we drive Responsible Growth. This includes our commitment to being an inclusive workplace, attracting and developing exceptional talent, supporting our teammates’ physical, emotional, and financial wellness, recognizing and rewarding performance, and how we make an impact in the communities we serve.
Bank of America is committed to an in-office culture with specific requirements for office-based attendance and which allows for an appropriate level of flexibility for our teammates and businesses based on role-specific considerations.
At Bank of America, you can build a successful career with opportunities to learn, grow, and make an impact. Join us!
Job Description:
This job is responsible for developing and delivering complex requirements to accomplish business goals. Key responsibilities of the job include ensuring that software is developed to meet functional, non-functional and compliance requirements, and solutions are well designed with maintainability/ease of integration and testing built-in from the outset. Job expectations include a strong knowledge of development and testing practices common to the industry and design and architectural patterns.
LOB Specific Summary:
The Core Java Big Data developer will be responsible for designing and delivering next generation event processing, alerting and analytics system. The candidate would be hands on developer working within Bank of America Global Markets Equities Data and Analytics group.
The Data and Analytics team is responsible building systems that help the Bank leverage the power of data helping a wide variety of users including front office execution services consultants, sales traders, data scientists and compliance.
The role involves working with large volumes of data generated at very high velocity, typically billions of messages a day with peaks of several thousands of messages per second. Candidates must have a passion for building efficient, scalable and highly resilient systems working in a collaborative environment with multiple stake holders.
Responsibilities:
Codes solutions and unit test to deliver a requirement/story per the defined acceptance criteria and compliance requirementsDesigns, develops, and modifies architecture components, application interfaces, and solution enablers while ensuring principal architecture integrity is maintainedMentors other software engineers and coach team on Continuous Integration and Continuous Development (CI-CD) practices and automating tool stackExecutes story refinement, definition of requirements, and estimating work necessary to realize a story through the delivery lifecyclePerforms spike/proof of concept as necessary to mitigate risk or implement new ideasAutomates manual release activitiesDesigns, develops, and maintains automated test suites (integration, regression, performance)Design and develop distributed, high volume, high velocity multi-threaded event processing systems using Core Java technology stack in an agile settingDevelop highly efficient software code for multiple use cases leveraging Core Java and Big Data technologies for various use cases built on the platform.Provide high operational excellence guaranteeing high availability and platform stabilityParticipate in solution and design discussions using a cloud-based architectureCreating, deploying and configuring applications on AWS Cloud environment.Required Qualifications:
3+ years of recent Core Java development experience3+ years of experience developing distributed multi-threaded systemsExcellent understanding of object oriented design and development principlesTeam player with excellent interpersonal skills and integrityDesired Qualifications:
Experience in developing large scale, fault tolerance systems with Apache Kafka, Apache Storm, NoSQL databases or related technologiesExperience with other Big Data frameworksExperience with DevOps model and tools like AnsibleExperience in the financial industryHands-on experience or knowledge of AWS environmentSkills:
Application DevelopmentAutomationInfluenceSolution DesignTechnical Strategy DevelopmentArchitectureBusiness AcumenDevOps PracticesResult OrientationSolution Delivery ProcessAnalytical ThinkingCollaborationData ManagementRisk ManagementTest EngineeringMinimum Education Requirements: Bachelor Degree or Equivalent Professional Experience.
Shift:
1st shift (United States of America)Hours Per Week:
40