DESCRIPTION:
Duties: Design, analyze, develop and test data pipeline applications using full-stack development tools and technologies to create high quality software and new products. Use tools and services, including within AWS, to deploy the application to the public cloud. Translate business requirements into workable ETL pipeline software components. Perform unit, integration and automated end-to-end testing of data pipelines to ensure that software is functioning per business requirements and technical specifications. Ensure that developed applications comply with enterprise risk control and policies. Collaborate with Level 2 production support teams to assist on production release. Estimate, plan and manage software development tasks adopting agile methodology.
QUALIFICATIONS:
Minimum education and experience required: Master's degree in Software Engineering, Computer Science, Computer Engineering, Computer Information Systems, Information Technology, or in a related field of study plus three (3) years of experience in the job offered or as Software Engineer, Software Developer, Software Development Engineer, IT Analyst, or in a related occupation. The employer will alternatively accept a Bachelor's degree in Software Engineering, Computer Science, Computer Engineering, Computer Information Systems, Information Technology, or in a related field of study plus five (5) years of experience in the job offered or as Software Engineer, Software Developer, Software Development Engineer, IT Analyst, or in a related occupation.
Skills Required: This position requires experience with the following: Designing and developing web applications using J2EE, Spring, HTML, Java, Bootstrap, WebSphere, JSON, SpringBoot, JSP, Servlet, Struts, Google Guava, Redis, NodeJS, YML, XML, Swagger, Apache Kafka, Apache Spark, Rabit MQ; Using project development tools such as IntelliJIDE, Eclipse, and Visual Studio; Using databases such as Hibernate, SQL, SQL Server, Cassandra, Oracle, JDBC, Postgresql, and MySQL for migration projects; Monitoring performances of the microservices using tools such as Grafana, Postman and Splunk; Migrating applications to public cloud platform AWS using S3, Lambda, IAM, DynamoDB, SQS, SNS, EC2, CloudWatch, CloudFormation, CloudTrail, EKS, ECS, Fargate, Aurora, Redshift, Glue, RDS, ALB, NLB, Route53, MSK, Step Functions, and Beanstalk; Utilizing Jenkins, Maven, Docker, Kubernetes and Jules to develop, deploy and validate application flow; Running Automated Testing, Functional Testing, Manual Testing, Performance Testing, Regression Testing, Unit Testing, User Acceptance Testing, Jacoco, Cucumber, Junit, Powermock, Mockito, JMeter, and Blazemeter; and Using Apache Subversion, GIT and Bitbucket in order to maintain source code safety.
Job Location: 8181 Communications Pkwy, Plano, TX 75024.