Own your opportunity to work alongside federal civilian agencies. Make an impact by providing services that help the government ensure the well being of U.S. citizens.
Job DescriptionSeize your opportunity to make a personal impact as a Kafka Subject Matter Expert (SME) supporting The United States Postal Service (USPS). GDIT is your place to make meaningful contributions to challenging projects and grow a rewarding career.
This position would be aligned to the USPS Office of Chief Information Officer (CIO) Architecture, Strategy, & Innovation (CASI). The position is under the Technology & Infrastructure Innovation Program which requires a Kafka (SME) to support the team with analysis to integrate new Kafka cluster and event stream requirements into established large-scale, production enterprise architectures. Conduct design and architecture assessments and provide written recommendations for integrating Kafka event streams within an application domain or enterprise event brokering. Additionally, help troubleshoot production Kafka event streams and software integration issues as top tier internal support, including review of performance issues and proposing resolutions.
HOW A KAFKA SME WILL MAKE AN IMPACT:
Conduct research and provide written reviews of Kafka ecosystem best practices and innovative strategies for hybrid multi-cloud high availability.Conduct proof of concept activities and build prototypes for Kafka technology stack components in sandbox environments to assess new capabilities.Define and implement standards and patterns for Kafka ecosystem life cycle, test-driven protection schemes, and automated implementation strategies.Devise strategies and roadmaps for enterprise expansion of Kafka ecosystem capabilities, integrations, and governance.Strategy and compliance reviews and recommendations for enterprise Kafka ecosystem elements which may include the following: architecture frameworks, databases, network, web and application architecture resources, backup and recovery, high availability, disaster recovery, patch management and analytics.Hands-on experience designing and implementing Kafka event streaming capabilities in applications and infrastructure across hybrid multi-cloud environments.Experience producing IT technical artifacts, with an emphasis on Kafka event streams, including design documents, architecture diagrams, architecture assessments, white papers, test plans, requirements mapping, and implementation plans.In-depth knowledge of design principles and inner workings of Kafka implementations and applicable use cases for migrating applications from legacy style to modern style with the use of Event Streams and Async APIs.Demonstrable knowledge of:Apache Kafka, Confluent Platform, Confluent Cloud.Apache Strimzi, Confluent for Kubernetes.Cloud services with Kafka API compatibility (e.g. Azure Event Hub, Amazon MSK).Confluent Schema Registry and serialization using JSON and AVRO schemas.Kafka replication options to include Mirror Maker, Confluent Cluster Linking and Schema Linking.Confluent Identity Mgmt. and RBAC, integrated/federated with enterprise IDP & role management.Kafka Client APIs (Producer, Consumer, Streams).Kafka Connect and Connectors (e.g. JDBC, Cassandra, Google BigQuery, WebSphere MQ, etc).Sizing and capacity planning for Kafka clusters.Kafka topic partitioning strategies including partition key designWHAT YOU’LL NEED TO SUCCEED:
Required Experience:
13+ years of hands-on experienceEducation:
Bachelor’s degree in computer science, information technology, or related field or equivalent experience – not required.Required skills:
Using design patterns for building scalable and maintainable applications/solutions.Clearly document code, models, and technical solutions.Proficiency in Generative AI and prompt engineering.Continuous learning and adaptability in a very large IT organizationCommunicating complex technical concepts to both technical and executive stakeholders.Troubleshooting software and technical implementations in large-scale enterprise ecosystems.Understanding of concepts like CI/CD, containerization, and deployment strategies for Kafka components in large-scale production environments.Querying and managing data in both SQL and NoSQL databases.Proficiency creating technical diagrams with products like Microsoft Visio or Draw.io.Proficiency in creating technical design and architecture documents in Microsoft Word.Proficiency creating business and technical presentations in Microsoft PowerPoint.Proficiency creating data representations, charts and reports in tools such as Microsoft’s Excel worksheets and Power BI.Desired Skills:
Integrating Kafka event streams with Agentic AI workflows.Location:
Raleigh, NC (Preferred)RemoteSecurity Clearance Level:
Must be able to obtain and maintain a Public Trust and successfully pass a thorough Government background screening process requiring the completion of detailed forms and fingerprintingThis position has a U.S. residency requirement. The USPS security clearance process requires the selected candidate to have resided in the U.S. (including U.S. Territories) for the last five years as follows: U.S. Citizens cannot have left the U.S. (including U.S. Territories) for longer than 6 months consecutively in the last 3 years (unless they meet certain exceptions). Non-U.S. Citizens cannot have left the U.S. (including U.S. Territories) for longer than 90 days consecutively in the last 3 years.GDIT IS YOUR PLACE:
401K with company matchComprehensive health and wellness packagesThe internal mobility team is dedicated to helping you own your careerProfessional growth opportunities include paid education and certificationsCutting-edge technology you can learn fromRest and recharge with paid vacation and holidays#KafkaSME #zxc726