Pune
10 days ago
Lead II - Pipeline Engineer

We are seeking a skilled Pipeline Engineer to design, develop, and optimize scalable data and processing pipelines supporting large-scale distributed systems. The ideal candidate will have hands-on experience with microservices, big data technologies, and modern CI/CD workflows. You will be instrumental in building and maintaining robust data and service pipelines that power our core systems.

Key Responsibilities:

Design, build, and maintain scalable data pipelines using Java or Scala.

Develop microservices and backend components with Spring Boot.

Implement and support big data workflows leveraging Kafka, Spark, Zeppelin, and Hadoop.

Create efficient data integration workflows to ensure high performance and reliability.

Work with distributed data stores including S3, Cassandra, MongoDB, Couchbase, Redis, and Elasticsearch.

Utilize modern serialization formats like Protocol Buffers, Avro, or Thrift for efficient data exchange.

Build and maintain CI/CD pipelines to automate testing and continuous delivery.

Containerize applications with Docker and orchestrate using Kubernetes and Helm.

Deploy and manage infrastructure on AWS cloud services.

Collaborate cross-functionally to ensure data quality, reliability, and optimal performance.

Mandatory Skills:

Proficient in Java with 2+ years of experience building scalable microservices.

Strong expertise in Spring Boot and backend development.

Solid understanding of functional programming principles.

Experience with CI/CD pipelines and version control tools such as Git, Jenkins, or GitLab CI.

Proficient with big data technologies including Kafka, Spark, Zeppelin, Hadoop, and AWS EMR.

Experience with distributed storage and databases like S3, Cassandra, MongoDB, Elasticsearch, Couchbase, or Redis.

Familiarity with serialization formats (Protocol Buffers, Avro, Thrift).

Hands-on experience with Docker, Kubernetes, and Helm.

Knowledge of AWS cloud services and infrastructure management.

Good to Have:

Experience with real-time data streaming and processing.

Familiarity with observability tools (Prometheus, Grafana, ELK stack).

Understanding of security best practices for data pipelines and APIs.

Exposure to infrastructure as code (IaC) tools such as Terraform or CloudFormation.

Experience with ETL processes.

Soft Skills:

Excellent written and verbal communication skills.

Strong ownership and ability to independently drive tasks to completion.

Proactive in identifying blockers and proposing solutions.

Collaborative mindset to work effectively with backend, frontend, and DevOps teams.

Comfortable working in fast-paced, asynchronous environments.

Confirmar seu email: Enviar Email