Bosch Global Software Technologies Private Limited is a 100% owned subsidiary of Robert Bosch GmbH, one of the world's leading global supplier of technology and services, offering end-to-end Engineering, IT and Business Solutions. With over 27,000+ associates, it’s the largest software development center of Bosch, outside Germany, indicating that it is the Technology Powerhouse of Bosch in India with a global footprint and presence in the US, Europe and the Asia Pacific region.
Job DescriptionResponsible Area
Job Summary: We are seeking an experienced AWS Databricks Developer to design, develop, and maintain scalable data processing and analytics solutions on the AWS and Databricks platform. The ideal candidate will have strong expertise in building data pipelines, optimizing performance, and integrating multiple AWS services to deliver high-quality data solutions.
Key Responsibilities:
· Overall, 5 to 8 years of experience in IT Industry. Min 6 years of experience working on Data Engineering.
· Design and develop solution in combination of Cloudera, Palantir and Databricks.
· Develop Python-based automation scripts to optimize workflows and processes.
· Integrate Databricks with AWS ecosystem components such as S3, Glue, Athena, Redshift, and Lambda
· Define architecture for data-driven solutions.
· Provide solutioning for web applications and data pipelines.
· Solutioning data analytics requirements with Palantir.
· Stay updated with emerging technologies and quickly adapt to new tools and frameworks.
· Work with CI/CD pipelines.
· Ensure scalability, security, and performance of deployed solutions.
· Collaborate with business teams, data engineers, and developers to align solutions with business goals.
Key activities
Mandatory Skills:
· Databricks:
o Expertise in design jobs and workloads.
o Proficiency in pyspark / scala for data manipulation, transformation and analysis
o Optimization of Databricks clusters and notebooks.
· Python: Hands-on experience in automation and scripting.
· Cloudera: Strong knowledge of Spark, Hive, Impala, HDFS, Kafa, HBase and Cluster management.
· Palantir: Hands on experience with analytics, data fusion and developing analytical workflows
· DevOps Basics: Familiarity with Jenkins CI/CD pipelines.
· Integrate Databricks with AWS ecosystem components such as S3, Glue, Athena, Redshift, and Lambda
· Communication: Excellent verbal and written communication skills.
· Fast Learner: Ability to quickly grasp new technologies and adapt to changing requirements.
Good to have skills :
· Familiarity with containerization technologies (Docker, Kubernetes).
· Experience with real-time streaming technologies (e.g., Kafka Streams, Spark Streaming).
· Exposure to machine learning operationalization (MLOps) workflows.
· Background in regulated industries with strict data compliance requirements.
Additional Information:
Certifications: Databricks Certified Associate
Educational qualification:
Bachelor’s degree in Computer Science, IT, or related field.
Experience :
Overall, 5 to 8 years of experience in IT Industry. Min 6 years of experience working on Data Engineering
Mandatory/requires Skills :
AWS Databricks, Python, Cloudera and Palantir
Preferred Skills :