India
1 day ago
Cloud Data Engineer
Cloud Data Engineer – Cloud Platforms, ETL/ELT & Big Data Systems

Skills:
Cloud Data Engineering | AWS | Azure | GCP | ETL/ELT | Big Data | Data Warehousing | Data Pipelines | Data Governance

Department: Data Engineering / Cloud Platform
Employment Type: Full Time
Work Mode: Onsite / Hybrid / Remote
Experience: 5–10 Years

About the Role

We are seeking a highly skilled Cloud Data Engineer to design, build, and maintain scalable, high-performance data pipelines across cloud platforms.

This role focuses on enabling data-driven decision-making by ensuring efficient data ingestion, transformation, storage, and accessibility for analytics and business intelligence. You will work with large volumes of structured and unstructured data, contributing to enterprise-scale data architecture and cloud transformation initiatives.

Key Responsibilities Data Pipeline Development & Optimization Design, build, and maintain scalable ETL/ELT pipelines across cloud environments Process large-scale structured and unstructured datasets efficiently Optimize pipelines for performance, reliability, and cost efficiency Cloud Data Architecture Develop and manage cloud-native data solutions using AWS, Azure, or GCP Work with data warehouses such as Redshift, BigQuery, Snowflake, or Synapse Design and implement data lake and data warehouse architectures Data Engineering & Processing Build robust data workflows using Python, SQL, and Spark (PySpark) Develop batch and real-time data processing pipelines Ensure high data quality, integrity, and consistency across systems Data Orchestration & Automation Implement orchestration workflows using Apache Airflow or similar tools Automate data ingestion, transformation, and deployment pipelines Support CI/CD practices for data engineering workflows Monitoring, Performance & Troubleshooting Monitor pipeline performance and resolve bottlenecks Optimize query performance and data processing efficiency Ensure system reliability through proactive issue resolution Security, Compliance & Governance Implement data security, encryption, and access control mechanisms Ensure compliance with data privacy and regulatory standards Support data governance, metadata management, and auditing processes Collaboration & Cross-Functional Work Collaborate with data scientists, analysts, and engineering teams Enable data access for analytics, reporting, and machine learning use cases Support enterprise-wide data initiatives and platform integrations Cloud Migration & Innovation Support cloud migration and modernization initiatives Evaluate new tools and technologies for data engineering improvements Contribute to proof-of-concept (POC) and architecture design decisions Technical Skills Programming & Data Processing Required: Python, SQL Preferred: PySpark, Scala, Java Databases & Data Management Relational databases: PostgreSQL, MySQL, SQL Server Cloud data warehouses: Redshift, BigQuery, Snowflake, Synapse Experience with data lake architectures Cloud Technologies AWS, Azure, or GCP Cloud-native storage, compute, and data services Frameworks & Tools Apache Spark, Hadoop ecosystem Data integration tools (Apache NiFi or similar) Orchestration & DevOps Airflow, Terraform, CloudFormation Jenkins, Git, Docker, Kubernetes Security & Compliance Data masking, encryption, IAM Compliance frameworks (GDPR, HIPAA, or equivalent) Experience Requirements 5+ years of experience in cloud data engineering and pipeline development Proven experience with ETL/ELT processes and large-scale data systems Hands-on experience with cloud data platforms and warehouses Experience optimizing data workflows for scalability and performance Exposure to cloud migration and infrastructure automation preferred Day-to-Day Responsibilities Design and implement scalable data pipelines Collaborate with cross-functional teams on data workflows Automate ingestion and transformation processes Troubleshoot pipeline issues and optimize performance Ensure data security and governance compliance Support cloud migration and modernization projects Maintain documentation and architecture diagrams Qualifications Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field Certifications in cloud data platforms (AWS / GCP / Azure) preferred Experience working in enterprise-scale or regulated environments is a plus Professional Competencies Strong analytical and problem-solving skills Excellent communication and collaboration abilities Ability to manage multiple priorities in fast-paced environments Leadership mindset with mentoring capabilities Strategic thinking for scalable and secure data architecture Continuous learning mindset with adaptability to new technologies Why This Role is High Impact Work on large-scale cloud data platforms and enterprise systems Enable data-driven decision-making across business functions Build high-performance, scalable data infrastructure Contribute to digital transformation and modernization initiatives #CloudDataEngineer #DataEngineering #BigData #ETL #DataPipelines #AWS #Azure #GCP #DataWarehousing #Spark #Airflow #DataGovernance #CloudComputing #TechCareers #HiringNow
Confirmar seu email: Enviar Email
Todos os Empregos de Antal International