Trivandrum
1 day ago
Lead I - Data Engineering

Job Title: Data Engineer

Exp-5+yrs

Location-Trivandrum, Kochi, Chennai, Pune

Job Summary

We are seeking a skilled Data Engineer to design, build, and maintain data pipelines and data models that support analytical and business intelligence needs. The ideal candidate will have hands-on experience with Python or SQL, Google Cloud Platform (GCP), and a strong understanding of data management, quality, and security best practices.

Key Responsibilities

Build and maintain moderately complex data pipelines, ensuring data flow, transformation, and usability for analytical projects.

Design and implement data models, optimizing for performance and scalability.

Apply knowledge of data characteristics and supply patterns to develop rules and tracking processes that support data quality models.

Prepare data for analytical use by gathering, integrating, cleansing, and structuring data from multiple sources and systems.

Perform design, creation, and interpretation of large and highly complex datasets.

Troubleshoot pipeline and data issues to ensure accuracy and reliability.

Stay up-to-date with GCP advancements and recommend innovative solutions.

Implement security best practices within data pipelines and cloud infrastructure.

Collaborate with global teams to share and adopt best practices in data management, maintenance, reporting, and security.

Develop and execute data quality checks to ensure consistency and integrity.

Work with credit data products and perform analysis using tools like Google BigQuery, BigTable, DataFlow, and Spark/PySpark.

Mandatory Skills

Python or SQL Proficiency: Experience with Python or SQL and intermediate scripting for data manipulation and processing.

GCP & Cloud Fundamentals: Intermediate understanding and experience with Google Cloud Platform (GCP) and overall cloud computing concepts.

Data Pipeline Construction: Proven ability to build, maintain, and troubleshoot moderately complex pipelines.

Data Modeling & Optimization: Experience designing and optimizing data models for performance.

Data Quality Governance: Ability to develop rules, tracking processes, and checks to support a data quality model.

Data Preparation & Structuring: Skilled in integrating, consolidating, cleansing, and structuring data for analytical use.

Security Implementation: Knowledge of security best practices in pipelines and cloud infrastructure.

Big Data Analysis Tools: Hands-on experience with Google BigQuery, BigTable, DataFlow, Scala + Spark or PySpark.

Advanced Data Formats: Experience working with JSON, AVRO, and PARQUET formats.

Communication & Best Practices: Strong communication skills to promote global best practices and guide adoption.

Preferred Qualifications

Cloud certification (e.g., GCP Data Engineer, AWS, Azure).

Experience with credit data products.

Familiarity with data governance frameworks and metadata management tools.

Technical Skills

Python | SQL | GCP | BigQuery | BigTable | DataFlow | Spark / PySpark | JSON | AVRO | PARQUET

Confirmar seu email: Enviar Email