Plantation, FL
1 day ago
Sr Data Engineer

The Data Engineer is responsible for building, maintaining, and optimizing reliable ELT/ETL workflows that support enterprise-wide data and analytics requirements. This role focuses on hands-on development of data pipelines, data models, and transformation logic to ensure high-quality, accessible, and well-structured data for analytical and operational use cases.

The Data Engineer works closely with data analysts, data scientists, product managers, and business stakeholders to understand requirements, contribute to technical solution design, and implement robust data engineering capabilities. The ideal candidate can translate business needs into technical specifications, write efficient SQL and Python code, and apply modern data engineering best practices across cloud-based environments.

Key Responsibilities

Develop, maintain, and optimize scalable ELT/ETL pipelines using modern data engineering tools and frameworks. Implement and refine data models, schemas, and data structures to support analytics, reporting, and operational workloads. Participate in solution design discussions, contributing to data strategies and technical decisions. Create and maintain high-quality SQL queries, transformation logic, and reusable data processing components. Ensure data quality by implementing validation rules, profiling checks, monitoring, and reconciliation processes. Collaborate with cross-functional teams to gather requirements, troubleshoot issues, and deliver reliable data solutions. Monitor pipeline performance, address failures, and continuously improve reliability, cost, and scalability. Support data governance and security policies across cloud and on-premise systems. Produce clear documentation and communicate technical concepts in a user-friendly manner. Contribute to team best practices, standards, and process improvements.

Required Qualifications

5–10 years of experience in data engineering or related fields. Strong proficiency in SQL and experience with data transformation technologies (e.g., Azure Data Factory, Apache Spark, Glue). Experience building and maintaining ELT/ETL pipelines in cloud platforms (AWS, Azure, or GCP). Solid understanding of data modeling (dimensional modeling, star schema, normalization). Hands-on experience with modern data warehousing platforms such as Databricks, Snowflake, or Azure Synapse. Strong programming skills in Python or another modern scripting language. Experience working with CI/CD workflows; familiarity with tools like Databricks Asset Bundles (DABs) is a plus. Ability to translate business requirements into technical specifications and data solutions. Excellent communication and documentation skills. Experience building Power BI data models or developing analytical datasets is preferred.

Soft Skills

Strong analytical and problem-solving skills with a proactive mindset. Ability to work independently and collaborate effectively within cross-functional teams. Comfortable participating in technical conversations and contributing ideas. High attention to detail, ownership, and accountability in delivering high-quality solutions.
Confirmar seu email: Enviar Email