Warren, NJ, 07059, USA
21 hours ago
Databricks Architect
Job Description -Lead the migration and modernization of enterprise data platforms from legacy systems (SSIS, SSRS, SQL Server) to Azure cloud-native solutions. -Design and implement scalable, secure, and high-performance data architectures aligned with business needs. -Develop and optimize OLAP and OLTP data models for analytical and transactional systems. -Design semantic layers and implement best practices for data governance, lineage, and quality across hybrid environments. -Build and maintain ETL/ELT pipelines for batch and streaming data ingestion using Azure Data Factory (ADF) and Databricks. Implement Medallion architecture-based data estates and -Delta Lake Lakehouse solutions using DLT, PySpark Jobs, and Databricks Workflows. -Design and optimize data workflows for ingestion, transformation, and analytics leveraging Azure-native services. -Develop data solutions using PySpark, SparkSQL, and Python within Databricks on Azure. Implement key data concepts such as Change Data -Capture (CDC), streaming vs. batch ingestion, and source-to-target mapping. -Ensure robust data ingestion strategies using Azure stack and best practices for pull vs. push paradigms. -Work closely with cross-functional teams to gather requirements and deliver tailored data solutions for P&C Insurance (commercial line). -Provide technical leadership and mentorship to data engineers and developers. -Monitor Databricks and Azure services for performance issues and resolve them proactively. -Establish CI/CD pipelines using Git and VS Code for automated deployments. We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/. Skills and Requirements • P&C Insurance (commercial line) is knowledge required. • Experience in modernizing (solutions and hands-on execution) enterprise data platforms from legacy to Azure cloud • Deep knowledge and experience in Data Modelling (OLAP and OLTP), Data Lake, Data Warehousing, ETL/ELT pipelines on both on-premise legacy and on AZURE • Lead migration and modernization of legacy ETL processes from SSIS, SSRS, and SQL Server to cloud-native solutions. • Design and optimize data workflows for ingestion, transformation, and analytics using AZURE-native services • Design and Build data pipelines and solutions using Databricks (PySpark and SparkSQL) on AZURE • Experience with building Medallion architecture-based data estates • Experience in building Databricks Delta Lake based Lakehouse using DLT, PySpark Jobs, Databricks Workflows • Proficient in SQL, Python, PySpark, ADF,Azure stack • Working knowledge of Git, CI/CD, VS Code • Proficient in AZURE data ingestion stack • Implementation experience of several key data concepts such as CDC (Change Data Capture), Streaming and/or Batch ingestion, Pull v/s Push paradigms, Source to Target mapping, and so on • Collaborate with cross-functional teams to gather requirements and deliver scalable, secure, and high-performance data solutions. • Strong semantic layer modelling and implementation experience • Establish best practices for data governance, lineage, and quality across hybrid environments. • Provide technical leadership and mentoring to data engineers and developers. • Monitor and troubleshoot performance issues across Databricks and AZURE services. • Understanding of key reporting stack such as Power BI, Tableau, Excel BI Add-Ins a -Databricks/Azure Certified Data Engineer Associate is a plus
Confirmar seu email: Enviar Email