San Francisco, CA, 94103, USA
3 days ago
INTL - INDIA - Lead Data Engineer
Job Description Responsibilities: Data Architecture & Modeling: Design logical and physical data models ensuring high quality and minimal redundancy. Data Integration: Develop and manage ETL pipelines to integrate data from multiple sources for reporting and analytics. BI Implementation: Evaluate and implement BI tools and technologies; configure reporting and visualization platforms. Performance Optimization: Monitor and tune queries, optimize data structures, and ensure efficient data processing. Collaboration: Partner with business stakeholders, analysts, and engineering teams to deliver solutions aligned with business needs. Governance & Security: Establish data governance policies, access controls, and compliance measures. We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/. Skills and Requirements - 8–10 years of software development and deployment experience - 5+ years hands‑on with: SQL, Databricks, Azure Data Factory (ADF), Datastage (or equivalent ETL), - Experience with SSAS cubes, Cognos, Tableau, ThoughtSpot, or other BI tools. - Proven ability to write production-grade SQL for raw data processing, Kafka ingestions, ADF pipelines, and rigorous data validation/QA. - API integration experience (collecting/ingesting data via REST/GraphQL or similar). - Strong programming skills in Scala and Python. - Deep database expertise across SQL and NoSQL systems.
Confirmar seu email: Enviar Email