Bengaluru, IND
8 days ago
Lead Data/AI Engineering
**Job Description:** **Job Title / Advertise Job Title:** LEAD DATA/AI ENGINEERING **Job Summary:** The Lead Engineer manages the design, development, and deployment of scalable data and AI solutions, leading complex projects that solve major business challenges and deliver organizational value. This role guides engineering teams in building robust data pipelines, analytics models, and machine learning applications, while collaborating cross-functionally to translate business needs into technical strategies. The Lead Engineer ensures best practices in data architecture, quality, and governance, fosters innovation with the latest technologies, and mentors team members to drive data-driven decision-making and business growth. **Roles and Responsibilities:** **• Design, Development, Testing, and Deployment:** Drive the development of scalable Data & AI warehouse applications by leveraging software engineering best practices such as automation, version control, and CI/CD. Implement comprehensive testing strategies to ensure reliability and optimal performance. Manage deployment processes end-to-end, effectively addressing configuration, environment, and security concerns. + **Engineering and Analytics:** Transform Data Warehouse and AI use case requirements into robust data models and efficient pipelines, ensuring data integrity by applying statistical quality controls and advanced AI methodologies.• **API & Microservice Development:** Design and build secure, scalable APIs and microservices for seamless data integration across warehouse platforms, ensuring usability, strong security, and adherence to best practices.• **Platform Scalability & Optimization:** Assess and select the most suitable technologies for cloud and on-premises data warehouse deployments, implementing strategies to ensure scalability, robust performance monitoring, and cost-effective operations. + **Lead** : Lead the execution of complex data engineering and AI projects to solve critical business problems and deliver impactful results. + **Technologies:** Leverage deep expertise in Data & AI technologies (such as Spark, Kafka, Databricks, and Snowflake), programming (including Java, Scala, Python, and SQL), API integration patterns (like HTTP/REST and GraphQL), and leading cloud platforms (Azure, AWS, GCP) to design and deliver data warehousing solutions. **Shift timing (if any):** 12:30 PM to 9:30 PM IST **Location / Additional Location (if any):** Bangalore, Hyderabad **Overall Experience:** + Typically requires a minimum 15 years of progressive experience in data engineering, data architecture, or related fields. + Demonstrated experience leading complex data projects, managing teams, and delivering end-to-end data solutions in large or matrixed organizations is highly valued. + Experience with cloud data platforms, big data technologies, and implementing best practices in data governance and DevOps is strongly preferred. **Primary / Mandatory skills:** + **Delivery:** Proven experience in managing and delivering complex data engineering and AI solutions for major business challenges. + **Data Architecture & Modeling:** Expertise in designing scalable, high-performance data architectures (e.g., data warehouses, data lakes, data marts) and creating robust data models. + **ETL/ELT Development:** Advanced skills in building, optimizing, and maintaining data pipelines using modern ETL/ELT tools (e.g., Informatica, Talend, dbt, Azure Data Factory). + **Cloud Platforms:** Proficiency with cloud data services and platforms such as AWS, Azure, or Google Cloud (e.g., Redshift, Snowflake, Databricks, BigQuery). + **Programming Languages:** Strong coding ability in SQL and at least one general-purpose language (e.g., Python, Scala, Java). + **Big Data Technologies:** Experience with distributed data processing frameworks (e.g., Spark, Hadoop) and real-time streaming tools (e.g., Kafka). + **Data Governance & Quality:** Knowledge of data governance practices, data lineage, data cataloging, and implementing data quality checks. + **CI/CD & Automation:** Experience in automating data workflows, version control (e.g., Git), and deploying CI/CD pipelines for data applications. + **Analytics & AI/ML Integration:** Ability to support advanced analytics and integrate machine learning pipelines with core data platforms. + **Leadership & Collaboration:** Proven track record in leading teams, mentoring engineers, and collaborating with business, analytics, and IT stakeholders. + **Problem Solving & Communication:** Strong analytical, troubleshooting, and communication skills to translate business needs into technical solutions. **Secondary / Desired skills:** + **Data Visualization:** Experience with BI tools such as Power BI, Tableau, or Looker for creating dashboards and visual analytics. + **AI/ML Model Operationalization:** Familiarity with deploying, monitoring, and scaling machine learning models in production environments (MLOps). + **API & Microservices Development:** Understanding of building and consuming RESTful APIs and microservices for data integration. + **Data Security & Privacy:** Knowledge of data encryption, access controls, and compliance with data privacy regulations (GDPR, CCPA, SOX). + **Data Catalogs & Metadata Management:** Experience with tools like Alation, Collibra, or Azure Purview for cataloging and managing metadata. + **Workflow Orchestration:** Hands-on with workflow tools (e.g., Apache Airflow, Control-M, Prefect) for scheduling and monitoring data jobs. + **Performance Tuning:** Skills in optimizing queries, storage, and processing for cost and speed. + **Change/Data Release Management:** Experience in managing data schema evolution, versioning, and deployment coordination. + **GitHub & Copilot Proficiency:** Proficient in using GitHub for version control, collaboration, and CI/CD pipelines; experience leveraging GitHub Copilot to enhance coding efficiency and foster team productivity. + **DevOps for Data:** Exposure to infrastructure-as-code (Terraform, CloudFormation) and containerization (Docker, Kubernetes) for data workloads. + **Domain Knowledge:** Understanding of Finance, Telecom, Retail, or the relevant business domain to better align data solutions with business needs. + **Project Management:** Familiarity with Agile, Scrum, or Kanban methodologies for managing data projects. + **Stakeholder Management:** Ability to effectively engage with non-technical users, translate requirements, and manage expectations. **Additional information (if any):** + **Leadership & Mentorship:** Expected to mentor and develop junior engineers, foster a culture of knowledge sharing, and lead by example in adopting best practices. + **Cross-Functional Collaboration:** Will work closely with data scientists, business analysts, product managers, and IT teams to deliver end-to-end solutions that meet business needs. + **Innovation & Continuous Improvement:** Encouraged to stay current with emerging technologies and trends in data engineering, AI/ML, and cloud platforms, and to proactively recommend and implement improvements. + **Ownership & Accountability:** Responsible for the entire data engineering lifecycle, including architecture, implementation, monitoring, and optimization. + **Communication Skills:** Must be able to translate complex technical concepts into clear, actionable insights for non-technical stakeholders and leadership. + **Change Management:** Experience managing change in fast-paced environments and guiding teams through technology transformations is highly valued. + **Quality & Compliance Focus:** Commitment to data quality, security, and compliance is essential, with experience in implementing and maintaining controls and standards. + **Business Impact:** Expected to contribute to measurable business outcomes by enabling data-driven decision-making and supporting organizational goals. **Education Qualification:** + Bachelor’s degree in Computer Science, Information Technology or a related field is required. + A Master’s degree in Data Science, Computer Science, Engineering, or a related discipline. **Certifications (if any specific):** + Cloud Platform Certifications (AWS, Azure, GCP) + Data Engineering & Big Data (Databricks, CCDP) + Database & Data Warehousing (SnowPro, GCP) + General Data & AI (CDMP, AI/ML integration, Microfosft) + DevOps & Automation (Github, Gitlab CI/CD) **Weekly Hours:** 40 **Time Type:** Regular **Location:** IND:KA:Bengaluru / Innovator Building, Itpb, Whitefield Rd - Adm: Intl Tech Park, Innovator Bldg It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made. AT&T will consider for employment qualified applicants in a manner consistent with the requirements of federal, state and local laws We expect employees to be honest, trustworthy, and operate with integrity. Discrimination and all unlawful harassment (including sexual harassment) in employment is not tolerated. We encourage success based on our individual merits and abilities without regard to race, color, religion, national origin, gender, sexual orientation, gender identity, age, disability, marital status, citizenship status, military status, protected veteran status or employment status
Confirmar seu email: Enviar Email