Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Job Summary:
We are seeking a highly experienced and innovative Senior Data Engineer with deep expertise in Google Cloud Platform (GCP), Python, and AI-driven data solutions. This role will focus on building scalable, cloud-native data pipelines and enabling advanced analytics and machine learning capabilities across the organization.
Primary Responsibilities:
Design, develop, and maintain robust data pipelines using GCP services such as BigQuery, Cloud Dataflow, Cloud Composer, and Cloud StorageBuild and optimize data lakes and data warehouses to support analytics and AI workloadsDevelop data ingestion, transformation, and orchestration workflows using Python, Apache Beam, and AirflowCollaborate with data scientists and AI engineers to operationalize machine learning models and integrate them into production data flowsEnsure data quality, governance, and security across all stages of the data lifecycleImplement CI/CD pipelines for data engineering workflows using tools like Cloud Build, GitHub, or BitbucketMonitor and troubleshoot production data pipelines and optimize performance and cost-efficiencyDocument technical designs, processes, and best practices for knowledge sharing and scalabilityComply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regard to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do soRequired Qualifications:
7+ years of experience in data engineering, with a solid focus on cloud-native architecturesHands-on expertise in Google Cloud Platform (GCP) services, especially BigQuery, Dataflow, Pub/Sub, and Cloud ComposerSolid programming skills in Python, with experience in building scalable data solutionsExperience with AI/ML integration, including feature engineering, model deployment, and inference pipelinesSolid understanding of ETL/ELT, data modeling, and distributed computingProficiency in SQL and working with large-scale structured and semi-structured dataFamiliarity with containerization and orchestration tools (e.g., Docker, Kubernetes) is a plusSolid problem-solving skills and ability to work independently and collaborativelyProven excellent communication and stakeholder management skills
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.