At Schwab, you’re empowered to make an impact on your career. Here, innovative thought meets creative problem solving, helping us “challenge the status quo” and transform the finance industry together.
Job Duties: Build-out of core data platform (WAM-Ex) capabilities, cloud native data platform (Big Query/Snowflake), and core data capabilities. This includes orchestration, data security, and data quality to be shared across the Company’s Wealth organization and asset management. Develop data APIs and data delivery services to support critical operational processes, analytical models, and machine learning applications. Define and build best practices and standards for federated development on the WAM-Ex data platform. Design consistent and connected logical and physical data models across data domains. Design consistent data engineering life cycle for building data assets across WAM-Ex initiatives. Analyze and implement best practices in management of enterprise data, including master data, reference data, metadata, data quality, and lineage. Collaborate with other engineers, architects, data scientist, analytics teams, and business product owners to develop software in an Agile development environment. Architect, build, and support the operation of Cloud and On-Premises enterprise data infrastructure and tools. Support selection and integration of data related tools, frameworks, and applications required to expand platform capabilities. Design and develop robust, reusable, and scalable data-driven solutions and data pipeline frameworks to automate ingestion, processing, and delivery of both unstructured batch and real-time streaming data.
What you haveJob Requirements: Bachelor’s in Computer Science, Engineering, or a related field and 60 months of progressive, post-Bachelor’s experience in a related occupation.. Experience must include 36 months of experience involving the following: Cloud infrastructure development using Amazon Web Services (AWS) or Azure or Google Cloud Platform(GCP); Big data processing using Apache Spark, Pyspark, Python, and SQL; Data warehousing using Snowflake or Amazon Redshift or BigQuery; Workflow orchestration tools such as Apache Airflow; Cloud native batch and real time ETL/ELT pipelines; Secure access and RBAC (Role-Based Access Control) for data platforms; Data quality, lineage, and governance using tools or custom frameworks; and CICD/DevOps Tools including Git, Bitbucket, Jenkin, Bamboo, or Github. .
We offer competitive pay and benefits. Starting compensation depends on related experience. Annual bonus and other eligible earnings are not included in the ranges above. Benefits include: 401(k) w/ company match; employee stock purchase plan; paid vacation, volunteering, 28-day sabbatical after every 5 years of service for eligible positions; paid parental leave and family building benefits; tuition reimbursement; health, dental, and vision insurance; hybrid/remote work schedule available for eligible positions (subject to Schwab’s internal approach to workplace flexibility).
What’s in it for you
At Schwab, you’re empowered to shape your future. We champion your growth through meaningful work, continuous learning, and a culture of trust and collaboration—so you can build the skills to make a lasting impact. Our Hybrid Work and Flexibility approach balances our ongoing commitment to workplace flexibility, serving our clients, and our strong belief in the value of being together in person on a regular basis.
We offer a competitive benefits package that takes care of the whole you – both today and in the future:
401(k) with company match and Employee stock purchase planPaid time for vacation, volunteering, and 28-day sabbatical after every 5 years of service for eligible positionsPaid parental leave and family building benefitsTuition reimbursementHealth, dental, and vision insurance Apply Save job