At Schwab, you’re empowered to make an impact on your career. Here, innovative thought meets creative problem solving, helping us “challenge the status quo” and transform the finance industry together.
Job Duties: Build-out of core data platform (WAM-Ex) capabilities, cloud native data platform (Big Query/Snowflake), and core data capabilities. This includes orchestration, data security, and data quality to be shared across the Company’s Wealth organization and asset management. Develop data APIs and data delivery services to support critical operational processes, analytical models, and machine learning applications. Define and build best practices and standards for federated development on the WAM-Ex data platform. Design consistent and connected logical and physical data models across data domains. Design consistent data engineering life cycle for building data assets across WAM-Ex initiatives. Analyze and implement best practices in management of enterprise data, including master data, reference data, metadata, data quality, and lineage. Collaborate with other engineers, architects, data scientist, analytics teams, and business product owners to develop software in an Agile development environment. Architect, build, and support the operation of Cloud and On-Premises enterprise data infrastructure and tools. Support selection and integration of data related tools, frameworks, and applications required to expand platform capabilities. Design and develop robust, reusable, and scalable data-driven solutions and data pipeline frameworks to automate ingestion, processing, and delivery of both unstructured batch and real-time streaming data.
What you haveJob Requirements: Bachelor’s in Computer Science, Engineering, or a related field and 60 months of progressive, post-Bachelor’s experience in a related occupation.. Experience must include 36 months of experience involving the following: Cloud infrastructure development using Amazon Web Services (AWS) or Azure or Google Cloud Platform(GCP); Big data processing using Apache Spark, Pyspark, Python, and SQL; Data warehousing using Snowflake or Amazon Redshift or BigQuery; Workflow orchestration tools such as Apache Airflow; Cloud native batch and real time ETL/ELT pipelines; Secure access and RBAC (Role-Based Access Control) for data platforms; Data quality, lineage, and governance using tools or custom frameworks; and CICD/DevOps Tools including Git, Bitbucket, Jenkin, Bamboo, or Github. .
We offer competitive pay and benefits. Starting compensation depends on related experience. Annual bonus and other eligible earnings are not included in the ranges above. Benefits include: 401(k) w/ company match; employee stock purchase plan; paid vacation, volunteering, 28-day sabbatical after every 5 years of service for eligible positions; paid parental leave and family building benefits; tuition reimbursement; health, dental, and vision insurance; hybrid/remote work schedule available for eligible positions (subject to Schwab’s internal approach to workplace flexibility).
Options Apply for this jobApplyShareRefer a friendRefer Sorry the Share function is not working properly at this moment. Please refresh the page and try again later. Share on your newsfeedWhy work for us?
At Schwab, we’re committed to empowering our employees’ personal and professional success. Our purpose-driven, supportive culture, and focus on your development means you’ll get the tools you need to make a positive difference in the finance industry.
We offer a competitive benefits package to our full-time employees that takes care of the whole you – both today and in the future:
401(k) with company match and Employee stock purchase plan Paid time for vacation, volunteering, and 28-day sabbatical after every 5 years of service for eligible positions Paid parental leave and family building benefits Tuition reimbursement Health, dental, and vision insurance Application FAQsSoftware Powered by iCIMS
www.icims.com