Why GM Financial?
GM Financial is the wholly owned captive finance subsidiary of General Motors and is headquartered in Fort Worth, U.S. We are a global provider of auto finance solutions, with operations in North America, South America and the Asia Pacific region. Through our long-standing relationships with auto dealers, we offer attractive retail financing and lease programs to meet the needs of each customer. We also offer commercial lending products to dealers to help them finance and grow their businesses.
At GM Financial, our team members define and shape our culture — an environment that welcomes new ideas, fosters integrity and creates a sense of community and belonging. Here we do more than work — we thrive.
Our Purpose: We pioneer the innovations that move and connect people to what matters
Position open until filled
NOTE: We are unable to offer sponsorship for this position
What makes you a dream candidate?
Experience with processing large data sets using Hadoop, HDFS, Spark, Kafka, Flume or similar distributed systems
Experience with ingesting various source data formats such as JSON, Parquet, SequenceFile, Cloud Databases, MQ, Relational Databases such as Oracle
Experience with Cloud technologies (such as Azure, AWS, GCP) and native toolsets such as Azure ARM Templates, Hashicorp Terraform, AWS Cloud Formation
Understanding of cloud computing technologies, business drivers and emerging computing trends
Thorough understanding of Hybrid Cloud Computing: virtualization technologies, Infrastructure as a Service, Platform as a Service and Software as a Service Cloud delivery models and the current competitive landscape
Working knowledge of Object Storage technologies to include but not limited to Data Lake Storage Gen2, S3, Minio, Ceph, ADLS etc
Experience with containerization to include but not limited to Dockers, Kubernetes, Spark on Kubernetes, Spark Operator
Working knowledge of Agile development / SAFe Scrum and Application Lifecycle Management
Strong background with source control management systems (GIT or Subversion); Build Systems (Maven, Gradle, Webpack); Code Quality (Sonar); Artifact Repository Managers (Artifactory), Continuous Integration/ Continuous Deployment (Azure DevOps)
Experience with NoSQL data stores such as CosmosDB, MongoDB, Cassandra, Redis, Riak or other technologies that embed NoSQL with search such as MarkLogic or Lily Enterprise
Creating and maintaining ETL processes
Knowledgeable of best practices in information technology governance and privacy compliance
Experience with Adobe solutions (ideally Adobe Experience Platform, DTM/Launch) and REST APIs
Troubleshoot complex problems and works across teams to meet commitments
Excellent computer skills and proficiency in digital data collection
Ability to work in an Agile/Scrum team environment
Strong interpersonal, verbal, and writing skills
Digital technology solutions (DMPs, CDPs, Tag Management Platforms, Cross-Device Tracking, SDKs, etc.)
Knowledge of Real Time-CDP and Journey Analytics solutions
Understanding of big data platforms and architectures, data stream processing pipeline/platform, data lake and data lake houses
SQL experience: querying data and sharing what insights can be derived
Understanding of cloud solutions such as Google Cloud Platform, Microsoft Azure & Amazon AWS cloud architecture & services
Understanding of GDPR, privacy & security topics
Experience and Education
Experience with data engineering preferred
Bachelor’s Degree in related field or equivalent experience required
What We Offer: Generous benefits package available on day one to include: 401K matching, bonding leave for new parents (12 weeks, 100% paid), tuition assistance, training, GM employee auto discount, community service pay and nine company holidays.
Our Culture: Our team members define and shape our culture — an environment that welcomes innovative ideas, fosters integrity, and creates a sense of community and belonging. Here we do more than work — we thrive.
Compensation: Competitive pay and bonus eligibility
Work Life Balance: Flexible hybrid work environment, minimum of 2-days a week in office in Fort Worth, Texas
About the role:
The ABS Data Engineer I, Finance is a critical technical role within the GMF North America Securitization and Conduit Reporting team. This position will be helping ABS reporting team in building and maintaining reliable and scalable data pipelines. This position will leverage your expertise in Python, SQL, and Azure cloud technologies to extract, transform, and load data efficiently, enabling seamless data access and analysis for accounting business users. This position involves a high level of coordination with other departments and third-party software vendors.
JOB DUTIES
Work internal business partners to identify, capture, collect, and format data from the external sources, internal systems and the data warehouse to extract features of interest
Contribute to the evaluation, research, experimentation efforts with batch and streaming data engineering technologies in a lab to keep pace with industry innovation
Work with data engineering related groups to inform on and showcase capabilities of emerging technologies and to enable the adoption of these new technologies and associated techniques
Coordinate with Privacy Compliance to ensure proper data collection and handling
Create and implement business rules and functional enhancements for data schemas and processes
Coordinate with Privacy Compliance to ensure proper data collection and handling
Perform data load monitoring and resolution
Work with internal business clients to problem solve data availability and activation issues