Pleasanton, CALIFORNIA, USA
11 days ago
Senior Staff Software Developer - FinOps Cloud Development Platform

Company Description

It all started in sunny San Diego, California in 2004 when a visionary engineer, Fred Luddy, saw the potential to transform how we work. Fast forward to today — ServiceNow stands as a global market leader, bringing innovative AI-enhanced technology to over 8,100 customers, including 85% of the Fortune 500®. Our intelligent cloud-based platform seamlessly connects people, systems, and processes to empower organizations to find smarter, faster, and better ways to work. But this is just the beginning of our journey. Join us as we pursue our purpose to make the world work better for everyone.

Job Description

Team

Join the Global Cloud Services organization's FinOps Tools team, which is building ServiceNow's next-generation analytics and financial governance platform. Our team owns the full modern data stack: Trino for distributed queries, dbt for transformations, Iceberg for lakehouse architecture, Lightdash for business intelligence, and Argo Workflows for orchestration. You will be the founding engineer dedicated to building the Cloud Development Platform that empowers our 30+ data practitioners (data scientists, analysts, and FinOps engineers) to collaborate and productionize analytics at scale.

Role

We are building a cloud-native data development platform that bridges the gap between exploratory analysis and production-grade workflows. As our founding Staff Software Developer focused on Cloud Development Infrastructure, you will design, architect, and rapidly implement a platform built on VS Code, Coder, and Jupyter that seamlessly integrates with our existing data stack (Trino, dbt, Iceberg, Lightdash, Argo Workflows).

You will establish opinionated, automated pathways from notebook experimentation to production pipelines, moving at startup speed within an enterprise environment. This role demands aggressive execution: working prototype in 3 months, production-ready platform in 6 months.

This is a unique opportunity to build from the ground up and define how data development happens at ServiceNow's scale.

What you get to do in this role:

 

Design and develop scalable, maintainable, and reusable software components with a strong emphasis on performance and reliability.Collaborate with product managers to translate requirements into well-architected solutions, owning features from design through deliveryBuild intuitive and extensible user experiences using modern UI frameworks, ensuring flexibility for customer-specific needs.Contribute to the design and implementation of new products and features while enhancing existing product capabilities.Integrate automated testing into development workflows to ensure consistent quality across releases.Participate in design and code reviews ensuring best practices in performance, maintainability, and testability.Develop comprehensive test strategies covering functional, regression, integration and performance aspectsFoster a culture of continuous learning and improvement by sharing best practices in engineering and qualityPromote a culture of engineering craftsmanship, knowledge-sharing, and thoughtful quality practices across the team.

Technical Leadership & Architecture

Design and architect the foundational cloud development platform for notebook-based data workflowsLead technical decision-making on workspace provisioning, developer experience, and productionization pathwaysEstablish best practices for notebook-to-production workflows, including git integration, parameterization, validation, and automated deploymentDrive innovation in data development platforms, leveraging AI/ML tools for enhanced developer productivityMove fast: deliver working MVP in 3 months, production system scale in 6 months

Hands-On Development

Build and customize cloud workspace infrastructure using Coder (open source) on KubernetesDevelop VS Code extensions (TypeScript) for productionization workflows: notebook validation, parameterization, and Argo Workflow generationImplement opinionated notebook templates and validation rules for production-ready data pipelinesCreate seamless integrations between notebooks and ServiceNow's data stack: Trino queries, Iceberg table outputs, Lightdash previews, dbt transformationsBuild backend services (Python) for workflow orchestration, notebook parsing, and metadata managementDeploy JupyterHub initially, then progressively replace components with custom platform features based on user feedback

Platform Foundation

Design container images with embedded security policies, pre-configured data access to Trino/Iceberg tables, and optimized dependenciesImplement git-native workflows with automated notebook versioning, code review integration, and CI/CD pipelinesBuild observability and monitoring for workspace health, user activity, and pipeline success ratesEstablish infrastructure foundation that scales from 5 early adopters to 30+ practitioners within first year

Developer Experience & Automation

Create "template-based" notebook workflows with opinionated structure: parameterization (Papermill-style), Iceberg table outputs, validation checkpointsBuild CLI and UI tooling for one-click productionization: notebook → Argo Workflow with minimal manual interventionEstablish developer guardrails: credential management, data access policies, resource quotasCollaborate closely with early adopter data scientists to rapidly iterate on workflows and validate usabilityPrioritize platform stability and clear productionization paths over feature breadth in first 6 months

AI-Driven Development

Leverage cutting-edge AI development tools (e.g.. Cursor, Windsurf, ChatGPT, GitHub Copilot) to accelerate development velocityEstablish AI-augmented development practices and mentor future team members on effective AI tool utilizationDrive innovation in AI-assisted code generation, testing, and platform optimization

Collaboration & Integration

Work autonomously with guidance from Engineering and FinOps leadershipCollaborate with DevOps team on Kubernetes infrastructure, CI/CD pipelines, and security policiesPartner with FinOps Tools team members working on Trino, dbt, Lightdash, and Iceberg to ensure seamless integrationsContribute to open-source projects in the notebook and developer tooling ecosystem

Qualifications

To be successful in this role you have:

Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AI's potential impact on the function or industry.12+ years of experience in software engineering, with a track record of delivering high-quality products with deep expertise in full-stack development and cloud-native architecture with a Bachelor's degree; or 8 years and a Master's degree; or a PhD with 5 years experience in Computer Science, Engineering, or related technical field; or equivalent experience.Strong Python skills for backend services, API development, and data tooling (notebook parsing, workflow generation)Proven track record of rapid execution in greenfield environments with evolving requirementsHands-on experience building and scaling developer platforms or internal tools at enterprise scaleDeep understanding of cloud development environments (Coder, GitHub Codespaces, Gitpod, or similar)Strong Kubernetes and containerization expertise for cloud-native application deploymentExperience with data workflows and tooling: Jupyter, notebooks, orchestration systems (Airflow/Argo), data catalogsFull professional proficiency in EnglishProficiency in Python, Java, or similar object-oriented languages.Experience with modern front-end frameworks such as Angular, React, or Vue.Strong knowledge of data structures, algorithms, object-oriented design, design patterns, and performance optimizationFamiliarity with automated testing frameworks (e.g., JUnit, Selenium, TestNG) and integrating tests into CI/CD pipelinesUnderstanding software quality principles including reliability, observability, and production readiness.Ability to troubleshoot complex systems and optimize performance across the stack.Experience with AI-powered tools or workflows, including validation of datasets, model predictions, and inference consistency.Comfort with development tools such as IDEs, debuggers, profilers, source control, and Unix-based systems

Technical Expertise

VS Code ecosystem: Extension API, webview development, command palette, language servers, debugger protocolsCoder or similar platforms: Workspace provisioning, remote development environments, infrastructure customizationJupyter ecosystem: JupyterHub, Jupyter Server, Papermill, nbconvert, or similar notebook toolingKubernetes & containerization: Pod management, custom resource definitions, Helm charts, image securityInfrastructure as Code: Terraform, Kubernetes operators, GitOps workflowsGit workflows: Branching strategies, code review automation, CI/CD integrationModern data stack: Familiarity with Trino, dbt, Iceberg, Argo Workflows, or similar technologiesAPI design: RESTful services, authentication (OAuth/SAML), webhook integrations

Platform Engineering & Developer Experience

Proven track record building internal developer platforms or productivity tools from scratchExperience designing opinionated workflows that balance flexibility with guardrailsStrong understanding of developer personas: data scientists, analysts, engineersAbility to iterate rapidly with early adopters and incorporate feedback without over-engineeringExperience with workspace security: secrets management, network policies, image scanningComfort operating at startup velocity within enterprise constraints

Leadership & Communication

Proven ability to work autonomously and drive technical decisions in ambiguous, greenfield environmentsStrong bias toward action: prototype quickly, gather feedback, iterate aggressivelyStrong technical writing and documentation skills for developer-facing contentExcellent collaboration skills across engineering, DevOps, and data teamsAbility to establish technical foundations for new products with long-term vision while delivering short-term results

Nice to Have

Open-source contributions, Jupyter ecosystem, or developer toolingExperience with Argo Workflows, Tekton, or Kubernetes-native CI/CD systemsFamiliarity with data validation frameworks (Great Expectations, dbt tests, etc.)Experience with Apache Iceberg or lakehouse architecturesConference speaking or technical blogging on developer platforms or data tooling

Why Join Us

Build and deliver high-impact software that powers digital experiences for millions of users.Collaborate in a culture that values craftsmanship, quality, and innovation.Work symbiotically with AI and automation tools that enhance engineering excellence and drive product reliability.Be part of a culture that encourages innovation, continuous learning, and shared success.

GCS-23

 

 

 

For positions in this location, we offer a base pay of $187,600 - $328,300, plus equity (when applicable), variable/incentive compensation and benefits. Sales positions generally offer a competitive On Target Earnings (OTE) incentive compensation structure. Please note that the base pay shown is a guideline, and individual total compensation will vary based on factors such as qualifications, skill level, competencies, and work location. We also offer health plans, including flexible spending accounts, a 401(k) Plan with company match, ESPP, matching donations, a flexible time away plan and family leave programs. Compensation is based on the geographic location in which the role is located and is subject to change based on work location.

Additional Information

Work Personas

We approach our distributed world of work with flexibility and trust. Work personas (flexible, remote, or required in office) are categories that are assigned to ServiceNow employees depending on the nature of their work and their assigned work location. Learn more here. To determine eligibility for a work persona, ServiceNow may confirm the distance between your primary residence and the closest ServiceNow office using a third-party service.

Equal Opportunity Employer

ServiceNow is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, national origin or nationality, ancestry, age, disability, gender identity or expression, marital status, veteran status, or any other category protected by law. In addition, all qualified applicants with arrest or conviction records will be considered for employment in accordance with legal requirements. 

Accommodations

We strive to create an accessible and inclusive experience for all candidates. If you require a reasonable accommodation to complete any part of the application process, or are unable to use this online application and need an alternative method to apply, please contact [email protected] for assistance. 

Export Control Regulations

For positions requiring access to controlled technology subject to export control regulations, including the U.S. Export Administration Regulations (EAR), ServiceNow may be required to obtain export control approval from government authorities for certain individuals. All employment is contingent upon ServiceNow obtaining any export license or other approval that may be required by relevant export control authorities. 

From Fortune. ©2025 Fortune Media IP Limited. All rights reserved. Used under license. 

Confirmar seu email: Enviar Email