Bucharest, Romania
1 day ago
Senior Data Engineer
Location: Bucharest, Romania

The people we all rely on to make the world go round – they rely on Thales.  Thales rely on its employees to invent the future: right here, right now.

Present in Romania for over 40 years, Thales is expanding its presence in the country by growing its Digital capabilities and by developing a Group Engineering Competence Centre (ECC). Operating from Bucharest, Thales delivers solutions in a number of core businesses, from ground transportation, space and defence, to security and aeronautics.
Several professional opportunities have arisen. If you are looking for the solidity of a Global Group that is at the forefront of innovation, but with the agility of a human structure that tailors to the personal development of its employees and allows opportunities for evolution in an international environment, then this is the place for you!

Background:

We are seeking a passionate Senior Data Engineer to join our Engineering Project Dashboard team aiming to provide KPIs and metrics to monitor engineering activities of projects' engineering work packages. Customers of Engineering Dashboard digital services are spread all around the world, leading teams with different granularity, and looking for contextual information related to their projects.

Mission:

Our Data Engineer colleague will define and implement data transformation from a Data Lake dedicated to engineering to be exploited through Looker Studio GCP (Google Cloud Platform). The goal is to produce Engineering Project Dashboard team aiming to provide KPIs and metrics to monitor engineering activities of projects' engineering work packages.

Main responsibilities:

Design, build, and maintain scalable and reliable data pipelines with various data sources

Develop ETL / ELT processes using tools like Data Fusion GCP and DataFlow GCP (Apache Beam)

Collect and process the data in a suitable format for the organization needs.

Perform and integrate data quality checks to identify and correct errors or discrepancies.

Create and maintain documentation related to data flows and model, transformations applied, and validation procedures.

Optimize performance and cost-efficiency of GCP data services

Ensure security and compliance best practices in data handling

Maintain clear and close collaboration with both the development team and the project stakeholders/ key users.

Requirements:

Bachelor’s degree in Computer science, Computer Engineering, or relevant technical field.

5 + year of experience with cloud data platforms (e.g., AWS, Azure, GCP). GCP is highly desirable or at least a minimal experience.

Strong experience with DataFlow (GCP) and Apache Beam.

Proficiency in Python (or similar languages) with solid software engineering fundamentals (testing, modularity, version control).

Hands-on experience with SQL and NoSQL data stores, such as PostgreSQL, Redshift, DynamoDB, or MongoDB

Good understanding of data warehousing and modern architectures (e.g., data Lakehouse, data mesh)

Familiarity with DevOps/CI-CD practices, infrastructure-as-code (Terraform, CloudFormation), and containerization (Docker/Kubernetes)

Understanding of data quality, observability, lineage, and metadata management practices

Good communication and relationship with the stakeholders and team members

Capable to give and receive feedback; able to listen and share, able to give constructive feedback

English Fluent, French would be a plus

Agile mindset & practices

At Thales we provide CAREERS and not only jobs. With Thales employing 80,000 employees in 68 countries our mobility policy enables thousands of employees each year to develop their careers at home and abroad, in their existing areas of expertise or by branching out into new fields. Together we believe that embracing flexibility is a smarter way of working. Great journeys start here, apply now!
Confirmar seu email: Enviar Email
Todos os Empregos de Thales