MAKE HISTORY WITH US!
At PMI, we’ve chosen to do something incredible. We’re totally transforming our business and building our future on smoke-free products with the power to deliver a smoke-free future.
With huge change, comes huge opportunity. So, wherever you join us, you’ll enjoy the freedom to dream up and deliver better, brighter solutions and the space to move your career forward in endlessly different directions.
JOIN US!
You are highly driven to constantly innovate and drive positive change, but more importantly, you consistently deliver great results. As a data engineer, you will have the opportunity to be a part of something different. This position offers exceptional opportunities for every candidate to grow their technical and non-technical skills. If you are selected, you have the opportunity to really make a difference to our business by inventing, enhancing and building new systems, delivering results, working on exciting and challenging projects.
WHO ARE WE LOOKING FOR?
· Minimum 3–5 years of experience in data engineering or a related field.
· Strong experience with AWS, particularly S3 for managing and organizing large-scale data storage.
· Knowledge of terraform.
· Familiarity with IAM roles, bucket policies, and data lifecycle management in AWS.
· Strong knowledge of dbt.
· Proven experience designing and maintaining ETL/ELT pipelines for structured and semi-structured data.
· Strong Python programming skills for data processing, automation, and pipeline orchestration.
· Ability to build scalable, modular, and maintainable data workflows with robust error handling and logging.
· Advanced proficiency in Snowflake, including:
· Writing efficient Snowflake SQL for data transformation and analytics.
· Understanding Snowflake’s architecture, performance tuning, and cost optimization.
· Experience with role-based access control and data sharing.
· Experience using Bitbucket (or similar tools like GitHub/GitLab) for source control and collaboration.
· Familiarity with CI/CD practices in data engineering, including automated testing and deployment of data pipelines.
· Proficiency in Agile methodologies and experience working with project management tools like Jira.
· Experience with Matillion is a strong plus.
WHAT WE OFFER YOU?
· Private medical care, life insurance
· Subsidized meals
· Office or hybrid working model with flexible working arrangements
· Employee pension plan
· Multisport program
· Cafeteria with various benefits
· Free bike and car parking for all employees
· In this position you will earn no less that 17 108 PLN gross monthly
HOW CAN YOU MAKE HISTORY WITH US?
· Creating, Testing and Deploying Pipelines: Design, develop, and maintain data pipelines for efficient data ingestion, processing, and delivery.
· Work with Vendors: Collaborate with external vendors to integrate new data sources into the data ecosystem, ensuring seamless data flow and integration.
· Optimizing and Maintaining Warehouse: Continuously improve the performance and scalability of the data warehouse, implementing optimization techniques and best practices.
· Ensure Data Completeness and Quality: Implement mechanisms to ensure data completeness, accuracy, and consistency across all data sources and processes.
· Adapt to Changing Requirements: Work effectively in an environment where priorities shift frequently, adjusting plans with agility while maintaining delivery quality.
· Proactive Role: Actively anticipate and resolve potential data issues before they escalate, demonstrating a proactive approach to data.
· Data Analysis: Analyze and interpret data to provide actionable insights and recommendations to stakeholders, supporting data-driven decision-making processes.
· Analyzing large datasets and derive meaningful insights.
Please note that only online applications will be taken into consideration. Only selected candidates will be contacted.