Want to professionalize your AI skills, pivot to an AI role and increase your salary?
Master AI Engineering with the most practical and comprehensive LLM Development certifications at Towards AI Academy.

Inetum

Regular Data Engineer

Inetum

Published 16 Feb 2026
Warsaw, Poland
Full Time

Share this job

Role Highlights

Languages used

Python
SQL

Key skills

CICD
Data Processing
Data Quality
Transformation
Infrastructure
ELT
ETL
DataSets
Storage
Data Engineer

Tools, Libraries and Frameworks

SAP
SalesForce
Airflow
Git
Apache Spark
Hadoop

Description

Inetum Regular Data Engineer \\\| SmartRecruiters Google Chrome Microsoft Edge Apple Safari Mozilla Firefox . Regular Data Engineer Full-time Company Description Inetum Polska is part of the global Inetum Group and plays a key role in driving the digital transformation of businesses and public institutions. Operating in cities such as Warsaw, Poznan, Katowice, Lublin, Rzeszow, Lodz the company offers a wide range of IT services. Inetum Polska actively supports employee development by fully funding training, certifications, and participation in technology conferences. Additionally, the company is involved in local social initiatives, such as charitable projects and promoting an active lifestyle. It prides itself on fostering a diverse and inclusive work environment, ensuring equal opportunities for all. Globally, Inetum operates in 19 countries and employs over 28,000 professionals. The company focuses on four key areas: Consulting (Inetum Consulting): Strategic advisory services that help organizations define and implement innovative solutions. Infrastructure and Application Management (Inetum Technologies): Designing and managing IT systems tailored to clients individual needs. Software Implementation (Inetum Solutions): Deploying partner solutions from industry leaders like Microsoft, SAP, Salesforce, and ServiceNow. Custom Software Development (Inetum Software): Creating unique software solutions to meet specific client needs. With strategic partnerships with major technology giants, including Microsoft, SAP, Salesforce, and ServiceNow, Inetum delivers advanced technological solutions tailored to customer requirements. In 2023, Inetum reported revenues of 2.5 billion, underscoring its strong position in the digital services market. Inetum distinguishes itself by offering a comprehensive range of benefits that meet the diverse needs of employees, providing flexibility, support and commitment. Here's what makes working at Inetum unique: Flexible and hybrid work: Flexible working hours. Hybrid work model, allowing employees to divide their time between home and modern offices in key Polish cities. Attractive financial benefits: A cafeteria system that allows employees to personalize benefits by choosing from a variety of options. Generous referral bonuses, offering up to PLN6,000 for referring specialists. Additional revenue sharing opportunities for initiating partnerships with new clients. Professional development and team support: Ongoing guidance from a dedicated Team Manager for each employee. Tailored technical mentoring from an assigned technical leader, depending on individual expertise and project needs. Community and Well-Being: Dedicated team-building budget for online and on-site team events. Opportunities to participate in charitable initiatives and local sports programs. A supportive and inclusive work culture with an emphasis on diversity and mutual respect. Job Description Key Responsibilities: Design, develop, and implement efficient ELT/ETL processes for large datasets. Build and optimize data processing workflows using Apache Spark. Utilize Python for data manipulation, transformation, and analysis. Develop and manage data pipelines using Apache Airflow. Write and optimize SQL queries for data extraction, transformation, and loading. Collaborate with data scientists, analysts, and other engineers to understand data requirements and deliver effective solutions. Work within an on-premise computing environment for data processing and storage. Ensure data quality, integrity, and performance throughout the data lifecycle. Participate in the implementation and maintenance of CI/CD pipelines for data processes. Utilize Git for version control and collaborative development. Troubleshoot and resolve issues related to data pipelines and infrastructure. Contribute to the documentation of data processes and systems. Qualifications Minimum 2 years of professional experience as a programmer working with large datasets. Experience in at least 1 project involving the processing of large datasets. Experience in at least 1 project programming with Python. Experience in at least 1 project within an on-premise computing environment. Proven experience programming with Apache Spark. Proven experience programming with Python. Proven experience programming with Apache Airflow. Proven experience programming with SQL. Familiarity with Hadoop concepts. Proven experience in programming ELT/ETL processes. Understanding of CI/CD principles and practices. Proficiency in using a version control system (Git). Strong self-organization skills and a goal-oriented approach. Excellent interpersonal and organizational skills, including planning. Strong communication, creativity, independence, professionalism, stress resistance, and inquisitiveness. Adaptability and flexibility, with an openness to continuous learning and development. Additional Information Work is conducted in a hybrid system: 2 times a month from one of our office - Warsaw, Katowice, Poznan, Rzeszow, Lodz or Lublin. We hereby inform you that Inetum Polska sp. z o.o. has implemented an internal reporting (whistleblowing) procedure. The content of the procedure and the possibility to submit an internal report are available at: I'm interested I'm interestedPrivacy NoticeCookies Settings I'm interested Refer a friend share this job Share on LinkedIn Share on Facebook Share on Twitter Share via email Share on Xing Share on WeChat Share to WeChat × Copy the link and open WeChat to share. Copy to clipboard Open WeChat Share to WeChat × Use Scan QR Code in WeChat and click ··· to share. Regular Data Engineer Warsaw, Poland Full-time I'm interested I'm interested

Required Qualifications and Skills

The role requires a minimum of 2 years of professional experience as a programmer working with large datasets, including experience in at least one project involving large dataset processing and one project programming with Python. Proven experience is needed in programming with Apache Spark, Python, Apache Airflow, and SQL. Familiarity with Hadoop concepts and proven experience in programming ELT/ETL processes are also required. The position necessitates proficiency in using a version control system like Git and an understanding of CI/CD principles. Strong self-organization, goal-oriented approach, excellent interpersonal and organizational skills, including planning, are essential. The candidate should also possess strong communication, creativity, independence, professionalism, stress resistance, adaptability, flexibility, and an openness to continuous learning.

Disclaimer

Disclaimer: Job and company description information and some of the data fields may have been generated via GPT-4 summarisation and could contain inaccuracies. The full external job listing link should always be relied on for authoritative information.

About the company

Inetum

Size

19844

Founded

HQ

St.-Ouen, FR

Description

Inetum is a European leader in digital services. Inetums team of 28,000 consultants and specialists strive every day to make a digital impact for businesses, public sector entities and society. Inetums solutions aim at contributing to its clients performance and innovation as well as the common good. Present in 19 countries with a dense network of sites, Inetum partners with major software publishers to meet the challenges of digital transformation with proximity and flexibility. Driven by its ambition for growth and scale, Inetum generated sales of 2.5 billion euros in 2023. Top Employer Europe 2024

Share

Share this job

Related jobs

Data Analysis
Big Data
Computer Science
Data Visualization
Lisbon, Portugal
Full Time
IOS
MS Office
LLMs
Transformation
Bucharest
Full Time
Data Factory
Azure Databricks
Shell Scripting
IOS
Peru
A jornada completa