of the programming languages like Python, C++[2] , Java ● Experience with version control and code review tools such as Git ● Knowledge of latest data pipeline orchestration tools such as Airflow ● Experience with cloud platforms (AWS, GCP, or Azure) and infrastructure-as-code tools (e.g., Docker, Terraform, CloudFormation). ● Familiarity with data quality, data governance, and observability tools (e.g., Great More ❯
of the programming languages like Python, C++[2] , Java ● Experience with version control and code review tools such as Git ● Knowledge of latest data pipeline orchestration tools such as Airflow ● Experience with cloud platforms (AWS, GCP, or Azure) and infrastructure-as-code tools (e.g., Docker, Terraform, CloudFormation). ● Familiarity with data quality, data governance, and observability tools (e.g., Great More ❯
Coventry, West Midlands, England, United Kingdom Hybrid / WFH Options
Lorien
executing tests Requirements Strong experience as a Data Engineer (migrating legacy systems onto AWS, building data pipelines) Strong Python experience Tech stack experience required: AWS Glue, Redshift, Lambda, PySpark, Airflow SSIS or SAS experience (Desirable) Benefits Salary up to £57,500 + up to 20% bonus Hybrid working: 1 to 2 days a week in the office 28 days More ❯
Telford, Shropshire, United Kingdom Hybrid / WFH Options
Experis - ManpowerGroup
execution. Ability to work under pressure and manage competing priorities. Desirable Qualifications: Familiarity with DevOps practices and cloud-hosted platforms (AWS, Azure, SAS PaaS). Knowledge of scheduling tools - Airflow Exposure to SAS Migration Factory Delivery Models. The successful candidate will also need to be SC Vetted. All profiles will be reviewed against the required skills and experience. Due More ❯
Staffordshire, England, United Kingdom Hybrid / WFH Options
MSA Data Analytics Ltd
and strengthen the organisation’s data engineering and analytics capability within its AWS-based environment. We’re ideally looking for someone with strong hands-on experience across AWS services, Airflow, Python, and SQL. You’ll play a key role in designing, building, and maintaining modern data infrastructure that powers insight-led decision-making across the business. Working within a … and key stakeholders to deliver practical, scalable solutions that make a real impact. Key Responsibilities Design, build, and maintain robust, scalable ETL/ELT pipelines using tools such as Airflow and AWS services (S3, Redshift, Glue, Lambda, Athena). Integrate new data sources and continuously optimise performance and cost efficiency. Ensure data quality, integrity, and security across all systems. … with new tools and trends in data engineering, particularly within the AWS ecosystem. Skills & Experience Strong hands-on experience with AWS (S3, Redshift, Glue, Lambda, Athena). Skilled in Airflow for workflow orchestration. Advanced SQL and proficient in Python for data engineering. Experience with data modelling (e.g. dimensional) and familiarity with NoSQL databases (e.g. Elasticsearch). Confident using Git More ❯