Sheffield, South Yorkshire, England, United Kingdom Hybrid/Remote Options
DCS Recruitment
principles. Experience working with cloud platforms such as AWS, Azure, or GCP. Exposure to modern data tools such as Snowflake, Databricks, or BigQuery. Familiarity with streaming technologies (e.g., Kafka, Spark Streaming, Flink) is an advantage. Experience with orchestration and infrastructure tools such as Airflow, dbt, Prefect, CI/CD pipelines, and Terraform. What you get in return: Up to More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid/Remote Options
Vivedia Ltd
/ELT pipelines , data modeling , and data warehousing . Experience with cloud platforms (AWS, Azure, GCP) and tools like Snowflake, Databricks, or BigQuery . Familiarity with streaming technologies (Kafka, Spark Streaming, Flink) is a plus. Tools & Frameworks: Airflow, dbt, Prefect, CI/CD pipelines, Terraform. Mindset: Curious, data-obsessed, and driven to create meaningful business impact. Soft Skills: Excellent More ❯
Leeds, England, United Kingdom Hybrid/Remote Options
KPMG UK
having resided in the UK for at least the past 5 years and being a UK national or dual UK national. Experience in prominent languages such as Python, Scala, Spark, SQL. Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Experience with the design, build and maintenance of data pipelines and infrastructure More ❯
Leeds, England, United Kingdom Hybrid/Remote Options
Fruition Group
Airflow, and Git for version control. Excellent collaboration and communication skills, with strong attention to detail and data quality. Desirable: Exposure to AI/ML data preparation, Python or Spark, Data Vault 2.0, data governance, GDPR, and experience working in mobility, logistics, financial services, or tech-enabled environments. What's in it for me? Hybrid and remote working flexibility. More ❯
requirements. In order to secure one of these Senior Data Engineer roles you must be able to demonstrate the following experience: Experience in prominent languages such as Python, Scala, Spark, SQL. Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Experience with the design, build and maintenance of data pipelines and infrastructure More ❯
able to work across full data cycle. Proven Experience working with AWS data technologies (S3, Redshift, Glue, Lambda, Lake formation, Cloud Formation), GitHub, CI/CD Coding experience in ApacheSpark, Iceberg or Python (Pandas) Experience in change and release management. Experience in Database Warehouse design and data modelling Experience managing Data Migration projects. Cloud data platform development … the AWS services like Redshift, Lambda,S3,Step Functions, Batch, Cloud formation, Lake Formation, Code Build, CI/CD, GitHub, IAM, SQS, SNS, Aurora DB Good experience with DBT, Apache Iceberg, Docker, Microsoft BI stack (nice to have) Experience in data warehouse design (Kimball and lake house, medallion and data vault) is a definite preference as is knowledge of … other data tools and programming languages such as Python & Spark and Strong SQL experience. Experience is building Data lake and building CI/CD data pipelines A candidate is expected to understand and can demonstrate experience across the delivery lifecycle and understand both Agile and Waterfall methods and when to apply these. Experience This position requires several years of More ❯
Leeds, England, United Kingdom Hybrid/Remote Options
Fruition Group
best practices for data security and compliance. Collaborate with stakeholders and external partners. Skills & Experience: Strong experience with AWS data technologies (e.g., S3, Redshift, Lambda). Proficient in Python, ApacheSpark, and SQL. Experience in data warehouse design and data migration projects. Cloud data platform development and deployment. Expertise across data warehouse and ETL/ELT development in More ❯
demonstrate the following experience: Commercial experience gained in a Data Engineering role on any major cloud platform (Azure, AWS or GCP) Experience in prominent languages such as Python, Scala, Spark, SQL. Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Some experience with the design, build and maintenance of data pipelines and More ❯
and listed on the London Stock Exchange. With 3,000 employees and 32 offices in 12 countries we're a business with lots of opportunity for people with talent, spark and lots of ambition. If you want to build a great career with a company that prioritises strong values - such as integrity and courage - where our people always pull More ❯
Data Engineer - Professional Services - £75,000 - Leeds (hybrid) A leading global professional services firm is seeking a Design Data Engineer to join its growing data and analytics team. This role offers the opportunity to play a pivotal part in shaping More ❯
leeds, west yorkshire, yorkshire and the humber, united kingdom Hybrid/Remote Options
Axiom Software Solutions Limited
in Kafka. • Configuring, deploying, and maintaining Kafka clusters to ensure high availability and scalability. • Integrating Kafka with other data processing tools and platforms such as Kafka Streams, Kafka Connect, Spark Streaming Schema Registry, Flink and Beam. • Collaborating with cross-functional teams to understand data requirements and design solutions that meet business needs. • Implementing security measures to protect Kafka clusters … open source kafka experience • Spend a lot of time on Disaster Recovery aspects • Read up and explain new features like KRAFT • Kafka Resiliency • Any Real time tech experience like Spark Required Skills Experience • Extensive Experience with Apache Kafka and real-time architecture including event driven frameworks. • Strong Knowledge of Kafka Streams, Kafka Connect, Spark Streaming, Schema Registry More ❯
leeds, west yorkshire, yorkshire and the humber, united kingdom Hybrid/Remote Options
Axiom Software Solutions Limited
in Kafka. Configuring, deploying, and maintaining Kafka clusters to ensure high availability and scalability. Integrating Kafka with other data processing tools and platforms such as Kafka Streams, Kafka Connect, Spark Streaming Schema Registry, Flink and Beam. Collaborating with cross-functional teams to understand data requirements and design solutions that meet business needs. Implementing security measures to protect Kafka clusters … ensure optimal performance. Providing technical guidance and support to development operations teams. Staying updated with the latest Kafka features, updates and industry practices. Required Skills Experience Extensive Experience with Apache Kafka and real-time architecture including event driven frameworks. Strong Knowledge of Kafka Streams, Kafka Connect, Spark Streaming, Schema Registry, Flink and Beam. Experience with cloud platform such More ❯