meetings. What You Need to Succeed Strong skills in Python and SQL Demonstrable hands-on experience in AWS cloud Data ingestions both batch and streaming data and data transformations (Airflow, Glue, Lambda, Snowflake Data Loader, FiveTran, Spark, Hive etc.). Apply agile thinking to your work. Delivering in iterations that incrementally build on what went before. Excellent problem-solving … translate concepts into easily understood diagrams and visuals for both technical and non-technical people alike. AWS cloud products (Lambda functions, Redshift, S3, AmazonMQ, Kinesis, EMR, RDS (Postgres)). ApacheAirflow for orchestration. DBT for data transformations. Machine Learning for product insights and recommendations. Experience with microservices using technologies like Docker for local development. Apply engineering best practices More ❯
Sheffield, England, United Kingdom Hybrid / WFH Options
Autodesk
such as AWS, Azure, or GCP · Docker · Documenting code, architectures, and experiments · Linux systems and bash terminals Preferred Qualifications o Databases and/or data warehousing technologies, such as Apache Hive, Iceberg etc. o Data transformation via SQL and DBT. o Orchestration platforms such as ApacheAirflow, Argo Workflows, etc. o Data catalogs and metadata management tools More ❯
production issues. Optimize applications for performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark , Databricks , Apache Pulsar , ApacheAirflow , Temporal , and Apache Flink , sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise documentation for software, processes … cloud platforms like AWS , GCP , or Azure . DevOps Tools: Familiarity with containerization ( Docker ) and infrastructure automation tools like Terraform or Ansible . Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark , Databricks , or similar big data platforms for processing … large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like ApacheAirflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. Desired Skills: Asynchronous Programming: Familiarity with asynchronous programming tools like Celery or asyncio . Frontend More ❯
Castleford, England, United Kingdom Hybrid / WFH Options
PTSG
or in a similar role. Strong proficiency in SQL and experience with database management systems (e.g., MySQL, PostgreSQL, MongoDB). Experience with data pipeline and workflow management tools (e.g., ApacheAirflow, Luigi). Familiarity with cloud platforms and services and a particular knowledge of Google Cloud would be preferable. Proficiency in programming languages such as Python, Java, or More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
ingestion, transformation, and modelling pipelines. Strong SQL and Python skills – essential for building and validating test cases. Proven experience with Snowflake (or similar cloud data platforms), dbt , Fivetran , and Airflow . Knowledge of automation frameworks such as Cucumber, Gherkin, TestNG. Experience integrating test automation into large-scale delivery functions. Experience Proven track record in testing complex data solutions in More ❯
platform, ensuring scalability, reliability, and security. Drive modernisation by transitioning from legacy systems to a lean, scalable platform. Act as a lead expert for technologies such as AWS, DBT, Airflow, and Databricks. Establish best practices for data modelling, ingestion, storage, streaming, and APIs. Governance & Standards Ensure all technical decisions are well-justified, documented, and aligned with business needs. Lead … in data engineering and cloud engineering, including data ingestion, transformation, and storage. Significant hands-on experience with AWS and its data services. Expert-level skills in SQL, Python, DBT, Airflow and Redshift. Confidence in coding, scripting, configuring, versioning, debugging, testing, and deploying. Ability to guide and mentor others in technical best practices. A product mindset, focusing on user needs More ❯
Halifax, England, United Kingdom Hybrid / WFH Options
Alpha Associates Recruitment
DB) Python for data scripting Great problem-solving and communication skills Desirable: PHP or hybrid cloud/on-prem experience Exposure to AI/ML or orchestration tools (e.g., Airflow, dbt) What’s on Offer Hybrid working (flexibility after probation) Open, collaborative tech environment Equipment of your choice (Mac, Windows, Linux) Competitive salary & pension Free in-office breakfasts (twice More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
have experience with GCP, Data Products, Data Mesh, ETL, EDW Composer, BigQuery, DataProc, DBT, Spark, Hive, Kafka, Pub-Sub, Jira, GCP Dataflow, BigTable, Python, Scala, Java, .NET, ANSI SQL, Airflow, Shell Scripts, Control-M, Git, CI & CD, HDFS, Unix File System, RDBMS, Azure DevOps, Harness. More about the role: Define Data Product Definition, Architecture, and Roadmap. Lead the delivery More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Energy Jobline
description: Proficiency in GCP, Data Products, Data Mesh, ETL, EDW Composer, BigQuery, DataProc, DBT, Spark, Hive, Kafka, Pub-Sub, Jira, GCP Dataflow, BigTable, Python, Scala, Java, .NET, ANSI SQL, Airflow, Shell Scripts, Control-M, Git, CI & CD, HDFS, Unix File System, RDBMS, Azure DevOps, Harness. More about the role: Able to define Data Product Definition, Architecture, and Roadmap. Define More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
Sheffield, England, United Kingdom Hybrid / WFH Options
Goodlord
Please note before applying: this is an entry-level role and would be a great fit for someone who has recently completed a degree, bootcamp, or course in a related field such as Data Science, Statistics, or Computer Science. If More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
PEXA Group Limited
end data quality, from raw ingested data to business-ready datasets Optimise PySpark-based data transformation logic for performance and reliability Build scalable and maintainable pipelines in Databricks and Airflow Implement and uphold GDPR-compliant processes around PII data Collaborate with stakeholders to define what "business-ready" means, and confidently sign off datasets as fit for consumption Put testing … internal and external customers Skills & Experience Required Extensive hands-on experience with PySpark, including performance optimisation Deep working knowledge of Databricks (development, architecture, and operations) Proven experience working with Airflow for orchestration Proven track record in managing and securing PII data, with GDPR compliance in mind Experience in data governance processes; Alation experience preferred, but similar toolswelcome Strong SQL More ❯
teams to embed data-driven thinking across the organisation Ensure data governance, quality, and security best practices are in place Utilising the AWS tech stack including S3, Lambda, Redshift, Airflow Preferably some experience with Snowflake although this is not essential This is a hands-on leadership role - ideal for someone who enjoys both strategic thinking and technical delivery. Benefits More ❯
teams to embed data-driven thinking across the organisation Ensure data governance, quality, and security best practices are in place Utilising the AWS tech stack including S3, Lambda, Redshift, Airflow Preferably some experience with Snowflake although this is not essential This is a hands-on leadership role - ideal for someone who enjoys both strategic thinking and technical delivery. Benefits More ❯
progress, identify risks, and help remove blockers to keep delivery on track. Your Qualifications: Strong familiarity with data integration, transformation, and orchestration workflows; experience with tools like Informatica, SnapLogic, Airflow, or DataBricks is a plus. Experience with Anaplan, other planning applications, or multi-dimensional modeling tools is highly desirable. Working knowledge of API-driven architectures, event-driven design, and More ❯