Scala/Python/Java Experience in most of data and cloud technologies such as Hadoop, HIVE, Spark, Pig, SQOOP, Flume, PySpark, Databricks, Cloudera, Airflow, Oozie, S3, Glue, Athena, Terraform etc. Experience with schema design using semi-structured and structured data structures Experience on messaging technologies - Kafka, Spark Streaming more »
architectures and CAP theorem. A good understanding of functional paradigms and type theory. Confident JVM knowledge. Modern Java, Ruby, or Clojure knowledge. Experience with Airflow or other Python-based workflow orchestration tools. Proficiency in domain driven design and domain modeling. Exposure to Kubernetes, Docker, Linux, Kafka, RabbitMQ, or git. more »
South East London, London, United Kingdom Hybrid / WFH Options
Kennedy Pearce Consulting
objectives. Key Responsibilities: Build and maintain efficient data pipelines on Google Cloud Platform (GCP), ensuring scalability and reliability. Utilise tools such as Google BigQuery, Apache Spark, Apache Beam, Airflow, and Cloud Composer to manage and process large datasets. Collaborate with engineering, product, and data teams to create … hands-on experience in cloud platforms (experience with Google Cloud is a plus). Strong knowledge of data warehousing (e.g., Google BigQuery), data processing (Apache Spark, Beam), and pipeline orchestration (Airflow, Cloud Composer). Proficiency with SQL and No-SQL databases (e.g., Cloud Datastore, MongoDB), and storage systems more »
South West London, London, United Kingdom Hybrid / WFH Options
John Lewis & Partners
of some of the following languages and tools would be very helpful: Familiarity with SQL Terraform DBT The Snowflake Data Platform Docker and Kubernetes ApacheAirflow Observability tools - especially those relevant to Data Platforms Gitlab CI Benefits of the John Lewis Partnership: You will enjoy 25% discount at more »
Pandas, Matplotlib, and Scikit-Learn). Experience with causal inference and decision-making under uncertainty. Extensive working experience with tools and frameworks such as Airflow, Docker, Rest APIs (Flask, Fast API), GitHub, Jenkins, and familiarity working in a CI/CD environment. Familiarity with reinforcement learning frameworks like OpenAI more »
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
experience Snowflake experience Proficiency across an AWS tech stack DevOps experience building and deploying using Terraform Nice to Have: DBT Data Modelling Data Vault ApacheAirflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing more »
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
understanding of data modeling and modern data environments like data warehouses or data lakes (dbt skills are ideal). Familiarity with orchestration tools (e.g., ApacheAirflow, Dagster) and cloud platforms like AWS. Data-driven mindset with the ability to explain complex technical concepts clearly. Additional Information Were a more »
Provide guidance and mentorship to junior team members To be successful in this role you will have. Extensive experience with Snowflake Experience with DBT, Airflow or Python Cloud Data Engineering experience with AWS/Azure/GCP This is a hybrid role based from the companies London office with more »
SR2 | Socially Responsible Recruitment | Certified B Corporation™
financial services company. Location: London (remote first working). Salary: £45,000 - 50,000 (plus 10% bonus). Tech/Tools: Python, SQL, ELT, Airflow, Power BI, BigQuery, Snowflake, DataBricks, Azure and/or AWS. Key responsibilities://As one of the Data Engineers, you will be responsible more »
SR2 | Socially Responsible Recruitment | Certified B Corporation™
financial services company. Location: London (remote first working). Salary: £45,000 - 50,000 (plus 10% bonus). Tech/Tools: Python, SQL, ELT, Airflow, Power BI, BigQuery, Snowflake, DataBricks, Azure and/or AWS. Key responsibilities://As one of the Data Engineers, you will be responsible more »
Lambda, Kinesis, SQS/SNS, or similar services) Proficiency in Python and SQL Experience with modern orchestration and data transformation tools, such as DBT, Airflow, or similar tools Understanding of CI/CD and DevSecOps practices Experience with real-time data processing and event-driven architectures Understanding of data more »
Lambda, Kinesis, SQS/SNS, or similar services) Proficiency in Python and SQL Experience with modern orchestration and data transformation tools, such as DBT, Airflow, or similar tools Understanding of CI/CD and DevSecOps practices Experience with real-time data processing and event-driven architectures Understanding of data more »
routine issues and identify improvements in the testing and validation of data accuracy. Extensive experience with Snowflake is essential and working knowledge of DBT, Airflow, Python, SQL and AWS is highly desirable. Strong background developing, constructing, testing, and maintaining practical data architectures and drive improvements in data reliability, efficiency more »
Experience in a leadership Snowflake migration MUST* (Strong knowledge in Snowflake, Fivetran, DBT, and orchestration tools (e.g., ApacheAirflow, Prefect) ADO (Azure DevOps) and CI/CD Standards/Pipelines Good to Have :: Snowflake Certified Data Engineer or AWS Certified Data Engineer. more »
the UK and requiring no visa sponsorship. The Lead Data Engineer will: Build and maintain efficient ETL/ELT pipelines using tools such as ApacheAirflow and PySpark. Develop database schemas, dimensional models (Kimball/Inmon), and support data normalisation for relational and NoSQL databases. Participate in the more »
systems are behaving consistently. We utilize a myriad of open-source technologies to build our systems, such as S3, Ceph, Celery, RabbitMQ, Kafka, Docker, ApacheAirflow and many more. We'll trust you to: Own the performance and availability of our storage products, innovating to continuously mature our more »
in Python or Scala An understanding of Big Data technologies such as Spark, messaging services like Kafka or RabbitMQ, and workflow management tools like Airflow SQL & NoSQL expertise, ideally including Postgres, Redis, MongoDB etc Experience with AWS, and with tools like Docker & Kubernetes As well as this you will more »
cloud provider Frontend development with React or another modern web framework DevOps Kubernetes Infrastructure engineering with Terraform, Pulumi or similar Data workload orchestration with Airflow or similar Containerisation with Docker Experience with SQL, as well as relational database design and administration Experience in other tools not listed is also more »
in DataOps methodologies and tools with experience in implementing CI/CD pipelines and managing containerized applications. Proficiency in workflow orchestration tools such as Apache Airflow. Experience designing, building and maintaining Data Warehouses. Experience working in a collaborative environment with other functional experts (e.g. other engineering teams, product, design more »
platform stack Python as our main programming language Databricks as our datalake platform Kubernetes for data services and task orchestration Streamlit for data applications Airflow purely for job scheduling and tracking Circle CI for continuous deployment Parquet and Delta file formats on S3 for data lake storage Spark for more »
London, England, United Kingdom Hybrid / WFH Options
Harnham
warehouse, building efficient ETL pipelines in Python, and setting up a Snowflake data warehouse. Additionally, you'll be responsible for developing orchestration pipelines using ApacheAirflow to automate and streamline data flows. This data infrastructure will empower the client to make more informed investment decisions, such as understanding more »
warehouse, building efficient ETL pipelines in Python, and setting up a Snowflake data warehouse. Additionally, you'll be responsible for developing orchestration pipelines using ApacheAirflow to automate and streamline data flows. This data infrastructure will empower the client to make more informed investment decisions, such as understanding more »
create a system that can integrate vast and disparate data-sets into one locations, allowing Quant Researchers to easily access. Stack: Python, AWS, Spark, Airflow The team is partnering with tier one Quant Traders and Researchers, and would allow you to push yourself technically and professionally. They need an more »
create a system that can integrate vast and disparate data-sets into one locations, allowing Quant Researchers to easily access. Stack: Python, AWS, Spark, Airflow The team is partnering with tier one Quant Traders and Researchers, and would allow you to push yourself technically and professionally. They need an more »