performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark , Databricks , Apache Pulsar , ApacheAirflow , Temporal , and Apache Flink , sharing knowledge and suggesting improvements. Documentation: Contribute to clear and … or Azure . DevOps Tools: Familiarity with containerization ( Docker ) and infrastructure automation tools like Terraform or Ansible . Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark , Databricks , or similar … big data platforms for processing large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like ApacheAirflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. More ❯
with hands-on experience in ETL/ELT pipelines and data governance best practices. Proficiency in modern big data frameworks and tools such as Apache Spark, ApacheAirflow, dbt, as well as familiarity with cloud-based data services (AWS, Azure, or GCP). Strong understanding of distributed … business analysts, application developers) to translate business requirements into scalable, high-performance data solutions. Design and implement robust data pipelines using tools like Spark, Airflow, and dbt, ensuring data quality, reliability, and availability for analytics and reporting. Oversee data architecture standards and governance practices, including data security, compliance, lineage More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
CommuniTech Recruitment Group
Science, Math, or Financial Engineering degree Strong knowledge in other programming language(s) – e.g., JavaScript, Typescript, Kotlin Strong knowledge of data orchestration technologies – e.g., ApacheAirflow, Dagster, AWS Step Functions Understanding of ETL/ELT workflows, data modeling, and performance optimization for both batch and real-time processing. More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
CommuniTech Recruitment Group
Science, Math, or Financial Engineering degree Strong knowledge in other programming language(s) – e.g., JavaScript, Typescript, Kotlin Strong knowledge of data orchestration technologies – e.g., ApacheAirflow, Dagster, AWS Step Functions Understanding of ETL/ELT workflows, data modeling, and performance optimization for both batch and real-time processing. More ❯
Science, Math, or Financial Engineering degree Strong knowledge in other programming language(s) - e.g., JavaScript, Typescript, Kotlin Strong knowledge of data orchestration technologies - e.g., ApacheAirflow, Dagster, AWS Step Functions Understanding of ETL/ELT workflows, data modeling, and performance optimization for both batch and real-time processing. More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
CommuniTech Recruitment Group
Science, Math, or Financial Engineering degree Strong knowledge in other programming language(s) – e.g., JavaScript, Typescript, Kotlin Strong knowledge of data orchestration technologies – e.g., ApacheAirflow, Dagster, AWS Step Functions Understanding of ETL/ELT workflows, data modeling, and performance optimization for both batch and real-time processing. More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
CommuniTech Recruitment Group
Science, Math, or Financial Engineering degree Strong knowledge in other programming language(s) – e.g., JavaScript, Typescript, Kotlin Strong knowledge of data orchestration technologies – e.g., ApacheAirflow, Dagster, AWS Step Functions Understanding of ETL/ELT workflows, data modeling, and performance optimization for both batch and real-time processing. More ❯
of the open-source libraries we use extensively. We implement the systems that require the highest data throughput in Java and C++. We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log More ❯
the big 3 cloud ML stacks (AWS, Azure, GCP). Hands-on experience with open-source ETL, and data pipeline orchestration tools such as ApacheAirflow and Nifi. Experience with large scale/Big Data technologies, such as Hadoop, Spark, Hive, Impala, PrestoDb, Kafka. Experience with workflow orchestration … tools like Apache Airflow. Experience with containerisation using Docker and deployment on Kubernetes. Experience with NoSQL and graph databases. Unix server administration and shell scripting experience. Experience in building scalable data pipelines for highly unstructured data. Experience in building DWH and data lakes architectures. Experience in working in cross More ❯
diagram of proposed tables to enable discussion. Good communicator and comfortable with presenting ideas and outputs to technical and non-technical users. Worked on ApacheAirflow before to create DAGs. Ability to work within Agile, considering minimum viable products, story pointing, and sprints. More information: Enjoy fantastic perks More ❯
Proficiency in version control tools like Git ensures effective collaboration and management of code and data models. Experience with workflow automation tools, such as ApacheAirflow, is crucial for streamlining and orchestrating complex data processes. Skilled at integrating data from diverse sources, including APIs, databases, and third-party More ❯
Intelligence, Statistical & Data Analysis, Computational Algorithms, Data Engineering, etc. Experience working with a variety of complex, large datasets. Experience building automated pipelines (e.g., Jenkins, Airflow, etc.). Experience building or understanding end-to-end, distributed, and high-performance software infrastructures. Proven ability to work collaboratively as part of a More ❯
across the company. Role requirements 4+ years of experience You have an understanding of developing ETL pipelines using Python frameworks such as luigi or airflow; You have experience with the development of Python-based REST APIs/services and their integration with databases (e.g. Postgres); You are familiar with More ❯
MPP databases such as Redshift, BigQuery, or Snowflake, and modern transformation/query engines like Spark, Flink, Trino. Familiarity with workflow management tools (e.g., Airflow) and/or dbt for transformations. Comprehensive understanding of modern data platforms, including data governance and observability. Experience with cloud platforms (AWS, GCP, Azure More ❯
as Redshift, Snowflake, or BigQuery. Strong command of SQL and programming languages like Python, Scala, or Java. Familiarity with ETL/ELT tools (e.g., Airflow, Fivetran, dbt) and cloud data stacks (AWS/GCP/Azure). A deep understanding of data modelling, access controls, and infrastructure performance tuning. More ❯
as Redshift, Snowflake, or BigQuery. Strong command of SQL and programming languages like Python, Scala, or Java. Familiarity with ETL/ELT tools (e.g., Airflow, Fivetran, dbt) and cloud data stacks (AWS/GCP/Azure). A deep understanding of data modelling, access controls, and infrastructure performance tuning. More ❯
navigate client relationships and translate technical insights into business value. Experience with cloud platforms (e.g., Snowflake, AWS) and ETL/ELT pipeline tools like Airflow/dbt. Benefits £6,000 per annum training & conference budget to help you up-skill and elevate your career Pension contribution scheme (up to More ❯
architectures and CAP theorem. A good understanding of functional paradigms and type theory. Confident JVM knowledge. Modern Java, Ruby, or Clojure knowledge. Experience with Airflow or other Python-based workflow orchestration tools. Exposure to Kubernetes, Docker, Linux, Kafka, RabbitMQ, or git. Knowledge of financial concepts, exchange trading, or physical More ❯
maintaining modular, scalable data models that follow best practices Strong understanding of dimensional modelling Familiarity with AWS data services (S3, Athena, Glue) Experience with Airflow for scheduling and orchestrating workflows Experience working with data lakes or modern data warehouses (Snowflake, Redshift, BigQuery) A pragmatic problem solver who can balance More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT Expertise Terraform Experience Nice to Have: Data Modelling Data Vault ApacheAirflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing More ❯
or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., ApacheAirflow, Kubernetes). You thrive when working as part of a team. Comfortable in a fast-paced environment. Have excellent written and verbal More ❯
working. Required Skills 8+ years of experience in data engineering, data architecture, or related roles, with technical expertise in AWS, Python, Pyspark, Pandas, Snowflake, Airflow Strong domain experience, working with Quant groups or Portfolio Managers with large investment banks or in the Hedge Fund Industry Proven track record of More ❯
working. Required Skills 8+ years of experience in data engineering, data architecture, or related roles, with technical expertise in AWS, Python, Pyspark, Pandas, Snowflake, Airflow Strong domain experience, working with Quant groups or Portfolio Managers with large investment banks or in the Hedge Fund Industry Proven track record of More ❯
Python, R, and Java. Experience scaling machine learning on data and compute grids. Proficiency with Kubernetes, Docker, Linux, and cloud computing. Experience with Dask, Airflow, and MLflow. MLOps, CI, Git, and Agile processes. Why you do not want to miss this career opportunity? We are a mission-driven firm More ❯