data processing, data analysis and data visualisation SQL and Timeseries databases cloud AWS services, such as S3, EC2, RDS etc ETL tools, such as Airflow Git, CI/CD, testing tools, supporting documentation and best practices best practice and tooling including TDD, BDD Domain and soft skills summary office more »
building APIs & SQLAlchmy for database interactions. Strong experience in cloud-based development (AWS). Proficiency with both Docker & Kubernetes for containerisation & orchestration. Understanding or Airflow & DAGs. Experience in building applications using Kafka. Solid OOP principles & design patterns. Permanent/Full-Time Employment. Hybrid working environment (2/3 days more »
how these and other technologies can be applied to business problems to generate value. We currently work in an AWS, Snowflake, Looker, Python and airflow stack; you should be comfortable with these (or similar). The person we’re looking for We are looking for a self-starter who more »
Azure SQL Data Warehouse, or Amazon Redshift Support and learn the associated scheduling logic for data pipelines using scheduling tool such as Autosys or Airflow Read and translate logical data models and use ETL skills to load the physical layer, based on an understanding of timing of data loads more »
quality and identify areas for improvement to implement practical solutions. Key Requirements Background in Python Development from an engineering or development environment Experience with Airflow, Cloud (AWS) and Pandas more »
data processing, analysis, and visualization libraries Experienced with SQL and Timeseries databases Skilled in AWS services: S3, EC2, RDS Knowledgeable in ETL tools like Airflow Proficient in Git, CI/CD, testing tools, and documentation best practices Adheres to quality engineering practices including TDD and BDD Nice to have more »
converting SAS-based modules to Python-based solutions. Strong understanding of data management principles and experience working with Snowflake. Proficiency in Python, DBT, and Airflow or similar technologies. Excellent problem-solving skills and ability to troubleshoot complex issues. Experience working in an Agile environment and collaborating with cross-functional more »
in: Data Warehousing Data Engineering, overall Data Analytics Data Visualisation Proficiency in: Google Cloud (GCP) GCP BigQuery Python DBT or similar FastAPI or similar Airflow or similar Desirable: Google Apigee (as an application developer) Exposure to Machine Learning projects Exposure to DataOps Exposure to dataproc or similar more »
existing systems and ingestion pipelines. Requirements: Proven experience working with Python or Java or C# Experience working with ELT/ELT technologies such as Airflow, Argo, Dagster, Spark, Hive Strong technical expertise, especially in data processing and exploration, with a willingness to learn new technologies. A passion for automation more »
Southampton, Hampshire, South East, United Kingdom Hybrid / WFH Options
Datatech Analytics
record of taking data science products from conception through to deployment in production. You've had experience deploying data science workflows with tools like Airflow or Databricks. Proven experience of leading a data science team Proven experience of working with customer data You've a strong commitment to accuracy more »
expertise in tools spanning: data warehousing, ETL, internal visualisation and analytics. Good examples are Snowflake, GCP, Azure Analytics, Sagemaker, Databricks, Tableau, PowerBI, Looker, Quicksight, Airflow, astronomer.io, Alteryx, Collibra.• Hands on experience and\or a detailed and deep understanding of the workflow and approaches of:o BI analyst for visualisation more »
extensive experience having designed and scaled a Data Platform- Has strong Python skills,- Has great SQL, preferably Snowflake- Has previous experience working with dbt & Airflow- Is passionate about solving complex data problems & is interested in working with rich & diverse climate datasets- Cares deeply about the climate and ecosystems of more »
Experience using cloud technologies such as EMR, Lambda, EC2, and data pipelines. Experience leading data warehousing and analytics projects, including using technologies such as Airflow, Jenkins, Snowflake, and Kinesis. Experience with Agile, DevOps, and CICD frameworks in cloud-based environments. Exposure to at least one dashboarding tool like Tableau more »
few of the open-source libraries we use extensively. We implement the systems that require the highest data throughput in Java and C++.We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log more »
Data-lake and Business Intelligence solutions.Experience as a data engineer: implementing data pipelines (using PySpark, Spark SQL, Scala, etc), orchestration tools/services (i.e. Airflow, data factory) and testing frameworks.1 year plus of experience in a technical leadership roleExperience in one of the main cloud services (AWS, Google Cloud more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
a data science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, ApacheAirflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of more »
engineer who has hands-on experience in AWS to conduct end-to-end data analysis and data pipeline build-out using Python, Glue, S3, Airflow, DBT, Redshift, RDS, etc.Very solution-driven, and highly collaborative at providing thought leadership and soliciting diverse opinionsAccountable for results. Experienced in leading team of more »
DBT (Data Build Tool). Experience with FastAPI. Familiarity with Python UI framework packages. Knowledge of Apigee (as an app developer). Familiarity with Airflow (as an app developer). Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. If you are passionate about data engineering more »
or warehouse. Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the more »
range of databases. Snowflake is widely used, as are Docker and Kubernetes for containerisation. ETL and ELT tech are also used every day, primarily Airflow, Spark, Hive and a lot more. You’ll need to come from a strong academic background with some commercial experience in a data heavy more »
Oxfordshire, England, United Kingdom Hybrid / WFH Options
Mirus Talent
mandatory, familiarity with the following technologies and tools would be advantageous: Dagster (or similar Orchestration Tools) : Experience with Dagster or other orchestration tools like Airflow for managing complex data workflows and pipelines. Qlik Sense Cloud (or similar Reporting Tools) : Knowledge of Qlik Sense Cloud or similar reporting tools such more »
Manchester Area, United Kingdom Hybrid / WFH Options
Forsyth Barnes
data warehousing. Multiple years of experience with GCP, especially with core processing and orchestration products like BigQuery, DataFlow, DataFusion, DataStream, Cloud Functions, DataProc, and Airflow/Composer. Strong problem-solving skills and a meticulous approach to code reviews. Proven leadership qualities with the ability to uphold high standards within more »
Preston, Lancashire, United Kingdom Hybrid / WFH Options
Forsyth Barnes
warehousing. l Multiple years of experience with GCP, especially with core processing and orchestration products like BigQuery, DataFlow, DataFusion, DataStream, Cloud Functions, DataProc, and Airflow/Composer. l Strong problem-solving skills and a meticulous approach to code reviews. l Proven leadership qualities with the ability to uphold high more »
London, England, United Kingdom Hybrid / WFH Options
Jobleads-UK
Continuous Delivery Continuous integration pipelines Strong Python AWS or Azure with large-scale streaming data (Pulsar, Kafka, Kinesis, etc) ETL management; structured or custom (Airflow, Luigi, etc) Bonus Robust experience managing and developing an engineering team Delta lake or Iceberg Trino or Presto Graphql Good salary, bonus, stock options more »