processing and analytics Desired experience: Worked with Python 3.9+ Familiar with Python test automation Experience with SQL and Timeseries databases Familiar with Parquet, Arrow, Airflow, Databricks Experience with cloud AWS services, such as S3, EC2, RDS etc Quality engineering best practice and tooling including TDD, BDD This is an more »
data processing, data analysis and data visualisation SQL and Timeseries databases cloud AWS services, such as S3, EC2, RDS etc ETL tools, such as Airflow Git, CI/CD, testing tools, supporting documentation and best practices best practice and tooling including TDD, BDD Domain and soft skills summary office more »
existing systems and ingestion pipelines. Requirements: Proven experience working with Python or Java or C# Experience working with ELT/ELT technologies such as Airflow, Argo, Dagster, Spark, Hive Strong technical expertise, especially in data processing and exploration, with a willingness to learn new technologies. A passion for automation more »
how these and other technologies can be applied to business problems to generate value. We currently work in an AWS, Snowflake, Looker, Python and airflow stack; you should be comfortable with these (or similar). The person we’re looking for We are looking for a self-starter who more »
Southampton, Hampshire, South East, United Kingdom Hybrid / WFH Options
Datatech Analytics
record of taking data science products from conception through to deployment in production. You've had experience deploying data science workflows with tools like Airflow or Databricks. Proven experience of leading a data science team Proven experience of working with customer data You've a strong commitment to accuracy more »
extensive experience having designed and scaled a Data Platform- Has strong Python skills,- Has great SQL, preferably Snowflake- Has previous experience working with dbt & Airflow- Is passionate about solving complex data problems & is interested in working with rich & diverse climate datasets- Cares deeply about the climate and ecosystems of more »
Azure SQL Data Warehouse, or Amazon Redshift Support and learn the associated scheduling logic for data pipelines using scheduling tool such as Autosys or Airflow Read and translate logical data models and use ETL skills to load the physical layer, based on an understanding of timing of data loads more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
a data science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, ApacheAirflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of more »
Greater London, England, United Kingdom Hybrid / WFH Options
Validis
Proven ability to leverage CI/CD tools to streamline data pipeline development and deployment. Experience designing and implementing ETL pipelines using tools like ApacheAirflow, Luigi, Spark, or similar frameworks (familiarity is a plus). Understanding of data warehousing concepts and data modelling techniques. Experience with SQL more »
Greater London, England, United Kingdom Hybrid / WFH Options
Validis
to leverage CI/CD tools to streamline data pipeline development and deployment. Proven expertise in designing and implementing ETL pipelines using tools like ApacheAirflow, Luigi, Spark, or similar frameworks. Strong understanding of data warehousing concepts and data modelling techniques. Experience with SQL and proficiency in writing more »
data processing, analysis, and visualization libraries Experienced with SQL and Timeseries databases Skilled in AWS services: S3, EC2, RDS Knowledgeable in ETL tools like Airflow Proficient in Git, CI/CD, testing tools, and documentation best practices Adheres to quality engineering practices including TDD and BDD Nice to have more »
or warehouse. Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the more »
building APIs & SQLAlchmy for database interactions. Strong experience in cloud-based development (AWS). Proficiency with both Docker & Kubernetes for containerisation & orchestration. Understanding or Airflow & DAGs. Experience in building applications using Kafka. Solid OOP principles & design patterns. Permanent/Full-Time Employment. Hybrid working environment (2/3 days more »
methodologies such as CI/CD, Applicant Resiliency, and Security Preferred qualifications, capabilities, and skills: · Skilled with Python or PySpark · Exposure to cloud technologies (Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka) · Experience with Big Data solutions or Relational DB. · Experience in Financial Service Industry is a bonus. more »
in: Data Warehousing Data Engineering, overall Data Analytics Data Visualisation Proficiency in: Google Cloud (GCP) GCP BigQuery Python DBT or similar FastAPI or similar Airflow or similar Desirable: Google Apigee (as an application developer) Exposure to Machine Learning projects Exposure to DataOps Exposure to dataproc or similar more »
Manchester Area, United Kingdom Hybrid / WFH Options
Forsyth Barnes
data warehousing. Multiple years of experience with GCP, especially with core processing and orchestration products like BigQuery, DataFlow, DataFusion, DataStream, Cloud Functions, DataProc, and Airflow/Composer. Strong problem-solving skills and a meticulous approach to code reviews. Proven leadership qualities with the ability to uphold high standards within more »
Preston, Lancashire, United Kingdom Hybrid / WFH Options
Forsyth Barnes
warehousing. l Multiple years of experience with GCP, especially with core processing and orchestration products like BigQuery, DataFlow, DataFusion, DataStream, Cloud Functions, DataProc, and Airflow/Composer. l Strong problem-solving skills and a meticulous approach to code reviews. l Proven leadership qualities with the ability to uphold high more »
on refactoring or optimizing outdated approaches Good understanding of data security You may have: Experience/knowledge of the care industry Experience with Python Airflow or equivalent orchestration tools Working experience with APIs and development Please note that we are only able to accept applicants with the right to more »
Mathematics, Finance, Accounting, Economics or a related field or equivalent work experience (3+ years) Experience in: Some knowledge of database orchestration technologies + ETL (Airflow, DBT, Databricks) Working understanding of financial concepts and systems Ability to recognize and diagnose potential errors or data inconsistencies between multiple reports Working knowledge more »
expertise in tools spanning: data warehousing, ETL, internal visualisation and analytics. Good examples are Snowflake, GCP, Azure Analytics, Sagemaker, Databricks, Tableau, PowerBI, Looker, Quicksight, Airflow, astronomer.io, Alteryx, Collibra.• Hands on experience and\or a detailed and deep understanding of the workflow and approaches of:o BI analyst for visualisation more »
of SQL with vast amount of experience developing data pipelines in a cloud based database environment (e.g.BigQuery), including scheduling your own queries (e.g. using Airflow or Stitch) Demonstrated experience delivering data products in a modern BI technology (ideally Looker) or open source data frameworks Experience using programming languages (e.g. more »
in dbt.Skills in Python/R.Experience with Fivetran, Prefect, Snowflake, and Periscope.Familiarity with writing ETL pipelines using SQL and Python, and orchestration tools like Airflow or Prefect.Background in experimentation.Experience in fast-paced, venture-backed startup environments.At Fresha, we value passion and potential as much as specific skills. If you more »
bonusComponents used in our Data Stack: Fivetran, Prefect, Snowflake, dbt and PeriscopeExperience in writing ETL pipelines with SQL and Python, using orchestration tools like Airflow or PrefectExperience working in fast-paced venture-backed startup environmentsAt Fresha, we value passion and potential as much as specific skills. If you're more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as ApacheAirflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics more »