non-routine issues and identify improvements in the testing and validation of data accuracy.Extensive experience with Snowflake is essential and working knowledge of dbt, Airflow and AWS is highly desirable.Strong background developing, constructing, testing, and maintaining practical data architectures and drive improvements in data reliability, efficiency, and quality.A proven more »
of SQL with vast amount of experience developing data pipelines in a cloud based database environment (e.g.BigQuery), including scheduling your own queries (e.g. using Airflow or Stitch) Demonstrated experience delivering data products in a modern BI technology (ideally Looker) or open source data frameworks Experience using programming languages (e.g. more »
bonusComponents used in our Data Stack: Fivetran, Prefect, Snowflake, dbt and PeriscopeExperience in writing ETL pipelines with SQL and Python, using orchestration tools like Airflow or PrefectExperience working in fast-paced venture-backed startup environmentsAt Fresha, we value passion and potential as much as specific skills. If you're more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as ApacheAirflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics more »
Agile software development and system architecture within the Telco OSS domain, with preferred experience in Network GIS (Hexagon, IQ Geo) and workflow tooling (Appian, ApacheAirflow). Strong understanding of platform and product dynamics, including Platform Engineering and its relevance to OSS. Extensive background in DevOps practices, encompassing more »
and IAM. Experience with containerization and orchestration tools, particularly Kubernetes. Proficiency in infrastructure as code tools such as Terraform, Ansible, or CloudFormation. Experience in ApacheAirflow, AWS Backup & S3 versioning Solid understanding of CI/CD concepts and experience implementing CI/CD pipelines using tools like Jenkins more »
and IAM. Experience with containerization and orchestration tools, particularly Kubernetes. Proficiency in infrastructure as code tools such as Terraform, Ansible, or CloudFormation. Experience in ApacheAirflow, AWS Backup & S3 versioning Solid understanding of CI/CD concepts and experience implementing CI/CD pipelines using tools like Jenkins more »
Manchester, England, United Kingdom Hybrid / WFH Options
Vermelo RPO
sees challenges as development opportunities not problems Desirable Skills Experience of SAS Viya Experience of SAS Visual Analytics Experience of SQL Server Experience with ApacheAirflow Experience using MS Dev Ops for workflow and CI/CD pipelines. Educated to degree standard more »
Python and DBT for data transformation. Experience in converting SAS-based modules to Python-based solutions. Familiarity with Snowflake for data management. Experience with Airflow or similar technologies is a plus. Desired: Experience with DBT and Snowflake is advantageous. more »
has multiple years of experience using Snowflake as a data tool and can hit the ground running Experience Snowflake Financial services Cloud (Ideally azure) Airflow would be an advantage more »
have Terraform experience SQL & NoSQL experience Have built out Data Warehouses & built Data Pipelines Strong Databricks & Snowflake experience Docker, ECS, Kubernetes & Orchestration tools like Airflow or Step Functions are nice to have Contracts are running for 6 months initially, paying up to £450p/day (Outside IR35) and will more »
load data to Snowflake. Converting SAS based module to python based Candidates who have data management experience in Snowflake with expertise in Python, DBT, Airflow or similar technologies Must have hands on experience in DBT & SQL Snowflake (added advantage more »
Data Engineer Experience working with AI frameworks and libraries (PyTorch) Confidence collaborating in a complex and cross-functional teams Strong skills with: AWS, Python, Airflow, Snowflake Interested ? Apply now or reach out to daisy@wearenumi.com for further details and a chat more »
hands on and will require exposure to some essential tools and a high level of financial knowledge. Good exposure to AWS technologies is essential, Airflow would be desirable. Experience with batch processing is required. Experience working with a Linux environment. Scripting exposure within either Python, Bash or Shell scripting. more »
Manchester Area, United Kingdom Hybrid / WFH Options
Maxwell Bond
successful Lead Data Engineer will have: Experience leading a Data Engineering team. Extensive working experience with GCP, SQL and DBT. Proficient in: Kafka, Dataform, Airflow, Tableau, PowerBI, Redshift, Snowflake, Terraform and BigQuery. What's in it for the successful Lead Data Engineer: Hybrid working for a better work/ more »
Experience with Data Vault 2.0. Analytical mindset with attention to detail. Enthusiastic about learning and thriving in a dynamic setting. Familiarity with dbt, Snowflake, Airflow, FiveTran, and Terraform. Join and be part of a dynamic team driving innovation and excellence in data engineering more »
Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
early-stage B2B SaaS experience involving client-facing projects • Experience in front-end development and competency in JavaScript • Knowledge of API development • Familiarity with Airflow, DBT, Databricks • Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems. If this role sounds of interest, please apply using the more »
as thinking strategically and proactively identifying opportunities. You have experience collaborating with senior business stakeholders and finance teams. You have working knowledge of Python, Airflow, dbt, Bigquery, and Looker. Additional desirables: Experience in one or more finance domains, such as Financial Reporting, Treasury, Regulatory Reporting, Financial Planning & Analysis, Financial more »
4.Monitoring and Logging: Implement and maintain monitoring, logging, and alerting solutions. Key technologies: AWS, VPN, VPC Peering, EC2, S3, Lambda, Aurora, Docker/Kubernetes. ApacheAirflow, AWS networking concepts such as VPN, VPC peering, subnets, security groups, NAT gateways. AWS CloudWatch or equivalent. Kafka or similar data streaming more »
4.Monitoring and Logging: Implement and maintain monitoring, logging, and alerting solutions. Key technologies: AWS, VPN, VPC Peering, EC2, S3, Lambda, Aurora, Docker/Kubernetes. ApacheAirflow, AWS networking concepts such as VPN, VPC peering, subnets, security groups, NAT gateways. AWS CloudWatch or equivalent. Kafka or similar data streaming more »
trends and best practices. Qualifications: Expertise in Java and Python development (Essential). Experience with Spark or Hadoop (Essential). Knowledge of Trino or Airflow (Desirable). Proven ability to design and implement scalable and secure solutions. Excellent communication and collaboration skills. more »
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Set2Recruit
proficiency GIT proficiency Linux use and admin Experience deploying cloud services (AWS is a bonus) Experience with Docker and Kubernetes Using frameworks such as AirFlow ML background - PyTorch for computer vision This is a fully remote role which comes with: Budget for WFH set up. Stock options. 25 days more »
well as NoSQL databases Familiarity with cloud services (preferably GCP ) and understanding of its ETL frameworks. Desired : experience with data pipeline and workflow management (Airflow, Composer). Desired : Familiarity with machine learning algorithms and data science principles is a plus. Qualifications Degree in Computer Science, Engineering, or related field. more »
intelligence, and data warehousing Proficiency in BI tools (e.g., PowerBI, ThoughtSpot) Expert in SQL and Python with knowledge of how to leverage dbt and Airflow Data modelling and governance expertise Knowledge of how to run experimentation platforms and CoE Inspirational leadership styles with the ability to influence senior stakeholders more »
Southsea, Hampshire, United Kingdom Hybrid / WFH Options
Checkatrade
intelligence, and data warehousing Proficiency in BI tools (e.g., PowerBI, ThoughtSpot) Expert in SQL and Python with knowledge of how to leverage dbt and Airflow Data modelling and governance expertise Knowledge of how to run experimentation platforms and CoE Inspirational leadership styles with the ability to influence senior stakeholders more »