data and AI models. Data Engineer Required Experience Data engineering experience (2+ years) Cloud platform proficiency (e.g., AWS, Azure, GCP) Data pipeline development (e.g., Airflow, Apache Spark) SQL proficiency, database design Visualization tools knowledge (e.g., Tableau, PowerBI, Looker) Data Engineer Application Process This is a 1 year contract more »
and IAM. Experience with containerization and orchestration tools, particularly Kubernetes. Proficiency in infrastructure as code tools such as Terraform, Ansible, or CloudFormation. Experience in ApacheAirflow, AWS Backup & S3 versioning Solid understanding of CI/CD concepts and experience implementing CI/CD pipelines using tools like Jenkins more »
have Terraform experience SQL & NoSQL experience Have built out Data Warehouses & built Data Pipelines Strong Databricks & Snowflake experience Docker, ECS, Kubernetes & Orchestration tools like Airflow or Step Functions are nice to have Contracts are running for 6 months initially, paying up to £450p/day (Outside IR35) and will more »
really value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt TC: £85,000 + bonus + up to 22% pension Process: 2 stages No CV? No problem. Email me at athomas@trg-uk.com more »
DBT (Data Build Tool). Experience with FastAPI. Familiarity with Python UI framework packages. Knowledge of Apigee (as an app developer). Familiarity with Airflow (as an app developer). Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. If you are passionate about data engineering more »
Experience with Data Vault 2.0. Analytical mindset with attention to detail. Enthusiastic about learning and thriving in a dynamic setting. Familiarity with dbt, Snowflake, Airflow, FiveTran, and Terraform. Join and be part of a dynamic team driving innovation and excellence in data engineering more »
Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
early-stage B2B SaaS experience involving client-facing projects • Experience in front-end development and competency in JavaScript • Knowledge of API development • Familiarity with Airflow, DBT, Databricks • Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems. If this role sounds of interest, please apply using the more »
range of databases. Snowflake is widely used, as are Docker and Kubernetes for containerisation. ETL and ELT tech are also used every day, primarily Airflow, Spark, Hive and a lot more. You’ll need to come from a strong academic background with some commercial experience in a data heavy more »
Abingdon-On-Thames, England, United Kingdom Hybrid / WFH Options
Mirus Talent
mandatory, familiarity with the following technologies and tools would be advantageous: Dagster (or similar Orchestration Tools) : Experience with Dagster or other orchestration tools like Airflow for managing complex data workflows and pipelines. Qlik Sense Cloud (or similar Reporting Tools) : Knowledge of Qlik Sense Cloud or similar reporting tools such more »
web scraping and other data ingestion methods and tools. Knowledge of distributed computing frameworks (Hadoop, Spark, Hive, Presto). Experience with data orchestration tools (Airflow, Orchestra, Azkaban). Expertise in cloud data warehousing and core data modelling concepts. Proficiency in version control systems (Git) and experience with CI/ more »
Proven experience in MLOps and deploying machine learning models on Kubernetes. Proficiency in cloud technologies, AWS, GCP, Azure Experience with data orchestration tools (e.g., ApacheAirflow). Familiarity with Terraform for CI/CD and infrastructure as code. Strong programming skills in software development. A cloud-agnostic mindset more »
Agile software development and system architecture within the Telco OSS domain, with preferred experience in Network GIS (Hexagon, IQ Geo) and workflow tooling (Appian, ApacheAirflow). Strong understanding of platform and product dynamics, including Platform Engineering and its relevance to OSS. Extensive background in DevOps practices, encompassing more »
load data to Snowflake. Converting SAS based module to python based Candidates who have data management experience in Snowflake with expertise in Python, DBT , Airflow or similar technologies DBT , Snowflake ( added advantage more »
load data to Snowflake. Converting SAS based module to python based Candidates who have data management experience in Snowflake with expertise in Python, DBT , Airflow or similar technologies DBT , Snowflake more »
Python and DBT for data transformation. Experience in converting SAS-based modules to Python-based solutions. Familiarity with Snowflake for data management. Experience with Airflow or similar technologies is a plus. Desired: Experience with DBT and Snowflake is advantageous. more »
tools. In this role, you'll lead the development of robust, fully tested data pipelines in Python using cutting-edge platforms like Dagster/Airflow and apply your expertise in real-time data streaming solutions using Kafka. You'll play a key role in expanding their on prem data more »
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Synchro
in Python or PySpark, we encourage you to apply. Python App Developer Requirements: Proficiency with Python or PySpark. Exposure to cloud technologies such as Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka. Experience with Big Data solutions or Relational DB. Demonstrated knowledge of software applications and technical processes within a cloud more »
Data Engineer Experience working with AI frameworks and libraries (PyTorch) Confidence collaborating in a complex and cross-functional teams Strong skills with: AWS, Python, Airflow, Snowflake Interested ? Apply now or reach out to daisy@wearenumi.com for further details and a chat more »
hands on and will require exposure to some essential tools and a high level of financial knowledge. Good exposure to AWS technologies is essential, Airflow would be desirable. Experience with batch processing is required. Experience working with a Linux environment. Scripting exposure within either Python, Bash or Shell scripting. more »
to-end from scoping, designing, coding, release and continuous monitoring in production environment •ELT pipeline: Experience with ELT pipeline and orchestration systems such as Airflow •Database systems: Experience working with one of more of non-SQL databases such as Druid, Elasticsearch and neo4j •AWS: Experience deploying and managing applications more »
well as NoSQL databases Familiarity with cloud services (preferably GCP ) and understanding of its ETL frameworks. Desired : experience with data pipeline and workflow management (Airflow, Composer). Desired : Familiarity with machine learning algorithms and data science principles is a plus. Qualifications Degree in Computer Science, Engineering, or related field. more »
within the EU Fusion programme and connections to international HPC communities, showcasing contributions made to the field. Experience in workflow management systems such as Apache Airflow. Familiarity with Research Data Management methodologies,modern database technologies including SQL, NoSQL and Graph Databases, and parallel file access technologies such as MPI more »
within the EU Fusion programme and connections to international HPC communities, showcasing contributions made to the field. - Experience in workflow management systems such as Apache Airflow. - Familiarity with Research Data Management methodologies, modern database technologies including SQL, NoSQL and Graph Databases, and parallel file access technologies such as MPI more »
4.Monitoring and Logging: Implement and maintain monitoring, logging, and alerting solutions. Key technologies: AWS, VPN, VPC Peering, EC2, S3, Lambda, Aurora, Docker/Kubernetes. ApacheAirflow, AWS networking concepts such as VPN, VPC peering, subnets, security groups, NAT gateways. AWS CloudWatch or equivalent. Kafka or similar data streaming more »
4.Monitoring and Logging: Implement and maintain monitoring, logging, and alerting solutions. Key technologies: AWS, VPN, VPC Peering, EC2, S3, Lambda, Aurora, Docker/Kubernetes. ApacheAirflow, AWS networking concepts such as VPN, VPC peering, subnets, security groups, NAT gateways. AWS CloudWatch or equivalent. Kafka or similar data streaming more »