in: Data Warehousing Data Engineering, overall Data Analytics Data Visualisation Proficiency in: Google Cloud (GCP) GCP BigQuery Python DBT or similar FastAPI or similar Airflow or similar Desirable: Google Apigee (as an application developer) Exposure to Machine Learning projects Exposure to DataOps Exposure to dataproc or similar more »
Greater London, England, United Kingdom Hybrid / WFH Options
CommuniTech Recruitment Group
AWS ecosystems like Lambdas, step functions and ECS services. Experience of Dremio is a nice to have Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to ApacheAirflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake more »
step functions and ECS services. Strong understanding of AWS ecosystems like Lambdas, step functions and ECS services. Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to ApacheAirflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake more »
step functions and ECS services. Strong understanding of AWS ecosystems like Lambdas, step functions and ECS services. Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to ApacheAirflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake more »
pipelines Know your way around Unix based operating system Experience working with any major cloud provider (AWS, GCP, Azure) Fluency in English Experience using ApacheAirflow Experience using Docker Experience using Apache Spark Benefits: Salary £40-50K per annum dependant on skills and experience 25 Days more »
of frameworks like DropWizard. Lakehouse Architectures: Familiarity with modern data technologies such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. AWS Services: Hands-on experience with AWS, especially S3, ECS, and EC2/Fargate. Collaborative Approach: Proven ability to work effectively with more »
analysis, and software design Travel up to 10% What Will Help You On The Job Familiarity with running software services at scale AWS Infrastructure, Airflow, Kafka and data streaming using Spark/Scala Understanding of networking fundamentals (OSI layers 2-7) Technical and software engineering background in the areas more »
extensive experience having designed and scaled a Data Platform- Has strong Python skills,- Has great SQL, preferably Snowflake- Has previous experience working with dbt & Airflow- Is passionate about solving complex data problems & is interested in working with rich & diverse climate datasets- Cares deeply about the climate and ecosystems of more »
building APIs & SQLAlchmy for database interactions. Strong experience in cloud-based development (AWS). Proficiency with both Docker & Kubernetes for containerisation & orchestration. Understanding or Airflow & DAGs. Experience in building applications using Kafka. Solid OOP principles & design patterns. Permanent/Full-Time Employment. Hybrid working environment (2/3 days more »
of SQL with vast amount of experience developing data pipelines in a cloud based database environment (e.g.BigQuery), including scheduling your own queries (e.g. using Airflow or Stitch) Demonstrated experience delivering data products in a modern BI technology (ideally Looker) or open source data frameworks Experience using programming languages (e.g. more »
few of the open-source libraries we use extensively. We implement the systems that require the highest data throughput in Java and C++.We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log more »
have Terraform experience SQL & NoSQL experience Have built out Data Warehouses & built Data Pipelines Strong Databricks & Snowflake experience Docker, ECS, Kubernetes & Orchestration tools like Airflow or Step Functions are nice to have Contracts are running for 6 months initially, paying up to £450p/day (Outside IR35) and will more »
Experience with Data Vault 2.0. Analytical mindset with attention to detail. Enthusiastic about learning and thriving in a dynamic setting. Familiarity with dbt, Snowflake, Airflow, FiveTran, and Terraform. Join and be part of a dynamic team driving innovation and excellence in data engineering more »
converting SAS-based modules to Python-based solutions. Strong understanding of data management principles and experience working with Snowflake. Proficiency in Python, DBT, and Airflow or similar technologies. Excellent problem-solving skills and ability to troubleshoot complex issues. Experience working in an Agile environment and collaborating with cross-functional more »
range of databases. Snowflake is widely used, as are Docker and Kubernetes for containerisation. ETL and ELT tech are also used every day, primarily Airflow, Spark, Hive and a lot more. You’ll need to come from a strong academic background with some commercial experience in a data heavy more »
London, England, United Kingdom Hybrid / WFH Options
Jobleads-UK
Continuous Delivery Continuous integration pipelines Strong Python AWS or Azure with large-scale streaming data (Pulsar, Kafka, Kinesis, etc) ETL management; structured or custom (Airflow, Luigi, etc) Bonus Robust experience managing and developing an engineering team Delta lake or Iceberg Trino or Presto Graphql Good salary, bonus, stock options more »
routine issues and identify improvements in the testing and validation of data accuracy. Extensive experience with Snowflake is essential and working knowledge of dbt, Airflow and AWS is highly desirable. Strong background developing, constructing, testing, and maintaining practical data architectures and drive improvements in data reliability, efficiency, and quality. more »
well as NoSQL databases Familiarity with cloud services (preferably GCP ) and understanding of its ETL frameworks. Desired : experience with data pipeline and workflow management (Airflow, Composer). Desired : Familiarity with machine learning algorithms and data science principles is a plus. Qualifications Degree in Computer Science, Engineering, or related field. more »
schemas to support business requirements Develop and maintain data ingestion and processing systems using various tools and technologies, such as SQL, NoSQL, ETL, Luigi, Airflow, Argo, etc. Implement data storage solutions using different types of databases, such as relational, non-relational, or cloud-based. Working collaboratively with the client …/Azure SQL, PostgreSQL) You have framework experience within either Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving more »
and you’ll build the solution. They work in a mix of SQL, Python programming but also operate with modern tech such as Snowflake, Airflow, DBT, Kubernetes/Docker and cubeJS whilst also using Tableau CRM and Jupyter notebooks for gaining data insights. If you think there’s a … to be a SQL Server expert with strong architecture skills and also strong skills with Python, and DBT. The more experience with things like Airflow, cubeJS and Snowflake, plus other data viz tools would be a huge plus. Total compensation for this role is up to £250,000, split more »
AWS preferred) Solid understanding of libraries like Pandas and NumPy Experience in data warehousing tools like Snowflake, Databricks, BigQuery Familiar with AWS Step Functions, Airflow, Dagster, or other workflow orchestration tools Commercial experience with performant database programming in SQL Capability to solve complex technical issues, comprehending risks prior to more »
required: Python SQL Kubernetes CI/CD experience with relevant tooling with Jenkins, Docker or Terraform Cloud services experience with AWS/Azure Ideally: Airflow Java Experience working with front office trading systems and financial market data For more information on this role or any other contract/permanent more »
processing and analytics Desired experience: Worked with Python 3.9+ Familiar with Python test automation Experience with SQL and Timeseries databases Familiar with Parquet, Arrow, Airflow, Databricks Experience with cloud AWS services, such as S3, EC2, RDS etc Quality engineering best practice and tooling including TDD, BDD This is an more »
data processing, data analysis and data visualisation SQL and Timeseries databases cloud AWS services, such as S3, EC2, RDS etc ETL tools, such as Airflow Git, CI/CD, testing tools, supporting documentation and best practices best practice and tooling including TDD, BDD Domain and soft skills summary office more »
engineer who has hands-on experience in AWS to conduct end-to-end data analysis and data pipeline build-out using Python, Glue, S3, Airflow, DBT, Redshift, RDS, etc.Very solution-driven, and highly collaborative at providing thought leadership and soliciting diverse opinionsAccountable for results. Experienced in leading team of more »