Hadoop, Spark, Kafka). Experience with AWS and its data services (e.g. S3, Athena, AWS Glue). Familiarity with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). Knowledge of containerization and orchestration tools (e.g., Docker, ECS, Kubernetes). Knowledge of data orchestration tools (e.g. Prefect, Apache Airflow). Familiarity more »
with ETL tools, Hadoop-based technologies (e.g., Spark), and data pipelines (e.g., Beam, Flink). Experience designing data lake and data warehouse solutions (e.g., BigQuery, Azure Synapse, Redshift). Hands-on experience with visualisation tools (e.g., Looker, Tableau, PowerBI). Understanding of Agile methodologies such as Scrum. Knowledge of more »
Spark or Apache Flink. Implement real-time data transformations and analytics using KSQL. Ensure integration between on-premises streaming solutions and cloud services (e.g., BigQuery, Looker). Lead the design and implementation of ETL processes, extracting, transforming, and loading data into a data warehouse. Ensure data integrity, consistency, and more »
organize data from various sources, both internal and external, including cloud-based platforms and on-premises databases such as SQL, Microsoft Dynamics, MongoDB, GA4, BigQuery, and third-party connectors. Use SQL to query and manipulate large datasets efficiently, helping design schemas and tables within the data lake's medallion more »
such as Python, SQL, or other data-related languages. Experience with cloud services (e.g., AWS, Google Cloud, Azure) and data storage solutions (e.g., Redshift, BigQuery). Familiarity with data engineering tools and frameworks (e.g., Apache Spark, Kafka, Airflow). Strong problem-solving skills and a passion for working with more »
would be an advantage. Be a self-starter, comfortable to drive self-learning by reading existing codebases. Experience with GCP including tools such as BigQuery, Dataflow, Pub/Sub and Cloud Storage, or their equivalent AWS/Azure services would be an advantage. Strong communication skills with the ability more »
Databricks clusters, notebooks, jobs, and libraries. Deep understanding of Databricks architecture, configuration, and performance optimization techniques. Strong knowledge of GCP services, including Compute Engine, BigQuery, Cloud Storage, Dataflow, and Pub/Sub. Proficiency in scripting languages like Python or Scala for automation, terraform, and data engineering tasks. Experience with more »
solutions with Pub/Sub or Kafka, ensuring low-latency processing and real-time analytics capabilities. Data Warehouse Optimisation & Modelling : Set up and optimize BigQuery, SQL, DBT models, and VertexAI workflows for advanced data transformations and machine learning model serving. Data Visualization : Collaborate with stakeholders to define requirements and more »
of experience working as a professional data engineer in industry Expertise with Python coding and type system Expertise in writing SQL (GQL, PostgreSQL, and BigQuery are a plus) Experience with building both batch and streaming ETL pipelines using data processing engines Experience with cloud development (we use GCP and more »
or a related field, with experience in a leadership or strategic role. Strong proficiency in SQL and experience with cloud data warehouses (e.g., Snowflake, BigQuery, or Redshift). Hands-on experience with data visualisation tools like Tableau, Looker, or Power BI, creating impactful dashboards and reports. Expertise in defining more »
Bath, England, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
within a media, publishing, or marketing environment Expertise with Google Analytics, Affiliate Networks, and first-party sales data (Amazon, branded content, etc.) Proficiency in BigQuery, Looker Studio, or similar visualization tools (Power BI, Tableau) Line management experience, or mentoring and coaching experience Experience with data warehousing technologies like Snowflake more »
Manchester, North West, United Kingdom Hybrid / WFH Options
N Brown Group
on end-to-end architecture Deep interest in how architectures can improve experiences and drive better business decisions Experience with Google Cloud Platform stack (BigQuery, Composer, Dataplex, Dataflow, Cloud Functions, Cloud Run, Pub/Sub, GCS, Vertex AI, GKE) or similar cloud platforms Familiarity with open-source data-stack more »
and are excited by the challenge of uncovering new patterns and insights to drive industry-leading research. Requirements Basic proficiency in Python and SQL (BigQuery), with the ability to self-serve analysis queries and modify existing code for more complex and advanced data manipulation - preference given to candidates with more »
datasets and cloud-based platforms (e.g., AWS, GCP). Self-sufficient with SQL, as well as experience with data pipelining & warehousing technologies. Experience with BigQuery and dbt is a bonus. Experience with econometrics, bayesian ML, causal inference, and/or auction theory is highly desirable. Excellent problem-solving, analytical more »
Bath, England, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
processes, and data modelling Proficiency with Google Ad Manager, Google Analytics, and advertising media buying channels (e.g., real-time bidding, programmatic direct) Experience in BigQuery, Looker Studio, or similar visualization tools (Power BI, Tableau) Line management experience, or mentoring and coaching experience Experience with data warehousing technologies like Snowflake more »
Rotherham, South Yorkshire, United Kingdom Hybrid / WFH Options
Whitestone Resourcing Limited
data sources to a warehouse model. Experience in a data governance role with practical examples in implementing data management frameworks Google Cloud Platform (GCP), BigQuery, Looker Power BI and other reporting technologies. Strategic, Analytical, problem-solving thinking Leadership,Challenge & Communication Commercial Understanding Project Management InnovationExcellent written and visual presentation more »
working in fintech. You have experience working in start-ups or scale-ups. You have knowledge of cryptocurrency. You have worked with DBT, GoogleBigQuery, Looker and Python. Most importantly , you will embody the core principles that everyone here at the MoonPay lives by. Our "BLOCK Values" are at more »
Bromley, Greater London, St Mary Cray, United Kingdom Hybrid / WFH Options
Ripple
in fast-paced, dynamic environments and possess the commercial awareness to see the bigger picture. Key Responsibilities: Develop, manage, and refine data models in BigQuery that align with business insights and operational efficiency. Lead the creation and optimisation of dashboards in Looker Studio, ensuring they deliver clear, actionable insights more »
products and deep experience in one more of the categories: Serverless: Cloud Run, Cloud Functions Kubernetes Engine, Compute Engine VPC networks and Service Perimeters BigQuery, Cloud SQL, Cloud Spanner and Firestore API management including Apigee or Cloud Endpoints Data analytics services including Dataproc, Dataflow, Data Fusion and Cloud Composer more »
Rotherham, South Yorkshire, United Kingdom Hybrid / WFH Options
Randstad Technologies Recruitment
on-site meetings. Skills required: 2-4 years of experience in data analysis , ETL processes , and working with reporting tools like Looker and GoogleBigQuery . Must have experience with Google Cloud Platform (GCP) tools and services for data management. Proven ability to reduce technical debt and enhance automation more »
skills. You need to be able to independently write your queries and check them for accuracy. Experienced with working with data warehouses such as BigQuery, Snowflake, and Redshift. Automation experience - dashboards and processes. Advanced knowledge of Tableau or similar BI and data visualization tools. Experience working at a fast more »
sustain and implement existing ELT pipelines Skills and Experience: Essential to have 4+ years experience in: SQL Python DBT Cloud techs such as Snowflake, BigQuery or Databricks AWS/GCP Desirable to have experience with: CI/CD pipelines Data Engineering background Interview Process: There are 4 stages to more »
sustain and implement existing ELT pipelines Skills and Experience: Essential to have 4+ years experience in: SQL Python DBT Cloud techs such as Snowflake, BigQuery or Databricks AWS/GCP Desirable to have experience with: CI/CD pipelines Data Engineering background Interview Process: There are 4 stages to more »
Nottinghamshire, Nottingham, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
Expect variety – you’ll work across cloud platforms like Azure, AWS, and GCP, leveraging tools such as Databricks, Data Factory, Synapse, Kafka, Glue, Redshift, BigQuery, and more. About You You’re an engineer at heart, with a passion for building efficient, scalable data systems. To succeed, you’ll need more »
Birmingham, West Midlands (County), United Kingdom Hybrid / WFH Options
Xpertise Recruitment
Expect variety – you’ll work across cloud platforms like Azure, AWS, and GCP, leveraging tools such as Databricks, Data Factory, Synapse, Kafka, Glue, Redshift, BigQuery, and more. About You You’re an engineer at heart, with a passion for building efficient, scalable data systems. To succeed, you’ll need more »