City of London, London, United Kingdom Hybrid / WFH Options
83zero Limited
supporting project delivery through involvement in project/sprint planning and QA Also: Knowledge of other cloud platforms Google Data Products tools knowledge (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.) Relevant certifications Python Snowflake Databricks To apply please click the 'Apply' button and More ❯
technology approach combining talent with software and service expertise. Tasks & Responsibilities: Design, develop, and maintain scalable data pipelines on GCP using services such as BigQuery and Cloud Functions. Collaborate with internal consulting and client teams to understand data requirements and deliver data solutions that meet business needs. Implement data More ❯
technology approach combining talent with software and service expertise. Tasks & Responsibilities: Design, develop, and maintain scalable data pipelines on GCP using services such as BigQuery and Cloud Functions. Collaborate with internal consulting and client teams to understand data requirements and deliver data solutions that meet business needs. Implement data More ❯
methodologies in the rapidly evolving field of data analytics. Tasks & Responsibilities: Design, develop, and maintain scalable data pipelines on GCP using services such as BigQuery and Cloud Functions. Collaborate with internal consulting and client teams to understand data requirements and deliver data solutions that meet business needs. Implement data More ❯
Expertise in design of data solutions for BigQuery. Expertise in logical and physical data modelling. Hands-on experience using Google Dataflow, GCS, cloud functions, BigQuery, DataProc, Apache Beam (Python) in designing data transformation rules for batch and data streaming. Solid Python programming skills and using Apache Beam (Python). More ❯
frameworks (Airflow, Dagster) to build scalable pipelines. Knowledge of real-time data movement, databases (Oracle, SQL Server, PostgreSQL), and cloud analytics platforms (Snowflake, Databricks, BigQuery). Familiarity with emerging data technologies like Open Table Format, Apache Iceberg , and their impact on enterprise data strategies. Hands-on experience with data More ❯
for data analysis, machine learning, and data visualization. In-depth knowledge of cloud platforms (AWS, Azure, Google Cloud) and related data services (e.g., S3, BigQuery, Redshift, Data Lakes). Expertise in SQL for querying large datasets and optimizing performance. Experience working with big data technologies such as Hadoop, Apache More ❯
london, south east england, united kingdom Hybrid / WFH Options
Careerwise
for data analysis, machine learning, and data visualization. In-depth knowledge of cloud platforms (AWS, Azure, Google Cloud) and related data services (e.g., S3, BigQuery, Redshift, Data Lakes). Expertise in SQL for querying large datasets and optimizing performance. Experience working with big data technologies such as Hadoop, Apache More ❯
ELT workflows. Strong analytic skills related to working with unstructured datasets. Engineering best practices and standards. Experience with data warehouse software (e.g. Snowflake, GoogleBigQuery, Amazon Redshift). Experience with data tools: Hadoop, Spark, Kafka, etc. Code versioning (Github integration and automation). Experience with scripting languages such as More ❯
tools such as Tableau Experience working with APIs Experience working with large-scale spatial datasets (billions of rows) and performing geospatial at scale using BigQuery GIS or similar tools Experience with advanced analytical modelling techniques, including statistical analysis and predictive modelling, particularly applying these to large-scale datasets to More ❯
data-related issues. Key Skills and Knowledge: 8+ years of experience in data architecture, data engineering, or cloud computing. Expertise in Google Cloud Platform (BigQuery, Dataflow, Pub/Sub, Cloud Storage, etc.). Strong experience with SQL, Python, and data modeling. Hands-on experience with ETL/ELT pipelines More ❯
Sheets. Create clear, compelling reports that translate data into actionable business recommendations. Use Dataform to transform raw data into structured, usable datasets in our Bigquery data warehouse. Contribute to the development of source-of-truth data models. Maintain and optimize scalable data pipelines. Qualification & Experience Must haves: 3+ years More ❯
experience as a DevOps Engineer/Consultant with a history of successful client project delivery. Extensive hands-on experience with GCP services such as BigQuery, Cloud Storage, Dataflow, Pub/Sub, Dataproc, and Cloud Composer. Strong programming and scripting skills in languages like Python, Bash, or Go to automate More ❯
experience as a DevOps Engineer/Consultant with a history of successful client project delivery. Extensive hands-on experience with GCP services such as BigQuery, Cloud Storage, Dataflow, Pub/Sub, Dataproc, and Cloud Composer. Strong programming and scripting skills in languages like Python, Bash, or Go to automate More ❯
also be applicable. Experience designing and implementing stream processing applications (kStreams, kSQL, Flink, Spark Streaming). Experience with Data Warehousing tools like Snowflake/Bigquery/Databricks, and building pipelines on these. Experience working with modern cloud-based stacks such as AWS, Azure or GCP. Excellent programming skills with More ❯
roles. Experience working with other data disciplines, such as data engineering and data analytics. Experience with SQL, DBT, modern cloud data warehouses (e.g. Snowflake, BigQuery), self-service BI tooling (e.g. ThoughtSpot, Looker, Hex). Experience with Python, Jinja, Git, YAML, orchestration tools (e.g. Airflow, Dagster), and CI/CD More ❯
of experience working as a professional data engineer in industry Expertise with Python coding and type system Expertise in writing SQL (GQL, PostgreSQL, and BigQuery are a plus) Experience with building both batch and streaming ETL pipelines using data processing engines Experience with cloud development (we use GCP and More ❯
Processing: Utilise expert skills in Python and SQL to develop and optimise data models and processes. Experience with data management platforms such as Snowflake, BigQuery, MongoDB, DynamoDB, Redshift, and PostgreSQL will be crucial. Blockchain Solutions Engineering: Apply knowledge from previous engagements where blockchain technology, particularly in data handling and More ❯
also be applicable. Experience designing and implementing stream processing applications (kStreams, kSQL, Flink, Spark Streaming). Experience with Data Warehousing tools like Snowflake/Bigquery/Databricks, and building pipelines on these. Experience working with modern cloud-based stacks such as AWS, Azure or GCP. Excellent programming skills with More ❯
end data platforms that are scalable, secure, and cost-effective using cloud technologies (AWS, GCP, Azure). Implement hybrid cloud architectures (e.g., Snowflake, Redshift, BigQuery), and migrate legacy systems to modern cloud-based platforms. Lead the creation of Proofs of Concept (PoCs) and Minimum Viable Products (MVPs) for data More ❯
measurement frameworks and using quantitative methods (including predictive modeling and statistical analysis) to validate results. Strong experience building analytics with cloud data warehouses, GoogleBigQuery, Snowflake, modern data stack technologies like DBT, Fivetran, Airflow. Demonstrable experience in partnering with various stakeholders and building data products that support many technical More ❯
we use Programming languages: SQL, LookML, Python Development tools and frameworks: dbt, pandas, dagster, Airbyte, dlt, data-diff, Elementary Data lake and warehouse: GCS, BigQuery Analytics: Looker, Looker Studio and geospatial analytics tools How we reward our team Dynamic working environment with a diverse and driven team Huge opportunity More ❯
SR2 | Socially Responsible Recruitment | Certified B Corporation™
AWS).//Excellent programming skills in Python.//SQL/NoSQL database management systems (PostgresSQL, MongoDB).//Familiar with BigQuery, Snowflake, Firebolt and/or Amazon (preferred).//Ability to work on messy, complex real-world data challenges.//Knowledge of More ❯
london, south east england, united kingdom Hybrid / WFH Options
SR2 | Socially Responsible Recruitment | Certified B Corporation™
AWS).//Excellent programming skills in Python.//SQL/NoSQL database management systems (PostgresSQL, MongoDB).//Familiar with BigQuery, Snowflake, Firebolt and/or Amazon (preferred).//Ability to work on messy, complex real-world data challenges.//Knowledge of More ❯
while mentoring and coaching others to succeed. Have a strong background in building and managing data infrastructure at scale, with expertise in Python, SQL, BigQuery, AWS, dbt and Airflow. Have a strong background in data modelling and building scalable data pipelines. Are naturally curious and enthusiastic about experimenting with More ❯