Wavicle is seeking an experienced BigQuery (BigQuery) Platform Architect/DBA to provide architecture and implementation direction for projects leveraging BigQuery. The role will be the senior most expert for BigQuery, setting standards and directions for BigQuery usage and be the strategist and trusted advisor on … BigQuery and related GCP services at Clients. The role will also support pre-sales/sales as needed. What you will do: Set direction and standards for BigQuery environments, ensuring high availability, performance, and scalability. Provide direction and expertise on design, implementation and maintenance for applications leveraging BigQuery. … GCP-native tools (e.g., Cloud Functions, Dataflow). Document database designs, procedures, and best practices for the team. General Qualifications: Proven experience with GoogleBigQuery (prefer 3+ years) and Google Cloud Platform services (5+ years) Strong proficiency in SQL and BigQuery-specific SQL functions and optimization. Familiarity with More ❯
years of experience working with sales and marketing systems preferred. Strong proficiency in SQL and experience with cloud-based data warehouses (e.g., Redshift, Snowflake, BigQuery). Hands-on experience with ETL tools such as Informatica, Talend, or AWS Glue. Proficiency in programming languages such as Python, Java, or Scala More ❯
Jenkins). Working with distributed computing frameworks, such as Apache Spark. Cloud platforms (e.g., AWS, Azure, Google Cloud) and associated services (e.g., S3, Redshift, BigQuery). Familiarity with data modelling, database systems, and SQL optimisation techniques. Other things we’re looking for (key criteria) Knowledge of the UK broadcast More ❯
on experience with cloud-based data solutions (AWS, Azure, or Google Cloud). Strong understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, BigQuery). Experience working with structured and unstructured data. Knowledge of data governance, security, and compliance best practices. Education and Experience: Bachelor's degree in More ❯
critical for structuring data for analysis. Proficiency in cloud platforms, such as AWS and GCP, with hands-on experience in services like Databricks, Redshift, BigQuery, and Snowflake, is highly valued. Advanced Python skills for data manipulation, automation, and scripting, using libraries like Pandas and NumPy, are necessary for effective More ❯
needed. Requirements: Strong proficiency in Python for data engineering tasks. Experience with cloud platforms (e.g., AWS, Azure, or GCP), including services like S3, Lambda, BigQuery, or Databricks. Solid understanding of ETL processes , data modeling, and data warehousing. Familiarity with SQL and relational databases. Knowledge of big data technologies , such More ❯
Platform 3+ years of experience in data engineering or data science. Experience with data quality assurance and testing. Ideally knowledge of GCP data services (BigQuery; Dataflow; Data Fusion; Dataproc; Cloud Composer; Pub/Sub; Google Cloud Storage) Understanding of logging and monitoring using tools such as Cloud Logging, ELK More ❯
technology approach combining talent with software and service expertise. Tasks & Responsibilities: Design, develop, and maintain scalable data pipelines on GCP using services such as BigQuery and Cloud Functions Collaborate with internal consulting and client teams to understand data requirements and deliver data solutions that meet business needs Implement data More ❯
processing frameworks (e.g., Apache Spark, Airflow, dbt). Hands-on experience with cloud-based data solutions (e.g., AWS, GCP, Azure) and data warehouses (e.g., BigQuery, Snowflake, Redshift). In-depth knowledge of SEO metrics, search algorithms, and technical SEO principles. Experience working with APIs from Google Search Console, GoogleMore ❯
Google Cloud) and distributed computing principles. Experience with ETL processes and building efficient data pipelines. Familiarity with data warehousing concepts and technologies (Redshift, Snowflake, BigQuery, etc.). Knowledge of containerization technologies (Docker, Kubernetes) is a plus. Experience with version control systems such as Git. Strong problem-solving skills, attention More ❯
frameworks (Airflow, Dagster) to build scalable pipelines. Knowledge of real-time data movement, databases (Oracle, SQL Server, PostgreSQL), and cloud analytics platforms (Snowflake, Databricks, BigQuery). Familiarity with emerging data technologies like Open Table Format, Apache Iceberg, and their impact on enterprise data strategies. Hands-on experience with data More ❯
frameworks (Airflow, Dagster) to build scalable pipelines. Knowledge of real-time data movement, databases (Oracle, SQL Server, PostgreSQL), and cloud analytics platforms (Snowflake, Databricks, BigQuery). Familiarity with emerging data technologies like Open Table Format, Apache Iceberg, and their impact on enterprise data strategies. Hands-on experience with data More ❯
frameworks (Airflow, Dagster) to build scalable pipelines. Knowledge of real-time data movement, databases (Oracle, SQL Server, PostgreSQL), and cloud analytics platforms (Snowflake, Databricks, BigQuery). Familiarity with emerging data technologies like Open Table Format, Apache Iceberg , and their impact on enterprise data strategies. Hands-on experience with data More ❯
frameworks (Airflow, Dagster) to build scalable pipelines. Knowledge of real-time data movement, databases (Oracle, SQL Server, PostgreSQL), and cloud analytics platforms (Snowflake, Databricks, BigQuery). Familiarity with emerging data technologies like Open Table Format, Apache Iceberg, and their impact on enterprise data strategies. Hands-on experience with data More ❯
frameworks (Airflow, Dagster) to build scalable pipelines. Knowledge of real-time data movement, databases (Oracle, SQL Server, PostgreSQL), and cloud analytics platforms (Snowflake, Databricks, BigQuery). Familiarity with emerging data technologies like Open Table Format, Apache Iceberg, and their impact on enterprise data strategies. Hands-on experience with data More ❯
in a similar role focused on designing data architectures and systems. Expertise in data modeling, database design, and data warehousing technologies (e.g., Snowflake, Redshift, BigQuery, etc.). Solid understanding of cloud platforms (AWS, Azure, Google Cloud) and cloud-based data storage and processing solutions. Strong proficiency in SQL and More ❯
in SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, Cassandra). In-depth knowledge of data warehousing concepts and tools (e.g., Redshift, Snowflake, GoogleBigQuery). Experience with big data platforms (e.g., Hadoop, Spark, Kafka). Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud More ❯
technology approach combining talent with software and service expertise. Tasks & Responsibilities: Design, develop, and maintain scalable data pipelines on GCP using services such as BigQuery and Cloud Functions. Collaborate with internal consulting and client teams to understand data requirements and deliver data solutions that meet business needs. Implement data More ❯
in SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, Cassandra). • In-depth knowledge of data warehousing concepts and tools (e.g., Redshift, Snowflake, GoogleBigQuery). • Experience with big data platforms (e.g., Hadoop, Spark, Kafka). • Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud More ❯
in data engineering, with at least 2 years in a leadership or management role. Strong experience with data warehousing solutions (e.g., AWS Redshift, GoogleBigQuery, Snowflake). Expertise in data pipeline development, ETL tools, and frameworks (e.g., Apache Airflow, Talend, Fivetran). Proficiency in SQL and experience with Python More ❯
Expertise in data architecture, data modelling, data analysis, and data migration. Strong knowledge of SQL, NoSQL, cloud-based databases (e.g., AWS Redshift, Snowflake, GoogleBigQuery, Azure Synapse). Experience in ETL development, data pipeline automation, and data integration strategies. Familiarity with AI-driven analytics Strong understanding of data governance More ❯
a related role Strong programming skills in Python and SQL Excellent problem-solving and analytical skills Experience with GA4 and GA knowledge Experience with BigQuery, AWS, or other ETL tools and techniques Experience with CDP and CEP platforms such as Segment, Rudderstack, Braze etc. to support the marketing team More ❯
designing, building, and maintaining scalable data pipelines and ETL processes. Proficiency in SQL and experience working with relational and non-relational databases (e.g. Snowflake, BigQuery, PostgreSQL, MySQL, MongoDB). Hands-on experience with big data technologies such as Apache Spark, Kafka, Hive, or Hadoop. Proficient in at least one More ❯
a production environment. • 3+ years of experience in programming with Python • 3+ years of hands-on experience utilizing Google Cloud Platform (GCP) services, including BigQuery and Google Cloud Storage to efficiently manage and process large datasets, as well as Cloud Composer and/or Cloud Run. • Experience with version More ❯
/ELT pipelines and database technologies like PostgreSQL and MongoDB Familiar with major cloud platforms and tools, ideally Amazon Web Services and Snowflake or BigQuery Solid understanding of data transformation, advanced analytics, and API-based data delivery Ability to work across departments with a collaborative, problem-solving mindset and More ❯