SQL expertise designing and maintaining ETL/data pipelines ideally proficiency in multiple cloud infrastructures, databases and data warehousing solutions - AWS and GCP being Airflow 👍 Bonus points for: Experience working in a small, fast-growing start-up, comfortable navigating unstructured/fuzzy environments Experience with RudderStack, Expo and/ More ❯
SQL expertise designing and maintaining ETL/data pipelines ideally proficiency in multiple cloud infrastructures, databases and data warehousing solutions - AWS and GCP being Airflow 👍 Bonus points for: Experience working in a small, fast-growing start-up, comfortable navigating unstructured/fuzzy environments Experience with RudderStack, Expo and/ More ❯
with cloud-based data warehouses (e.g., Databricks, Snowflake, Redshift). Ability to optimize queries and pipelines for efficiency and reliability. Bonus: Experience with dbt, Airflow, or visualization tools like Tableau. Excellent communication skills and ability to document technical solutions effectively. For this position you need to be eligible to More ❯
experience in Snowflake architecture, including data loading, transformation, and performance tuning. Proficient in ETL processes using tools such as Informatica PowerCenter and BDM, AutoSys, Airflow, and SQL Server Agent. Experience with cloud platforms preferably AWS. Strong knowledge of AWS cloud services, including EMR, RDS Postgres, Redshift Athena, S3, and More ❯
with AWS services (S3, Glue, Lambda, SageMaker, Redshift, etc.) Strong Python and SQL skills; experience with PySpark a bonus Familiarity with containerization (Docker), orchestration (Airflow, Step Functions), and infrastructure as code (Terraform/CDK) Solid understanding of machine learning model lifecycle and best practices for deployment at scale Excellent More ❯
Master's or PhD in a relevant field (e.g., Computer Science, Data Science, Engineering, Applied Mathematics, Statistics, etc.). Proficiency in Python, SQL, AWS, Airflow, PySpark, PyTorch, NumPy, and related data technologies. Experience with cloud infrastructure, data pipelines, and machine learning model deployment. Proven experience leading diverse teams of More ❯
Master's or PhD in a relevant field (e.g., Computer Science, Data Science, Engineering, Applied Mathematics, Statistics, etc.). Proficiency in Python, SQL, AWS, Airflow, PySpark, PyTorch, NumPy, and related data technologies. Experience with cloud infrastructure, data pipelines, and machine learning model deployment. Proven experience leading diverse teams of More ❯
operational businesses Experience using LLMs or AI tools to structure and extract meaning from unstructured data Experience automating workflows and deploying model pipelines (e.g. Airflow, dbt, MLFlow, or similar) Exposure to business planning, pricing, or commercial decision-making Familiarity with geospatial data Experience in fast-scaling startups or operational More ❯
Redis, etc) Spark or other distributed big data systems (e.g. Hadoop, Pig, Hive, etc.) Stream-processing frameworks (e.g. Kafka) Data pipeline orchestration tools (e.g. Airflow, Prefect, Dagster, etc.) Job Requirements: Bachelor's/Master's in Computer Science, a related field, or equivalent work experience Fluency in Portuguese and More ❯
How you will contribute in this role: Architecting, building and maintaining GenAI processing pipelines/workflows in Python via technologies such as serverless functions, Airflow, MLflow, SageMaker, Bedrock, or other Working with Data and NLP Engineers and Data Scientists to automate experiments that bring ML and NLP research ideas More ❯
to-Have: Experience with marketing data or customer-level models (e.g. uplift, attribution, causal inference, campaign optimization) Familiarity with MLOps tools (e.g. MLflow, FastAPI, Airflow) Exposure to A/B testing and experimentation frameworks WHY THIS ROLE IS DIFFERENT This isn’t a narrow data science role — you won More ❯
to-Have: Experience with marketing data or customer-level models (e.g. uplift, attribution, causal inference, campaign optimization) Familiarity with MLOps tools (e.g. MLflow, FastAPI, Airflow) Exposure to A/B testing and experimentation frameworks WHY THIS ROLE IS DIFFERENT This isn’t a narrow data science role — you won More ❯
to-Have: Experience with marketing data, customer-level modelling, or decision science (e.g. uplift, attribution, causal AI, optimization) Familiarity with MLOps tooling (MLflow, FastAPI, Airflow, etc.) Experience designing and interpreting A/B tests or other experimental frameworks Background in consulting, agency, or fast-paced environments where autonomy and More ❯
to-Have: Experience with marketing data, customer-level modelling, or decision science (e.g. uplift, attribution, causal AI, optimization) Familiarity with MLOps tooling (MLflow, FastAPI, Airflow, etc.) Experience designing and interpreting A/B tests or other experimental frameworks Background in consulting, agency, or fast-paced environments where autonomy and More ❯
management (Gitlab) Implement monitoring using third-party systems (Checkmk, Grafana, Prometheus) and develop bespoke solutions Work with GDFI Support to schedule production jobs (Tidal, Airflow, Cron) Take action to ensure our production-critical applications are available from pre-trading sessions throughout the trading day, with minimal downtime and disruption More ❯
management (Gitlab) Implement monitoring using third-party systems (Checkmk, Grafana, Prometheus) and develop bespoke solutions Work with GDFI Support to schedule production jobs (Tidal, Airflow, Cron) Take action to ensure our production-critical applications are available from pre-trading sessions throughout the trading day, with minimal downtime and disruption More ❯
technical foundation that enables our AI-driven workflows Skills and Qualifications Experience building end-to-end platform solutions that integrate workflow orchestration systems (like Airflow, Temporal, AWS Step Functions) with real-world business processes and data pipelines Strong background in integration engineering and data modelling Exceptional Python skills for More ❯
technical foundation that enables our AI-driven workflows Skills and Qualifications Experience building end-to-end platform solutions that integrate workflow orchestration systems (like Airflow, Temporal, AWS Step Functions) with real-world business processes and data pipelines Strong background in integration engineering and data modelling Exceptional Python skills for More ❯
or the travel industry. Conducted and analysed large scale A/B experiments Experience mentoring team members Experience with workflow orchestration technologies such as Airflow, Dagster or Prefect Experience with technologies such as: Google Cloud Platform, particularly Vertex AI Docker and Kubernetes Perks of joining us: Company pension contributions More ❯
Tomcat, UNIX tools, Bash/sh, SQL, Python, Hive, Hadoop/HDFS, and Spark. Work within a modern cloud DevOps environment using Azure, Git, Airflow, Kubernetes, Helm, and Terraform. Demonstrate solid knowledge of computer hardware and network technologies. Experienced in writing and running SQL and Bash scripts to automate More ❯
london, south east england, united kingdom Hybrid / WFH Options
Kantar Media
Tomcat, UNIX tools, Bash/sh, SQL, Python, Hive, Hadoop/HDFS, and Spark. Work within a modern cloud DevOps environment using Azure, Git, Airflow, Kubernetes, Helm, and Terraform. Demonstrate solid knowledge of computer hardware and network technologies. Experienced in writing and running SQL and Bash scripts to automate More ❯
data analytics platform on AWS, employing the AWS Cloud Development Kit (CDK). Construct resilient and scalable data pipelines using SQL/PySpark/Airflow to effectively ingest, process, and transform substantial data volumes from diverse sources into a structured format, ensuring data quality and integrity. Devise and implement More ❯
data quality concepts, methodologies, and best practices. Proficiency in SQL and data querying for data validation and testing purposes. Hands-on experience with Snowflake, Airflow or Matillion would be ideal. Familiarity with data integration, ETL processes, and data governance frameworks. Solid understanding of data structures, relational databases, and data More ❯
Office skills, especially MS Excel. Proficiency with SQL (particularly SQL server for Oracle). Proficiency with Python. Experience with Azure- preferred but not required. Airflow/Cloud computing- preferred but not required. Abilities to lead and own workstreams from start to finish. The ideal candidate will be curious, persevering More ❯
Astronomer designed Astro, an industry-leading, orchestration-first DataOps platform for data teams. Powered by Airflow, Astro accelerates building reliable data products that unlock insights, unleash AI value, and drive data-driven applications. We're a globally-distributed and rapidly growing venture-backed team of learners, innovators and collaborators. More ❯