and orchestration frameworks (Airflow, Dagster) to build scalable pipelines. Knowledge of real-time data movement, databases (Oracle, SQL Server, PostgreSQL), and cloud analytics platforms (Snowflake, Databricks, BigQuery). Familiarity with emerging data technologies like Open Table Format, Apache Iceberg, and their impact on enterprise data strategies. Hands-on experience with More ❯
have worked with cloud-based environments before (we use AWS) SQL: You have a good grasp of SQL, particularly with cloud data warehouses like Snowflake Version control: You are proficient with git Soft Skills: You are an excellent communicator, with an ability to translate non-technical requirements into clear, actionable More ❯
have worked with cloud-based environments before (we use AWS) SQL: You have a good grasp of SQL, particularly with cloud data warehouses like Snowflake Version control: You are proficient with git Soft Skills : You are an excellent communicator, with an ability to translate non-technical requirements into clear, actionable More ❯
and orchestration frameworks (Airflow, Dagster) to build scalable pipelines. Knowledge of real-time data movement, databases (Oracle, SQL Server, PostgreSQL), and cloud analytics platforms (Snowflake, Databricks, BigQuery). Familiarity with emerging data technologies like Open Table Format, Apache Iceberg , and their impact on enterprise data strategies. Hands-on experience with More ❯
Nottingham, Nottinghamshire, United Kingdom Hybrid / WFH Options
Capital One
for example Cloud Practitioner, Associate Architect or Associate Developer. Experience that would be advantageous: Experience in working with the below toolsets: Ab Initio Python Snowflake Sterling File Transfer Logz.io and New Relic AWS services Where and how you'll work This is a permanent position and is based in our More ❯
and orchestration frameworks (Airflow, Dagster) to build scalable pipelines. Knowledge of real-time data movement, databases (Oracle, SQL Server, PostgreSQL), and cloud analytics platforms (Snowflake, Databricks, BigQuery). Familiarity with emerging data technologies like Open Table Format, Apache Iceberg, and their impact on enterprise data strategies. Hands-on experience with More ❯
knowledge of data systems, including relational and non-relational data stores. Experience with big data tools and frameworks (e.g., Apache Spark, Kafka, Flink, or Snowflake). Nice to have: Experience building or implementing data products, schema management, data contracts, data privacy regulations (e.g. GDPR). Experience leading projects, requirements gathering More ❯
and orchestration frameworks (Airflow, Dagster) to build scalable pipelines. Knowledge of real-time data movement, databases (Oracle, SQL Server, PostgreSQL), and cloud analytics platforms (Snowflake, Databricks, BigQuery). Familiarity with emerging data technologies like Open Table Format, Apache Iceberg, and their impact on enterprise data strategies. Hands-on experience with More ❯
and Windows operating systems: security, configuration, and management Database design, setup, and administration (DBA) experience with Sybase, Oracle, or UDB Big data systems: Hadoop, Snowflake, NoSQL, HBase, HDFS, MapReduce Web and Mobile technologies, digital workflow tools Site reliability engineering and runtime operational tools (agent-based technologies) and processes (capacity, change More ❯
IT Management, and developers. Technical degree; Computer Science, Engineering, or Mathematics background highly valued. Experience using big data technologies such as AWS, Azure, Hadoop, Snowflake are a significant plus. Our benefits To help you stay energized, engaged and inspired, we offer a wide range of employee benefits including: retirement investment More ❯
Team and drive solution delivery Partner closely with global counterparts to facilitate regional implementation of global projects Utilize available technology tools (e.g. Tableau, Sigma, Snowflake, Anaplan, Appian etc.) within Blackstone to design and implement technology solutions for business stakeholders Bridging cross-functional teams (client working groups, business owners, engineering, QA More ❯
Dimensional (Kimball) data modelling. • Proficiency in SQL (T-SQL, PL/SQL, Databricks SQL). Desirable: • Databricks (or Alternative Modern Data Platform such as Snowflake). • Experience working in a regulated environment and knowledge of the risk and compliance requirements associated with this. • Oracle Database. • MongoDB. • Cloud Data Technologies (Mainly More ❯
and Dimensional (Kimball) data modelling • Proficiency in SQL (T-SQL, PL/SQL, Databricks SQL) Desirable: • Databricks (or Alternative Modern Data Platform such as Snowflake) • Experience working in a regulated environment and knowledge of the risk and compliance requirements associated with this. • Oracle Database • MongoDB • Cloud Data Technologies (Mainly Azure More ❯
technologies, Athena, S3 and ideally DBT. Experience working with Python, demonstrating proficiency in data manipulation and analysis. Hands-on experience with Hadoop, Big Data, Snowflake, and SQL, showcasing your ability to handle large datasets and derive valuable insights. Please note that if you are NOT a passport holder of the More ❯
stage. A very exciting opportunity! The following skills/experience is required: Strong Data Architect background Experience in Data Technologies to include: Finbourne LUSID, Snowflake, Hadoop, Spark. Experience in Cloud Platforms: AWS, Azure or GCP. Previously worked in Financial Services: Understanding of data requirements for equities, fixed income, private assets More ❯
in AWS with a data focus. Experience in the front office of a financial company, ideally within commodities, is required. Additional technical experience with Snowflake and Terraform would be a huge plus. You’ll have significant input into building new tools and models and will see the impact of your More ❯
75k PA **Key Concepts:** - Jira (Experience in using Jira for requirements and bug tracking) - Testing with Voluminous Data - Experience with S3 Buckets, Airflow Jobs, Snowflake, and CloudWatch - ETL and Data Warehousing - SQL - Testing Concepts - AWS (Amazon Web Services) for Testing - Python Testing **Key Responsibilities & Requirements:** 1. **Software Quality Assurance Experience More ❯
years in data engineering or data science consulting, with strong ETL pipeline development and cloud-based environment experience (Azure, AWS, DataBricks, or Snowflake). Proficient in Python (including numpy, pandas, scikit-learn), SQL, dimensional modeling, Power BI, Git, CI/CD, and VSCode/PyCharm. Proficiency in English and French More ❯
based environment with multi-threaded Java and a good working knowledge of an RDBMS. Some experience with technologies such as MongoDB, Kafka, IBM MQ, Snowflake, or other leading-edge high-performance data and caching technologies would be helpful. The candidate should have strong analytical skills, strong software engineering skills, a More ❯
SR2 | Socially Responsible Recruitment | Certified B Corporation™
.//Excellent programming skills in Python.//SQL/NoSQL database management systems (PostgresSQL, MongoDB).//Familiar with BigQuery, Snowflake, Firebolt and/or Amazon (preferred).//Ability to work on messy, complex real-world data challenges.//Knowledge of BI Tools More ❯
london, south east england, united kingdom Hybrid / WFH Options
SR2 | Socially Responsible Recruitment | Certified B Corporation™
.//Excellent programming skills in Python.//SQL/NoSQL database management systems (PostgresSQL, MongoDB).//Familiar with BigQuery, Snowflake, Firebolt and/or Amazon (preferred).//Ability to work on messy, complex real-world data challenges.//Knowledge of BI Tools More ❯
based technologies (Azure, AWS, or GCP). You'll design scalable data architectures, including data lakes, lakehouses, and warehouses, leveraging tools such as Databricks, Snowflake, and Azure Synapse. The ideal candidate will have a deep technical background in data engineering and a passion for leading the development of best-in More ❯
identify wider business impact, risk and opportunities, making connections across key outputs and processes. You’ll also demonstrate: Knowledge of data architecture, key tooling (Snowflake, AWS, Tableau) and relevant coding languages (SQL, SAS, Python) Strong knowledge of data management principles Experience of translating data and insights for key stakeholders Knowledge More ❯
pipelines, and doing transformation and ingestion in a certain tech suite. Minimum Requirements: Extensive experience in implementing solutions around the AWS cloud environment (S3, Snowflake, Athena, Glue). In-depth understanding of database structure principles. Strong knowledge of database structure systems and data mining. Excellent understanding of Data Modelling (ERwin More ❯
data modeling, and business intelligence best practices. Knowledge of Agile, DevOps, Git, APIs, microservices, and data pipeline development . Familiarity with Spark, Kafka, or Snowflake is a plus. Desirable Certifications: Microsoft Certified: Fabric Analytics Engineer Associate Why Join Us? Competitive salary up to £70,000 per year Opportunities for growth More ❯