and AI models. Data Engineer Required Experience Data engineering experience (2+ years) Cloud platform proficiency (e.g., AWS, Azure, GCP) Data pipeline development (e.g., Airflow, ApacheSpark) SQL proficiency, database design Visualization tools knowledge (e.g., Tableau, PowerBI, Looker) Data Engineer Application Process This is a 1 year contract requirement more »
tooling Scripting experience (Python, Perl, Bash, etc.) ELK (Elastic stack) JavaScript Cypress Linux experience Search engine technology (e.g., Elasticsearch) Big Data Technology experience (Hadoop, Spark, Kafka, etc.) Microservice and cloud native architecture Desirable Skills Able to demonstrate experience of troubleshooting and diagnosis of technical issues. Able to demonstrate excellent more »
London (city), London, England Hybrid / WFH Options
T Rowe Price
or similar, with 6+ years of professional experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate would be more »
London (city), London, England Hybrid / WFH Options
T Rowe Price
or similar, with 6+ years of professional experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate would be more »
Newcastle Upon Tyne, United Kingdom Hybrid / WFH Options
NHS Counter Fraud Authority
science, e.g., SQL, R and Python alongside the ability to use tools and packages such as Alteryx, Jupyter notebook, R Markdown, TensorFlow, Keras, Pytorch, ApacheSpark etc. oPractical proficiency in producing reproducible code and pipelines including documentation, governance and assurance frameworks, automation and code review using tools such more »
science, e.g., SQL, R and Python alongside the ability to use tools and packages such as Alteryx, Jupyter notebook, R Markdown, TensorFlow, Keras, Pytorch, ApacheSpark etc. oPractical proficiency in producing reproducible code and pipelines including documentation, governance and assurance frameworks, automation and code review using tools such more »
science, e.g., SQL, R and Python alongside the ability to use tools and packages such as Alteryx, Jupyter notebook, R Markdown, TensorFlow, Keras, Pytorch, ApacheSpark etc. oPractical proficiency in producing reproducible code and pipelines including documentation, governance and assurance frameworks, automation and code review using tools such more »
working with AWS technologies such as Lambda, ECS Fargate, API Gateway, RDS, DynamoDB, EMR building customer-facing applications and APIs building data pipelines using Spark + Scala that process Tb of data per day working with customers to understand the business context of new features participating in design reviews more »
Senior Data Engineer - Python/Hadoop/Spark - sought by leading investment bank based in London - Hybrid - contract *inside IR35 - umbrella* Key Responsibilities: Design and implement scalable data pipelines that extract, transform and load data from various sources into the data Lakehouse. Help teams push the boundaries of analytical … processes, and data warehousing. Significant exposure and hands on at least 2 of the programming languages - Python, Java, Scala, GoLang. Significant experience with Hadoop, Spark and other distributed processing platforms and frameworks. Experience working with Open table/storage formats like delta lake, apache iceberg or apache hudi. Experience of developing and managing real time data streaming pipelines using Change data capture (CDC), Kafka and Apache Spark. Experience with SQL and database management systems such as Oracle, MySQL or PostgreSQL. Strong understanding of data governance, data quality, data contracts, and data security best practices. Exposure more »
Sheffield, South Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Experis
knowledge of security principles and best practices for cloud-based solutions. Preferred Skills : Certification in cloud platforms. Experience with big data technologies such as Apache Hadoop, Spark, or Kafka. Knowledge of data governance and compliance frameworks. Familiarity with DevOps practices and tools (e.g., Git, Jenkins, Terraform). All more »
scripting. * Minimum of 1 year experience in investment banking or the financial sector. * Performance Tuning of Oracle/MySQL/Hive SQL Queries/Spark SQL Statements. * Experience in working with large databases - multi terabytes (3+ Terabytes). * Minimum of 5 years' experience in Big Data Space (Hive, Impala … Spark Sql, HDFS etc). * Any cloud experience (AWS/Azure/Google/Oracle). * Solid experience with Oracle objects(Packages,Procedure,functions) * Very clear concepts on Oracle architecture. * Very strong debugging skills. * Proficient in query tuning. * Detail oriented * Strong written and verbal communication skills * Ability to work more »
scripting. * Minimum of 1 year experience in investment banking or the financial sector. * Performance Tuning of Oracle/MySQL/Hive SQL Queries/Spark SQL Statements. * Experience in working with large databases - multi terabytes (3+ Terabytes). * Minimum of 5 years' experience in Big Data Space (Hive, Impala … Spark Sql, HDFS etc). * Any cloud experience (AWS/Azure/Google/Oracle). * Solid experience with Oracle objects(Packages,Procedure,functions) * Very clear concepts on Oracle architecture. * Very strong debugging skills. * Proficient in query tuning. * Detail oriented * Strong written and verbal communication skills * Ability to work more »
As a Data Architect, you'll lead the development of Java and Python projects, design API integrations using Spark, and collaborate with clients and internal teams to translate business requirements into high and low-level designs. You'll also define architecture and technical designs, create data flows and integrations … users and client teams. Stay updated with the latest trends and best practices. Qualifications: Expertise in Java and Python development (Essential). Experience with Spark or Hadoop (Essential). Knowledge of Trino or Airflow (Desirable). Proven ability to design and implement scalable and secure solutions. Excellent communication and more »