Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
and experience of this is a strong preference. However other Cloud platFforms like AWS/GCP are acceptable.Coding - Experience using Python with data (Pandas, PySpark) would be an advantage. Other languages such as C# would be beneficial but not essential.What’s next?If you believe you have the desired more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
experience of this is a strong preference. However other Cloud platFforms like AWS/GCP are acceptable. Coding - Experience using Python with data (Pandas, PySpark) would be an advantage. Other languages such as C# would be beneficial but not essential. What’s next? If you believe you have the more »
Bournemouth, Dorset, South West, United Kingdom Hybrid / WFH Options
LV= General Insurance
experience of writing clean, well-documented, and unit-tested python ETL processes to read, transform, and verifying data using tools such as SQL and PySpark Cloud technologies and architecture including Databricks, Kubernetes, Data/Delta Lake, Azure Machine Learning, Data Factory. Azure is preferable, but AWS and GCP experience more »
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
LV= General Insurance
experience of writing clean, well-documented, and unit-tested python ETL processes to read, transform, and verifying data using tools such as SQL and PySpark Cloud technologies and architecture including Databricks, Kubernetes, Data/Delta Lake, Azure Machine Learning, Data Factory. Azure is preferable, but AWS and GCP experience more »
in Azure Cloud technologies e.g., ML/OPS, ML Flow, Azure Data Factory, Azure Function, Data Bricks, Event Hub, Microservices/API, Python/PYSPARK/R, or SQL 2+ years of experience designing, developing, and implementing Big Data platforms using Azure Cloud architecture with structured and unstructured data more »
in Azure Cloud technologies e.g., ML/OPS, ML Flow, Azure Data Factory, Azure Function, Data Bricks, Event Hub, Microservices/API, Python/PYSPARK/R, or SQL 2+ years of experience designing, developing, and implementing Big Data platforms using Azure Cloud architecture with structured and unstructured data more »
related field Certifications such as Azure Data Engineer Associate are desirable. Knowledge of data ingestion methods for real-time and batch processing Proficiency in PySpark and debugging Apache Spark workloads. What’s in it for you? Annual bonus scheme – up to 10% Excellent pension scheme Flexible working Enhanced family more »
the insurance domain is advantageous. - Education : A degree in Computer Science, Data Science, Engineering, or a related discipline. Technical Skills : Proficient in Python, SQL, PySpark, and Databricks. Demonstrated proficiency in modern NLP techniques and tools. Proven track record in developing and managing data quality metrics and dashboards. Experience collaborating more »
supply.We’re seeking a Data engineers that is able to demonstrate skills/knowledge such as; Must have programming experience in SQL and Python. PySpark and/or R programming skills are desirableExperience with Azure Cloud Services, particularly Azure Data Factory, Databricks, SQL server/SQL databases, and Storage more »
West Midlands, Dudley, West Midlands (County), United Kingdom Hybrid / WFH Options
Concept Resourcing
standardise the Data Warehouse environment and solutions. Required Skills, Knowledge, and Experience: Experience designing, developing, and testing Azure Data Factory/Fabric pipelines, including PySpark Notebooks and Dataflow Gen2 workflows. Previous experience with Informatica Power Centre is desired. Significant experience in designing, writing, editing, debugging, and testing SQL code more »
Proficiency in modern programming languages and database querying languages. Comprehensive understanding of the Software Development Life Cycle and Agile methodologies. Familiarity with Python or PySpark and cloud technologies like AWS, Kubernetes, and Spark is highly desirable. Ideal Candidate: Someone with a knack for innovative solutions and a commitment to more »
Cycle · Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Preferred qualifications, capabilities, and skills: · Skilled with Python or PySpark · Exposure to cloud technologies (Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka) · Experience with Big Data solutions or Relational DB. · Experience in Financial Service Industry is more »
applied machine learning, probability, statistics, and quantitative risk modelling. High proficiency in Python & SQL. Experience with big data technologies and tools, particularly Databricks and Pyspark, is highly desirable. Experience in agile software development processes is a plus. Experience in insurance, cyber, or a related domain is ideal. Understanding of more »
Exeter, Devon, South West, United Kingdom Hybrid / WFH Options
Staffworx Limited
Ability to optimise workflows and analysis for map reduce processing Experience with BI software (Power BI, Tableau, Qlik Sense) Any experience with data engineering, PySpark, Databricks, Delta Lakes beneficial Confident presenting complex problems in ways suitable to target audience Experience leading or managing a small analysis team Familiarity working more »
Ability to optimise workflows and analysis for map reduce processing Experience with BI software (Power BI, Tableau, Qlik Sense) Any experience with data engineering, PySpark, Databricks, Delta Lakes beneficial Confident presenting complex problems in ways suitable to target audience Experience leading or managing a small analysis team Familiarity working more »
cloud platformsExpertise on Azure Cloud platformKnowledge on orchestrating workloads on cloudAbility to set and lead the technical vision while balancing business driversStrong experience with PySpark, Python programmingProficiency with APIs, containerization and orchestration is a plusQualifications:Bachelor's and/or master’s degree About you:You are self-motivated more »
West Midlands, Dudley, West Midlands (County), United Kingdom Hybrid / WFH Options
Concept Resourcing
standardise the Data Warehouse environment and solutions. Required Skills, Knowledge, and Experience: Experience designing, developing, and testing Azure Data Factory/Fabric pipelines, including PySpark Notebooks and Dataflow Gen2 workflows. Previous experience with Informatica Power Centre is desired. Significant experience in designing, writing, editing, debugging, and testing SQL code more »
Surrey, England, United Kingdom Hybrid / WFH Options
Hawksworth
that are comfortable with terms like; Statistical Models, Computer Vision, Predictive Analytics, Data Visualization, Large Language Models (LLM), NLP, AI, Machine Learning, MLOPs, Python, Pyspark, and Azure. Flexible Working : The role is 60% work from home Sponsorship : Sadly sponsorship isn't available for these roles. About You: Bachelors Degree more »
V2, Azure Databricks, Azure Function Apps& Logic Apps, Azure Stream Analytics, Azure Resource Manager skills (Terraform, Azure Portal, Az CLI and Az PowerShell) Strong PySpark, Delta Lake, Unity Catalog and Python skills. Includes ability to write unit and integration tests in Python with unittest, pytest, etc. Strong understanding of more »
a complete greenfield re-architecture from the ground up in Microsoft Azure. The Tech you'll be playing with: Azure Data Factory Azure Databricks PySpark SQL DBT What you need to bring: 1-3 year's experience in building Data Pipelines SQL experience in data warehousing Python experience would more »
Proficiency in modern programming languages and database querying languages. Comprehensive understanding of the Software Development Life Cycle and Agile methodologies. Familiarity with Python or PySpark and cloud technologies like AWS, Kubernetes, and Spark is highly desirable. Ideal Candidate: Someone with a knack for innovative solutions and a commitment to more »
South East London, England, United Kingdom Hybrid / WFH Options
MBN Solutions
understanding of Quality and Information Security principlesExperience with Azure, ETL Tools such as ADF and DatabricksAdvanced Database and SQL skills, alng with SQL, Python, Pyspark, Spark SQLStrong understanding of data model design and implementation principlesData warehousing design patterns and implementationBenefits:£50-£60k DOE Mainly home based working. Twice a more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
of this is a strong preference. However other Cloud platforms like AWS/GCP are acceptable. • Coding Languages - Experience using Python with data (Pandas, PySpark) would be an advantage. Other languages such as C# would be beneficial but not essential. Their lovely offices are based in the West Midlands more »
the Financial Services with experience in..... Statistical Models, Computer Vision, Predictive Analytics, Data Visualization, Large Language Models (LLM), NLP, AI, Machine Learning, MLOPs Python, Pyspark, Azure, Agile, MetaBase, then please apply. You can 📧 your cv to matt@hawksworthuk.com or message me on LinkedIn. Ideally you'll have plenty of more »
South East London, England, United Kingdom Hybrid / WFH Options
Durlston Partners
on building a massively distributed cloud-hosted data platform that would be used by the entire firm.Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster.Cloud: AWS, Lambdas, ECS servicesThis role would focus on various areas of Data Engineering including:End to End more »