Manchester, North West, United Kingdom Hybrid / WFH Options
Viqu Limited
Databricks Engineer – Remote – Manchester - 6 Months Contract (Initial) – Outside IR35 We are seeking 4 Databricks Engineers on a contract basis to assist our client as they migrate over from legacy to AWS. The main focus will be to centralise the organisations data warehouses and consolidating/centralising a number of … BI tools. Databricks Engineers should have extensive experience working with Databricks on recent projects and be exposed to the latest features and toolsets of Databricks Role Responsibilities: Design and develop scalable data pipelines using Databricks. Implement data processing workflows in Python and SQL. Manage and optimise data storage solutions on … AWS. Ensure data quality and integrity across all stages of data processing. Develop and maintain documentation for data engineering processes. Utilise Databricks features, including Delta Live Tables (DLT) to simplify streaming and ensure data quality. Work with asset bundles in Databricks for streamlined data operations. Key Skills: Proven experience as more »
Inmon and hybrid data warehouse design. Cloud data products like Data Factory, Event Hubs, Data Lake, Synapse, Azure SQL server. Knowledge in developing in Databricks and experience in coding with PySpark. Spark SQL ETL coding standards: ensuring that code is standardised, self-documenting and can be reliably tested Knowledge of more »
and unit testing. Infrastructure-as-code expertise with CI/CD pipelines. Ability to communicate complex topics clearly. Nice to Have: Experience with Snowflake, Databricks, and GCP or Azure. Knowledge of streaming data architectures. Data security and compliance implementation. Machine Learning Operations (MLOps) experience. more »
Kimball & inmon, and hybrid data warehouse design. Cloud data products such as: Data factory Events Hubs Data Lake Synapse Azure SQL Server Experience developing Databricks and coding with PySpark and Spark SQL. Proficient in ETL coding standards Data encryption techniques and standards Knowledge of relevant legislation such as: Data Protection more »
hybrid data warehouse design. Knowledge using cloud data products like Data Factory, Event Hubs, Data Lake, Synapse, Azure SQL server. Knowledge in developing in Databricks and experience in coding with PySpark, Spark SQL. Experience in design and development of complex data and analytics solutions in an iterative manner for large more »
in the Banking and Financial Services sector is advantageous. Deep knowledge or experience with using as much of the following: Azure Cloud Data Components | Databricks | Python | PySpark | Terraform | APIs | Lakehouse | Data Mesh | Nosql DBs | GitHub Person Specification Self motivator with a desire to learn new skills and embrace new technologies more »
City of London, London, United Kingdom Hybrid / WFH Options
Concept Resourcing
salesforce projects. Key Responsibilities: Architecture, modelling and leadership skills Strong Azure data skills, including: Azure Data Factory V2, Azure Data Lake Storage V2, Azure Databricks, Azure Function Apps& Logic Apps, Azure Stream Analytics, Azure Resource Manager skills (Terraform, Azure Portal, Az CLI and Az PowerShell) Strong PySpark, Delta Lake, Unity more »
Senior Databricks Migration Consultant Remote This role demands in-depth knowledge of data engineering, cloud technologies (preferably AWS), and a successful record in enterprise level data migrations into Databricks. You will ensure the efficient and secure transition of our data assets to the Databricks Lakehouse Platform, working closely with our … impacts, and project dependencies. - Develop detailed data migration strategies, including architectural plans, data quality evaluations, testing, and creating robust data pipelines and workflows for Databricks best practices. - Maintain data integrity, security, and compliance with regulations like GDPR. - Manage data migration from legacy analytics data warehouses, data lakes, and ETL tools … to Databricks within set deadlines. - Comfortable operating in all project phases, adjusting requirements, and supporting the transition from build and migration to production. - Serve as a Databricks SME, sharing knowledge, technical guidance, and troubleshooting. Basic skillset requirements - Proven experience in AWS and/or Azure data migrations - Databricks architecture, components more »
achieving additional value through innovative modelling. Experience: Experience of Advanced analytics tool stack and developing pipelines from different data sources- API especially. Experience of Databricks is fundamental, working with data scientists and other stakeholders to develop sophisticated analytical solutions. Must have experience in setting up the data pipelines (ELT) from more »
team. As a Data focused Technical Business Analyst you will have strong requirement gathering skills, Stakeholder management but also hands on experience with Azure, Databricks, Data Factory, Power BI and Azure SQL. Location : Remote (Occasionally reporting on site for meetings). Length : 6 Months+ Day Rate : £(Apply online only) per more »
Status: Inside of IR35 Required experience will include: Expertise in designing and implementing data pipelines using Azure services such as Azure Data Factory , Azure Databricks and DBT . Hands-on experience with SQL database design Experience of product lifecycle management principles and tools (e.g. DevOps , terraform ) and relational database manipulation more »
IR35 Required experience will include: 5 years+ QA/Quality Assurance experience Experience of ETL Processing/Data Warehousing testing Excellent Azure experience; including Databricks and Data Factory. Hands-on experience with SQL or Azure SQL. Experience using automated testing on Python frameworks. Experience with Cucumber, Specflow and other frameworks. more »
storage Highly Preferred Technical Skills: Experience with Kubernetes for container orchestration and data platform scalability Expertise in database tuning for performance optimization Familiarity with Databricks for large-scale data processing If interested, please apply or message me directly at more »
and techniques. Experienced using one or more analytical tools e.g. R, Python, Tableau, Apache Spark, etc. Highly skilled in Azure Machine Learning and Azure DataBricks Knowledge and experience of how to maintain data, tools and processes to generate reproducible analysis and implement robust and valuable data solutions If this role more »
Privacy technologies such as homomorphic encryption, SMPC, Federated Learning, TEE. Any prior experience with Data management or tooling hugely beneficia (Data Lakes, Snowflake or Databricks). Experience working closely with engineering teams (particularly those working on Machine Learning and Cryptography products). Ability to deliver product presentations, and showcase software more »
Employment Type: Contract
Rate: From £1,100 to £1,300 per day Outside of IR35
North West London, London, United Kingdom Hybrid / WFH Options
Viqu Limited
Senior Data Engineer (Databricks) – Remote - 6 Months Contract (Initial) – Outside IR35 VIQU are seeking 4 Senior Data engineers (Databricks) on a contract basis to assist our client as they migrate over from legacy to AWS. The main focus will be to centralise the organisations data warehouses and consolidating/centralising … a number of BI tools. Senior Data Engineers should have extensive experience working with Databricks on recent projects and be exposed to the latest features and toolsets of Databricks Role Responsibilities: Design and develop scalable data pipelines using Databricks. Implement data processing workflows in Python and SQL. Manage and optimise … storage solutions on AWS. Ensure data quality and integrity across all stages of data processing. Develop and maintain documentation for data engineering processes. Utilise Databricks features, including Delta Live Tables (DLT) to simplify streaming and ensure data quality. Work with asset bundles in Databricks for streamlined data operations. Key Skills more »
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Viqu Limited
Senior Data Engineer (Databricks) – Remote – Leeds - 6 Months Contract (Initial) – Outside IR35 We are seeking 4 Senior Data engineers (Databricks) on a contract basis to assist our client as they migrate over from legacy to AWS. The main focus will be to centralise the organisations data warehouses and consolidating/… centralising a number of BI tools. Senior Data Engineers should have extensive experience working with Databricks on recent projects and be exposed to the latest features and toolsets of Databricks Role Responsibilities: Design and develop scalable data pipelines using Databricks. Implement data processing workflows in Python and SQL. Manage and … storage solutions on AWS. Ensure data quality and integrity across all stages of data processing. Develop and maintain documentation for data engineering processes. Utilise Databricks features, including Delta Live Tables (DLT) to simplify streaming and ensure data quality. Work with asset bundles in Databricks for streamlined data operations. Key Skills more »
City of London, London, United Kingdom Hybrid / WFH Options
TALENT INTERNATIONAL UK LTD
solutions using Azure services such as Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Create and optimize data processing workflows in Databricks using PySpark and Spark SQL. Ensure ETL coding standards are met, including self-documenting code and reliable testing. Apply best practice data encryption techniques and … warehouse design. Extensive experience with Azure data products including Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Proficient in developing with Databricks, PySpark, and Spark SQL. Strong understanding of ETL coding standards, including standardized, self-documenting code and reliable testing. Knowledge of data encryption techniques and standards. more »
City of London, London, United Kingdom Hybrid / WFH Options
TALENT INTERNATIONAL UK LTD
solutions using Azure services such as Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Create and optimize data processing workflows in Databricks using PySpark and Spark SQL. Ensure ETL coding standards are met, including self-documenting code and reliable testing. Apply best practice data encryption techniques and … warehouse design. Extensive experience with Azure data products including Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Proficient in developing with Databricks, PySpark, and Spark SQL. Strong understanding of ETL coding standards, including standardized, self-documenting code and reliable testing. Knowledge of data encryption techniques and standards. more »
South East London, London, United Kingdom Hybrid / WFH Options
The Bridge (IT Recruitment) Limited
on a long term contract, inside IR35 on a remote basis. The key skills required for this Python Developer role are: Python ETL Azure Databricks Pyspark If you do have the required skills for this remote Python Developer contract, please do apply. more »
CX with potential extension Requirements: 5+ Years as an Azure Data Architect Must have Active SC Clearance Background in financial services Experience working with Databricks Extensive data modelling knowledge Will have designed data lakes more »
the achievement of prioritised Use Cases and Business Objectives. Technical and Business Metadata Management: Develop a strategy for technical and business metadata capture on Databricks Unity Catalog and Purview, ensuring seamless integration between the two. Define the technical and business metadata required at different layers (Bronze, Silver, Gold). Implement more »
London, Liverpool, Merseyside, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
competitive day rate of £250-£400, falling inside IR35 regulations. Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes using AWS, Databricks, Python, Spark, and SQL. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions. Optimize and troubleshoot … Engineer or in a similar role. Strong proficiency in AWS services related to data engineering (e.g., S3, Redshift, Glue). Hands-on experience with Databricks for data processing and analytics. Proficient in Python programming for data manipulation and automation. Solid understanding of Apache Spark for big data processing. Strong SQL more »
Inmon, and hybrid data warehouse design principles. Utilize cloud data products such as Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Databricks and PySpark Development Develop in Databricks with experience coding in PySpark and Spark SQL. Ensure ETL code is standardized, self-documenting, and can be reliably … warehouse design. Proficient in using cloud data products like Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Experienced in developing with Databricks and coding in PySpark and Spark SQL. Thorough understanding of coding standards for ETL processes. Knowledgeable about best practice data encryption techniques and standards. Familiar more »
Birmingham, West Midlands, West Midlands (County), United Kingdom
Randstad Technologies Recruitment
Azure Data Engineer- Birmingham - £500pd Outside I'm currently looking for a Data Engineer to join a company that is based in Birmingham. They need someone to start in the next few weeks. Location: Birmingham (on site 2 times a more »