Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
BJSS
challenges your customers face and how you can help them. You’ll bring wide experience of the data ecosystem, including: numerous data platforms e.g. Databricks; data modelling approaches; public cloud services including AWS and Azure; a deep knowledge and focus on security and privacy and their applications on data platform more »
Birmingham, England, United Kingdom Hybrid / WFH Options
BJSS
You can expect to get involved in variety of projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge more »
Milton Keynes, England, United Kingdom Hybrid / WFH Options
BJSS
You can expect to get involved in variety of projects in the cloud (AWS, Azure, GCP), learning about and using data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. About You You're an engineer at heart and enjoy the challenge more »
that’s working with some of the following to build high throughput, low latency, near real-time products: Java and Scala based Web Services, Databricks Data Lakes (Delta Lakes), AWS Kinesis and MSK, AWS ElasticSearch, AWS RDS, Apache Flink & Spark, scripting using Python, Terraform’s infrastructure as a code. The more »
data solutions from the earliest discovery phases of projects through prototyping, architectural design and delivery. You will be working with Azure tools such as Databricks, Data Factory as well as Hadoop to create big data environments which, in turn, will help businesses to gain greater insight into their big data … BI/Analysis Services/DAX Data Modelling/Data Warehouse Theory Nice to Have Azure Modern Data Platform Services (Data Lake, Data Factory, Databricks, Synapse) Azure DevOps/CI/CD/YAML/ARM/Terraform MSBI Traditional Stack (SQL, SSAS, SSIS, SSRS) Azure Automation/PowerShell Azure more »
practices for data engineering processes. Requirements: Previous experience managing a team of data engineers. Proficiency in Azure (including services like Azure Data Factory, Azure Databricks, etc.). Strong programming skills in Python . Familiarity with data modeling, ETL processes, and data warehousing. Excellent communication and leadership abilities. Additional Information: This more »
team. As a Data focused Technical Business Analyst you will have strong requirement gathering skills, Stakeholder management but also hands on experience with Azure, Databricks, Data Factory, Power BI and Azure SQL. Location : Remote (Occasionally reporting on site for meetings). Length : 6 Months+ Day Rate : £(Apply online only) per more »
for data management and you’ll help design and develop various Data solutions. Required Skills and Qualifications : Solid experience with Python/SQL ADF, Databricks and Synapse knowledge required Strong client-facing experience as a data engineer/consultant. Experience designing and developing ELT/ETL processes Experience with Azure more »
of Python code for you to maintain and optimize. You'll need hands-on coding for this role, therefore just having python experience with Databricks or ADF won't suffice. This is an exciting time to be part of an enterprise environment like this one, gaining invaluable experience and working more »
of Python code for you to maintain and optimize. You'll need hands-on coding for this role, therefore just having python experience with Databricks or ADF won't suffice. This is an exciting time to be part of an enterprise environment like this one, gaining invaluable experience and working more »
and the business stakeholders Investigate and provide solutions for issues (bugs, data errors) Key skills PowerBI – DAX, Power Query (M) Power apps – Advantageous Azure, Databricks Stakeholder management What’s on Offer to £90k Salary package working in either Leeds, Manchester or London + flexible hours More exciting benefits Please apply more »
and the business stakeholders Investigate and provide solutions for issues (bugs, data errors) Key skills PowerBI– DAX, Power Query (M) Power apps – Advantageous Azure, Databricks Stakeholder management What’s on Offer Up to £90k Salary Bonus package Hybrid working in either Leeds, Manchester or London + flexible hours More exciting more »
Manchester Area, United Kingdom Hybrid / WFH Options
Corecom Consulting
and the business stakeholders Investigate and provide solutions for issues (bugs, data errors) Key skills PowerBI– DAX, Power Query (M) Power apps – Advantageous Azure, Databricks Stakeholder management What’s on Offer Up to £90k Salary- DOE Bonus package Hybrid working in either Leeds, Manchester or London + flexible hours More more »
high availability Desired experience: Worked with Python 3.9+ Familiar with Python test automation Experience with SQL and Timeseries databases Familiar with Parquet, Arrow, Airflow, Databricks Experience with cloud AWS services, such as S3, EC2, RDS etc Quality engineering best practice and tooling including TDD, BDD This is an inside IR35 more »
skills with SQL, working with large and complex data sets to extract insights and identify trends Exposure to cloud-based analytical platforms such as Databricks, Snowflake, Google BigQuery etc. A proven background in paid media measurement and web analytics for customer journey purposes Strong technical skills with Google Analytics (GA more »
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Experis
understand of data management frameworks such as DCAM or DAMA and how they can be used to improve data quality Microsoft Azure stack and Databricks experience desirable Number of direct reports 1+ Geographic area of impact Global Size of budget N/a Key stakeholdersBusiness stakeholders, Analytics, Data Governance & Quality more »
Exeter, England, United Kingdom Hybrid / WFH Options
CA Tech Talent
A minimum of 5 years experience within a similar role, experience within the healthcare industry would be advantageous Proven working knowledge of the following: Databricks, Python, Power BI, TSQL, ETL, Agile methodologies and Refactoring Great communication skills, comfortable presenting yourself and information in presentations and/or meetings Company Benefits more »
experience in retail/marketing but not requiredCandidates should be looking to work in a fast paced startup feel environmentTech across: Python, SQL, AWS, Databricks, PySpark, AB Testing, MLFlow, APIsApply below!CONTACTIf you can’t see what you’re looking for right now, send us your CV anyway – we’re more »
select, design, implement and manage data lake/data warehouse platforms. (Some of the following types of providers: AWS, Microsoft Azure, Google Cloud Platform, Databricks, Snowflake, Cloudera, Spark, MongoDB) Done this at companies using high volumes of data, ideally in retailing. Other sectors where used high volume data would also more »
Job Title: Azure Data Engineer (Contract) Duration: 3 months Rate: £400 per day Location: Remote Skills: Databricks/Azure Data Lake/Kubernetes IR35 Determination: Outside IR35 Company Overview: We are seeking a skilled and experienced Azure Data Engineer to join our team on a contract basis. This is an … reliability, efficiency, and performance of our data solutions on the Azure platform. Responsibilities: Design, develop, and implement data solutions using Azure services, including Azure Databricks, Azure Data Lakes, and other cloud-native technologies. Work closely with cross-functional teams to understand data requirements and translate them into technical specifications. Build … Contract Details: Duration: 3months Start Date: ASAP Rate: £400 Outside IR35 Data Engineer/Day Rate/Contractor/Azure/Data Lakes/Databricks/Remote/Outside IR35 Data Engineer/Day Rate/Contractor/Azure/Data Lakes/Databricks/Remote/Outside IR35 Data more »
to link technical solutions with business objectives to drive value. Exceptional communication, presentation, and negotiation skills. Certifications relevant to Azure, AWS, Google Cloud Platform, Databricks, or similar technologies are highly desirable. Willingness to travel as required to meet business needs. What We Offer: Remote Working: Flexible options to support your more »
impacts, debt and risks. Technical knowledge or experience in the following data languages or toolsets: AutoSys, Azure Function App, Azure GIT, Azure Portal, C#, Databricks, GraphQL/Graph API, Informatica CDI, Informatica Power Center, Java, Javascript, Snowflake, PowerBI, PyRecs, Python, Selenium, Spark, and SQL Nice to have skills Ability to more »
bring fresh ideas to the table. Key Responsibilities Engineer and orchestrate data flows & pipelines in a cloud environment using a progressive tech stack e.g. Databricks, Spark, Python, PySpark, Delta Lake, SQL, Logic Apps, Azure Functions, ADLS, Parquet, Neo4J, Flask Ingest and integrate data from a large number of disparate data … in training and learn new technologies and techniques Tackle challenging project tasks demonstrating ownership and responsibility Skills: Experience in processing data using Spark/Databricks or similar Experience working in a cloud environment (Azure, AWS, GCP) Experience in at least one of: Python (or similar), SQL, PySpark Experience in building more »
in community events. Relevant training in the use of elements of Microsoft Azure Technologies (Azure Analysis Service, Data Factory, Data Lake Storage/Analytics, Databricks, Synapse, Purview, Fabric etc.) Experience Essential Significant experience in two or more of the following: Designing and Building Data Warehousing Solutions, SQL Server Integration Services … core BI and data warehouse design principles. Knowledge of latest Microsoft Cloud based technologies (Azure Analysis Service, Data Factory, Data Lake Storage/Analytics, Databricks, Synapse, Purview, Fabric etc.) Desirable Knowledge of cloud technologies and concepts Knowledge of NHS Information requirements, datasets, data dictionary. Knowledge of visualisation & dashboard design best more »
months Skills required for the role: Big data experience Data inventory and data familiarisation Efficient data ingestion and ingestion pipelines Data cleaning and transformation Databricks (ideally with Unity Catalog) Python and PySpark CI/CD (ideally with Azure Devops) Unit testing (PyTest) If you have the above experience and are more »