Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Damia Group Ltd
Spark Scala Developer - Scala/ApacheSpark - Hybrid/Leeds - £450-£550 Spark Scala Developer to join our client, one of the biggest financial services organizations in the world, with operations in more than 38 countries. It has an IT infrastructure of 200,000+ servers … 000+ database instances, and over 150 PB of data. As a Spark Scala Engineer, you will have the responsibility to refactor Legacy ETL code, for example DataStage into PySpark using Prophecy low-code no-code and available converters. Converted code is causing failures/performance issues. Your responsibilities; As … a Spark Scala Engineer you will be working for the GDT (Global Data Technology) Team, you will be responsible for: Designing, building, and maintaining data pipelines using ApacheSpark and Scala Working on an Enterprise scale Cloud infrastructure and Cloud Services in one of the Clouds (GCP more »
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Damia Group Ltd
Spark/PySpark Architect - 12 months+ -£Inside IR35- Hybrid working of 3 days on site in Leeds My client are a Global Consultancy who are looking for a number of Spark/PySpark Architects to join them on a Long term programme. As the Spark architect, you … Integration upgrade to PySpark Collaboration with multiple customer stakeholders Knowledge of working with Cloud Databases Excellent communication and solution presentation skills. Able to analyse Spark code failures through Spark Plans and make correcting recommendations Able to review PySpark and Spark SQL jobs and make performance improvement recommendations … Able to understand Data Frames/Resilient Distributed Data Sets and understand any memory related problems and make corrective recommendations Able to monitor Spark jobs using wider tools such as Grafana to see whether there are Cluster level failures. As a Spark architect, who can demonstrate deep knowledge more »
in backend development or data engineering. Hands-on experience with AWS services like S3, EC2, and EMR. Proficiency in SQL and experience with CDAP, Spark, and Kafka. Experience building scalable ETL processes and workflows. Strong programming ability with Python, Java, and unit testing. Infrastructure-as-code expertise with CI more »
London, Liverpool, Merseyside, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
rate of £250-£400, falling inside IR35 regulations. Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes using AWS, Databricks, Python, Spark, and SQL. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions. Optimize and troubleshoot data … Glue). Hands-on experience with Databricks for data processing and analytics. Proficient in Python programming for data manipulation and automation. Solid understanding of ApacheSpark for big data processing. Strong SQL skills for data querying, transformation, and analysis. Excellent problem-solving abilities and attention to detail. Ability more »
tooling Scripting experience (Python, Perl, Bash, etc.) ELK (Elastic stack) JavaScript Cypress Linux experience Search engine technology (e.g., Elasticsearch) Big Data Technology experience (Hadoop, Spark, Kafka, etc.) Microservice and cloud native architecture Desirable Skills Able to demonstrate experience of troubleshooting and diagnosis of technical issues. Able to demonstrate excellent more »
or similar technologies. Hands-on experience with AWS and snowflake. Financial services industry experience (highly desirable). Experience with Big Data technologies such as Spark or Hadoop. Bachelor's degree in computer science, Engineering, or equivalent. Further information available upon application. ECS Recruitment Group Ltd is acting as an more »
data products like Data Factory, Event Hubs, Data Lake, Synapse, Azure SQL server. Knowledge in developing in Databricks and experience in coding with PySpark, Spark SQL. Experience in design and development of complex data and analytics solutions in an iterative manner for large enterprise business/data warehouse implementations more »
design. Cloud data products such as: Data factory Events Hubs Data Lake Synapse Azure SQL Server Experience developing Databricks and coding with PySpark and Spark SQL. Proficient in ETL coding standards Data encryption techniques and standards Knowledge of relevant legislation such as: Data Protection Act, EU Procurement Directives, Freedom more »
data analysis techniques Ability to produce clear graphical representations and data visualisations. Knowledge of data analysis tools, for example, R Programming, Tableau Public, SAS, ApacheSpark, Excel, RapidMiner Knowledge of data modelling, data cleansing, and data enrichment techniques An understanding of data protection issues Experience of working with more »
products like Data Factory, Event Hubs, Data Lake, Synapse, Azure SQL server. Detailed knowledge in developing in Databricks and experience in coding with PySpark. Spark SQL ETL coding standards: ensuring that code is standardised, self-documenting and can be reliably tested Knowledge of best practice data encryption techniques and more »
City of London, London, United Kingdom Hybrid / WFH Options
TALENT INTERNATIONAL UK LTD
such as Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Create and optimize data processing workflows in Databricks using PySpark and Spark SQL. Ensure ETL coding standards are met, including self-documenting code and reliable testing. Apply best practice data encryption techniques and standards to ensure … experience with Azure data products including Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Proficient in developing with Databricks, PySpark, and Spark SQL. Strong understanding of ETL coding standards, including standardized, self-documenting code and reliable testing. Knowledge of data encryption techniques and standards. Familiarity with more »
Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Databricks and PySpark Development Develop in Databricks with experience coding in PySpark and Spark SQL. Ensure ETL code is standardized, self-documenting, and can be reliably tested. Apply best practice data encryption techniques and standards. Understand relevant national … data products like Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Experienced in developing with Databricks and coding in PySpark and Spark SQL. Thorough understanding of coding standards for ETL processes. Knowledgeable about best practice data encryption techniques and standards. Familiar with relevant legislation related to more »
As a Data Architect, you'll lead the development of Java and Python projects, design API integrations using Spark, and collaborate with clients and internal teams to translate business requirements into high and low-level designs. You'll also define architecture and technical designs, create data flows and integrations … users and client teams. Stay updated with the latest trends and best practices. Qualifications: Expertise in Java and Python development (Essential). Experience with Spark or Hadoop (Essential). Knowledge of Trino or Airflow (Desirable). Proven ability to design and implement scalable and secure solutions. Excellent communication and more »