Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Anson McCade
Data Engineer • Location : Belfast based – Hybrid - flexible working • Salary : £42,500 - £50,000 • Package: 10% bonus + 11% pension Overview One of the UK’s leading digital solution provider’s is searching for a Data Engineer to join their practice more »
Senior Data Engineer Up to £70k plus bonus Manchester Are you looking to take your Data Engineer career to the next level?? This company use extremely modern technologies, and you can be certain you will grow within a technical environment. more »
I have placed quite a few candidates with this organisation now, all have said glowing reviews about it, they're the leader in legal representation comparison and have rather interesting, unique data to work with. They are using modern Azure more »
I have placed quite a few candidates with this organisation now, all have said glowing reviews about it, they're the leader in legal representation comparison and have rather interesting, unique data to work with. They are using modern Azure more »
London. Responsibilities: Collaborate with cross-functional teams to gather requirements and implement solutions. Develop and maintain data processing applications using Python. Optimise and tune PySpark jobs for performance and scalability. Ensure data quality, reliability, and integrity throughout the data processing pipelines. Technical Requirements: Python: Proficiency in Python programming. Object … Oriented Design: Solid understanding of object-oriented principles and design patterns. PySpark: Experience with PySpark for data processing and analytics. Azure: Familiarity with Azure services and cloud platforms. Financial Services Background: Knowledge of financial markets, instruments, and related data. more »
such as, Code Repo, Code Workbook, Pipeline Build, migration techniques, Data Connection and Security setup. Design, develop Data Pipelines, and have excellent skills in PySpark and Spark SQL, hands on with code Build and deployment in Palantir. Must lead a team of 6-7 technical associates with PySparkmore »
months to begin with & its extendableLocation: Leeds, UK (min 3 days onsite)Context: Legacy ETL code for example DataStage is being refactored into PySpark using Prophecy low-code no-code and available converters.Converted code is causing failures/performance issues.Skills:Spark Architecture – component understanding around Spark Data Integration (PySpark … SQL, Spark Explain plans.Spark SME – Be able to analyse Spark code failures through Spark Plans and make correcting recommendations.Spark SME – Be able to review PySpark and Spark SQL jobs and make performance improvement recommendations.Spark – SME Be able to understand Data Frames/Resilient Distributed Data Sets and understand any … there are Cluster level failures.Cloudera (CDP) – Knowledge of understanding how Cloudera Spark is set up and how the run time libraries are used by PySpark code.Prophecy – High level understanding of Low-Code No-Code prophecy set up and its use to generate PySpark code. more »
months to begin with & its extendable Location: Leeds, UK (min 3 days onsite) Context: Legacy ETL code for example DataStage is being refactored into PySpark using Prophecy low-code no-code and available converters. Converted code is causing failures/performance issues. Skills: Spark Architecture – component understanding around Spark … Data Integration (PySpark, scripting, variable setting etc.), Spark SQL, Spark Explain plans. Spark SME – Be able to analyse Spark code failures through Spark Plans and make correcting recommendations. Spark SME – Be able to review PySpark and Spark SQL jobs and make performance improvement recommendations. Spark – SME Be able … are Cluster level failures. Cloudera (CDP) – Knowledge of understanding how Cloudera Spark is set up and how the run time libraries are used by PySpark code. Prophecy – High level understanding of Low-Code No-Code prophecy set up and its use to generate PySpark code. more »
Python, PySpark, AWS, Oracle, Kafka, Banking Become a key member of an agile team designing and delivering a market-leading secure and scalable reporting product. The team is migrating from Oracle to AWS cloud so experience with either is nice to have. You serve as a seasoned member of … candidates able to work under these requirements without visa sponsorship will be able to be considered at this time. Required skills: - Either Python or PySpark experience. Ideally, you will be working as either a Python Developer (3+ years) or Data Engineer using PySpark - Either AWS or Oracle. Candidates more »
Python, PySpark, AWS, Oracle, Kafka, Banking Become a key member of an agile team designing and delivering a market-leading secure and scalable reporting product. The team is migrating from Oracle to AWS cloud so experience with either is nice to have. You serve as a seasoned member of … candidates able to work under these requirements without visa sponsorship will be able to be considered at this time. Required skills: - Either Python or PySpark experience. Ideally, you will be working as either a Python Developer (3+ years) or Data Engineer using PySpark - Either AWS or Oracle. Candidates more »