designing and constructing robust data pipelines using the best of open-source data engineering and scientific Python toolset. Tech Stack: Airbyte AWS Glue Pandas Pyspark Delta Lake PostgreSQL The team follows agile ways of working and you engage with various stakeholders across the business. The role is Hybrid more »
experience-related problems such as workforce management, demand forecasting, or root cause analysis Strong visualisation skills including experience with Tableau Familiarity with Databricks and PySpark for data manipulation and analysis Familiarity with Git-based source control methodologies, including branching and pull requests A self-starter, passionate about converting data more »
Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
come with scaling a company • The ability to translate complex and sometimes ambiguous business requirements into clean and maintainable data pipelines • Excellent knowledge of PySpark, Python and SQL fundamentals • Experience in contributing to complex shared repositories. What’s nice to have: • Prior early-stage B2B SaaS experience involving client more »
STEM subject e.g. Mathematics, Statistics or Computer Science Experience in personalisation, segmentation with a focus on CRM Retail experience is a bonus Experience with PySpark/Azure/Databricks is a bonus Experience of management NEXT STEPS: If this role looks of interest, please reach out to Joseph Gregory. more »
Proficiency in modern programming languages and database querying languages. Comprehensive understanding of the Software Development Life Cycle and Agile methodologies. Familiarity with Python or PySpark and cloud technologies like AWS, Kubernetes, and Spark is highly desirable. Ideal Candidate: Someone with a knack for innovative solutions and a commitment to more »
vision to convert data requirements into logical models and physical solutions. with data warehousing solutions (e.g. Snowflake) and data lake architecture, Azure Databricks/Pyspark & ADF. retail data model standards - ADRM. communication skills, organisational and time management skills. to innovate, support change and problem solve. attitude, with the ability more »
directly with clients. Supporting clients in platform discovery, integration, training, and collaboration on data science projects. Proficiency in technical skills, particularly Python, R, SQL, Pyspark, and JavaScript. Assisting users in mastering the platform. Analysing diverse data and ML applications. Providing strategic insights to ensure customer success. Collaborating with customers more »
Python, PySpark, AWS, Oracle, Kafka, Banking Become a key member of an agile team designing and delivering a market-leading secure and scalable reporting product. The team is migrating from Oracle to AWS cloud so experience with either is nice to have. You serve as a seasoned member of … candidates able to work under these requirements without visa sponsorship will be able to be considered at this time. Required skills: - Either Python or PySpark experience. Ideally, you will be working as either a Python Developer (3+ years) or Data Engineer using PySpark - Either AWS or Oracle. Candidates more »
experience in PowerBi would also be useful. You will be a Engineer with past experience with java, data, and infrastructure (devOps). Java, Python, PySpark Mechanisms: MongoDB, Redshift, AWS S3 Environments/Infra: AWS (required), [AWS Lambda, Terraform] (nice to have) Platforms: Creating data pipelines within Databricks or equivalent more »
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Synchro
contract position. If you possess a solid background in software application development, with experience in cloud or microservice architecture, and proficiency in Python or PySpark, we encourage you to apply. Python App Developer Requirements: Proficiency with Python or PySpark. Exposure to cloud technologies such as Airflow, Astronomer, Kubernetes, AWS more »
of 4 years commercial experience Strong proficiency in AWS and relevant technologies Good communication skills Preferably experience with some or all of the following; Pyspark, Kubernetes, Terraform Expeirence in a start-up/scale up or fast-paced environment is desirable. HOW TO APPLY Please register your interest by more »
London. Responsibilities: Collaborate with cross-functional teams to gather requirements and implement solutions. Develop and maintain data processing applications using Python. Optimise and tune PySpark jobs for performance and scalability. Ensure data quality, reliability, and integrity throughout the data processing pipelines. Technical Requirements: Python: Proficiency in Python programming. Object … Oriented Design: Solid understanding of object-oriented principles and design patterns. PySpark: Experience with PySpark for data processing and analytics. Azure: Familiarity with Azure services and cloud platforms. Financial Services Background: Knowledge of financial markets, instruments, and related data. more »
and industry standards for the organization. Strong experience on Azure cloud services like Azure, ADF, ADLS, Synapse Proficiency in querying languages such as SQL, Pyspark, Python and familiarity with data visualization tools (e.g. Power BI). Strong communication skills to gather the business requirements from stakeholder and propose best more »
development in JavaScript, ReactJS (HTML/CSS experience advantageous) Experience of wireframe/prototype mythologies and platforms (Sketch, Figma etc) Experience of Python/PySpark or other coding experience. Relevant experience in working within a complex, demanding UX/UI developer/designer environment. Relevant experience JIRA and/ more »
Python, PySpark, AWS, Oracle, Kafka, Banking Become a key member of an agile team designing and delivering a market-leading secure and scalable reporting product. The team is migrating from Oracle to AWS cloud so experience with either is nice to have. You serve as a seasoned member of … candidates able to work under these requirements without visa sponsorship will be able to be considered at this time. Required skills: - Either Python or PySpark experience. Ideally, you will be working as either a Python Developer (3+ years) or Data Engineer using PySpark - Either AWS or Oracle. Candidates more »
Bournemouth, Dorset, South West, United Kingdom Hybrid / WFH Options
LV= General Insurance
experience of writing clean, well-documented, and unit-tested python ETL processes to read, transform, and verifying data using tools such as SQL and PySpark Cloud technologies and architecture including Databricks, Kubernetes, Data/Delta Lake, Azure Machine Learning, Data Factory. Azure is preferable, but AWS and GCP experience more »
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
LV= General Insurance
experience of writing clean, well-documented, and unit-tested python ETL processes to read, transform, and verifying data using tools such as SQL and PySpark Cloud technologies and architecture including Databricks, Kubernetes, Data/Delta Lake, Azure Machine Learning, Data Factory. Azure is preferable, but AWS and GCP experience more »
V2, Azure Databricks, Azure Function Apps& Logic Apps, Azure Stream Analytics, Azure Resource Manager skills (Terraform, Azure Portal, Az CLI and Az PowerShell) Strong PySpark, Delta Lake, Unity Catalog and Python skills. Includes ability to write unit and integration tests in Python with unittest, pytest, etc. Strong understanding of more »
supply.We’re seeking a Data engineers that is able to demonstrate skills/knowledge such as; Must have programming experience in SQL and Python. PySpark and/or R programming skills are desirableExperience with Azure Cloud Services, particularly Azure Data Factory, Databricks, SQL server/SQL databases, and Storage more »
Proficiency in modern programming languages and database querying languages. Comprehensive understanding of the Software Development Life Cycle and Agile methodologies. Familiarity with Python or PySpark and cloud technologies like AWS, Kubernetes, and Spark is highly desirable. Ideal Candidate: Someone with a knack for innovative solutions and a commitment to more »
Ability to optimise workflows and analysis for map reduce processing Experience with BI software (Power BI, Tableau, Qlik Sense) Any experience with data engineering, PySpark, Databricks, Delta Lakes beneficial Confident presenting complex problems in ways suitable to target audience Experience leading or managing a small analysis team Familiarity working more »
South East London, England, United Kingdom Hybrid / WFH Options
MBN Solutions
understanding of Quality and Information Security principlesExperience with Azure, ETL Tools such as ADF and DatabricksAdvanced Database and SQL skills, alng with SQL, Python, Pyspark, Spark SQLStrong understanding of data model design and implementation principlesData warehousing design patterns and implementationBenefits:£50-£60k DOE Mainly home based working. Twice a more »
scientific Python toolset. Our tech stack includes Airbyte for data ingestion, Prefect for pipeline orchestration, AWS Glue for managed ETL, along with Pandas and PySpark for pipeline logic implementation. We utilize Delta Lake and PostgreSQL for data storage, emphasizing the importance of data integrity and version control in our … testing frameworks. Proficiency in Python and familiarity with modern software engineering practices, including 12factor, CI/CD, and Agile methodologies. Deep understanding of Spark (PySpark), Python (Pandas), orchestration software (e.g. Airflow, Prefect) and databases, data lakes and data warehouses. Experience with cloud technologies, particularly AWS Cloud services, with a more »
experience leading a data engineering team. Key Tech: - AWS (S3, Glue, EMR, Athena, Lambda) - Snowflake, Redshift - DBT (Data Build Tool) - Programming: Python, Scala, Spark, PySpark or Ab Initio - Data pipeline orchestration (Apache Airflow) - Knowledge of SQL This is a 6 month initial contract with a trusted client of ours. more »
start interviewing ASAP. Responsibilities: Azure Cloud Data Engineering using Azure Databricks Data Warehousing Data Engineering Very strong with the Microsoft Stack ESSENTIAL knowledge of PySpark clusters Python & C# Scripting experience Experience of message queues (Kafka) Experience of containerization (Docker) FINANCIAL SERVICES EXPERIENCE (Energy/commodities trading) If you have more »