Mart. Utilize Vector Databases, Cosmos DB, Redis, and Elasticsearch for efficient data storage and retrieval. Demonstrate proficiency in programming languages including Python, Spark, Databricks, Pyspark, SQL, and ML Algorithms. Implement Machine Learning models and algorithms using Pyspark, Scikit Learn, and other relevant tools. Manage Azure DevOps, CI/… environments, Azure Data Lake, Azure Data Factory, Microservices architecture. Experience with Vector Databases, Cosmos DB, Redis, Elasticsearch. Strong programming skills in Python, Spark, Databricks, Pyspark, SQL, ML Algorithms, Gen AI. Knowledge of Azure DevOps, CI/CD pipelines, GitHub, Kubernetes (AKS). Experience with ML/OPS tools such more »
in STEM subjects. Strong experience in data pipelines and deploying ML models Preference for experience in retail/marketing Tech across: Python, AWS, Databricks, PySpark, AB Testing, MLFlow, APIs Experience in feature engineering and third-party data Apply below more »
Mid-or-Senior level Data Scientist Solid knowledge of Data Engineering principles, including productionisation Technical experience with some or all of the following: Python, PySpark, scikit-learn, pandas, Azure Data Services, Databricks. If this sounds of interest, please apply. more »
not received on time. Communicating outages with the end users of a data pipeline What We Value Comfortable reading and writing code in Python, Pyspark and Java. Basic understanding of Spark and interested in learning the basics of tuning Spark jobs. Data pipeline monitoring team members should be able more »
performance, scalability, and reliability. Technical Skills required: RedShift Glue (inc. Glue Studio, Glue Data Quality, Glue DataBrew) Step Functions Athena Lambda Kinesis Python, Spark, Pyspark, SQL Your contributions as a Data Engineer will directly impact the organization's operations and revenue. In addition to a competitive annual salary, we more »
of 4 years commercial experience Strong proficiency in AWS and relevant technologies Good communication skills Preferably experience with some or all of the following; Pyspark, Kubernetes, Terraform Expeirence in a start-up/scale up or fast-paced environment is desirable. HOW TO APPLY Please register your interest by more »
Senior Databricks Migration Consultant Remote This role demands in-depth knowledge of data engineering, cloud technologies (preferably AWS), and a successful record in enterprise level data migrations into Databricks. You will ensure the efficient and secure transition of our data more »
South East London, London, United Kingdom Hybrid / WFH Options
The Bridge (IT Recruitment) Limited
a long term contract, inside IR35 on a remote basis. The key skills required for this Python Developer role are: Python ETL Azure Databricks Pyspark If you do have the required skills for this remote Python Developer contract, please do apply. more »
on very complex systems Strong experience with Computer vision Longevity in their previous roles Experience with Remote Sensing highly desirable Stack: Python, PyTorch, AirFlow, PySpark (equivalent tools are fine more »
engineering leaders/stakeholders in decision making and implementing the models into production. You will need to have hands on skills in Python and PySpark, experience working in a cloud environment and knowledge of development tools like Git or Docker. You can also expect to work with the latest more »
experience as a Data Engineer. Strong proficiency in AWS and relevant technologies Good communication skills Preferably experience with some or all of the following; Pyspark, Kubernetes, Terraform Expeirence in a start-up/scale up or fast-paced environment is desirable. HOW TO APPLY Please register your interest by more »
to the table. Key Responsibilities Engineer and orchestrate data flows & pipelines in a cloud environment using a progressive tech stack e.g. Databricks, Spark, Python, PySpark, Delta Lake, SQL, Logic Apps, Azure Functions, ADLS, Parquet, Neo4J, Flask Ingest and integrate data from a large number of disparate data sources Design … Spark/Databricks or similar Experience working in a cloud environment (Azure, AWS, GCP) Experience in at least one of: Python (or similar), SQL, PySpark Experience in building data pipeline/ETL/ELT solutions Ability and strong desire to research and learn new technologies and languages Interest in more »
Qualifications PhD/Degree/MSc in a highly numerate subject (e.g. Data Analytics, MORSE, Mathematics, Statistics, Physics) Mandatory Experience Knowledge of Python/Pyspark, SQL or R programming. Experience of applying appropriate analytical tools and methods to provide high quality quantitative and qualitative analysis. Able to identify information more »
Employment Type: Contract
Rate: £20 - £20.47/hour Hybrid Working, Flexible Working
Engineer you will be pivotal in designing, developing, and maintaining data architecture and infrastructure. The ideal candidate should have a strong foundation in Python, PySpark, SQL, and ETL processes, along with proven experience in implementing solutions in a cloud environment. Roles & Responsibilities: Experienced Data Engineer with a background in … and mastering to management and distribution of large datasets. Mandatory Skills: 6+ Years of experience in Design, build, and maintain data pipelines using Python, PySpark and SQL. Develop and maintain ETL processes to move data from various data sources to our data warehouse on AWS. Collaborate with data scientists more »
City of London, London, United Kingdom Hybrid / WFH Options
TALENT INTERNATIONAL UK LTD
Azure services such as Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Create and optimize data processing workflows in Databricks using PySpark and Spark SQL. Ensure ETL coding standards are met, including self-documenting code and reliable testing. Apply best practice data encryption techniques and standards … design. Extensive experience with Azure data products including Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Proficient in developing with Databricks, PySpark, and Spark SQL. Strong understanding of ETL coding standards, including standardized, self-documenting code and reliable testing. Knowledge of data encryption techniques and standards. more »
and Infrastructure. Minimum 3 years of hands-on experience implementing AI/ML solutions and platform tooling for Data Science. Expert in Spark SQL, PySpark, (Python and/or R programming language) which includes experience in libraries such as Pandas, scikit-learn, R (tidyverse, glm, caret etc ), MLFlow, Experimentation more »
Lead Azure Data Engineer | PySpark (Python) & Synapse | Tech for Good/Charity Essential skills required: Azure – solid experience required of the Azure Data ecosystem Python - ESSENTIAL as PySpark is used heavily. You will be tested on PySpark Azure Synapse - ESSENTIAL as is used heavily Spark Azure Data … people comprising developers, data engineers, QA and DevOps. Essential skills required: Azure – solid experience required of the Azure Data ecosystem Python - ESSENTIAL as PySpark is used heavily. You will be tested on PySpark Azure Synapse - ESSENTIAL as is used heavily Spark Azure Data Lake/Data Bricks/… architecture Familiar with Synapse CI/CD Azure Purview or another governance tool experience. Familiar with building Catalogs and lineage Lead Azure Data Engineer | PySpark (Python), Synapse, Data Lake | Tech for Good/Charity more »
hybrid data warehouse design principles. Utilize cloud data products such as Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Databricks and PySpark Development Develop in Databricks with experience coding in PySpark and Spark SQL. Ensure ETL code is standardized, self-documenting, and can be reliably … using cloud data products like Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Experienced in developing with Databricks and coding in PySpark and Spark SQL. Thorough understanding of coding standards for ETL processes. Knowledgeable about best practice data encryption techniques and standards. Familiar with relevant legislation more »
get the most from their data. They are looking for someone with core skills in SQL complimented with Azure experience (Azure Data Factory, Databricks, PySpark etc) This is a very exciting time to join as they shake things up across the industry, so please get in touch asap to … Working with and modelling data warehouses. Skills & Qualifications Strong technical expertise using SQL Server and Azure Data Factory for ETL Solid experience with Databricks, PySpark etc. Understanding of Agile methodologies including use of Git Experience with Python and Spark Benefits £65,000 - £75,000 Bonus To apply for this more »
engineering pipelines using Microsoft Azure technologies such as SQL Azure, Azure Data Factory, and Azure Databricks. Develop and optimize data solutions using Azure SQL, PySpark, and PySQL to handle complex data transformation and processing tasks. Implement and manage data storage solutions in One Lake and Azure SQL, ensuring data … integrity and accessibility. Proficient in Python, PySpark, and PySQL for data processing and analytics tasks. Experience with Power BI and other reporting and analytics tools. Demonstrated knowledge of OLAP, data warehouse design concepts, and performance optimizations in database and query processing. Knowledge of .NET frameworks is highly preferred. Excellent more »
Synapri collaborates with a large transport company, who is seeking a Senior Data & Analytics Developer for a 12 months contract. You will be involved in the design, development and configuration of applications, components and tools according to the technical plans more »
I have placed quite a few candidates with this organisation now, all have said glowing reviews about it, they're the leader in legal representation comparison and have rather interesting, unique data to work with. They are using modern Azure more »
the opportunity to develop data pipelines into Azure & AWS, as well as contributing to some SAS development on legacy projects when required. Python or PySpark and SQL will be your bread and butter, with any experience of Airflow being a great bonus. The core skillset: Python/PySparkmore »