engineers of varying levels of experience. Flexibility and willingness to adapt to new software and techniques. Nice to Have Experience working with projects in ApacheSpark, Databricks of similar. Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities? A technical expert and leader on the more »
engineers of varying levels of experience. Flexibility and willingness to adapt to new software and techniques. Nice to Have Experience working with projects in ApacheSpark, Databricks of similar. Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities? A technical expert and leader on the more »
or similar technologies. Hands-on experience with AWS and snowflake. Financial services industry experience (highly desirable). Experience with Big Data technologies such as Spark or Hadoop. Bachelor's degree in computer science, Engineering, or equivalent. Further information available upon application. ECS Recruitment Group Ltd is acting as an more »
London, England, United Kingdom Hybrid / WFH Options
Anson McCade
tools such as Informatica MDM, Informatica AXON, Informatica EDC, and Collibra • MySQL, SQL Server, Oracle, Snowflake, PostgreSQL and NoSQL databases • Programming languages such as Spark or Python • Amazon Web Services, Microsoft Azure or Google Cloud and distributed processing technologies such as Hadoop Benefits : • Base Salary more »
design and coding skills (e.g. Python, Scala, JavaScript). Experience with Microsoft or AWS data stack e.g. Microsoft Azure Data Lake, Hadoop (preferably with Spark), Cosmos DB, HDInsight/HBase, MongoDB, Redis, Azure Table/Blob stores etc. Exposure to tools like SAP technologies and Alteryx always useful. Experience more »
Platforms Must have 8+ years' Experience with Relational Databases like Oracle, NoSQL Databases and/or Big Data technologies (e.g. Oracle, SQL Server, Postgres, Spark, Hadoop, other Open Source). Must have experience in Data Security Solutions (Identity and Access Management and Data Security Access Management) Must have 3+ more »
Engineer, with expertise developing scalable data pipelines. Strong object oriented programming skills, particularly in Python . Experience with data lakes and data warehousing solutions ( Spark, Dataflow, BigQuery ). Knowledge of SQL and experience with relational databases, as well as NoSQL databases Familiarity with cloud services (preferably GCP ) and understanding more »
Engineering experience Demostrate In-depth knowledge of large-scale data platforms (Databricks, Snowflake) and cloud-native tools (Azure Synapse, RedShift) Experience of analytics technologies (Spark, Hadoop, Kafka) Have familiarity with Data Lakehouse architecture, SQL Server, DataOps, and data lineage concepts Demonstrate In-depth knowledge of large-scale data platforms more »
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
Bachelor's degree in Computer Science or a related field (Master's degree preferred) Nice to have: experience with LLMs, Vector Databases, AWS EMR, Spark, and Python Our commitment: Equal opportunities are important to us. We believe that diversity and inclusion at The Stepstone Group are critical to our more »
an Azure Solution Architect in Microsoft Data and AI. Extensive, hands-on, current experience in Azure data and AI technologies, like Fabric, Synapse, Databricks, Spark, Python, GitHub, Data Factory, Azure Data Lake, Power BI, Cognitive Services, Purview, etc. Strong data analysis and modeling skills. High-level understanding of Azure more »
classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau more »
quality of data. Key Requirements: Strong experience designing data pipelines/warehouses using AWS and Snowflake. Exposure to big data technologies such as Kafka, Spark, or Hadoop. Solid experience with Snowflake, including performance optimisation and cost management. Strong experience with SQL and Data modelling. Excellent understanding of AWS architecture more »
multiple tasks and projects simultaneously. Preferred Qualifications AWS Certified Solutions Architect or other relevant AWS certifications. Experience with big data technologies such as Hadoop, Spark, or similar. Knowledge of data governance and data quality best practices. Familiarity with machine learning and AI concepts and tools. more »
Wandsworth, Greater London, Dundonald, United Kingdom
DataBuzz
Bricks setup using Terraform experience. * Experience of MLOps and DataOps. * Experience of using container technologies, cloud platforms (ideally AWS), and distributed processing frameworks like Spark and Dask. * Experience in Javascript application development and UI design. * Expertise in developing mobile applications. * Familiarity with the agile software development process. If you more »
to 10% What Will Help You On The Job Familiarity with running software services at scale AWS Infrastructure, Airflow, Kafka and data streaming using Spark/Scala Understanding of networking fundamentals (OSI layers 2-7) Technical and software engineering background in the areas of cloud computing, enterprise computing, serversand more »
they are on the lookout for 2 AWS Data Engineers to come in on a contract basis. Key Skills/Requirements: Must have Python & Spark experience Must have strong AWS experience Must have Terraform experience SQL & NoSQL experience Have built out Data Warehouses & built Data Pipelines Strong Databricks & Snowflake more »
City of London, London, United Kingdom Hybrid / WFH Options
Oliver Bernard Ltd
Experience with a JVM language, Kotlin, Java, Scala, Clojure Knowledge of Typescript and React is beneficial Exposure to data pipelines using technologies such as Spark and Kafka Experience with cloud services (ideally AWS) Hybrid working 1-2 days per week in Central London. £110,000 depending on experience. Please more »
data products like Data Factory, Event Hubs, Data Lake, Synapse, Azure SQL server. Knowledge in developing in Databricks and experience in coding with PySpark, Spark SQL. Experience in design and development of complex data and analytics solutions in an iterative manner for large enterprise business/data warehouse implementations more »
design. Cloud data products such as: Data factory Events Hubs Data Lake Synapse Azure SQL Server Experience developing Databricks and coding with PySpark and Spark SQL. Proficient in ETL coding standards Data encryption techniques and standards Knowledge of relevant legislation such as: Data Protection Act, EU Procurement Directives, Freedom more »
analytical abilities. Preferred Skills: Experience with cloud databases (e.g., AWS, Azure, Google Cloud Platform). Knowledge of big data tools and frameworks (e.g., Hadoop, Spark). Certification in database management (e.g., Microsoft Certified: Azure Data Engineer Associate). What We Offer: Competitive salary and comprehensive benefits package. Opportunity to more »
with JavaScript or Python Experience deploying software into the cloud and on-premise. Developing software products. Experience with EKS, Kubernetes, OpenSearch/ElasticSearch, MongoDB, Spark or NiFi. Experience with microservices architectures. Experience with AI/ML systems TO BE CONSIDERED…. Please either apply by clicking online or emailing more »
master and meta data management Experience with Azure SQL Database, Azure Data Factory, Azure Storage, Azure IaaS/PaaS related database implementations. Experience with Apachespark and new Fabric framework would be a plus. more »
as R, Python, Azure, Machine Learning (ML), and Databricks. Essential criteria and experience Proficiency in one or more analytical tools eg R, Python, Tableau, ApacheSpark, etc. Proficiency in Azure Machine Learning and Azure Data Bricks. Pro-activity and self-starting attitude. Excellent analytical and problem-solving ability. more »
for business improvements Lead a small team of data scientist on Neural Networks LLMs (CNN & RNN), ML, & NLP NLP/AI/ML/Spark/Python/Data scientist/Machine Learning Engineer/OCR/Deep Learning Requirements Bachelor's degree or equivalent experience in quantitative field more »
products like Data Factory, Event Hubs, Data Lake, Synapse, Azure SQL server. Detailed knowledge in developing in Databricks and experience in coding with PySpark. Spark SQL ETL coding standards: ensuring that code is standardised, self-documenting and can be reliably tested Knowledge of best practice data encryption techniques and more »