Greater London, England, United Kingdom Hybrid / WFH Options
InterEx Group
the definition of Big Data architecture with different tools and environments: Cloud (AWS, Azure and GCP), Cloudera, No-sql databases (Cassandra, Mongo DB), ELK, Kafka, Snowflake, etc. Past experience in Data Engineering and data quality tools (Informatica, Talend, etc.) Previous involvement in working in a Multilanguage and multicultural environment more »
management of the platforms, working closely with our stakeholders and other teams. We would like to speak to people who have; Java OR Python Kafka K8S, Docker Lead the end-to-end technical delivery of large complex projects with multiple internal and external stakeholders, identifying dependencies, risks, issues, costs more »
Proficiency in cloud platforms (AWS, GCP, Azure) and containerization (Docker) is a game-changer. Familiarity with caching technologies (Redis) and messaging/streaming tech (Kafka, RabbitMQ) will set you apart. Why You Belong Here Join a fast-growing FinTech company with a vibrant start-up culture and global reach. more »
greenfield payments platform The candidate: 4+ years experience as a Java Developer. Strong experience in Spring, Spring Boot. Experience in AWS Strong experience with Kafka and Microservice based design Experience working with a modern tech stack Expert in domain driven design and test automation CI/CD Pipeline experience more »
Knowledge and understanding of OTC products (Interest Rate Swaps, Variance Swaps, CDS, etc.) bookings. Familiarity with C++ and Big Data tools such as Spark, Kafka, Elastic. Join us and be part of a team that values innovation, collaboration, and excellence. Take your career to new heights with a leading more »
commercial experience as a Data Engineer or similar role within large-scale environments dealing with large data sets. Expertise in SQL & dbt and ideally Kafka Significant Python coding skills Containerisation experience (Dockers, Kubernetes) Cloud Computing experience ( GCP/AWS/Azure ) Strong preference for a Snowflake background but open more »
and manage multiple projects simultaneously. Experience working with the AWS + Services (Sagemaker or equivalent) Desirables: A financial technology background Experience with SQS or Kafka Prior work with CI/CD tools Prior work with a transformation tool The interview process will consist of 3 stages for this position more »
/Knowledge/Experience: Good experience of defining RFPs and RFIs. Hands on experience in Java/Spring boot or node.js, express framework, Git, Kafka and CI/CD Pipelines. Strong experience in AWS Cloud. AWS PaaS such as: KMS, Secrets Manager, AWS DocumentDB, RDS Postgres, SQS. Automated testing more »
frameworks (Flask, Django, FastAPI etc)Knowledge of relation database technologies e.g. Oracle, PostgreSQLExperience developing applications using React or experience with event driven services (e.g. Kafka)Experience with development utilising SDLC tools - Git, JIRA, Artifactory, Jenkins/TeamCity, OpenShift/KubernetesAnalytical thinker, team player and possess strong communication skills with more »
creating data pipelines on a cloud (preferably AWS) environment CI/CD experience Containerization experience (Docker, Kubernetes, etc.) Experience with SQS/SNS, ApacheKafka, RabbitMQ Other interesting/bonus skills – Airflow, Trino, Apache Iceburg, Postgres, MongoDB You *must* be eligible to work in your chosen country of residence more »
in Kubernetes, Docker, Jenkins, and Ansible. Proficiency with monitoring tools like Prometheus and Grafana. Experience with Terraform and cloud formation scripting. Knowledge of ApacheKafka for stream processing. Familiarity with various databases including MySQL, PostgreSQL, Redis, and Solr. Proficient with git and git workflows for version control. Experience in more »
well as experience creating, deploying, and managing containers in message brokering and real-time data streaming Hands-on experience with technologies such as ApacheKafka and Azure Service Bus, capable of building and integrating scalable streaming applications. On offer: Remote working setup Annual leave Working week Wellness incentive Equipment more »
implementation and performance tuning Hadoop/Spark implementations Experience Apache Hadoop and the Hadoop ecosystem Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, Hcatalog, Solr, Avro) Experience with one or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto) Experience developing software code more »
with at least one or more of these programming languages: Python, Scala/Java Experience with distributed data and computing tools, mainly Apache Spark & Kafka Understanding of critical path approaches, how to iterate to build value, engaging with stakeholders actively at all stages. Able to deal with ambiguity and more »
per day. This will be working out of their London based office and 2-3 days per week onsite. Technical skillset: Golang Kafka Kubernetes SQL CI/CD experience with relevant tooling with Jenkins, Docker or Terraform Cloud services experience, ideally with AWS Bonus: Payments experience Python experience For more »
as NoSQL. You will have worked in Cloud environments previously and be familiar with cloud architecture and engineering with Azure and have experience with Kafka and Kubernetes. You will be familiar with architectural design patterns and how these are applied practically in code implementations and have worked in Agile more »
architecture and engineering, ideally on Microsoft Azure (Kubernetes Service, Container Apps, App Service, Functions, Event Grid and Service Bus) Experience with Messaging platforms, ideally Kafka Experience with Containers, ideally Kubernetes Architectural design patterns and how these are applied practically in code implementation SDLC experience in an agile environment If more »
Kubernetes, Terraform, CI/CD tools strong observability experience, ideally with more modern approaches like Prometheus, Grafana, Open Telemetry comfortable with databases exposure to Kafka would be ideal more »
role Passion for data analytics Strong understanding of order book mechanics Java SQL REST and WS APIs experience Has built event driven apps using Kafka AWS or Google Cloud Linux, Docker a plus Experience in financial services *Rust experience is a nice to have* If you or anyone else more »
Liverpool, England, United Kingdom Hybrid / WFH Options
ShortList Recruitment Limited
knowledge of the LAMP stack MVC frameworks (Laravel/Symfony) Knowledge of microservice architecture HTML, CSS, JavaScript MySQL/SQL, NoSQL Desirable Skills: Angular KAFKA AWS, Lambda Docker/Kubernetes The PHP Developer role is based in Liverpool but offers fully remote working. The role comes with a range more »
of cyber criminals. The Ideal Software Engineer will have: Strong technical knowledge of Java. Experience with one or more of C#, React, SQL Server, Kafka, AWS, Ansible, Docker, Javascript. Experience of team leadership/mentorship. Prior experience working within an agile environment. Software Engineer Key Details Salary up to more »
Senior Software Engineer or Technical Leads 💻 NodeJS, Kafka, Cassandra, React, AWS, Microservices 🏠 Hybrid - Zone 1 office 💵 £90k - £125k base salary, plus equity and annual bonus. £160-£190k total comp. Evoke is working with a publicly traded technology unicorn business in the b2b SaaS space who are looking to rapidly more »
Surrey, England, United Kingdom Hybrid / WFH Options
The JM Longbridge Group
Python, and node.js to streamline deployment processes. Manage and optimize relational databases (e.g., PostgreSQL), NoSQL databases (e.g., MongoDB), and implement data streaming solutions (e.g., Kafka) for efficient data handling. Work with big data technologies such as Hadoop to manage and analyse large datasets. Qualifications : Bachelor’s degree in computer more »