Camden, London, United Kingdom Hybrid / WFH Options
Aristocrat
deep understanding of the inherent challenges. Familiarity with microservices architecture, API design, and integration patterns. Experience with event-driven architecture and messaging systems (e.g., Kafka, RabbitMQ, PubSub ). Experience with OpenAPI and AsyncAPI for defining APIs, and exercising the APIs through appropriate tools (e.g., Postman). Solid knowledge of more »
DB, and design effective database schemas and queries for high performance. Open Source Integration: Integrate open-source frameworks (e.g., Node.js, Angular, React, Spring Boot, Kafka, Docker) into enterprise applications to meet specific business needs. Collaborate with development teams to ensure open-source technologies are incorporated effectively while maintaining enterprise more »
pipelines. Experience in Python/Java, SQL and cloud data infrastructure (AWS, Azure of GCP). Familiarity with tools or similar technologies such as Kafka, Airflow, Apache Spark. HOW TO APPLY Please register your interest by sending your CV to luke.frost@xcede.com for more info more »
Learning tech stack includes: Python, ML libraries (TensorFlow, PyTorch, scikit-learn, transformers, XGBoost, ResNet), geospatial libraries (shapely, geopandas, rasterio), AWS, Postgres, Apache Airflow, ApacheKafka, Apache Spark. Mandatory requirements: You have at least 5 years of experience in the DS role, deploying models into production; You have proven experience more »
Data Engineering and Analytics: Work closely with data teams to define robust data pipelines and scalable cloud-based data platforms using tools like ApacheKafka, Snowflake, or Databricks. Monitoring and Performance Tuning: Implement advanced monitoring and observability solutions using tools like Prometheus, Grafana, or Datadog to proactively identify and more »
Data Engineering and Analytics: Work closely with data teams to define robust data pipelines and scalable cloud-based data platforms using tools like ApacheKafka, Snowflake, or Databricks. br Monitoring and Performance Tuning: Implement advanced monitoring and observability solutions using tools like Prometheus, Grafana, or Datadog to proactively identify more »
Newcastle upon Tyne, Tyne and Wear, Tyne & Wear, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
processes to ensure efficient data ingestion, processing, and integration across various systems Lead development and maintenance of real-time data streaming platform using ApacheKafka, Databricks, etc. Ensure the integration of streaming data with batch processing systems for comprehensive data management Utilize AWS data engineering services to build and more »
Role: AWS+Java Developer Location :London,UK Job Description : AWS, Kubernetes, Docker, Istio, Terraform, Java, Spring, Spring Boot, Kafka, GitLab, Helm, Maven and Argo CD AWS Services, Docker, Kubernetes, Istio, Java, Spring, Springboot, Kafka. This is because one of the major parts of the project is to migrate over more »
teams across multiple locations and act as the deputy to the Co-Founder/CTO. Key Technologies: Java, Python, Go & JavaScript Mongo, PostgreSQL, RabbitMQ, Kafka, Redis, ElasticSearch Kubernetes, Docker, AWS, GCP, Azure Prometheus, Grafana, Sentry, Nginx Key Experience: Software engineering background Ability to lead on architecture and infrastructure decisions more »
environments (GitHub, Bitbucket) Solid understanding of routing, switching, DNS, firewalls, load balancing, and global traffic management Familiarity with NoSQL/SQL databases, queuing systems (Kafka, SQS), and designing for high availability and clustering Skilled with tools like ELK, Fluentd, or CloudWatch for performance tuning, forensic analysis, and capacity planning more »
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Bowerford Associates
transformation and loading using leading cloud technologies including Azure and AWS. Leverage Big Data Technologies - you will utilise tools such as Hadoop, Spark and Kafka to design and manage large-scale data processing systems. Extend and maintain the data Warehouse - you will be supporting and enhancing the data warehouse more »
Python and JavaScript) Experience building frontend applications (we use TypeScript, React, and GraphQL) Experience with various database technologies and query languages (we use Neo4j, Kafka, MySQL, Snowflake, Elasticsearch, and more) Familiar with low-latency techniques to help improve page load time and reliability Experience with microservices, APIs, and related more »
to maintain high levels of code, Agile (Scrum\Kanban) , Test Driven Development, WCF, ASP.NET particularly MVC Desirable Technical Skills- Experience of middleware e.g., NServiceBus, Kafka, TFVC/Git/Github, Design Patterns, Azure PaaS (API Management/LogicApps) If you are interested in this role and then we’d more »
paramount towards our customer journey as well as working on new initiative programmes. Our Tech Stack includes: Ruby, Ruby on Rails, MongoDB, AWS, Docker, Kafka, Terraform, JavaScript, SCSS, React, and Webpack. As one of our Senior Software Engineers you'll: be hands-on in developing our products using best more »
Leeds, England, United Kingdom Hybrid / WFH Options
BAE Systems Digital Intelligence
a wide range of experience such as: Requirements Analysis Test Analysis and definition AWS, Azure Linux, Docker, VMWare Ansible, Foreman, Kubernetes, Jenkins, Bamboo, HDFS, Kafka, Avro C/C++, Java, Javascript Networking (esp security) Qt, React, Redux, JEE Git, BitBucket, Jira, Jama Entry Requirements On track to achieve a more »
background in API design and management, with experience in REST, SOAP, GraphQL, and related technologies. Technical Skills : Experience with middleware technologies, message brokers (e.g., Kafka, RabbitMQ), and ESB (Enterprise Service Bus) architectures. Proficiency in integration patterns, such as publish/subscribe, message queuing, event-driven architectures, and orchestration/ more »
to build platforms that can be used to provide business critical insights. Desired Skills ⚙️ Python SQL (SQL Server, Azure SQL), NoSQL Apache Airflow, Kubernetes Kafka, Spark GCP Benefits 🏖 Quarterly bonuses Private Health Care Training programmes Annual off-sites If you are a skilled engineer (Python, SQL, Spark, Azure) who more »
team and liaise with other teams within the business and external vendors. Desired Skills ⚙️ Python SQL (SQL Server, Azure SQL), NoSQL Apache Airflow, Kubernetes Kafka, Spark GCP Benefits 🏖 Quarterly bonuses Private Health Care Training programmes Annual off-sites If you are a skilled engineer (Python, SQL, Spark, Azure) who more »
Data Engineering and Analytics: Work closely with data teams to define robust data pipelines and scalable cloud-based data platforms using tools like ApacheKafka, Snowflake, or Databricks. Monitoring and Performance Tuning: Implement advanced monitoring and observability solutions using tools like Prometheus, Grafana, or Datadog to proactively identify and more »
programming languages- Scala, Java, or Python Experience building and developing Cloud-based applications or services, preferably in AWS Experience working with event streaming platforms (Kafka/Kinesis/SQS) Experience with distributed processing systems such as Apache Spark and/or Apache Flink Ability to handle periodic on-call more »
with delivery partners and third-party applications. Proficiency in Java and Spring Boot is essential. Familiarity with the Spring Framework and tools like ApacheKafka is a bonus. Knowledge of finical billing domains/systems Hands-on experience with microservices architecture , database programming , and event streaming in a cloud more »
Computer Science, Engineering, etc. Skills Software Development experience in Python or Scala An understanding of Big Data technologies such as Spark, messaging services like Kafka or RabbitMQ, and workflow management tools like Airflow SQL & NoSQL expertise, ideally including Postgres, Redis, MongoDB, etc. Experience with AWS, and with tools like more »
West Sussex, England, United Kingdom Hybrid / WFH Options
UST
and development best practices Excellent leadership, communication, and interpersonal skills to foster a collaborative work environment Desirable skills (not mandatory): Middleware expertise (e.g., NServiceBus, Kafka) Azure PaaS (API Management, LogicApps) Hurry & apply for a more detailed conversation with our Talent Acquisition team! #UST #SeniorDeveloper #DotNetDeveloper #SoftwareEngineering #HiringNow #IR35 #TechJobs more »
. Min. 3 years commercial experience as a Data Engineer post-academia Exposure to the following is also desirable but not essential; DBT, Airflow, Kafka, Docker HOW TO APPLY Please register your interest by sending your CV to luke.frost@xcede.com or click the Apply Link for more info more »
Nottinghamshire, Nottingham, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
for large organizations. Expect variety – you’ll work across cloud platforms like Azure, AWS, and GCP, leveraging tools such as Databricks, Data Factory, Synapse, Kafka, Glue, Redshift, BigQuery, and more. About You You’re an engineer at heart, with a passion for building efficient, scalable data systems. To succeed more »