computing engineers, and systems engineers, to ensure efficient data processing, secure storage, and insightful analysis. This position requires hands-on experience and expertise with Kafka, AWS, and MongoDB Responsibilities Upgrade and/or maintain Kafka clusters in an AWS/Kubernetes cloud environment Design, Develop, and Deploy Kafka clusters in an AWS/Kubernetes cloud environment Manage schema evolution and security using Kafka Schema Registry and Kafka Security Manager (KSM) Oversee the assessment, design, building, and maintenance of scalable platforms Requirements Kafka experience: Kafka Connect, Kafka Streams, and other Kafka ecosystem … tools Kafka knowledge of/with: Kafka Schema Security Manager, Kafka Schema Registry, and Kafka architecture and internals Experience with Apache NiFI Experience with AWS Experience with MongoDB, Redis Experience with agile development methodologies Data Pipeline Experience: You will be involved with acquiring and prepping data more »
Player: Ability to work effectively in a collaborative team environment, as well as independently. Preferred Qualifications: Experience with big data technologies (e.g., Hadoop, Spark, Kafka). Experience with AWS and its data services (e.g. S3, Athena, AWS Glue). Familiarity with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). more »
been around emitting metrics from the Calculation Engine to put more data in the hands of our clients. In processing these metrics, we use Kafka Streams for aggregation and Kafka connectors to persist the data. As a Senior Developer, you will be responsible for leading the design and … using promises/futures (e.g., CompletableFuture). Extensive experience with multi-threaded applications. Deep understanding of event-driven and streaming microservices. Extensive experience using Kafka, leveraging Kafka Connect and Kafka Streams. Experience with container technologies such as Docker, Podman, and Kubernetes, as well as package managers like more »
tasks efficiently. Experience with agile development methodologies. Preferred Qualifications Top Secret Security Clearance with SCI eligibility. Familiarity with event-driven architectures and messaging systems (Kafka, RabbitMQ). Experience with feature stores and model registries. Familiarity with big data technologies (Spark, Hadoop). Knowledge of monitoring and logging tools for more »
or ArangoDB Experience with SDLC tools, such as JIRA, Confluence, GitLab, and BitBucket Preferred Technical Skills: Data processing tools such as Apache NiFi, ApacheKafka, or Celery Machine learning tools such as tensorflow, pytorch, scikit-learn, or spaCy CI/CD tools such as Jenkins or GitLab CI Deployment more »
DevOps Tools: Jenkins, Git, TeamCity, GitLab CI/CD Programming & Scripting Languages: Go (Golang), Python, Ruby, Java, Bash Database & Middleware: MySQL, PostgreSQL, Tomcat, ZooKeeper, Kafka About You Basic Qualifications: This position requires a TS/SCI with CI polygraph security clearance. Applicants must already possess a valid and active more »
Helm, Kubernetes, and Spring Boot or similar Experience with Java, Python, SQL Server-side data technologies like, Hadoop, Accumulo, GeoMesa, Postgres, Elasticsearch; Graphite, Grafana, Kafka, Storm, Flink, Spark Understanding of programming principles, such as object-orientation and use of design patterns DevOps tools, including Jenkins, Gitlab, Nexus, and SonarQube more »
educator of modern approaches to software development and testing, and agile working principles . Experience with big data technologies ( Apache Spark ), data streaming ( ApacheKafka ) and workflow orchestration ( Apache Airflow, Dagster ), as well as machine learning and data processing frameworks ( Tensorflow ). Experience with Docker and containerization Kubernetes Know more »
DevOps Tools: Jenkins, Git, TeamCity, GitLab CI/CD Programming & Scripting Languages: Go (Golang), Python, Ruby, Java, Bash Database & Middleware: MySQL, PostgreSQL, Tomcat, ZooKeeper, Kafka About You Basic Qualifications: This position requires a TS/SCI with CI polygraph security clearance. Applicants must already possess a valid and active more »
Helm, Kubernetes, and Spring Boot or similar Experience with Java, Python, SQL Server-side data technologies like, Hadoop, Accumulo, GeoMesa, Postgres, Elasticsearch; Graphite, Grafana, Kafka, Storm, Flink, Spark Understanding of programming principles, such as object-orientation and use of design patterns DevOps tools, including Jenkins, Gitlab, Nexus, and SonarQube more »
In-depth knowledge in using tools such as Terraform, Helm, kubectl, Hashicorp vault. Deep understanding of event-driven and streaming microservices. Extensive experience using Kafka and Cloud-native messaging systems (AWS SQS/SNS or Google Pub/Sub or Equivalent). Familiar with asynchronous programming using promises/ more »
developing and maintaining REST based interfaces. Experience with Container orchestration frameworks such as Docker and Kubernetes. Bonus if you have experience with networking, Redis, Kafka, Grafana, ELK stack (Elasticsearch, Logstash, Kibana). AWS Certification (Developer, DevOps and/or, Architect, etc.) A passion for creating beautiful, engaging, intuitive, efficient more »
Lead Data Engineer (MongoDB, Kafka, Java) Salary: Competitive plus generous benefits package Location: Hybrid working with occasional travel to key sites across the UK (London, Bristol, Gloucester, Edinburgh) About the Role A leading financial services organization is undergoing a major digital transformation, placing technology at the heart of its … version control systems like Git. Proven ability to design scalable, real-time data applications. Technical Expertise: Must have modern, recent experience with MongoDB and Kafka in a data engineering capacity Real-Time Data Applications Data Management: MongoDB, Cassandra/ScyllaDB Data Integration: Kafka, Kafka Streams, Java, APIs more »
Manchester, England, United Kingdom Hybrid / WFH Options
Anson McCade
and orchestrate solutions using Docker, Kubernetes, and AWS. Work with databases including ElasticSearch, Postgres, RDS, and more. Optimise performance and ensure quality using SonarQube, Kafka, and ELK stack. What We’re Looking For: 5+ years of hands-on Java development experience with leadership capabilities. Strong expertise in Java, multi more »
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Anson McCade
and orchestrate solutions using Docker, Kubernetes, and AWS. Work with databases including ElasticSearch, Postgres, RDS, and more. Optimise performance and ensure quality using SonarQube, Kafka, and ELK stack. What We’re Looking For: 5+ years of hands-on Java development experience with leadership capabilities. Strong expertise in Java, multi more »
M1, Manchester, United Kingdom Hybrid / WFH Options
Avanti Recruitment
Docker/OpenStack Experience with RESTful APIs Build tools- Maven/Gradle CI/CD (Jenkins/PaaS/Gitlab) Desirable Skills: GraphQL Angular Kafka or similar Benefits: Salary up to £80k (possible flex DOE) Nest Pension 28 Days holiday (inclusive of bank holidays) Fully Remote working across the more »
EC2, Barbican, Greater London, United Kingdom Hybrid / WFH Options
Avanti Recruitment
Docker/OpenStack Experience with RESTful APIs Build tools- Maven/Gradle CI/CD (Jenkins/PaaS/Gitlab) Desirable Skills: GraphQL Angular Kafka or similar Benefits: Salary up to £80k (possible flex DOE) Nest Pension 28 Days holiday (inclusive of bank holidays) Fully Remote working across the more »
B1, Birmingham, West Midlands (County), United Kingdom Hybrid / WFH Options
Avanti Recruitment
Docker/OpenStack Experience with RESTful APIs Build tools- Maven/Gradle CI/CD (Jenkins/PaaS/Gitlab) Desirable Skills: GraphQL Angular Kafka or similar Benefits: Salary up to £80k (possible flex DOE) Nest Pension 28 Days holiday (inclusive of bank holidays) Fully Remote working across the more »
front end development in modern technologies like REACT, JavaScript, and Node JS Expertise in backend development (e.g., My SQL, Java Spring Boot, Kubernetes, Docker, Kafka, Python) Proficiency in developing cloud-based, dockerized applications and microservices Experience with Agile methodologies/DevOps environment including CI/CD pipelines Basic understanding more »
to build and deploy working prototypes GEOINT AI/ML pipelines Geospatial data processing Cloud technologies such as Kubernetes, Helm Data brokers such as Kafka and RabbitMQ Knative CI/CD pipelines and tooling (Gitlab CI/CD, ArgoCD, CircleCI, Jenkins) RESTful APIs Databases technologies (PostgreSQL, Redis, or other more »
Desired Knowledge, Skills, and Abilities: • Experience with Python and bash scripting to automate tasks • Experience working with streaming data and automation systems such as Kafka, NiFi, RabbitMQ, or similar • Experience with deploying to container orchestration platforms such as Kubernetes, OpenShift, Rancher, and Docker Swarm • Experience working with S3 compliant more »
Lead Data Engineer (MongoDB, Kafka, Java) Salary: Competitive plus generous benefits package Location: Hybrid working with occasional travel to key sites across the UK (London, Bristol, Gloucester, Edinburgh) A leading financial services organization is undergoing a major digital transformation, placing technology at the heart of its growth strategy. They … version control systems like Git. Proven ability to design scalable, real-time data applications. Technical Expertise: Must have modern, recent experience with MongoDB and Kafka in a data engineering capacity Real-Time Data Applications Data Management: MongoDB, Cassandra/ScyllaDB Data Integration: Kafka, Kafka Streams, Java, APIs more »
of infrastructure automation tools such as Terraform, Ansible, CloudFormation etc. Experience with data processing frameworks/tools/platform such as Databricks, Apache Spark, Kafka, Flink, AWS cloud services for batch processing, batch streaming and streaming. Experience containerizing analytical models using Docker and Kubernetes or other container orchestration platforms. more »
of infrastructure automation tools such as Terraform, Ansible, CloudFormation etc. Experience with data processing frameworks/tools/platform such as Databricks, Apache Spark, Kafka, Flink, AWS cloud services for batch processing, batch streaming and streaming. Experience containerizing analytical models using Docker and Kubernetes or other container orchestration platforms. more »
of infrastructure automation tools such as Terraform, Ansible, CloudFormation etc. Experience with data processing frameworks/tools/platform such as Databricks, Apache Spark, Kafka, Flink, AWS cloud services for batch processing, batch streaming and streaming. Experience containerizing analytical models using Docker and Kubernetes or other container orchestration platforms. more »