and CI/CD processes. Experience working with a large-scale legacy software system. Experience with tools such as Confluence, Eclipse, Jira, Jenkins, Junit, Kafka, and Spring Boot. Clearance Level TS/SCI This position requires an active DoD Clearance (Secret, Top Secret, Top Secret/SCI) or the More ❯
technologies and Distributed Systems such as Hadoop, HDFS, HIVE, Spark, Databricks, Cloudera. Experience developing near real time event streaming pipelines with tools such as - Kafka, Spark Streaming, Azure Event Hubs. Excellent experience in the Data Engineering Lifecycle, you will have created data pipelines which take data through all layers More ❯
to adapt to schedule changes as needed. Must be able to travel CONUS up to 25% of the time. Preferred Requirements Experience with NiFi, Kafka, AWS Infrastructure, and K8's. Experience in cloud based technologies (AWS, Azure). Experience in distributed databases, NoSQL databases, full text-search engines (e.g. More ❯
the ELK stack. Set up and manage the CI/CD pipeline using BitBucket, Maven, Terraform, Jenkins, Ansible/Packer, and Kustomize. Work with Kafka, SQS for queuing solutions and implement scheduling using Jenkins/Ansible. Use a combination of Cucumber, JUnit, Selenium, and Postman for comprehensive testing. Qualifications More ❯
Belfast, City of Belfast, County Antrim, United Kingdom Hybrid / WFH Options
Aspire Personnel Ltd
the ELK stack. Set up and manage the CI/CD pipeline using BitBucket, Maven, Terraform, Jenkins, Ansible/Packer, and Kustomize. Work with Kafka, SQS for queuing solutions and implement scheduling using Jenkins/Ansible.Use a combination of Cucumber, JUnit, Selenium, and Postman for comprehensive testing. Qualifications Minimum More ❯
in Security, Tools that identify security vulnerabilities like Fortify, Blackduck etc. and track it 3 years of Experience with Docker/Kubernetes Exposure to Kafka/IBM MQ 2 years of Experience in working with Oracle Cloud with certification preferred. Education/Certifications: Please indicate whether education and/ More ❯
technologies and Distributed Systems such as Hadoop, HDFS, HIVE, Spark, Databricks, Cloudera. Experience developing near real-time event streaming pipelines with tools such as - Kafka, Spark Streaming, Azure Event Hubs. Excellent experience in the Data Engineering Lifecycle, having created data pipelines which take data through all layers from generation More ❯
Curious and excited to learn new technologies. Experience with deployment technologies (Kubernetes (K8s), Helm, Istio, Rancher). Experience using open-source technologies (NiFi, GeoServer, Kafka, Grafana, Loki, Prometheus, etc.). Understanding of AWS services and cloud environments. Knowledge of CI/CD pipelines and tooling (Gitlab CI/CD More ❯
4+ years of visual/UI testing and Rest Assured/Open API testing Experience testing event-driven services, message queues, and event brokers (Kafka) Proficient in performance testing tools (JMeter, K6, Neoload, LoadRunner) Skilled in writing SQL/NoSQL queries for data verification In-depth knowledge of Selenium More ❯
unit testing, system testing, integration, security, and performance testing. Relentless focus on delivering business value through sound engineering methods and principles. Working experience with Kafka for event-driven communication between services. Experience in Cloud Applications. Docker-based container development and deployment on Kubernetes. Understanding of CI/CD methodologies More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
Edgesource
standing of big data search tools (Airflow, Pyspark, Trino, OpenSearch, Elastic, etc.) Desired Qualifications (NOT REQUIRED): Experience with Docker, Jenkins, Hadoop/Spark, Kibana, Kafka, NiFi, and/or ElasticSearch Working at Edgesource: As an ISO 9001:2015 certified and CMMI Level 3 appraised small business, Edgesource specializes in More ❯
Qualifications, Capabilities and Skills Understanding of distributed systems and microservices architecture. Understanding of cloud technologies (AWS, GCP, Azure, etc.). Understanding of messaging frameworks (Kafka, RabbitMQ, etc.). Experience in automating deployment, releases and testing in continuous integration, continuous delivery pipelines. About Us J.P. Morgan is a global leader More ❯
pipelines for data applications and infrastructure Expertise with cloud platforms like AWS, GCP or Azure preferably AWS Experience with dbt for data transformations and Kafka (or other streaming technologies) is a strong plus Proficiency in data modeling and designing scalable data architectures to support analytics and operational use cases More ❯
public cloud, including AWS, Microsoft Azure, or Google Cloud Experience with distributed data and computing tools, including Spark, Databricks, Hadoop, Hive, AWS EMR, or Kafka Experience working on real-time data and streaming applications Experience with UNIX and Linux, including basic commands and Shell scripting Experience with NoSQL implementation More ❯
public cloud, including AWS, Microsoft Azure, or Google Cloud Experience with distributed data and computing tools, including Spark, Databricks, Hadoop, Hive, AWS EMR, or Kafka Experience working on real-time data and streaming applications Experience with NoSQL implementation, including MongoDB or Cassandra Experience with data warehousing using AWS Redshift More ❯
s degree Nice If You Have: 5+ years of experience with distributed data and computing tools, including Spark, Databricks, Hadoop, Hive, AWS EMR, or Kafka 5+ years of experience working on real-time data and streaming applications 5+ years of experience with NoSQL implementation, including MongoDB or Cassandra 5+ More ❯
Integration Experience with NoSQL databases including MongoDB Experience with containerization technologies such as Docker and containerd Familiar with Messaging Frameworks such as RabbitMQ and Kafka Familiar with CI/CD principles, methodologies, and tools such as Bamboo and GitLab CI Familiar with IaC principles, methodologies, and tools such as More ❯
Systems and Virtualisation (Windows and Linux). Infrastructure as Code and Operational Automation (e.g. Terraform, Ansible). Message Queueing and Streaming Fabrics (e.g. AMQP, Kafka, Kinesis). Docker and Kubernetes. Scripting (Shell and PowerShell). Basic Coding with a bias for Infrastructure (Python, Go, C#). IAM Policy and More ❯
the ELK stack. Set up and manage the CI/CD pipeline using BitBucket, Maven, Terraform, Jenkins, Ansible/Packer, and Kustomize. Work with Kafka, SQS for queuing solutions and implement scheduling using Jenkins/Ansible. Use a combination of Cucumber, JUnit, Selenium, and Postman for comprehensive testing. Qualifications More ❯
similar Key Technologies We Use (not necessarily required for the role): Google Cloud, Google Cloud Composer, BigQuery, Spark, Solr, Elasticsearch, Druid, PostgreSQL, ScyllaDB, Redis, Kafka, Flink, Docker, Kubernetes, Kibana, Jenkins, Prometheus, Grafana, Github, C++, Python, Scala, Compiler Explorer What Blis Can Offer: We want you to be well and More ❯
pipelines for data applications and infrastructure Expertise with cloud platforms like AWS, GCP or Azure preferably AWS Experience with dbt for data transformations and Kafka (or other streaming technologies) is a strong plus Proficiency in data modeling and designing scalable data architectures to support analytics and operational use cases More ❯
of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 4+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 4+ year experience working on real-time data and streaming applications 4+ years of experience with NoSQL implementation (Mongo, Cassandra More ❯
of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 4+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 4+ year experience working on real-time data and streaming applications 4+ years of experience with NoSQL implementation (Mongo, Cassandra More ❯