maintaining data pipelines. Proficiency in JVM-based languages (Java, Kotlin), ideally combined with Python and experience in Spring Boot Solid understanding of data engineering tools and frameworks, like Spark, Flink, Kafka, dbt, Trino, and Airflow. Hands-on experience with cloud environments (AWS, GCP, or Azure), infrastructure-as-code practices, and ideally container orchestration with Kubernetes. Familiarity with SQL and More ❯
e.g., Hadoop, Spark). Strong knowledge of data workflow solutions like Azure Data Factory, Apache NiFi, Apache Airflow etc Good knowledge of stream and batch processing solutions like ApacheFlink, Apache Kafka Good knowledge of log management, monitoring, and analytics solutions like Splunk, Elastic Stack, New Relic etc Given that this is just a short snapshot of the role More ❯
in data engineering, data architecture, or a similar role, with at least 3 years in a lead capacity. Proficient in SQL, Python, and big data processing frameworks (e.g., Spark, Flink). Strong experience with cloud platforms (AWS, Azure, GCP) and related data services. Hands-on experience with data warehousing tools (e.g., Snowflake, Redshift, BigQuery), Databricks running on multiple cloud More ❯
technologies (e.g., Hadoop, Spark). Strong knowledge of data workflow solutions like Azure Data Factory, Apache NiFi, Apache Airflow Good knowledge of stream and batch processing solutions like ApacheFlink, Apache Kafka Good knowledge of log management, monitoring, and analytics solutions like Splunk, Elastic Stack, New Relic Note: The following line contains removed formatting for safety. Given that this More ❯
Luton, England, United Kingdom Hybrid/Remote Options
easyJet
CloudFormation. Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality and/or and data lineage frameworks like Great Expectations More ❯
and guide implementation teams • Deep understanding of Kafka internals, KRaft architecture, and Confluent components • Experience with Confluent Cloud, Stream Governance, Data Lineage, and RBAC • Expertise in stream processing (ApacheFlink, Kafka Streams, ksqlDB) and event-driven architecture • Strong proficiency in Java, Python, or Scala • Proven ability to integrate Kafka with enterprise systems (databases, APIs, microservices) • Hands-on experience with More ❯
and guide implementation teams • Deep understanding of Kafka internals, KRaft architecture, and Confluent components • Experience with Confluent Cloud, Stream Governance, Data Lineage, and RBAC • Expertise in stream processing (ApacheFlink, Kafka Streams, ksqlDB) and event-driven architecture • Strong proficiency in Java, Python, or Scala • Proven ability to integrate Kafka with enterprise systems (databases, APIs, microservices) • Hands-on experience with More ❯
Kinesis) Knowledge of IaC (Terraform, CloudFormation) and containerisation (Docker, Kubernetes) Nice to have: Experience with dbt, feature stores, or ML pipeline tooling Familiarity with Elasticsearch or real-time analytics (Flink, Materialize) Exposure to eCommerce, marketplace, or transactional environments More ❯
Kinesis) Knowledge of IaC (Terraform, CloudFormation) and containerisation (Docker, Kubernetes) Nice to have: Experience with dbt, feature stores, or ML pipeline tooling Familiarity with Elasticsearch or real-time analytics (Flink, Materialize) Exposure to eCommerce, marketplace, or transactional environments More ❯
Azure and distributed systems. Preferred Skills Kubernetes & Helm: Deploying and managing containerized applications at scale with reliability and fault tolerance. Kafka (Confluent): Familiarity with event-driven architectures; experience with Flink or KSQL is a plus. Airflow: Experience configuring, maintaining, and optimizing DAGs. Energy or commodity trading: Understanding the data challenges and workflows in this sector. Trading domain knowledge: Awareness More ❯
Azure and distributed systems. Preferred Skills Kubernetes & Helm: Deploying and managing containerized applications at scale with reliability and fault tolerance. Kafka (Confluent): Familiarity with event-driven architectures; experience with Flink or KSQL is a plus. Airflow: Experience configuring, maintaining, and optimizing DAGs. Energy or commodity trading: Understanding the data challenges and workflows in this sector. Trading domain knowledge: Awareness More ❯
Strong experience working with SQL and databases/engines such as MySQL, PostgreSQL, SQL Server, Snowflake, Redshift, Presto, etc Experience building ETL and stream processing pipelines using Kafka, Spark, Flink, Airflow/Prefect, etc. Familiarity with data science stack: e.g. Juypter, Pandas, Scikit-learn, Dask, Pytorch, MLFlow, Kubeflow, etc. Strong experience with using AWS/Google Cloud Platform (S3S More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid/Remote Options
Vivedia Ltd
pipelines , data modeling , and data warehousing . Experience with cloud platforms (AWS, Azure, GCP) and tools like Snowflake, Databricks, or BigQuery . Familiarity with streaming technologies (Kafka, Spark Streaming, Flink) is a plus. Tools & Frameworks: Airflow, dbt, Prefect, CI/CD pipelines, Terraform. Mindset: Curious, data-obsessed, and driven to create meaningful business impact. Soft Skills: Excellent communication and More ❯
/React.js, or Node.js 2+ years of experience working with big data technologies (e.g. Hadoop, Spark, Presto) 2+ years of experience working on streaming data applications (e.g. Kafka, Kinesis, Flink, or Spark Streaming) 4+ years of experience in open source frameworks At this time, Capital One will not sponsor a new applicant for employment authorization, or offer any immigration More ❯
/React.js, or Node.js 2+ years of experience working with big data technologies (e.g. Hadoop, Spark, Presto) 2+ years of experience working on streaming data applications (e.g. Kafka, Kinesis, Flink, or Spark Streaming) 4+ years of experience in open source frameworks At this time, Capital One will not sponsor a new applicant for employment authorization, or offer any immigration More ❯
/React.js, or Node.js 2+ years of experience working with big data technologies (e.g. Hadoop, Spark, Presto) 2+ years of experience working on streaming data applications (e.g. Kafka, Kinesis, Flink, or Spark Streaming) 4+ years of experience in open source frameworks At this time, Capital One will not sponsor a new applicant for employment authorization, or offer any immigration More ❯
/React.js, or Node.js 2+ years of experience working with big data technologies (e.g. Hadoop, Spark, Presto) 2+ years of experience working on streaming data applications (e.g. Kafka, Kinesis, Flink, or Spark Streaming) 4+ years of experience in open source frameworks At this time, Capital One will not sponsor a new applicant for employment authorization, or offer any immigration More ❯
City Of London, England, United Kingdom Hybrid/Remote Options
Bondaval
or similar) from a good University highly desirable. Nice to Have: Familiarity with message brokers (Kafka, SQS/SNS, RabbitMQ). Knowledge of real-time streaming (Kafka Streams, ApacheFlink, etc.). Exposure to big-data or machine-learning frameworks (TensorFlow, PyTorch, Hugging Face, LangChain). Experience with real-time streaming technologies (Kafka, Apache Storm). Understanding of infrastructure More ❯
contract definition, clean code, CI/CD, path to production Worked with AWS as a cloud platform Extensive hands-on experience with modern data technologies, ETL tools (e.g. Kafka, Flink, DBT etc.) , data storage (e.g. Snowflake, Redshift, etc.) and also IaC ( e.g. Terraform, CloudFormation ) Software development experience with one or more languages (e.g. Python, Java, Scala, Go ) Pragmatic approach More ❯
Java, Scala). Experience working in financial services or large enterprise environments. Demonstrated ability to lead distributed engineering teams effectively. Deep understanding of data architecture, streaming technologies (e.g., Kafka, Flink), and cloud platforms (e.g., AWS, Azure, GCP). Excellent communication and stakeholder management skills. ABOUT CAPGEMINI Capgemini is a global business and technology transformation partner, helping organizations to accelerate More ❯
Java, Scala). Experience working in financial services or large enterprise environments. Demonstrated ability to lead distributed engineering teams effectively. Deep understanding of data architecture, streaming technologies (e.g., Kafka, Flink), and cloud platforms (e.g., AWS, Azure, GCP). Excellent communication and stakeholder management skills. ABOUT CAPGEMINI Capgemini is a global business and technology transformation partner, helping organizations to accelerate More ❯
to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark, Databricks, Apache Pulsar, Apache Airflow, Temporal, and ApacheFlink, sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise documentation for software, processes, and systems to ensure team alignment and knowledge sharing. Your Qualifications Experience: Professional experience … pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like Apache Airflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with ApacheFlink or other stream processing frameworks is a plus. Desired Skills Asynchronous Programming: Familiarity with asynchronous programming tools like Celery or asyncio. Frontend Knowledge: Exposure to frontend frameworks like React More ❯
about complex problems at high scale. Ability to work collaboratively in a team environment and communicate effectively with other teams across Cloudflare. Experience with data streaming technologies (e.g., Kafka, Flink) is a strong plus. Experience with various logging platforms or SIEMs (e.g., Splunk, Datadog, Sumo Logic) and storage destinations (e.g., S3, R2, GCS) is a plus. Experience with Infrastructure More ❯