. Production experience with KSQL for real-time data analytics and ClickHouse for fast analytics on streaming data. Experience with stream processing tools like Apache Kafka, Apache Flink, or Apache Spark Structured Streaming. Strong experience with ETL/ELT tools (e.g., Airflow, ADF, Glue, NiFi) and orchestration. … maintain scalable data storage solutions with ClickHouse for fast analytics on streaming data. Design and implement high-throughput, low-latency streaming data pipelines using Apache Kafka. Oversee the development of stream processing applications using Apache Spark or Apache Flink. Implement real-time data transformations and analytics using more »
solutions (e.g., Redshift, BigQuery, Snowflake). Knowledge of containerization and orchestration tools (e.g., Docker, ECS, Kubernetes). Knowledge of data orchestration tools (e.g. Prefect, Apache Airflow). Familiarity with CI/CD pipelines and DevOps practices. Familiarity with Infrastructure-as-code tools (e.g. Terraform, AWS CDK). Experience with more »
Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience in working with process scheduling platforms like Apache Airflow. Open to working in GS proprietary technology like Slang/SECDB An understanding of compute resources and the ability to interpret performance metrics more »
Strong communication skills and the ability to work in a team. Strong analytical and problem-solving skills. PREFERRED QUALIFICATIONS Experience with Kubernetes deployment architectures Apache NiFi experience Experience building trading controls within an investment bank ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and ideas to more »
and Kafka. Technical Skills: Proficiency in data modelling (ERD, normalization) and data warehousing concepts. Strong understanding of ETL frameworks and tools (e.g., Talend, Informatica, Apache NiFi). Knowledge of programming languages such as SQL, Python, or Java. Experience with BI tools (e.g., Power BI, Tableau) and data visualisation best more »
and Kafka. Technical Skills: Proficiency in data modelling (ERD, normalization) and data warehousing concepts. Strong understanding of ETL frameworks and tools (e.g., Talend, Informatica, Apache NiFi). Knowledge of programming languages such as SQL, Python, or Java. Experience with BI tools (e.g., Power BI, Tableau) and data visualisation best more »
team of developers globally. The platform is a Greenfield build using standard modern technologies such as Java, Spring Boot, Kubernetes, Kafka, MongoDB, RabbitMQ, Solace, Apache Ignite. The platform runs in a hybrid mode both on-premise and in AWS utilizing technologies such as EKS, S3, FSX. The main purpose more »
Experience with data orchestration tools (e.g., Airflow, Prefect, Dagster) Knowledge of Postgres, GraphQL, and other data manipulation tools Familiarity with big data concepts, including Apache Iceberg and data lakes Experience with geospatial data formats (Parquet/GeoParquet, GeoJSON, Shapefiles) Proficiency in DevOps tools (Git, Docker, Jenkins) Understanding of networking more »
West Midlands, England, United Kingdom Hybrid / WFH Options
Atreides
and Python. Nice to have: Scala, Golang Experience with Postgres, GraphQL, and other data manipulation tools. Knowledge of big data tools and environments, including Apache Iceberg and other datalake concepts. Familiarity with geospatial data formats such as Parquet/GeoParquet, GeoJSON, and Shapefiles. Experience with DevOps tools such as more »
. Proficiency in data visualization libraries (Plotly, Seaborn). Solid understanding of database design principles and normalization. Experience with ETL tools and processes and Apache Oozie or similar workflow management tools. Understanding of Machine Learning and AI concepts is a plus. Leadership & Interpersonal Skills: Proven track record of managing more »
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
experience Snowflake experience Proficiency across an AWS tech stack DevOps experience building and deploying using Terraform Nice to Have: DBT Data Modelling Data Vault Apache Airflow Interviews ongoing don't miss your chance to secure a role working with cutting edge technology while maintaining exceptional work-life balance. Contact more »
. Solid experience with Python or Java for data manipulation. Understanding of data modeling , data warehousing , and database optimization . Familiarity with tools like Apache Spark , Airflow , Kafka , or similar technologies. Strong problem-solving skills with a focus on data quality and performance. Experience working in Agile environments and more »
working in fintech or trading industries. Experience in object-oriented development with strong software engineering foundations. Experience with data-engineering cloud technologies such as Apache Airflow, K8S, Clickhouse, Snowflake, Redis, cache technologies, and Kafka. Experience with relational and non-relational DBs. Proficient in SQL and query optimizations. Experience with more »
as code principles and tools. Cloud Experience: Experience with cloud platforms (AWS, Azure, GCP, etc.). Big Data: Experience with big data frameworks like Apache Spark. Preferred Qualifications: Automation Tools: Experience with automation tools like Jenkins, GitLab CI/CD, CircleCI, Terraform, Ansible, Puppet, or similar. Big Data and more »
West Midlands, England, United Kingdom Hybrid / WFH Options
Aubay UK
migration projects, particularly large-scale migrations to distributed database platforms. Hands-on experience with big data processing technologies, including Spark (PySpark and SparkScala) and Apache Airflow. Expertise in distributed databases and computing environments. Familiarity with Enterprise Architecture methodologies, ideally TOGAF. Strong leadership experience, including managing technology teams and delivering more »
City of London, London, United Kingdom Hybrid / WFH Options
Client Server
with Python and SQL, experience with version control and contributing to a shared codebase You have experience with modern data tools and technologies including Apache Spark, Azure Databricks experience would also be great You have a strong knowledge of data management principles and best practices You have experience with more »
in Python/Java, SQL and cloud data infrastructure (AWS, Azure of GCP). Familiarity with tools or similar technologies such as Kafka, Airflow, Apache Spark. HOW TO APPLY Please register your interest by sending your CV to luke.frost@xcede.com for more info more »
with cloud services (e.g., AWS, Google Cloud, Azure) and data storage solutions (e.g., Redshift, BigQuery). Familiarity with data engineering tools and frameworks (e.g., Apache Spark, Kafka, Airflow). Strong problem-solving skills and a passion for working with large datasets. A degree in Computer Science, Data Engineering, Mathematics more »
Manchester, North West, United Kingdom Hybrid / WFH Options
Circle Group
source technologies, particularly PHP, JavaScript, jQuery and Linux Knowledge of WebSockets, Redis Pub/Sub, and message brokers like RabbitMQ Experience with LAMP (Linux, Apache, MySQL, PHP) infrastructure Swoole or Ratchet for async PHP processing, and familiarity with event-driven architectures and RESTful API optimization would be a bonus more »
autonomously on business critical projects, and collaborate with others throughout our team and user base. Our tech stack includes TypeScript, Python, Node, WebAssembly, WebGL, Apache Arrow, DuckDB, Kubernetes and React. For the best possible user experience, we have developed various technologies in-house, including a custom WebGL rendering engine more »
part of the equation. Ideally, you will have extensive hands-on experience with the following: Data Architecture Spark Kafka Databricks SQL Python Delta Orchestration - Apache Airflow/Prefect/Azure Data Factory/Glue Storage and modelling - Data Lake, Data Warehouse, Lakehouse Additional experience with the following would be more »
City Of Bristol, England, United Kingdom Hybrid / WFH Options
Made Tech
and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop. Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create more »
FOR THE SENIOR SOFTWARE ENGINEER TO HAVE . Cloud based experience Microservice architecture or server-less architecture Big Data/Messaging technologies such as Apache Nifi/MiNiFi/Kafka TO BE CONSIDERED . Please either apply by clicking online or emailing me directly to For further information please more »
level knowledge of Java Expert level experience with HTTP, ReSTful web services and API design Expert Level of Gradle Messaging technologies (Kafka) Experience with Apache Ignite or GridGain (Highly beneficial) Experience with Reactive Streams Advanced understanding of Oauth2, JWT, Spring Security Desirable Skills: A good working knowledge of a more »
problems. Our Machine Learning tech stack includes: Python, ML libraries (TensorFlow, PyTorch, scikit-learn, transformers, XGBoost, ResNet), geospatial libraries (shapely, geopandas, rasterio), AWS, Postgres, Apache Airflow, Apache Kafka, Apache Spark. Mandatory requirements: You have at least 5 years of experience in the DS role, deploying models into more »