with Big Data technologies (Dask, Spark), Docker, Kubernetes, Helm, and Terraform. Strong SQL skills, experience with data warehousing and relational/NoSQL databases (Impala, Hive, Delta, Databricks, Postgres, Mongo). Experience creating pipelines supporting ML/statistical algorithms and working with data scientists. Familiarity with message queues (Kafka, RabbitMQ more »
ensure code reliability through testing Mentor junior engineers and contribute to continuous learning within the team Technical Stack: Frontend: React.js, Redux Backend: Python Databases: Hive, MongoDB, SQL Server ETL Pipelines: Airflow, Spark, dbt Other: Docker, Git, Test-driven development Requirements: 5+ years of full-stack development experience in Python more »
TDD, source management, continuous integration, and releasing software to live environments. Knowledge of large-scale data processing technology, for example: Arrow, Hadoop, Cassandra, Spark, Hive, or Databricks. Strong experience of batch ETL, data warehousing, database querying and building secure and scalable applications in a cloud-based environment, preferably AWS more »
environments. Knowledge of data science, ML and AI tools (eg. Jupyter, Spacy, Transformers). Experience with big data NoSQL engines/platforms such as Hive, Impala, and Elasticsearch. Experience with business intelligence and visualization tools like Tableau, Kibana, and PowerBI. Collaboration skills. TO BE CONSIDERED…. Please either apply more »
of an agile team of developers in credit risk regulatory and compliance data delivery projects. Additional relevant technical skills such as SQL skills in Hive, Impala, and Teradata, and experience in AWS or other cloud platforms are also highly valued. This role will be based in our Northampton office. more »
of an agile team of developers in credit risk regulatory and compliance data delivery projects. Additional relevant technical skills such as SQL skills in Hive, Impala, and Teradata, and experience in AWS or other cloud platforms are also highly valued. This role will be based in our Northampton office. more »
of an agile team of developers in credit risk regulatory and compliance data delivery projects. Additional relevant technical skills such as SQL skills in Hive, Impala, and Teradata, and experience in AWS or other cloud platforms are also highly valued. This role will be based in our Northampton office. more »
of an agile team of developers in credit risk regulatory and compliance data delivery projects. Additional relevant technical skills such as SQL skills in Hive, Impala, and Teradata, and experience in AWS or other cloud platforms are also highly valued. This role will be based in our Northampton office. more »
of an agile team of developers in credit risk regulatory and compliance data delivery projects. Additional relevant technical skills such as SQL skills in Hive, Impala, and Teradata, and experience in AWS or other cloud platforms are also highly valued. This role will be based in our Northampton office. more »
of an agile team of developers in credit risk regulatory and compliance data delivery projects. Additional relevant technical skills such as SQL skills in Hive, Impala, and Teradata, and experience in AWS or other cloud platforms are also highly valued. This role will be based in our Northampton office. more »
works: Microservices, Event Driven Architecture, Azure API Management * Experience with working in Agile/Lean environments * Proficient in a variety of database technologies, e.g. Hive, Impala, NoSQL, Oracle * Experience using data visualisation tools such as PowerBI * Experience of working with industry models for insurance or banking Insurance Data Architect more »
this role, such as development experience in Ab-Initio, with full software development lifecycle (SDLC) in Hadoop & Cloud and a working knowledge on Teradata, Hive, Impala databases. Additional relevant skills in Application design, development, and implementation of DW solutions using Ab initio are highly valued. This role will be more »
this role, such as development experience in Ab-Initio, with full software development lifecycle (SDLC) in Hadoop & Cloud and a working knowledge on Teradata, Hive, Impala databases. Additional relevant skills in Application design, development, and implementation of DW solutions using Ab initio are highly valued. This role will be more »
this role, such as development experience in Ab-Initio, with full software development lifecycle (SDLC) in Hadoop & Cloud and a working knowledge on Teradata, Hive, Impala databases. Additional relevant skills in Application design, development, and implementation of DW solutions using Ab initio are highly valued. This role will be more »
this role, such as development experience in Ab-Initio, with full software development lifecycle (SDLC) in Hadoop & Cloud and a working knowledge on Teradata, Hive, Impala databases. Additional relevant skills in Application design, development, and implementation of DW solutions using Ab initio are highly valued. This role will be more »
this role, such as development experience in Ab-Initio, with full software development lifecycle (SDLC) in Hadoop & Cloud and a working knowledge on Teradata, Hive, Impala databases. Additional relevant skills in Application design, development, and implementation of DW solutions using Ab initio are highly valued. This role will be more »
this role, such as development experience in Ab-Initio, with full software development lifecycle (SDLC) in Hadoop & Cloud and a working knowledge on Teradata, Hive, Impala databases. Additional relevant skills in Application design, development, and implementation of DW solutions using Ab initio are highly valued. This role will be more »
this role, such as development experience in Ab-Initio, with full software development lifecycle (SDLC) in Hadoop & Cloud and a working knowledge on Teradata, Hive, Impala databases. Additional relevant skills in Application design, development, and implementation of DW solutions using Ab initio are highly valued. This role will be more »
this role, such as development experience in Ab-Initio, with full software development lifecycle (SDLC) in Hadoop & Cloud and a working knowledge on Teradata, Hive, Impala databases. Additional relevant skills in Application design, development, and implementation of DW solutions using Ab initio are highly valued. This role will be more »
this role, such as development experience in Ab-Initio, with full software development lifecycle (SDLC) in Hadoop & Cloud and a working knowledge on Teradata, Hive, Impala databases. Additional relevant skills in Application design, development, and implementation of DW solutions using Ab initio are highly valued. This role will be more »
this role, such as development experience in Ab-Initio, with full software development lifecycle (SDLC) in Hadoop & Cloud and a working knowledge on Teradata, Hive, Impala databases. Additional relevant skills in Application design, development, and implementation of DW solutions using Ab initio are highly valued. This role will be more »
and controls to ensure and maintain optimal platform performance and workload management. Leverage the latest capabilities from an evolving technology stack – NiFi/Kafka, Hive/Impala, Ranger/Atlas, Hbase/Phonix, Terraform/IaC. Guiding the implementation of solutions on CDP in the cloud (AWS/Azure more »
and controls to ensure and maintain optimal platform performance and workload management. Leverage the latest capabilities from an evolving technology stack - NiFi/Kafka, Hive/Impala, Ranger/Atlas, Hbase/Phonix, Terraform/IaC. Guide the implementation of solutions on CDP in the cloud (AWS/Azure more »
Camberley, Surrey, United Kingdom Hybrid / WFH Options
Venus Recruitment Ltd
hands on data-driven work Strong communication and teamworking abilities Good numerical skills Advanced Excel, and PowerBI Proficiency using query languages such as SQL, Hive Scripting and programming skills. Benefits include: 25 days paid holiday plus bank holidays, hybrid working, Group Pension Scheme, Life Assurance 3 x basic salary more »
the following additional languages: Java, C#, C++, Scala Familiarity with Big Data technology in cloud and on-premises environments: Hadoop, HDFS, Spark, NoSQL Databases, Hive, MongoDB, Airflow, Kafka, AWS, Azure, Dockers or Snowflake Good understanding of object-oriented programming (OOP) principles & concepts Familiarity with advanced SQL techniques Familiarity with … data visualization tools such as Tableau or Power BI Familiarity with Apache Flink or Apache Storm Understanding of DevOps practices and tools for (CI/CD) pipelines. Awareness of data security best practices and compliance requirements (e.g., GDPR, HIPA). To Qualify: You should be willing to relocate more »