Azure SQL Data Warehouse, Azure Data Lake, AWS S3,AWS RDS,AWS Lambda or similar Have experience with Open Source big data products i.e. Hadoop Hive, Pig, Impala or similar Have experience with Open Source non-relational or NoSQL data repositories such as: MongoDB, Cassandra, Neo4J or similar Be more »
Azure SQL Data Warehouse, Azure Data Lake, AWS S3,AWS RDS,AWS Lambda or similar Have experience with Open Source big data products i.e. Hadoop Hive, Pig, Impala or similar Have experience with Open Source non-relational or NoSQL data repositories such as: MongoDB, Cassandra, Neo4J or similar Be more »
Power BI and Sigma. • Experience with programming languages such as Python, R, and/or Julia. • Familiarity with data processing frameworks like Spark or Hadoop is a plus. • Solid understanding of statistical analysis techniques, data mining methods, and machine learning algorithms. • Strong analytical and problem-solving skills with the more »
with a statistical programming language (Python or R) and experience with libraries specifically for Machine Learning or Data Analytics as well as knowledge on Hadoop/MapReduce is a plus. Hands-on experience building and delivering large scale enterprise systems/products.Experience attracting, hiring and retaining top engineering talent.Experience more »
with a statistical programming language (Python or R) and experience with libraries specifically for Machine Learning or Data Analytics as well as knowledge on Hadoop/MapReduce is a plus. Hands-on experience building and delivering large scale enterprise systems/products.Experience attracting, hiring and retaining top engineering talent.Experience more »
and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau more »
and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau more »
Analytics. Strong SQL and Python skills. Experience with data modeling, ETL processes, and data warehousing. Knowledge of big data technologies such as Spark and Hadoop is a plus. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Experience in the healthcare sector is a plus more »
programming language, ideally Python but can also be Java or C/c++ SQL expeirence Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau) Get in touch with Ella Alcott - Ella@engagewithus.com more »
Surrey, England, United Kingdom Hybrid / WFH Options
The JM Longbridge Group
e.g., PostgreSQL), NoSQL databases (e.g., MongoDB), and implement data streaming solutions (e.g., Kafka) for efficient data handling. Work with big data technologies such as Hadoop to manage and analyse large datasets. Qualifications : Bachelor’s degree in computer science, Engineering, or related field (or equivalent experience). Experience with cloud more »
PostgreSQL), NoSQL databases (e.g., M MongoDB), and implement data streaming solutions (e.g., Kafka) for efficient data handling. - Work with big data technologies such as Hadoop to manage and analyse large datasets. Qualifications: - Bachelor's Degree in Computer Science or Engineering. - Experience with cloud technologies, particularly Azure and AWS. - Proficiency more »
Employment Type: Permanent
Salary: £52000 - £62000/annum Bonus + Full Benefits
industry experience Experience in distributed system design Experience with Pure/Alloy Working knowledge of open-source tools such as AWS lambda, Prometheus Spark, Hadoop orSnowflake knowledge would be a plus. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices Dublin more »
industry experience Experience in distributed system design Experience with Pure/Alloy Working knowledge of open-source tools such as AWS lambda, Prometheus Spark, Hadoop orSnowflake knowledge would be a plus. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices Dublin more »
industry experience Experience in distributed system design Experience with Pure/Alloy Working knowledge of open-source tools such as AWS lambda, Prometheus Spark, Hadoop orSnowflake knowledge would be a plus. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices Dublin more »
industry experience Experience in distributed system design Experience with Pure/Alloy Working knowledge of open-source tools such as AWS lambda, Prometheus Spark, Hadoop orSnowflake knowledge would be a plus. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices Dublin more »
phases of projects through prototyping, architectural design and delivery. You will be working with Azure tools such as Databricks, Data Factory as well as Hadoop to create big data environments which, in turn, will help businesses to gain greater insight into their big data repositories. RESPONSIBILITIES Working on projects more »
About the roleA Payments FinTech are currently seeking a Data Engineering Lead (Python, Hadoop & SQL) to lead and mentor a talented team of data engineers and scientists as they look to simplify the bank through developing innovative data driven solutions, allowing them to be commercially successful through insight, and more »
in computer science, Mathematics, Statistics or similar engineering discipline. Working knowledge of Linux. Knowledge of network protocols and operation. Data analysis and visualization in Hadoop, Splunk. Data manipulation using tools like MapReduce or SQL. Experience with network security products and solutions. HTML, CSS and JavaScript experience. Python. Ifyou are more »
and analytical abilities. Preferred Skills: Experience with cloud databases (e.g., AWS, Azure, Google Cloud Platform). Knowledge of big data tools and frameworks (e.g., Hadoop, Spark). Certification in database management (e.g., Microsoft Certified: Azure Data Engineer Associate). What We Offer: Competitive salary and comprehensive benefits package. Opportunity more »
on experience with analytic tools like R & Python; & visualization tools like Tableau & Power BI Exposure to cloud platforms and big data systems such as Hadoop HDFS, and Hive is a plus Ability to work with IT and Data Engineering teams to help embed analytic outputs in business processes Graduate more »
London, England, United Kingdom Hybrid / WFH Options
Anson McCade
and NoSQL databases • Programming languages such as Spark or Python • Amazon Web Services, Microsoft Azure or Google Cloud and distributed processing technologies such as Hadoop Benefits : • Base Salary: £45,000 - £75,000 (DoE) • Discretionary Bonus: Circa 10% per annum • DV Bonus: Circa £5,000 • Flex Fund: £5000 • Health: Private more »
experience Demostrate In-depth knowledge of large-scale data platforms (Databricks, Snowflake) and cloud-native tools (Azure Synapse, RedShift) Experience of analytics technologies (Spark, Hadoop, Kafka) Have familiarity with Data Lakehouse architecture, SQL Server, DataOps, and data lineage concepts Demonstrate In-depth knowledge of large-scale data platforms (Databricks more »
scaling, and troubleshooting of cloud systems.- Operational experience running a 24x7 production infrastructure at scale.- Proficiency working with data structures, schemas, and technologies like Hadoop, Hive, Redis, and MySQL- Experience in using cloud-native services like GKE, EKS, AWS/GCP load balancing, AWS/GCP cloud storage platforms more »
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Bright Purple
have: A passion for manipulation and visualization of data. Working knowledge of Linux. Knowledge of network protocols and operation. Data analysis and visualization in Hadoop/Splunk. Experience with network security products and solutions. Ability to work with Python, HTML, CSS and JavaScript experience. Ifyou are a driven and more »