Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Bright Purple
have: A passion for manipulation and visualization of data. Working knowledge of Linux. Knowledge of network protocols and operation. Data analysis and visualization in Hadoop/Splunk. Experience with network security products and solutions. Ability to work with Python, HTML, CSS and JavaScript experience. Ifyou are a driven and more »
in computer science, Mathematics, Statistics or similar engineering discipline. Working knowledge of Linux. Knowledge of network protocols and operation. Data analysis and visualization in Hadoop, Splunk. Data manipulation using tools like MapReduce or SQL. Experience with network security products and solutions. HTML, CSS and Javascript experience. Python. Ifyou are more »
in computer science, Mathematics, Statistics or similar engineering discipline. Working knowledge of Linux. Knowledge of network protocols and operation. Data analysis and visualization in Hadoop, Splunk. Data manipulation using tools like MapReduce or SQL. Experience with network security products and solutions. HTML, CSS and Javascript experience. Python. Ifyou are more »
manage multiple tasks and projects simultaneously. Preferred Qualifications AWS Certified Solutions Architect or other relevant AWS certifications. Experience with big data technologies such as Hadoop, Spark, or similar. Knowledge of data governance and data quality best practices. Familiarity with machine learning and AI concepts and tools. more »
Birmingham, England, United Kingdom Hybrid / WFH Options
⭕️ Nimbus®
such as Python, C#, .Net and/or JavaScript is highly desirable. Experience with cloud platforms (e.g., Azure) and data technologies (e.g., SQL, NoSQL, Hadoop, Spark). PLEASE NOTE: You must have either UK citizenship or permanent leave to remain in the UK. Due to the high volume of more »
ideas • Ability to set the direction and deliver on a vision with forward planning to achieve results • Technical knowledge of big data platforms (e.g., Hadoop and Hive) as well as knowledge of ML, Data science and advanced modelling techniques, technologies, and programming languages • Possess a high degree of self more »
following: .NET (VB, C#, ASP.NET, .NET CORE) MVC Framework Python JavaScript (REACT, Bootstrap Frameworks) Database design SQL/SQL Server NoSQL technologies e.g., MongoDB, Hadoop, etc. Your Experience If you’re the right person for the role, you’ll bring experience of working on a range of applications across more »
Data Analytics stack (IS, AS, RS) Power BI, DAX MDS Azure Data Lakes Supporting: Azure ML .Net/HTML5 Azure infrastructure R, Python Powershell Hadoop, Data Factory Principles: Data Modelling Data Warehouse Theory Data Architecture Master Data Management Data Science WHY ADATIS? There’s a long list of reasons more »
following: .NET (VB, C#, ASP.NET, .NET CORE) MVC Framework Python JavaScript (REACT, Bootstrap Frameworks) Database design SQL/SQL Server NoSQL technologies e.g., MongoDB, Hadoop, etc. If you’re the right person for the role, you’ll bring experience of working on a range of applications across the development more »
Romsey, Hampshire, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
Experience with cloud technologies i.e. AWS or Azure Programming language experience i.e. Java, Python, node.js or SQL Data technologies experience i.e. PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (url removed more »
Woking, Surrey, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
Experience with cloud technologies i.e. AWS or Azure Programming language experience i.e. Java, Python, node.js or SQL Data technologies experience i.e. PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (url removed more »
warehousing. Min 7yrs with Python Big Data & Data lake solutions; PostgreSQL, Clickhouse or SnowFlake etc Cloud Infrasutcurre (AWS services) Data processing pipelines using Kafka, Hadoop, Hive, Storm, or Zookeeper Hands-on team leadership The Reward Joining a fast-growth, successful blockchain business. The role offers fully remote work, a more »
Sittingbourne, Kent, South East, United Kingdom Hybrid / WFH Options
Southern Housing
requirements into system functional and non-functional requirements Experience with popular database programming languages including SQL, PL/SQL, possibly extending to NoSQL/Hadoop oriented, and nonrelational databases. In your supporting statement, it is important that you address how you meet each of the above criteria providing real more »
the space of data and AI technologies and business scenarios. Strong understanding of cutting edge and legacy Big Data and AI technologies such as Hadoop, Spark, OpenAI and Claude as well as architectures and domains such as Computer Vision, NLP, Neural Networks, Machine Learning, Generative AI, Data Warehouse and more »
Experience of Data Lake/Hadoop platform implementation Hands-on experience in implementation and performance tuning Hadoop/Spark implementations Experience ApacheHadoop and the Hadoop ecosystem Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, Hcatalog, Solr, Avro) Experience with one … or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto) Experience developing software code in one or more programming languages (Java, Python, etc.) Preferred Qualifications Masters or PhD in Computer Science, Physics, Engineering or Math Hands on experience leading large-scale global data warehousing and analytics projects Ability more »
instrumental in ensuring the security and efficiency of the data handling and reporting processes. Key Responsibilities: Data Processing: Utilize Apache Spark, AWS RDS, and Hadoop to process large datasets efficiently and securely. Reporting: Generate comprehensive and insightful reports using Tableau. Business Rules Management: Implement and manage business rules with more »
EC1N, Farringdon Without, Greater London, United Kingdom
Damia Group Ltd
your key responsibilities will be to : Manage operational procedures. Transform and process data using Apache Spark. Administer AWS RDS with MySQL. Work with the Hadoop platform. Create reports using Tableau. Utilize Red Hat Decision Central Skills/Experience Required of the SC Cleared DevSecOps Engineer: Strong operational procedures knowledge. more »
Employment Type: Permanent
Salary: £50000 - £65000/annum 15% cash flex and 10% bonus
Red Hat Decision Central Key Responsibilities: Manage operational procedures. Transform and process data using Apache Spark. Administer AWS RDS with MySQL. Work with the Hadoop platform. Create reports using Tableau. Utilize Red Hat Decision Central About Capgemini Capgemini is a global leader in partnering with companies to transform and more »
HAVE SC CLEARANCE a. Overall 5-7 Years of IT experience b. Strong experience to Dev-Ops principles c. Strong experience in Spark, Tableau , Hadoop, PL/SQL d. Good experience working with AWS platform e. Good exposure of ITIL processes which includes incident, problem, change management etc. f. more »
reliability and availability of the company's software products. Data Processing Pipelines : You'll design and implement data processing pipelines using technologies like Kafka, Hadoop, Hive, Storm, or Zookeeper, enabling real-time and batch processing of data from the blockchain. Hands-on Team Leadership : As a hands-on leader more »
Data Lake/Hadoop platform implementation Good level hands-on experience in implementation and performance tuning Hadoop/Spark implementations Experience ApacheHadoop and the Hadoop ecosystem Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, Hcatalog, Solr, Avro) Experience with one … or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto) Experience developing software code in one or more programming languages (Java, Python, etc.) Preferred Qualifications: Masters or PhD in Computer Science, Physics, Engineering or Maths Hands on experience leading large-scale global data warehousing and analytics projects Ability more »
Experience with RDS like MySQL or PostgreSQL or others, and experienced with NoSQL like Redis or MongoDB or others; * Experience with BigData technologies like Hadoop eco-system is a plus. * Excellent writing, proof-reading and editing skills. Able to create documentation that can express cloud architectures using text and more »
Key Skills 3+ years of Python experience Highly statistical and Analytical Exposure to Google Cloud Platform ( BigQuery, GCS, Datalab, Dataproc, Cloud ML (desirable) Spark & Hadoop experience Strong communication skills Good problem solving skills Qualifications Bachelor's degree or equivalent experience in a quantative field (Statistics, Mathematics, Computer Science, Engineering … and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau) This is a permanent position, and offers flexibility with Hybrid working, 2-3 days per week in the office, depending on workload more »
3 Development resources (London) with experience in designing and building platforms, and supporting applications both in cloud environments and on-premises. These resources are expected to be open-source contributors to Apache projects, have an in-depth understanding of the more »