roles. AWS Proficiency: Strong understanding of AWS services (e.g., EC2, S3, Lambda, SageMaker, ECS) and cloud infrastructure management. Programming and ML Frameworks: Proficiency in Python and experience with ML frameworks such as scikit-learn, TensorFlow, or PyTorch. CI/CD Experience: Experience with CI/CD tools and practices (e.g. More ❯
equivalent experience Five (5+) years of experience Extensive experience with automation tools such as Terraform, Chef, or Ansible Scripting experience with the following languages: Python, Ruby, Bash Experience with modern git repo technologies (GitHub, BitBucket, GitLab) Experience with CI/CD technologies (Jenkins, GitLab CI) Problem-solving skills and attitude More ❯
various levels of technical and non-technical depth BS/MS or equivalent experience required or equivalent military experience required Experience with scripting including Python, JSON, YAML, and Bash - plus 3-5 years of relevant experience with strong communication and customer service skills Additional Information The Team Our technical support More ❯
storage, and virtualization technologies Experience with cloud platforms (e.g., AWS, Azure, GCP) and their UNIX/Linux offerings Proficiency in scripting languages (e.g., Bash, Python, Perl) Familiarity with configuration management tools (e.g., Ansible, Puppet, Chef) Knowledge of container technologies (e.g., Docker, Kubernetes) Experience with monitoring tools (e.g., Nagios, Zabbix, Prometheus More ❯
building and implementing data science and machine learning solutions to tackle business problems. Comfort with rapid prototyping and disciplined software development processes. Experience with Python, ML libraries (e.g. spaCy, NumPy, SciPy, Transformers, etc.), data tools and technologies (Spark, Hadoop, Hive, Redshift, SQL), and toolkits for ML and deep learning (SparkML More ❯
the adoption of AI-driven strategies across BIC operations. Identify opportunities for AI integration to enhance BIC research processes. Use advanced analytics tools (e.g., Python, R, SQL, Tableau, Power BI) to analyse, visualize, and interpret data. Support efforts to continuously innovate and improve analytical processes using ML, ensuring the delivery More ❯
the best results. Experience Needed/Required: A BSc (or advanced high-level qualification) in Computer Science or equivalent level of experience. Proficiency in Python and some level of Shell scripting experience. Familiarity with Linux containerization concepts. Willingness and ability to work to an Agile methodology. Experience with CI/ More ❯
date knowledge of security practices, standards, and protocols related to API development. Deep knowledge and use of at least one common programming language: e.g., Python, Scala, Java, including toolchains for documentation, testing, and operations/observability. Experience with API management platforms and tools (e.g., Apigee, AWS API Gateway, Postman). More ❯
level and we are looking for data engineers who have a variety of different skills which include some of the below. Strong proficiency in Python Extensive experience with cloud platforms (AWS, GCP, or Azure) Experience with: Data warehousing and lake architectures SQL and NoSQL databases Distributed computing frameworks (Spark, Kinesis More ❯
performance computing frameworks (MPI, OpenMP, CUDA, Triton); cloud computing (on hyper-scaler platforms, e.g., AWS, Azure, GCP); building machine learning models and pipelines in Python, using common libraries and frameworks (e.g., NumPy, SciPy, Pandas, PyTorch, JAX), especially including deep learning applications; C/C++ for computer vision, geometry processing, or More ❯
Private Cloud). Experience working with Virtualization technologies (Docker, Kubernetes). Experience working with micro-service, analytics and messaging architectures (Quantexa, Kafka, NodeJS/Python microservices). Experience working in CI/CD and DevOps environments. Experience with relational and non-relational databases. Recent experience programming in one or more More ❯
production. Understanding of data governance, security, and compliance frameworks, including GDPR and ISO 27001. Knowledge of data visualisation tools (e.g., Power BI, Tableau, Quicksight, Python visualisations) is beneficial. Soft Skills And Leadership Ability to engage C-level stakeholders, translating complex technical concepts into business value. Strong analytical and problem-solving More ❯
production. Understanding of data governance, security, and compliance frameworks, including GDPR and ISO 27001. Knowledge of data visualisation tools (e.g., Power BI, Tableau, Quicksight, Python visualisations) is beneficial. Soft skills and leadership Ability to engage C-level stakeholders, translating complex technical concepts into business value. Strong analytical and problem-solving More ❯
production. Understanding of data governance, security, and compliance frameworks, including GDPR and ISO 27001. Knowledge of data visualisation tools (e.g., Power BI, Tableau, Quicksight, Python visualisations) is beneficial. Soft Skills And Leadership Ability to engage C-level stakeholders, translating complex technical concepts into business value. Strong analytical and problem-solving More ❯
principles enabling you to build industry leading products, we'd love to hear from you! Requirements Proficiency in at least one programming language (e.g., Python, C#, JavaScript, Kotlin, Java, Go). Experience with Gen AI tools (e.g., LangGraph, CrewAI, Hugging Face, OpenAI APIs) and ML frameworks such as PyTorch/ More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Leonardo
in secure environments. Experience in managing large-scale, real-time data pipelines and ensuring their performance and reliability. Strong scripting and programming skills in Python, Bash, or other relevant languages. Working knowledge of cloud platforms (AWS, Azure, GCP) with a focus on data security and infrastructure as code. Excellent communication More ❯
Employment Type: Permanent, Part Time, Work From Home
Nice to have: Bachelor's or advanced degree in Computer Science, Engineering, Mathematics, natural sciences, or a related field. Basic development experience in Java, Python, Go, or C# Prior experience using Git Exposure to hybrid cloud/on-prem environments and High Performance Compute (HPC), and job schedulers (e.g. Slurm More ❯
platforms (AWS, Google Cloud, Azure) and experience with microservices, containerisation (Docker, Kubernetes), DevOps, and CI/CD pipelines. You have hands-on experience with Python, React, Typescript, and Kubernetes. Excellent problem-solving skills, with the ability to lead, inspire and motivate a team. Strong communication and leadership skills, with the More ❯
for managing cloud resources. Experience with setting up and managing GPU-accelerated environments for large-scale model inference. Programming & Frameworks: Strong programming skills in Python and good knowledge of Scala and/or Java. Experience with ML libraries. Solid understanding of distributed processing systems like Spark. Experience building production-grade More ❯
be beneficial. Experience of using agile delivery tools such as JIRA, Pivotal, Collab, Confluence. Experience of engineering based on the likes of SQL, SSIS, Python, Java, Scala, XML/FpML and Power BI. Data architecture, data lineage including an understanding of AI. Testing/quality engineering; experience of test automation More ❯
Working within the client's test and government framework, you will be responsible for creating the automation framework to support all data warehousing progression and regression testing and fix the defects. Responsibilities Analyse and document test data, results, and recommendations. More ❯
of experience as a Data Scientist or Data Analyst. Experience in data mining. Understanding of machine learning and operations research. Knowledge of SQL and Python; Python ML frameworks familiarity is a plus; familiarity with Java is a big plus. Experience using ELK Stack (Elasticsearch, Logstash, Kibana). Experience using business More ❯