software development Experience with or a strong passion to learn more about: Software development and testing in languages such as C#, Go, Java, C++, Python, Typescript Containerization, DevOps, and Cloud Platforms such as Azure or AWS K8s provisioning, configuration and operation Logging, monitoring, and observability tooling CI/CD best More ❯
roles. AWS Proficiency: Strong understanding of AWS services (e.g., EC2, S3, Lambda, SageMaker, ECS) and cloud infrastructure management. Programming and ML Frameworks: Proficiency in Python and experience with ML frameworks such as scikit-learn, TensorFlow, or PyTorch. CI/CD Experience: Experience with CI/CD tools and practices (e.g. More ❯
equivalent experience Five (5+) years of experience Extensive experience with automation tools such as Terraform, Chef, or Ansible Scripting experience with the following languages: Python, Ruby, Bash Experience with modern git repo technologies (GitHub, BitBucket, GitLab) Experience with CI/CD technologies (Jenkins, GitLab CI) Problem-solving skills and attitude More ❯
various levels of technical and non-technical depth BS/MS or equivalent experience required or equivalent military experience required Experience with scripting including Python, JSON, YAML, and Bash - plus 3-5 years of relevant experience with strong communication and customer service skills Additional Information The Team Our technical support More ❯
storage, and virtualization technologies Experience with cloud platforms (e.g., AWS, Azure, GCP) and their UNIX/Linux offerings Proficiency in scripting languages (e.g., Bash, Python, Perl) Familiarity with configuration management tools (e.g., Ansible, Puppet, Chef) Knowledge of container technologies (e.g., Docker, Kubernetes) Experience with monitoring tools (e.g., Nagios, Zabbix, Prometheus More ❯
building and implementing data science and machine learning solutions to tackle business problems. Comfort with rapid prototyping and disciplined software development processes. Experience with Python, ML libraries (e.g. spaCy, NumPy, SciPy, Transformers, etc.), data tools and technologies (Spark, Hadoop, Hive, Redshift, SQL), and toolkits for ML and deep learning (SparkML More ❯
the adoption of AI-driven strategies across BIC operations. Identify opportunities for AI integration to enhance BIC research processes. Use advanced analytics tools (e.g., Python, R, SQL, Tableau, Power BI) to analyse, visualize, and interpret data. Support efforts to continuously innovate and improve analytical processes using ML, ensuring the delivery More ❯
the best results. Experience Needed/Required: A BSc (or advanced high-level qualification) in Computer Science or equivalent level of experience. Proficiency in Python and some level of Shell scripting experience. Familiarity with Linux containerization concepts. Willingness and ability to work to an Agile methodology. Experience with CI/ More ❯
date knowledge of security practices, standards, and protocols related to API development. Deep knowledge and use of at least one common programming language: e.g., Python, Scala, Java, including toolchains for documentation, testing, and operations/observability. Experience with API management platforms and tools (e.g., Apigee, AWS API Gateway, Postman). More ❯
build and maintain a system and culture that supports and implements SLOs. Familiar with Docker & Kubernetes, specifically EKS & ECS. Familiar with programming, such as Python or .NET C#. Additional Qualifications: Proven experience in monitoring, analyzing, and optimizing the performance of large-scale distributed systems in a cloud environment. Proven experience More ❯
level and we are looking for data engineers who have a variety of different skills which include some of the below. Strong proficiency in Python Extensive experience with cloud platforms (AWS, GCP, or Azure) Experience with: Data warehousing and lake architectures SQL and NoSQL databases Distributed computing frameworks (Spark, Kinesis More ❯
performance computing frameworks (MPI, OpenMP, CUDA, Triton); cloud computing (on hyper-scaler platforms, e.g., AWS, Azure, GCP); building machine learning models and pipelines in Python, using common libraries and frameworks (e.g., NumPy, SciPy, Pandas, PyTorch, JAX), especially including deep learning applications; C/C++ for computer vision, geometry processing, or More ❯
production. Understanding of data governance, security, and compliance frameworks, including GDPR and ISO 27001. Knowledge of data visualisation tools (e.g., Power BI, Tableau, Quicksight, Python visualisations) is beneficial. Soft Skills And Leadership Ability to engage C-level stakeholders, translating complex technical concepts into business value. Strong analytical and problem-solving More ❯
production. Understanding of data governance, security, and compliance frameworks, including GDPR and ISO 27001. Knowledge of data visualisation tools (e.g., Power BI, Tableau, Quicksight, Python visualisations) is beneficial. Soft skills and leadership Ability to engage C-level stakeholders, translating complex technical concepts into business value. Strong analytical and problem-solving More ❯
production. Understanding of data governance, security, and compliance frameworks, including GDPR and ISO 27001. Knowledge of data visualisation tools (e.g., Power BI, Tableau, Quicksight, Python visualisations) is beneficial. Soft Skills And Leadership Ability to engage C-level stakeholders, translating complex technical concepts into business value. Strong analytical and problem-solving More ❯
metrics, data preprocessing, and feature engineering. Proven experience building and deploying RAG systems and/or LLM-powered applications in production environments. Proficiency in Python and ML libraries such as PyTorch, Hugging Face Transformers , or TensorFlow. Experience with vector search tools (e.g., FAISS, Pinecone, Weaviate) and retrieval frameworks (e.g., LangChain More ❯
principles enabling you to build industry leading products, we'd love to hear from you! Requirements Proficiency in at least one programming language (e.g., Python, C#, JavaScript, Kotlin, Java, Go). Experience with Gen AI tools (e.g., LangGraph, CrewAI, Hugging Face, OpenAI APIs) and ML frameworks such as PyTorch/ More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Leonardo
in secure environments. Experience in managing large-scale, real-time data pipelines and ensuring their performance and reliability. Strong scripting and programming skills in Python, Bash, or other relevant languages. Working knowledge of cloud platforms (AWS, Azure, GCP) with a focus on data security and infrastructure as code. Excellent communication More ❯
Employment Type: Permanent, Part Time, Work From Home
Nice to have: Bachelor's or advanced degree in Computer Science, Engineering, Mathematics, natural sciences, or a related field. Basic development experience in Java, Python, Go, or C# Prior experience using Git Exposure to hybrid cloud/on-prem environments and High Performance Compute (HPC), and job schedulers (e.g. Slurm More ❯
platforms (AWS, Google Cloud, Azure) and experience with microservices, containerisation (Docker, Kubernetes), DevOps, and CI/CD pipelines. You have hands-on experience with Python, React, Typescript, and Kubernetes. Excellent problem-solving skills, with the ability to lead, inspire and motivate a team. Strong communication and leadership skills, with the More ❯
for managing cloud resources. Experience with setting up and managing GPU-accelerated environments for large-scale model inference. Programming & Frameworks: Strong programming skills in Python and good knowledge of Scala and/or Java. Experience with ML libraries. Solid understanding of distributed processing systems like Spark. Experience building production-grade More ❯
be beneficial. Experience of using agile delivery tools such as JIRA, Pivotal, Collab, Confluence. Experience of engineering based on the likes of SQL, SSIS, Python, Java, Scala, XML/FpML and Power BI. Data architecture, data lineage including an understanding of AI. Testing/quality engineering; experience of test automation More ❯
Working within the client's test and government framework, you will be responsible for creating the automation framework to support all data warehousing progression and regression testing and fix the defects. Responsibilities Analyse and document test data, results, and recommendations. More ❯