Newcastle upon Tyne, Tyne & Wear, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
world of AI/ML. Key experience desired/what you will learn: Design, develop, and maintain scalable data pipelines and ETL processes leveraging GCP, AWS or Azure services. Data Platforms: Databricks, Redshift, Snowflake or BigQuery SQL or Python development. Experience with Java, Scala, C#, and JavaScript would be advantageous more »
pipelines. You have working knowledge of database software and SQL. You have working knowledge of cloud technologies relevant to the above (e.g. Azure, AWS, GCP). You have working knowledge of IaC technologies (e.g. Terraform, Pulumi etc.) You have working knowledge of typical data formats (e.g. JSON, YAML, CSV etc. more »
GCP Consultant, Data Engineer (GCP, Spark, Kafka) - remote GCP Consultant, Senior Data Engineer with strong Spark and engineering workstream … in design and build of data platform including data ingestion, streaming and data warehousing 6 months rolling contracts, inside IR35, remote Experience GCP (GoogleCloudPlatform) Spark, Kafka DataProc, DataFlow, Cloud Composer Data Engineering, Data Pipelines Kafka connect & streams Contract will fall within scope of IR35, so umbrella company will more »
performance computing, large-scale data engineering, and full-stack web development. o Proficient in machine learning and analytics applications. o Experience with cloud infrastructure (GCP, AWS, Azure), DevOps tools (Docker, Terraform, Kubernetes), and data lakehouses (e.g., Databricks). o Familiarity with CI/CD pipelines and automated testing frameworks. * Leadership more »
thinking, and willing to roll up your sleeves and get stuck in! Senior level technical capability with strong problem solving skills. Proven experience with GCP as either a DevOps Engineer or SRE. A good depth of experience using Terraform and Ansible. Strong proficiency in programming languages such as Python, Go more »
City of London, London, United Kingdom Hybrid / WFH Options
Ramsay Health Care
data for ML/predictive and prescriptive etc modelling What youll bring with you: Experience of building data pipelines in a cloudplatform (Azure, GCP, AWS) 2 years minimum experience within Databricks Highly proficient in data modelling and data analysis High degree of knowledge in data structures and data architecture more »
Greater London, England, United Kingdom Hybrid / WFH Options
Ramsay Health Care
for ML/predictive and prescriptive etc modelling What you’ll bring with you: Experience of building data pipelines in a cloudplatform (Azure, GCP, AWS) 2 years minimum experience within Databricks Highly proficient in data modelling and data analysis High degree of knowledge in data structures and data architecture more »
Proven experience in software development using JavaScript or TypeScript Experience with testing software (automated tests or Test Driven Development) Previous experience working with AWS, GCP, Microsoft Azure or another cloud service Have a passion for continued learning of the latest JavaScript and ES versions and trends What you'll get more »
and architectural frameworks (e.g. MVC, SOA, microservices) Strong understanding of software development processes, including agile methodologies Experience with cloud computing platforms (e.g. AWS, Azure, GCP) Excellent communication and interpersonal skills Ability to work independently and as part of a team Strong problem-solving and analytical skills Experience with DevOps practices more »
Languages such as: Python, PowerShell, Golang Experience with container platforms (EKS, Kubernetes, and Docker) Experience with more than one hyper-scaler platform (AWS, Azure, GCP) Experience with HashiCorp tooling (Consul, Vault, Packer) Windows/Linux OS's About Us About Claranet Founded at the beginning of the dot.com bubble in more »
and BigQuery are a plus) Experience with building both batch and streaming ETL pipelines using data processing engines Experience with cloud development (we use GCP and Terraform) including reference architectures and developing specialized stacks on cloud services A strong background in data modeling, data structures, and large-scale data manipulation more »
tests to drive the design of the application. You have experience with building deployment pipelines and continuous delivery on cloud platforms (we use GoogleCloudPlatform, Docker, Terraform and Kubernetes). Bonus Points for: You have an entrepreneurial streak, with some great examples of how you saw an opportunity and more »
service reliability. Technical Skills: 3+ years in an SRE or related role in a high-availability environment Advanced understanding of cloud platforms (AWS or GCP or Azure), containerization technologies (Docker, Kubernetes) and microservices architecture Experience with Linux, including server setup, package deployment, configuration, troubleshooting, and security Understanding of networking (TCP more »
Edinburgh, City of Edinburgh, United Kingdom Hybrid / WFH Options
Cathcart Technology
possible level. You'll ideally have good experience with the following; ** Strong Programming skills; ideally with Java ** Software Architecture ** Cloud Services (AWS, Azure or GCP) ** Proven experience in a similar role The following is highly desirable; ** Apache Kafka ** Front-end experience (ReactJS with TypeScript) ** DevOps tooling (Docker, Kubernetes, Terraforms) They more »
is a major plus Understanding of Snowflake roles and user security Understanding of Snowflake capabilities like Snowpipe, etc. SQL scriptingCloud experience on AWS (Azure, GCP are nice to have as well) Python Scripting is a plus £115,000 - £135,000 a year The compensation range for this role will be more »
environment. Continuous learning mindset, with a willingness to stay updated on new technologies and industry trends. Qualifications Certification in relevant cloud technologies (AWS, Azure, GCP) at an expert/professional level is highly desirable, e.g.: AWS Cloud Practitioner Certification in Kubernetes administration is desirable: Certified Kubernetes Administrator (CKA) Certified Kubernetes more »
westlake, texas, united states Hybrid / WFH Options
Fidelity TalentSource LLC
scripting skills preferred Exposure to On-Prem and Public Cloud as well as hands on experience working with public cloud (AWS/Azure/GCP) Proficient with analytics and monitoring tools such as Grafana, Datadog Experience supporting 24/7, continuous availability production and managed environments. Solid understanding of software more »
also had: Preferable experience with Python and ML technologies. Experience in Big Data and Hadoop ecosystem. Implementation experience in one of the cloud platforms - GCP, Azure or AWS. About working for us: Our focus is to ensure we're inclusive every day, building an organisation that reflects modern society and more »
serverless architectures using AWS Lambda. Experience with containerisation tools like Docker. Knowledge of microservices architecture and design patterns. Familiarity with other cloud providers like GCP or Azure is a plus. AWS Certified DevOps Engineer or other relevant certifications. Soft Skills: Strong problem-solving skills and ability to troubleshoot complex systems. more »
developing using Node.js and TypeScript Experience designing and scaling databases (SQL, PostgreSQL, Firebase) Familiarity designing and writing RESTful APIs Experience developing cloud-based solutions (GCP) Knowledge of practices such as TDD, BDD and CI/CD Experience working in an Agile environment Strong commitment to code quality and best practices more »
engineering, including familiarity with git, CI/CD, testing and releasing. Proven ability to work with large datasets and cloud-based platforms (e.g., AWS, GCP). Self-sufficient with SQL, as well as experience with data pipelining & warehousing technologies. Experience with BigQuery and dbt is a bonus. Experience with econometrics more »
Green Bay, Wisconsin, United States Hybrid / WFH Options
Genesis10
DevOps or automation tools (e.g., Terraform, Powershell, Python). Preferred Experience: Deep understanding of Snowflake architecture, capabilities, and best practices. Cloud platforms (AWS, Azure, GCP) and their services relevant to Snowflake (e.g, S3, Blob Storage, BigQuery). Salary: $110-130k/year + benefits If you have the described more »
green bay, wisconsin, united states Hybrid / WFH Options
Genesis10
DevOps or automation tools (e.g., Terraform, Powershell, Python). Preferred Experience: Deep understanding of Snowflake architecture, capabilities, and best practices. Cloud platforms (AWS, Azure, GCP) and their services relevant to Snowflake (e.g, S3, Blob Storage, BigQuery). Salary: $110-130k/year + benefits If you have the described more »