Google Cloud Professional Cloud Architect or Professional Cloud Developer certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes and data warehouse solutions utilising technologies more »
Google Cloud Professional Cloud Architect or Professional Cloud Developer certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes and data warehouse solutions utilising technologies more »
science, Information Technology, or a related fieldExperience with containerisation and orchestration technologies (e.g., Docker, Kubernetes).Knowledge of big data technologies and frameworks (e.g., Hadoop, Spark).Familiarity with other cloud platforms (e.g. AWS, Google Cloud) and PaaS providers (e.g. Snowflake)Knowledge of Inner or Open Source paradigm and way of more »
science, Information Technology, or a related fieldExperience with containerisation and orchestration technologies (e.g., Docker, Kubernetes).Knowledge of big data technologies and frameworks (e.g., Hadoop, Spark).Familiarity with other cloud platforms (e.g. AWS, Google Cloud) and PaaS providers (e.g. Snowflake)Knowledge of Inner or Open Source paradigm and way of more »
Experience with data structures/algorithms, building Data Platforms, Data-lake and Business Intelligence solutions.Experience as a data engineer: implementing data pipelines (using PySpark, Spark SQL, Scala, etc), orchestration tools/services (i.e. Airflow, data factory) and testing frameworks.1 year plus of experience in a technical leadership roleExperience in more »
data access, and data storage techniques. Excellent problem-solving skills and the ability to think algorithmically. Desirable Skills: Knowledge of big data technologies (Hadoop, Spark, Kafka) is highly desirable. Familiarity with data governance and compliance requirements. more »
with ETL processes and tools. Knowledge of cloud platforms (e.g., GCP, AWS, Azure) and their data services. Familiarity with big data technologies (e.g., Hadoop, Spark) is a plus. Understanding of AI tools like Gemini and ChatGPT is also a plus. Excellent problem-solving and communication skills. Ability to work more »
as TensorFlow, PyTorch, or Scikit-learn. Strong knowledge of statistical modelling, data mining, and data visualization techniques. Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure). Strong problem-solving skills and the ability to think critically and creatively. Excellent analytical skills with more »
value through improved data handling and analysis. Responsibilities: Build predictive models using machine-learning techniques that generate data-driven insights on modern data platforms (Spark, Hadoop and other map-reduce tools); Develop and productionalize containerized algos for deployment in hybrid cloud environments (GCP, Azure) Connect and blend data from more »
least one cloud platform (preferably GCP).BSc/MSc in computer science, maths, physics or STEM subject.Basic knowledge of statistics and machine learning.Experience with Spark, Apache services, ETL tools, Data visualization and dashboards.Experience with streamed data processing, parallel compute, and/or event based architectures.Experience with web-scraping more »
or hedge fund industry. Technical Skills: Proficiency in Python and SQL. Experience with relational and NoSQL databases. Knowledge of big data frameworks (e.g., Hadoop, Spark, Kafka). Understanding of financial markets and trading systems. Strong analytical, problem-solving, and communication skills. Familiarity with DevOps tools and practices. This is more »
/CD, Applicant Resiliency, and Security Preferred qualifications, capabilities, and skills: · Skilled with Python or PySpark · Exposure to cloud technologies (Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka) · Experience with Big Data solutions or Relational DB. · Experience in Financial Service Industry is a bonus. more »
leading business intelligence platform (e.g. Microsoft, Crystal, Qlik, SAP, Tableau). Good understanding of open source, big data, and cloud data platforms (e.g. Hadoop, Spark, Hive, Pentaho, AWS, Azure); given a business problem, you can analyse and evaluate options and recommend solutions. Proven experience in designing, building and maintaining more »
Learn, TensorFlow, PyTorch). Solid understanding of ML and data pipeline architectures and best practices. Experience with big data technologies and distributed computing (e.g., Spark, Hadoop) is a plus. Proficient in SQL and experience with relational databases. Strong analytical and problem-solving skills, with a keen attention to detail. more »
etc). Experience with SQL and query design on large, complex datasets. Experience with cloud and big-data tools and frameworks like Databricks/Spark, Airflow, Snowflake, etc. Expertise designing and developing with distributed data processing platforms like Databricks/Spark. Experience using ELT/ETL tools such as more »
Manchester, Greater Manchester, United Kingdom Hybrid / WFH Options
AutoTrader UK
of applying data technologies to solve problems and you can expect to work with a range of technologies including dbt, Kotlin/Java, Python, ApacheSpark and Kafka.Join us as a Principal Software Engineer and, as well as shaping and creating the foundations for insight-driven, market-leading … delivery chain, from data to productsYou will have an understanding of data modelling and experience with data engineering tools and platforms such as, Kafka, Spark, and HadoopComfortable presenting technical ideas to non-technical colleaguesExperience mentoring and coaching and sharing technical expertiseStrong team work ethic, communication and collaboration skillsSupport with … our recruitment process by evaluating candidates at all stages.Although not essential, helpful experience includes:Messaging systems such as Apache Kafka or Google Pub/SubDocker/Container OrchestrationWorking experience with Google CloudEvery candidate brings a unique mix of skills and qualities to the table. We're all about inclusivity more »
engineers of varying levels of experience Flexibility and willingness to adapt to new software and techniques Nice to Have Experience working with projects in ApacheSpark, Databricks of similar Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities?A technical expert and leader on the more »
engineers of varying levels of experience. Flexibility and willingness to adapt to new software and techniques. Nice to Have Experience working with projects in ApacheSpark, Databricks of similar. Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities? A technical expert and leader on the more »
engineers of varying levels of experience. Flexibility and willingness to adapt to new software and techniques. Nice to Have Experience working with projects in ApacheSpark, Databricks of similar. Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities? A technical expert and leader on the more »
or similar technologies. Hands-on experience with AWS and snowflake. Financial services industry experience (highly desirable). Experience with Big Data technologies such as Spark or Hadoop. Bachelor's degree in computer science, Engineering, or equivalent. Further information available upon application. ECS Recruitment Group Ltd is acting as an more »
Guildford, England, United Kingdom Hybrid / WFH Options
Hawksworth
warehousing and ETL frameworks Proficiency in working with relational databases (e.g., Oracle, PostgreSQL), Parquet/Delta files and big data technologies (e.g. Synapse, Hadoop, Spark, Kafka) Knowledge of Microsoft Azure and associated data services is a good to have. Strong analytical and data interpretation skills, with the ability to more »
Bedford, Bedfordshire, United Kingdom Hybrid / WFH Options
Understanding Recruitment
frameworks (TensorFlow, PyTorch etc.)MLOps experienceNice to have:Familiarity with Git or other Version Control SystemsComputer Vision Library exposureUnderstanding of Big Data Technologies (Hadoop, Spark etc)Experience with Cloud platforms (AWS, GCP or Azure)This is a fully remote role, but may require very occasional travel (once a month more »
data lake/warehouse/hub built in GCP. You are confident using the full suite of Google data products, IAC, CI/CD, Spark and Kafka. Our core toolbox includes Google Cloud Big Data technologies, Scala, Java & Python, Jenkins amongst others. We value first principles reasoning to select more »
London, England, United Kingdom Hybrid / WFH Options
Anson McCade
tools such as Informatica MDM, Informatica AXON, Informatica EDC, and Collibra • MySQL, SQL Server, Oracle, Snowflake, PostgreSQL and NoSQL databases • Programming languages such as Spark or Python • Amazon Web Services, Microsoft Azure or Google Cloud and distributed processing technologies such as Hadoop Benefits : • Base Salary more »
to: Backend technology, Python. Databases like MSSQL. Front-end technology, Java. Cloud platform, AWS. Programming language, JavaScript (React.js) Big data technologies such as Hadoop, Spark, or Kafka. What We Need from You: Essential Skills: A degree in Computer Science, Engineering, or a related field, or equivalent experience. Proficiency in more »