stakeholders. Team Player: Ability to work effectively in a collaborative team environment, as well as independently. Preferred Qualifications: Experience with big data technologies (e.g., Hadoop, Spark, Kafka). Experience with AWS and its data services (e.g. S3, Athena, AWS Glue). Familiarity with data warehousing solutions (e.g., Redshift, BigQuery more »
Oracle, SQL Server, PostgreSQL) and data warehousing technologies. Experience with cloud-based data solutions (AWS, Azure, GCP). Familiarity with big data technologies like Hadoop, Spark, and Kafka. Technical Skills: Proficiency in data modelling (ERD, normalization) and data warehousing concepts. Strong understanding of ETL frameworks and tools (e.g., Talend more »
Oracle, SQL Server, PostgreSQL) and data warehousing technologies. Experience with cloud-based data solutions (AWS, Azure, GCP). Familiarity with big data technologies like Hadoop, Spark, and Kafka. Technical Skills: Proficiency in data modelling (ERD, normalization) and data warehousing concepts. Strong understanding of ETL frameworks and tools (e.g., Talend more »
computational modelling and deeply appreciate the challenges. You have written RESTful APIs and/or Webapps. You have implemented "Big Data" processing setups (e.g. Hadoop/Spark ecosystem, DataBricks, Cassandra etc.) You can code to an advanced level in Python. You are competent at coding in VBA. You have more »
Advanced working knowledge in SQL and relational databases (e.g., Microsoft SQL Server, Oracle). Big Data Technologies: Familiarity with big data technologies such as Hadoop, Spark, or Kafka is a plus. Data Warehousing: Experience in data warehousing, including data modeling and implementing data pipelines. Data Management: Multi-skilled experience more »
or Tableau. Experience with real-time analytics and event-driven architectures using tools such as Apache Kafka. Background in big data technologies such as Hadoop, HBase, or Cassandra. more »
technical pre-sales, including crafting and presenting innovative proposals to clients. Desirable Skills Google Cloud or AWS Solutions Architect certification. Familiarity with ETL tools, Hadoop-based technologies (e.g., Spark), and data pipelines (e.g., Beam, Flink). Experience designing data lake and data warehouse solutions (e.g., BigQuery, Azure Synapse, Redshift more »
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
Be Technology
GPT4. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud for AI/ML deployments. Knowledge of big data technologies like Spark, Hadoop, or Kafka is a plus. Proficient in version control tools like Git and CI/CD pipelines for machine learning workflows. Other Requirements A more »
building and optimising data pipelines and distributed data systems. - Strong expertise in cloud platforms (AWS, GCP, or Azure) and modern data technologies (Spark, Kafka, Hadoop, or similar). - Proficiency in programming languages such as Python, Scala, or Java. - Experience working on AI/ML-driven platforms, with knowledge of more »
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Bowerford Associates
optimal data extraction, transformation and loading using leading cloud technologies including Azure and AWS. Leverage Big Data Technologies - you will utilise tools such as Hadoop, Spark and Kafka to design and manage large-scale data processing systems. Extend and maintain the data Warehouse - you will be supporting and enhancing more »
similar. Experience with machine learning frameworks (e.g., TensorFlow, Scikit-learn). Strong knowledge of SQL and database management. Familiarity with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure). Soft Skills: Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work more »
and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau) Natural Language Processing/Deep Learning/NLP/Python/Leadership more »
Work On: Data Warehousing ETL (Extract, Transform, Load) Processes Data Modelling and Database Management Data Pipeline Development Data Quality Assurance Big Data Technologies (e.g., Hadoop, Spark) Data Visualization Roles & Responsibilities: Collaborate with experienced data engineering professionals and global team members. Participate in designing and implementing data warehousing solutions. Develop more »
Work On: Data Warehousing ETL (Extract, Transform, Load) Processes Data Modelling and Database Management Data Pipeline Development Data Quality Assurance Big Data Technologies (e.g., Hadoop, Spark) Data Visualization Roles & Responsibilities: Collaborate with experienced data engineering professionals and global team members. Participate in designing and implementing data warehousing solutions. Develop more »
Nottinghamshire, Basford, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
scikit-learn). Advanced knowledge of programming languages such as Python, R, or Scala. Experience working with large-scale data platforms (e.g., Spark, Snowflake, Hadoop). Hands-on expertise with cloud services like AWS, Azure, or GCP. Strong analytical skills with a deep understanding of statistics and optimisation techniques. more »
with rapid prototyping and disciplined software development processes. Experience with Python, ML libraries (e.g. spaCy, NumPy, SciPy, Transformers, etc.), data tools and technologies (Spark, Hadoop, Hive, Redshift, SQL), and toolkits for ML and deep learning (SparkML, Tensorflow, Keras). Demonstrated ability to work on multi-disciplinary teams with diverse more »
Stevenage, England, United Kingdom Hybrid / WFH Options
Capgemini Engineering
visualization tools (e.g., Matplotlib, Seaborn, Tableau).• Ability to work independently and lead projects from inception to deployment.• Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure) is desirable.• MSc or PhD in Computer Science, Data Science, Artificial Intelligence, or related field is more »
Manchester Area, United Kingdom Hybrid / WFH Options
Made Tech
infrastructure underpinning data systems through a DevOps approach Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop Good understanding of the possible architectures involved in modern data system design (e.g. Data Warehouse, Data Lakes and Data Meshes) and the different use more »
and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau more »
IT industry (prefer experience in Banking & Financial Services sector Key skills: Databases (RDBMS, No SQL, open source, proprietary), SQL expertise, StreamSets, Kafka, Big Data Hadoop, Spark, Python, Data on Cloud, Test Management tools (Octane), JIRA - Project experience in building Data solutions, experience in utilizing and feeding data from data more »
experience in data engineering, including working with AWS services. Proficiency in AWS services like S3, Glue, Redshift, Lambda, and EMR. Knowledge on Cloudera based hadoop is a plus. Strong ETL development skills and experience with data integration tools. Knowledge of data modeling, data warehousing, and data transformation techniques. Familiarity more »
and implementing systematic trading systems Experience in developing and deploying RESTful APIs and microservices for model serving Familiarity with Big Data technologies such as Hadoop and Spark Experience with data visualization and reporting tools Familiarity with databases and query languages for data extraction and transformation Understanding of and experience more »
such as scikit-learn, TensorFlow, or PyTorch - Solid understanding of statistical analysis and data mining techniques - Familiarity with big data technologies like Spark or Hadoop - Experience with data visualization tools (e.g., Matplotlib, Seaborn, Plotly) Nice to Have: - Experience with cloud platforms (AWS, GCP, or Azure) - Knowledge of NLP or more »