pipelines Know your way around Unix based operating system Experience working with any major cloud provider (AWS, GCP, Azure) Fluency in English Experience using ApacheAirflow Experience using Docker Experience using Apache Spark Benefits: Salary £40-50K per annum dependant on skills and experience 25 Days more »
analysis, and software design Travel up to 10% What Will Help You On The Job Familiarity with running software services at scale AWS Infrastructure, Airflow, Kafka and data streaming using Spark/Scala Understanding of networking fundamentals (OSI layers 2-7) Technical and software engineering background in the areas more »
Tech: - AWS (S3, Glue, EMR, Athena, Lambda) - Snowflake, Redshift - DBT (Data Build Tool) - Programming: Python, Scala, Spark, PySpark or Ab Initio - Data pipeline orchestration (ApacheAirflow) - Knowledge of SQL This is a 6 month initial contract with a trusted client of ours. CVs are being presented on Friday more »
have a valid visa as we Are not able to sponsor Technical Stack:- Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, ApacheAirflow Skills years of experience in python scripting. in developing applications in Python language. to python-oriented Algorithm’s libraries such as NumPy more »
how these and other technologies can be applied to business problems to generate value. We currently work in an AWS, Snowflake, Looker, Python and airflow stack; you should be comfortable with these (or similar). The person we’re looking for We are looking for a self-starter who more »
and Data Science Closely collaborate with data scientists, product and engineers to innovate and refine the next ML initiatives Good knowledge in Python, SQL, ApacheAirflow, Docker, NoSQL Proficiency using tools like Terraform for Infrastructure-as-Code and GCP infrastructure management. Salary Range and Benefits: We are paying more »
required: Python SQL Kubernetes CI/CD experience with relevant tooling with Jenkins, Docker or Terraform Cloud services experience with AWS/Azure Ideally: Airflow Java Experience working with front office trading systems and financial market data For more information on this role or any other contract/permanent more »
processing and analytics Desired experience: Worked with Python 3.9+ Familiar with Python test automation Experience with SQL and Timeseries databases Familiar with Parquet, Arrow, Airflow, Databricks Experience with cloud AWS services, such as S3, EC2, RDS etc Quality engineering best practice and tooling including TDD, BDD This is an more »
methodologies such as CI/CD, Applicant Resiliency, and Security Preferred qualifications, capabilities, and skills: · Skilled with Python or PySpark · Exposure to cloud technologies (Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka) · Experience with Big Data solutions or Relational DB. · Experience in Financial Service Industry is a bonus. more »
building APIs & SQLAlchmy for database interactions. Strong experience in cloud-based development (AWS). Proficiency with both Docker & Kubernetes for containerisation & orchestration. Understanding or Airflow & DAGs. Experience in building applications using Kafka. Solid OOP principles & design patterns. Permanent/Full-Time Employment. Hybrid working environment (2/3 days more »
in: Data Warehousing Data Engineering, overall Data Analytics Data Visualisation Proficiency in: Google Cloud (GCP) GCP BigQuery Python DBT or similar FastAPI or similar Airflow or similar Desirable: Google Apigee (as an application developer) Exposure to Machine Learning projects Exposure to DataOps Exposure to dataproc or similar more »
existing systems and ingestion pipelines. Requirements: Proven experience working with Python or Java or C# Experience working with ELT/ELT technologies such as Airflow, Argo, Dagster, Spark, Hive Strong technical expertise, especially in data processing and exploration, with a willingness to learn new technologies. A passion for automation more »
in a Data Engineering role Strong SQL and Python development skills Hands-on experience with cloud-based data warehousing technologies (e.g., Snowflake, DBT, FiveTran, AirFlow) Effective communication skills for both technical and non-technical audiences Analytical mindset with attention to detail High energy, enthusiasm, and passion for learning in more »
Redshift, Lambda, Step Functions, DynamoDB, AWS Glue, RDS, Athena, Kinesis, Quicksight. We also widely use other tech such as Snowflake, DBT, Databricks, Informatica, Matillion, Airflow, Tableau, Power BI etc. The Lead Data Architect will liaise with clients to define requirements, refine solutions and ultimately hand over to our own more »
Southampton, Hampshire, South East, United Kingdom Hybrid / WFH Options
Datatech Analytics
record of taking data science products from conception through to deployment in production. You've had experience deploying data science workflows with tools like Airflow or Databricks. Proven experience of leading a data science team Proven experience of working with customer data You've a strong commitment to accuracy more »
Azure SQL Data Warehouse, or Amazon Redshift Support and learn the associated scheduling logic for data pipelines using scheduling tool such as Autosys or Airflow Read and translate logical data models and use ETL skills to load the physical layer, based on an understanding of timing of data loads more »
quality and identify areas for improvement to implement practical solutions. Key Requirements Background in Python Development from an engineering or development environment Experience with Airflow, Cloud (AWS) and Pandas more »
DBT (Data Build Tool): Strong skills in managing transformations and data pipelines. Python: Expertise in scripting, automation, and data manipulation. Beneficial Experience: Dagster/Airflow: Managing complex workflows. Qlik Sense Cloud/Tableau: Data visualization and reporting. Fivetran/AirByte: Efficient data ingestion. AWS: Familiarity with cloud infrastructure. CI more »
data processing, analysis, and visualization libraries Experienced with SQL and Timeseries databases Skilled in AWS services: S3, EC2, RDS Knowledgeable in ETL tools like Airflow Proficient in Git, CI/CD, testing tools, and documentation best practices Adheres to quality engineering practices including TDD and BDD Nice to have more »
converting SAS-based modules to Python-based solutions. Strong understanding of data management principles and experience working with Snowflake. Proficiency in Python, DBT, and Airflow or similar technologies. Excellent problem-solving skills and ability to troubleshoot complex issues. Experience working in an Agile environment and collaborating with cross-functional more »
or warehouse. Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the more »
GCP) is highly preferred (experience with other cloud platforms like AWS or Azure is also considered). Familiarity with data pipeline scheduling tools like Apache Airflow. Ability to design, build, and maintain data pipelines for efficient data flow and processing. Understanding of data warehousing best practices and experience in more »
MLOps) is a plus. Tools currently being used; - Python3, Numpy, Scipy, Xgboost - CI/CD: GitHub Actions, Jenkins, Docker - MLOps: DVC, MLflow - BI: Terraform, Airflow, BigQuery - LLMs: GPT, Claude And this is what you’ll get in return: Salary up to £120,000 depending on experience Share Program Flexible more »
MLOps) is a plus. Tools currently being used; - Python3, Numpy, Scipy, Xgboost - CI/CD: GitHub Actions, Jenkins, Docker - MLOps: DVC, MLflow - BI: Terraform, Airflow, BigQuery - LLMs: GPT, Claude And this is what you'll get in return: Salary up to £120,000 dependant on experience Share Program Flexible more »