Software Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Apache Iceburg Arrow DBT gRPC protobuf Snowflake TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by … led small teams on the delivery of projects AWS Glue Dremio Agile The following is DESIRABLE, not essential: Snowflake Spark, Airflow, Apache Iceberg, Arrow, DBT Trading, Front Office finance Some appreciation of asset classes such as Fixed Income, equities, FX or commodities Role: Python Software Engineer Team Lead (Architecture Programmer … Engineer Data Enterprise Engineering Developer Programmer AWS GCP Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Arrow DBT gRPC protobuf TypeScript Manager Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my more »
help them further grow their already exciting business. Within this role, you will be responsible for Maintaining, supporting and expanding existing data pipelines using DBT, Snowflake and S3. You will also be tasked with implementing standardised data ingress/egress pipelines coupled with onboarding new, disparate data sets, sourced from more »
for the entire organisation and proper data governance. Utilise and improve our current AWS-based data platform. Work with our tech stack, which includes dbt/DuckDB for transformation, Kafka/RabbitMQ as a streaming platform, Deltalake as a data format, Dagster for managing data assets, and Terraform, Kubernetes, and more »
good grasp of frameworks like DropWizard. Lakehouse Architectures: Familiarity with modern data technologies such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. AWS Services: Hands-on experience with AWS, especially S3, ECS, and EC2/Fargate. Collaborative Approach: Proven ability to work effectively more »
Experience with building and optimizing data pipelines for large-scale datasets. Solid understanding of data modeling concepts and ETL processes. Experience with DBT (DataBuildTool). Desirable Skills: Experience with DBT (DataBuildTool). Experience with FastAPI. Familiarity with Python UI framework packages. Knowledge of Apigee (as an more »
Manchester, Greater Manchester, United Kingdom Hybrid / WFH Options
AutoTrader UK
and scalable web hosting and data platforms. Our platform is a layer on top of core Open Source technologies such as Kubernetes, Istio, Airflow, dbt, running in Public Cloud. It is the glue that allows our teams to deploy into production environments 100s of times per day with the least more »
Product Manager (Product Owner Business Analysis Data Lake Data Mesh Datamesh SQL Architecture Big Data AWS Python Dremio Apache Iceburg Iceberg Arrow DBT Tableau PowerBI Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Buy Side) required by my asset management … management) Role: Product Manager (Product Owner Business Analysis Data Lake Data Mesh Datamesh SQL Architecture Big Data AWS Python Dremio Apache Iceburg Iceberg Arrow DBT Tableau PowerBI Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Buy Side) required by my more »
Product Manager (Product Owner Business Analysis Data Lake Data Mesh Datamesh SQL Architecture Big Data AWS Python Dremio Apache Iceburg Iceberg Arrow DBT Tableau PowerBI Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Buy Side) required by my asset management … management) Role: Product Manager (Product Owner Business Analysis Data Lake Data Mesh Datamesh SQL Architecture Big Data AWS Python Dremio Apache Iceburg Iceberg Arrow DBT Tableau PowerBI Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Buy Side) required by my more »
Manchester, Greater Manchester, United Kingdom Hybrid / WFH Options
AutoTrader UK
are at the cutting edge of applying data technologies to solve problems and you can expect to work with a range of technologies including dbt, Kotlin/Java, Python, Apache Spark and Kafka.Join us as a Principal Software Engineer and, as well as shaping and creating the foundations for insight more »
to ML operations (training, serving, monitoring), and microservice/web application engineering. Our tech stack: Azure Data Lake, Azure Databricks (Notebooks, APIs, Workflows), MLflow, dbt, Harness CI/CD, OpenShift, Docker, GitHub, JIRA DISCOVER your opportunity What will your essential responsibilities include? · Set engineering standards and be accountable for their more »
building Data Platform Products, including a cutting-edge data lakehouse platform and cloud-native streaming technologies using Kafka, data integrations across our payments partners, Dbt and Airflow In collaboration with the Data Platforms PM, support the teams with prioritisation, capacity planning, tracking value and success criteria, systems thinking to drive more »
s largest clients Develop solutions to parse and process tabular data from PDF and HTML documents Maintain, support and expand existing data pipelines using DBT, Snowflake and S3 Implement standardised data ingress/egress pipelines Onboard new, disparate data sets, sourced from many and varied data vendors, covering all asset more »
and tracking the technology innovations applicable to the solutions. Proven experience of working effectively with senior business stakeholders Experience of using tools including Snowflake, DBT, ADF and Azure Synapse Ability to lead teams and projects towards a common architecture approach and language. Strong communication and collaboration skills. Additional Information Legal more »
Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the CTO. 2 + more »
Airflow, Snowflake, etc. Expertise designing and developing with distributed data processing platforms like Databricks/Spark. Experience using ELT/ETL tools such as DBT, FiveTran, etc. Understanding of Agile Delivery best practice Good knowledge of the relevant technologies e.g. SQL, Oracle, PostgreSQL, Python, ETL pipelines, Airflow, Hadoop, Parquet. Strong more »
demonstrate extensive experience having designed and scaled a Data Platform- Has strong Python skills,- Has great SQL, preferably Snowflake- Has previous experience working with dbt & Airflow- Is passionate about solving complex data problems & is interested in working with rich & diverse climate datasets- Cares deeply about the climate and ecosystems of more »
who has hands-on experience in AWS to conduct end-to-end data analysis and data pipeline build-out using Python, Glue, S3, Airflow, DBT, Redshift, RDS, etc.Very solution-driven, and highly collaborative at providing thought leadership and soliciting diverse opinionsAccountable for results. Experienced in leading team of data engineers more »
and non-routine issues and identify improvements in the testing and validation of data accuracy.Extensive experience with Snowflake is essential and working knowledge of dbt, Airflow and AWS is highly desirable.Strong background developing, constructing, testing, and maintaining practical data architectures and drive improvements in data reliability, efficiency, and quality.A proven more »