required Expertise with Core Java, namely multithreading, accompanied with some Python is also acceptable Advanced SQL. Experience with cloud technologies is a plus (AWS, Snowflake, etc) Familiarity with equities and equity derivatives within a real-time electronic trading environment is required Strong communication skills; ability to liaise with investment professionals more »
with data orchestration tools: e.g. Apache Airflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL). Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments more »
Experience working in an electronic/systematic trading or investment firm. Experience working directly with Portfolio Managers, Traders, Quants and/or Researchers. AWS, Snowflake JavaScript, Typescript, HTML5, React .Net, C#, Java, JEE, Jakarta EE, Spring, Object-relational Mappers (ORM). RESTful Web Services Microservices Implementations. Data visualisation. Role Description more »
permanent. Looking for: 3+ years of professional data engineering experience. Proficiency in Python and Java 11+. Familiarity with modern data technologies such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. Hands-on experience with AWS. Ability to work effectively with both business and more »
skills in Python and Java 11+, with a good grasp of frameworks like DropWizard. Lakehouse Architectures: Familiarity with modern data technologies such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. AWS Services: Hands-on experience with AWS, especially S3, ECS, and EC2/ more »
for achieving project success. Key Responsibilities: Software Development: Write high-quality, maintainable code using languages such as Python and SQL Establish data tools like Snowflake and Azure Data Lake Services (ADLS) Gen 2 Utilize PowerBI, Tableau, or similar tools to design and create interactive and visually appealing dashboards and reports. more »
Preferred Qualifications If we had our say, we'd also look for: ETL/ELT Tools (AbInitio, DataStage, Informatica) Cloud Tools and Databases (AWS, Snowflake) Other programming languages (Unix scripting, Python, etc.) Leverage CI/CD framework for data integration, Open Source Experience working in cloud platforms (AWS, GCP, Azure more »
modelling and data vault 2.0 architectures) Key Responsibilities: Build and maintain scalable data pipelines written in Python and SQL and ran on AWS/Snowflake Taking ownership of data quality within projects Managing and educating a range of stakeholders when gathering requirements and delivering data projects Building effective and collaborative more »
experience with BI tools such as Looker is highly advantageous Experience working with cloud data warehouses, ideally with AWS/Redshift, Azure, GCP, or Snowflake Experience with dbt is highly advantageous Responsibilities Analyze, organize, and prepare raw data for modeling and data analytics Architect and assist in building data systems more »
General knowledge of relational databases (e.g., SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra, Couchbase). Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery). Excellent scripting skills (e.g., Python, SQL). Strong analytical and problem-solving skills. Excellent communication and interpersonal skills, with the ability to more »
Manchester, North West, United Kingdom Hybrid / WFH Options
True Worth Consulting Ltd
. Experience with RESTful API frameworks. Development Opportunities: Successful candidates will have the opportunity to pursue advanced certifications in key technologies such as DBT, Snowflake, and Azure SQL. There is potential for career advancement to Data Architect or product engineering roles, focusing on solving complex client challenges. Desired Values : We more »
Bachelors' degree in Computer Science, Engineering, Mathematics or related fields 5+ years of work experience in designing, developing, and troubleshooting complex SQL queries, including Snowflake, Oracle, MS SQL Server, PostgreSQL, and MySQL 3+ years of hands-on experience in ETL/ELT tools, including DBT,Airflow, or similar platforms 3+ more »
Analytics. Experience of working with large, complex data warehouses and/or data lakes. Familiarity with cloud-based analytics platforms such as AWS, Azure, Snowflake, Google Cloud Platform (Big Query), Spark, and Splunk. Proficiency in SQL and experience using one or more of the following languages: R, Python, Scala, and more »
of AWS infrastructure, including S3, Redshift, Lambda, Step Functions, DynamoDB, AWS Glue, RDS, Athena, Kinesis, Quicksight. We also widely use other tech such as Snowflake, DBT, Databricks, Informatica, Matillion, Airflow, Tableau, Power BI etc. The Lead Data Architect will liaise with clients to define requirements, refine solutions and ultimately hand more »
Advanced knowledge of data visualization tools and dashboard design experience essential (Tableau preferred) Experience in the use of large databases and data warehouses required (Snowflake preferred) Experience with transitioning and deploying data science and quantitative models to Production environment required along with exposure to Agile development process -- ability to articulate more »
languages or toolsets: AutoSys, Azure Function App, Azure GIT, Azure Portal, C#, Databricks, GraphQL/Graph API, Informatica CDI, Informatica Power Center, Java, Javascript, Snowflake, PowerBI, PyRecs, Python, Selenium, Spark, and SQL Nice to have skills Ability to propose and estimate the financial impact of architectural alternatives Existing knowledge of more »
West Bend, Wisconsin, United States Hybrid / WFH Options
Delta Defense
able to apply them. 2+ years dbt experience required. 5+ years of experience in data engineering required. Prior experience executing within cloud data warehouses (Snowflake, Redshift, BigQuery) required. 2+ years of experience in a business intelligence role desired. Ability to work efficiently with AWS cloud technologies such as S3, EMR more »
Spark experience Must have strong AWS experience Must have Terraform experience SQL & NoSQL experience Have built out Data Warehouses & built Data Pipelines Strong Databricks & Snowflake experience Docker, ECS, Kubernetes & Orchestration tools like Airflow or Step Functions are nice to have Contracts are running for 6 months initially, paying up to more »