Software Development Engineer, Open Data Analytics Fundamentals Team AWS Utility Computing (UC) provides product innovations - from foundational services such as Amazon's Simple Storage Service (S3) and Amazon Elastic Compute Cloud (EC2), to consistently released new product innovations that continue to set AWS's services and features apart … Internet of Things (IoT), Platform, and Productivity Apps services in AWS, including support for customers who require specialized security solutions for their cloud services. Amazon Web Services Open Data Analytics (ODA) organization is looking for exceptional engineers to help in our mission to provide the world's best cloud … Big Data processing platform and services such as EMR and Athena. Amazon Elastic MapReduce (EMR) is the industry-leading cloud big data platform for petabyte-scale data processing, interactive analytics, and machine learning using open-source frameworks such as Apache Spark, Trino, Hadoop, Hive, and HBase. AmazonAthenamore »
market data. Experience with developing dashboards and other data visualization applications with Plotly, Matplotlib, Bokeh, Dash, etc. Experience using AWS technologies such as S3, Athena, SQS, Batch, Lambda Experience with DevOps practices using containerization and orchestration technologies (e.g. Docker/Kubernetes) #J-18808-Ljbffr more »
Develop new tools and infrastructure using Python (Flask/Fast API) or Java (Spring Boot) and relational data backend (AWS – Aurora/Redshift/Athena/S3) Support users and operational flows for quantitative risk, senior management and portfolio management teams using the tools developed Qualifications/Skills Required more »
pipelines and data models. Understanding of API protocols and standards, including REST and GraphQL. Hands-on experience with AWS services such as S3, Lambda, Athena, EC2, SQS, RDS, DynamoDB. Experience with CI/CD pipelines, automated testing, Git, containerization, and IaC tools like Terraform. Solid understanding of agile methodologies more »
and automated deployments. Have excellent knowledge of AWS services (ECS, IAM, EC2, S3, DynamoDB, MSK). Our Technology Stack Python and Scala Starburst and Athena Kafka and Kinesis DataHub ML Flow and Airflow Docker and Terraform Kafka, Spark, Kafka Streams and KSQL DBT AWS, S3, Iceberg, Parquet, Glue and more »
stack, including (but not limited to) spark/hadoop, kafka, Aerospike/Dynamodb Experience with AWS tech stack, including but not limited to EMR, Athena, EKS Expert knowledge of multi-threading, memory model, etc. Understanding of database fundamentals and MySQL knowledge Experience with CICD tools such as Jenkins, Graphite more »
categories: Backend Java, Node.js, C#, Python, PHP, Scala, Power Platform Frontend React, JavaScript, Typescript, Angular Data PostgreSQL, Microsoft SQL Server, MongoDB, Apache Kafka, Neo4J, AmazonAthena DevOps AWS, Kubernetes, Azure, Jenkins, Docker, Ansible, Terraform, Dynatrace Responsibilities As part of the team, your day-to-day responsibilities will include more »
with AWS. Strong skills in Python or Scala and experience with SQL databases . Hands-on experience with AWS services (S3, Lambda, Glue, Redshift, Athena, etc.). Knowledge of data modelling, pipeline orchestration (Airflow) , and cloud infrastructure (IaC – Terraform, CloudFormation). Experience working in financial services or FinTech is more »
Who are you? A Data Engineer with 5+ years of experience. Essential Skills and Experience: Hands-on experience with AWS services, including Lambda, Glue, Athena, RDS, and S3. Strong SQL skills for data transformation, cleaning, and loading. Strong coding experience with Python and Pandas. Experience with any flavour of more »
you'll need Required: Experience designing and building high throughput, scalable, resilient and secure data pipelines. Experience with the following AWS technologies: Glue jobs, Athena, S3, Step functions, Lambda, RDS (Aurora Postgres), DMS, Redshift, QuickSight, Kinesis Firehose. Expert knowledge of SQL. Hands-on experience with Kafka. Hands-on experience more »
testing, deployment systems, and configuration as code. Experience with cloud services such as AWS, GCP, or equivalent (preference for AWS - S3, IAM, SNS, SQS, Athena, Glue, Kinesis). Familiarity with infrastructure as code, preferably Terraform. Knowledge of columnar databases, such as Snowflake. Experience in developing and optimising CI/ more »
/containerization. Experience with EKS/Kubernetes and DevOps tools, particularly GitHub Actions. Understanding of AWS infrastructure and services (S3, EKS, IAM, Storage Gateway, Athena). Proficient in Terraform or a solid understanding of Infrastructure as Code (IaC). Desirable Skills Expertise in Snowflake concepts such as resource monitors more »
services in production. Exposure to some of the following technologies (or equivalent): Apache Spark, AWS Redshift, AWS S3, Cassandra (and other NoSQL systems), AWS Athena, Apache Kafka, Apache Flink, AWS, and service-oriented architecture. What you'll get: Full responsibility for projects from day one, a collaborative team, and more »
these can be practically applied. Experience with Python or other scripting languages. Good working knowledge of the AWS data management stack (RDS, S3, Redshift, Athena, Glue, QuickSight) or Google data management stack (Cloud Storage, Airflow, Big Query, DataPlex, Looker). About Our Process We can be flexible with the more »
Collaborate with AWS field BD, pre-BD, training and support teams to help partners and internal customers learn and use AWS services such as Athena, Glue, Lambda, S3, DynamoDB NoSQL, Relational Database Service (RDS), Amazon EMR and Amazon Redshift. Deliver on-site technical engagements with partners and … Collaborate with AWS field BD, pre-BD, training and support teams to help partners and internal customers learn and use AWS services such as Athena, Glue, Lambda, S3, DynamoDB NoSQL, Relational Database Service (RDS), Amazon EMR and Amazon Redshift. Engaging with the internal customer's business and … please visit this link for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner. Amazon is committed to a diverse and inclusive workplace. Amazon is an equal opportunity employer and does not discriminate on the basis of race more »
Science, Computer Science, or related fields. Experience working as a Data Engineer, preferably in a similar industry or consultancy setting. AWS cloud technologies, including Athena, S3, and ideally DBT. Experience working with Python, demonstrating proficiency in data manipulation and analysis. Hands-on experience with Hadoop, Big Data, Snowflake, and more »
and performing transformation and ingestion in a specific tech suite. Minimum Requirements: Extensive experience in implementing solutions around the AWS cloud environment (S3, Snowflake, Athena, Glue). In-depth understanding of database structure principles. Strong knowledge of database structure systems and data mining. Excellent understanding of Data Modelling & Kinesis. more »
doing transformation and ingestion in a certain tech suite. Minimum requirements include: Extensive experience in implementing solutions around the AWS cloud environment (S3, Snowflake, Athena, Glue). In-depth understanding of database structure principles. Strong knowledge of database structure systems and data mining. Excellent understanding of Data Modelling & Kinesis. more »
Requirements: 3+ years of hands-on software engineering experience in production environments. Critical Skills: Expertise in AWS cloud architecture and services (ECS, EC2, S3, Athena, Glue, etc.) Strong foundation in C#, Python, Docker, and API development. Familiarity with Search, ML/NLP concepts and experience working with ML teams. more »
stack, including (but not limited to) spark/hadoop, kafka, Aerospike/Dynamodb. Experience with AWS tech stack, including but not limited to EMR, Athena, EKS. Expert knowledge of multi-threading, memory model, etc. Understanding of database fundamentals and MySQL knowledge. Experience with CI/CD tools such as more »
exceptional service. Qualifications The ideal candidate will have: Expertise in both streaming and traditional ETL data architectures. Strong knowledge of cloud technologies (AWS S3, Athena, EMR, Google BigQuery, Snowflake, Azure Data Lake, ...) and containerization (Docker, Kubernetes). Proficiency in Python. Proven achievements with Platform Engineering tools and methodologies more »
presenting your ideas and work to both technical and non-technical colleagues. Nice to have Experience with SQL using databases such as MySQL, PostgreSQL, Athena, and Redshift. Familiarity with cloud-based platforms like AWS and CI/CD processes using tools like GitHub Actions. Some knowledge of geospatial analysis more »
ideally having worked with peers of different levels to complete projects collaboratively. Our technology stack: Python (including FastAPI, OpenTelemetry, procrastinate, SQLAlchemy, Uvicorn), Postgres, MySQL, Athena, Liquibase, Retool, Docker, AWS Who you are: A professional history in software engineering with a deep knowledge of the technologies in our stack Proven more »
Support Report developers Tableau tools preferable, or equivalent End to end BI development (requirements to report delivery) AWS technology stack development S3, Redshift, RDS, Athena, etc Big Data platform development BigQuery Working with Insight Analysts/Data Scientists and their environments e.g. R, Jupyter, Scala Some form of non more »