Job Title: IBM MDM Architect Location: Stratford Upon Avon, UK Duration: 6-month FTC/contract Working Mode: Hybrid (2-3 days onsite per week) Job Description: The job description for an IBM MDM architect includes the following qualifications and More ❯
/CD (e.g GitHub and Jenkins) Monitor and tune performance of Spark jobs optimize data partitions and caching strategies Having experience and exposure to convert legacy etl tools like datastage, informatica into Prophecy pipelines using Transpiler component of Prophecy Required skill & experience: 2+ years of hands-on experience with Prophecy (Using pyspark) approach 5+ years of experience in data … Spark, Databricks,scala/Pyspark or SQL Strong understanding of ETL/ELT pipelines, distributed data processing and data lake architecture. Having exposure to ETL tools such as Informatica, Datastage or talend is added advantage Experience with Unity catalog, Delta lake and modern data lakehouse concepts Strong communication and stakeholder management skills. ͏ p. Collecting, aggregating, matching, consolidating, quality-assuring More ❯
disparate data sets and communicate clearly with stakeholders Hands-on experience with cloud platforms such as AWS, Azure, or GCP Familiarity with traditional ETL tools (e.g., Informatica, Talend, Pentaho, DataStage) and data warehousing concepts Strong understanding of data security, compliance , and governance best practices Experience leading or influencing cross-functional teams in a product or platform environment Strong stakeholder More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Capgemini
queues, and topics. Proficiency in SQL queries, functions, and procedures across Big Data platforms, Oracle, SQL Server, ERP solutions, and cloud providers. Useful experience with tools like Informatica, Talend, DataStage, or similar. Your Security Clearance Obtaining Security Check (SC) clearance is required, which involves residency in the UK for the last 5 years and other criteria. The process includes More ❯
London, England, United Kingdom Hybrid / WFH Options
EXL
ERwin, ER/Studio, or PowerDesigner, ensuring scalability, performance, and maintainability. ETL/ELT Frameworks: Design and build robust data pipelines with Cloud Composer, Dataproc, Dataflow, Informatica, or IBMDataStage, supporting both batch and streaming data ingestion. Data Governance & Quality: Implement data governance frameworks, metadata management, and data quality controls using Unity Catalog, Profisee, Alation, DQ Pro, or similar More ❯
Awareness of industry standards, regulations, and developments. Ideally, you’ll also have: Experience with Relational Databases and Data Warehousing concepts. Experience with Enterprise ETL tools such as Informatica, Talend, DataStage, or Alteryx. Project experience with technologies like Hadoop, Spark, Scala, Oracle, Pega, Salesforce. Cross-platform experience. Team building and leadership skills. You must be: Willing to work on client More ❯
awareness of industry standards, regulations, and developments. Ideally, you'll also have: Experience of Relational Databases and Data Warehousing concepts. Experience of Enterprise ETL tools such as Informatica, Talend, Datastage or Alteryx. Project experience using the any of the following technologies: Hadoop, Spark, Scala, Oracle, Pega, Salesforce. Cross and multi-platform experience. Team building and leading. You must be More ❯
or AWS Glue or equivalent ETL tool which covers from extracting the source to building the complex mappings. Strong experience on data migration and reporting using ETL (Informatica, Boomi, DataStage, Matillion etc) and Reporting Tools (Power BI, Tableau) Experience on Snowflake Storage and Database. Thorough experience with writing complex SQL Thorough experience translating business requirement to technical requirements and More ❯
About Us We are an innovative technology services company, since 2001 we have been providing mission critical systems and their support for some of the leading financial institutions around the world. Thesolutions provided by us are part of millions of More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
models are optimized for performance, scalability, and maintainability. Data Integration & Pipeline Engineering: Design and implement resilient ETL/ELT workflows using technologies such as BigQuery, Dataflow, Informatica, or IBMDataStage, supporting real-time and batch processing. Governance & Data Quality Management: Establish data governance frameworks, including metadata management and quality assurance, using platforms like Unity Catalog, Alation, Profisee, or DQ More ❯
optimized for performance, scalability, and long-term maintainability. Data Integration & Pipeline Engineering: Design and implement resilient ETL/ELT workflows using technologies such as BigQuery, Dataflow, Informatica, or IBMDataStage, supporting both real-time and batch data processing. Governance & Data Quality Management: Establish comprehensive data governance frameworks, including metadata management and quality assurance, using platforms like Unity Catalog, Alation More ❯
optimized for performance, scalability, and long-term maintainability. Data Integration & Pipeline Engineering: Design and implement resilient ETL/ELT workflows using technologies such as BigQuery, Dataflow, Informatica, or IBMDataStage, supporting both real-time and batch data processing. Governance & Data Quality Management: Establish comprehensive data governance frameworks, including metadata management and quality assurance, using platforms like Unity Catalog, Alation More ❯
We seek passionate testers with expertise Leading team, designing, developing, and maintaining data quality solutions. The ideal candidate should have strong expertise in ETL framework testing (preferably Talend or DataStage), BI report testing (preferably Power BI, Cognos), cloud technologies (preferably Azure, Databricks), SQL/PLSQL coding, and Unix/Python scripting. Key Responsibilities Lead and mentor a team of … advanced SQL, stored procedures, and database design principles. Utilize Unix/Python scripting for data validation, and process automation. Contribute to developing ETL solutions using tools like Talend or DataStage OR building stored procedures, scheduling workflows, and integrating data pipelines. Additionally, develop BI reports using tools like Power BI or Cognos. Implement best practices in data governance, compliance, and … related field. 5+ year of experience in leading test teams, mentoring engineers, and driving test strategies. 5+ year of experience in data testing using ETL tool experience like Talend, DataStage, or equivalent. 5+ year of experience is automated & Data Quality testing using tools like TOSCA, ICEDQ or equivalent frameworks. 5+ year Advanced SQL/PL SQL experience (Snowflake a More ❯
Amazon Redshift, or Google BigQuery Proficiency in creating visualizations using Power BI or Tableau Experience designing ETL/ELT solutions with tools like SSIS, Alteryx, AWS Glue, Databricks, IBMDataStage Strong analytical and technical skills Good communication skills, both verbal and written, with experience working closely with business users Experience in the hospitality industry is beneficial WHAT'S IN More ❯
technologies is a strong plus (e.g. HTTPS, REST, XML, JSON, etc). - JAVA/PERL/Ruby/Python scripting language experience. - Experience with third-party ETL tools (IBMDataStage or Data Manager, Informatica, etc). - Familiarity with AWS solutions such as EC2, DynamoDB, S3, Redshift, and Aurora. - Master’s degree in Information Systems, Computer Science, Finance, Accounting, Economics More ❯
more scripting language (e.g., Python, KornShell) PREFERRED QUALIFICATIONS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application More ❯
Job Type: Contract Job Location: Wimbledon , UK Job Description : For this role, senior experience of Data Engineering and building automated data pipelines on IBMDatastage & DB2, AWS and Databricks from source to operational databases through to curation layer is expected using the latest cloud modern technologies where experience of delivering complex pipelines will be significantly valuable to how to … scaling Databricks environments for ETL, data science, and analytics use cases. AWS Cloud: Extensive experience with AWS services such as S3, Glue, Lambda, RDS, and IAM. IBM Skills: DB2, Datastage, Tivoli Workload Scheduler, Urban Code Programming Languages: Proficiency in Python, SQL. Data Warehousing & ETL: Experience with modern ETL frameworks and data warehousing techniques. DevOps & CI/CD: Familiarity with More ❯
Amazon Redshift, and Google BigQuery. Building visualizations using Power BI or Tableau Experience in designing ETL/ELT solutions, preferably using tools like SSIS, Alteryx, AWS Glue, Databricks, IBMDataStage Solid analytical and technical skills Good communications skills both verbally and written; this role will be working closely with business users. Experience working in the hospitality industry beneficial WHAT More ❯
SQL, DDL, MDX, HiveQL, SparkSQL, Scala) PREFERRED QUALIFICATIONS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application More ❯
governance, quality, and privacy (e.g. GDPR compliance) Data Transformation & Pipelines Proficiency in ELT/ETL processes Strong experience in data ingestion, transformation & orchestration technology (ETL tools such as Informatica, Datastage, SSIS, etc..) or open source Meltano, Airbyte, and Airflow Proven experience with DBT (data build tool) Analytics & Dashboarding Proficiency with business intelligence tools (Power BI, Tableau, SAP BI, or More ❯
languages like Python or KornShell Unix experience Troubleshooting data and infrastructure issues Preferred Qualifications Experience with Hadoop, Hive, Spark, EMR Experience with ETL tools like Informatica, ODI, SSIS, BODI, DataStage Knowledge of distributed storage and computing systems Experience with reporting and analytics platforms We promote an inclusive culture and provide accommodations for applicants with disabilities. For more information, visit More ❯
Data Migration Developers x 2 Salary: £50,000 - £70,000 + company benefits Location - remote UK Full time/permanent vacancy You must be a sole British National and be able to obtain SC and NPPV clearance. JOB PURPOSE Working More ❯
understanding of data governance, quality, and privacy (e.g. GDPR compliance) Proficiency in ELT/ETL processes Strong experience in data ingestion, transformation & orchestration technology (ETL tools such as Informatica, Datastage, SSIS, etc ) or open source Meltano, Airbyte, and Airflow Proven experience with DBT (data build tool) Proficiency with business intelligence tools (Power BI, Tableau, SAP BI, or similar). More ❯
write, and read fluently in English PREFERRED QUALIFICATIONS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application More ❯
more scripting language (e.g., Python, KornShell) PREFERRED QUALIFICATIONS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. - Knowledge of cloud services such as AWS or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and More ❯