Merck logo

Working student data architecture

Merck
On-site
Barcelona, Spain
Engineering and Data

.

Your Role

Ready to shape the future of data in R&D? As a Working Student for Data Architecture you will support the team to design efficient, scalable data architectures and pipelines that enable analytics and AI across Pharmaceutical R&D. Based on communication with R&D stakeholders you will help translate functional requirements into technical concepts and implement components together with data engineers. You will gain hands‑on project experience across the R&D data value chain and work in a global, agile environment.

Key responsibilities

  • Support design and implementation of data models, data pipelines and ETL/ELT flows that feed reporting, analytics and AI solutions.
  • Assist in building and operating data repositories (e.g., Snowflake) and cloud-based data processing (AWS services) together with senior architects and engineers.
  • Collaborate with product owners, data engineers, business analysts and domain users to translate requirements and ensure correct solution delivery.
  • Help create and maintain architecture blueprints, standards, data dictionaries and best practices for the team.
  • Support automated CI/CD for data pipelines and integration tests (Azure DevOps / Git / warehouse automation tools).
  • Assist in data quality checks, basic troubleshooting and third‑level support tasks for R&D data systems.
  • Contribute to documentation, knowledge sharing and operational run‑books.

Experiences

  • Familiarity or exposure to Snowflake and AWS data services (Glue, Lambda, DMS) is expected; these are core to our Data & AI ecosystem.
  • Hands‑on or coursework knowledge of ETL/ELT concepts and tooling such as Informatica (PowerCenter / IICS) and warehouse automation tools (e.g., dbt, Coalesce) is a strong advantage.
  • Understanding of data modelling patterns, data flows and the end‑to‑end lifecycle from ingestion to consumption; experience with testing data integrations is beneficial.
  • Good SQL skills; basic Python experience is desirable.
  • Basic understanding of CI/CD concepts and experience (or interest) with Azure DevOps / Git for pipelines and deployments.
  • Fluent English and the ability to work in virtual, cross‑functional agile teams.

What we offer

  • Real project responsibility and mentorship from senior data architects; exposure to Snowflake, AWS and enterprise data tooling.
  • Opportunity to work across global teams and directly contribute to data solutions that accelerate R&D.

 

Apply now and become part of a team dedicated to Awakening Discovery and Elevating Humanity!