Skip to main content
100 years and 12 days since the five-day weekRead the story
Posted about 6 hours ago

Associate Director, Data Access Engineering

5 day weekGenerous PTOHybrid · Hyderabad, India

Job Description

The Opportunity

Join a global biopharma company with a 130-year legacy and mission to achieve new milestones in healthcare. Be part of technology driven, data-led organization supporting a diversified portfolio of medicines, vaccines and animal health products. Work alongside passionate teams that use data, analytics and insights to drive decisions, and tackle some of the world’s greatest health threats.

Our Technology Centers are globally distributed hubs that enable our digital transformation and business outcomes across IT. They bring together diverse teams to collaborate, share best practices, and deliver solutions that save and improve lives.

This role is based at our Hyderabad Tech Center and follows a hybrid working model (3 days onsite, 2 days remote). Candidates are expected to reside within commuting distance of the Hyderabad office.

Role Overview

As an Associate Director, Data Access Engineering in the Central Data & Analytics Office, you will provide engineering leadership for enterprise‑scale data access controls and security capabilities across the DnA ecosystem. Embedded in Core Data & Engineering and working with global teams, you will design, extend, and standardize reusable data access patterns, engineering standards, and integrations that ensure secure, auditable, and policy‑aware data consumption across platforms, products, and AI‑enabled workloads.

You are accountable for ensuring these capabilities are production‑ready, reliable, and operable end‑to‑end—supporting consistent enforcement of access, consent, and purpose controls, accelerating onboarding of new platforms, data products, and AI agents, and operating data access services safely at enterprise scale.

What will you do

  • Provide technical direction for enterprise data access engineering capabilities, balancing security, usability, and operational scalability.
  • Own the engineering design, build, deployment, and operation of enterprise data access capabilities, ensuring they function reliably at enterprise scale.
  • Partner with data engineering, platform, and product teams to integrate secure data access controls into enterprise data platforms and products.
  • Integrate data access controls with enterprise data platforms, ingestion frameworks, and orchestration layers.
  • Automate monitoring and operational telemetry for data access services to ensure scalability, performance, auditability, and reliability.
  • Ensure engineering feasibility, quality, and operability are embedded into planning and delivery decisions across initiatives.
  • Provide L3 production support and act as an engineering subject matter expert to support adoption and troubleshooting
  • Define and maintain SLOs, SLIs, and operational runbooks to ensure reliable, compliant operation of data access services at enterprise scale.
  • Evaluate and validate new COTS products features.
  • Drive cost‑ and performance‑efficient use of products and services, supporting rightsizing, autoscaling, and sustainable consumption patterns.
  • Define and automate scalable onboarding patterns for new integration types.
  • Embed automated access, security, and compliance checks into data and platform delivery pipelines using engineering‑first (policy‑as‑code) approaches.
  • Develop and champion engineering standards and best practices for secure data access, reducing duplication and improving delivery efficiency across squads and chapters.
  • Shape secure, scalable access patterns for emerging workloads such as AI agents and automated consumers as part of the enterprise data ecosystem.
  • Mentor engineers and technical leads on secure data access patterns and operational practices.
  • Collaborate across a matrixed organization to ensure engineering priorities are aligned with product roadmaps
  • Act as a people manager by coaching and developing team members, setting clear goals and expectations, providing regular feedback, and supporting performance and career growth.

What should you have

  • Bachelors’ degree in Information Technology, Computer Science or any Technology stream.
  • 11+ years of relevant experience in IT disciplines, including software and data engineering, data and analytics, technical lead, and solution architecture background, ideally within a drug development or life sciences context.

Primary skills (must-have)

  • Experience designing and operating policy‑based and attribute‑based data access control (PBAC / ABAC), including fine‑grained, auditable enforcement across platforms.
  • Hands‑on experience working with enterprise data access control platforms (e.g., Immuta or similar), including policy authoring and integration with data platforms and identity systems.
  • Ability to design data access models aligned with Data‑as‑a‑Product principles, including clear ownership, reuse boundaries, and lifecycle considerations at enterprise scale.
  • Demonstrated experience delivering large‑scale data access and information management solutions enabling secure, self‑service data consumption.
  • Depth across modern data platforms, including Databricks (Delta Lake, Unity Catalog), cloud data warehouses/lakehouses, and object storage (e.g., Amazon S3).
  • Experience in software/data engineering practices (including versioning, release management, deployment of datasets, agile & related software tools).
  • Hands‑on experience with CI/CD pipelines, Git‑based tooling (GitHub Actions, Jenkins, Azure DevOps), and automation using infrastructure‑ or policy‑as‑code to deliver secure, scalable platforms.
  • Strong cloud‑native engineering background; AWS certification is a plus.
  • Proven ability to collaborate with engineering, platform, product, and data stakeholder communities in complex enterprise environments.
  • Ability to define and influence engineering standards and reusable patterns across multiple teams or products.
  • Strong leadership, collaboration, and communication skills with the ability to align the organization on complex technical decisions.
  • Product and customer-centric approach and innovative thinking
  • Strong SQL and Python foundation with familiarity in common data engineering patterns (e.g., ETL/ELT, data modeling).
  • Experience using AI‑assisted engineering tools to support software development, testing, documentation, and operational workflows.
  • Experience as a people manager in matrix organization

Secondary skills (nice to have)

  • Prior experience in a global or matrixed organization with multiple stakeholder groups
  • Technical understanding of data lineage, metadata management, and data catalog platforms (e.g., Collibra, Alation), including end‑to‑end lineage, metadata ingestion, integration patterns
  • Exposure to data observability and/or data quality tooling or frameworks
  • Knowledge of orchestration and scheduling tools (Airflow, Prefect, Azure Data Factory)
  • Experience supporting or enabling self‑service platforms and driving adoption through documentation, standards, and enablement.
  • Awareness of emerging data access patterns for AI workflows
  • Technical familiarity with backend service development (Java and Spring Boot framework)
  • Experience working with containerized applications and Docker container images

Who we are

We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world.

What we look for

Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today.

#HYDIT2025

Required Skills:

AWS Devops, Data Engineering, Data Infrastructure, Data Visualization, Design Applications, DevOps, Software Configurations, Software Development, Software Development Life Cycle (SDLC), Solution Architecture, System Designs, System Integration, Testing

Preferred Skills:

Current Employees apply HERE

Current Contingent Workers apply HERE

Search Firm Representatives Please Read Carefully

Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company.  No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails.

Employee Status:

Regular

Relocation:

VISA Sponsorship:

Travel Requirements:

Flexible Work Arrangements:

Not Applicable

Shift:

Valid Driving License:

Hazardous Material(s):

Job Posting End Date:

05/20/2026

*A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date.