We interviewed seven enterprise pharma companies. All of them expressed significant frustration with how the website review process works, driven by compliance requirements, stakeholder volume, and company policies.
Our goal: understand how pharma teams handle website feedback, and where the process breaks down. All quotes are anonymized by role and company type.

“We have to use PDF to send website feedback every time. Whether it’s brand reviewing it or medical legal and regulatory.”
Across seven companies, feedback arrives via email, PDF, Excel, Slack, meetings, and more — simultaneously. None of it connects.

“Some stakeholders provide an Excel sheet with a list of issues and a screenshot. Some provide a Word document. Others take a PDF and annotate all the changes with PDF comments.”
Based on our conversations, we mapped the typical review cycle for a pharma website update. The exact stages vary by company, but the pattern below was consistent across all seven.
Most teams we spoke with work with development and design agencies that carry out some parts or even the whole web implementation. In some cases, there are hundreds of agencies that web teams have to coordinate.
Due to SSO configurations, security policies, and the sheer churn of agency relationships, it’s often impossible to give agencies access to internal systems like Jira. This makes it very hard to track feedback and statuses across teams.
“I’d get feedback in four different places and have to piece it all together.”
This stage almost always runs multiple rounds. Brand leads often feel a lot of ownership of their websites, but they may struggle to deliver feedback in the most developer-friendly way.
Feedback tends to be sent via different channels and formats (email, messages, meetings) and is often incomplete or vague.
“Brand leads insist on clicking ‘resolve’ themselves because they don’t trust the developer’s definition of ‘done.’”
“They follow various steps: some send Excel, some Word documents, some take a PDF and annotate it.”
“Brand leads often give feedback in terms of what they see — ‘this looks wrong,’ ‘change this text’ — without the technical context a developer needs: which device, which browser, which resolution, which component.”
This is the highest-stakes stage. No content goes live without MLR approval.
MLR approves a static document, but the live site is dynamic. Any drift between the approved PDF and what actually goes live — even minor formatting changes — puts the company out of compliance.
“If a reference number loses its superscript between the MLR-approved version and the live site, we’ve got an issue.”
UAT is focused on catching environmental differences: a component that renders differently, an asset that doesn’t load, a configuration that doesn’t carry over.
In many teams, brand and MLR have to review the UAT version too, which can essentially repeat the whole review cycle.
“Once the brand approves the staging version, we move to UAT and then we have to send that out for review too.”
Preserve a final, approved artifact of the published content for FDA audit purposes. This is the one stage where a PDF is genuinely, legally required.
“We use PDFs in all feedback stages but we only need it at the very end.”
But there’s a way to unify, standardize, and automate this process.
Brand teams, MLR reviewers, and external agencies submit feedback directly on the staging site. Full context captured automatically. Status visible to every party.
Reviewer opens the staging site. No login.
Point at any element. Leave feedback right there.
Screenshot, browser, OS — attached instantly.
One ticket, one backlog. Zero copy-paste.
This is what Marker.io does.
Trusted by

“We are managing feedback for +100 sites across brand teams, QA vendors, and our global dev team. Marker.io gave us one place to centralize everything. The Jira integration alone solved problems I’d been living with for years. We’re finally able to close the loop without chasing people across spreadsheets.”
Alex Zenios — Web Migration, Design & Engagement, AmgenIf this research describes your team’s reality, try Marker.io free.
Start a free trial →Here’s what that looks like in practice.
Website feedback arrives via PDF, email, or spreadsheets
No technical context to properly reproduce issues
Difficult to collaborate with agencies
MLR and brand teams leave feedback in different places
Feedback is captured on the website and automatically sent to your tools (e.g. Jira)
All context (browser, device, URL) added automatically
All parties can be onboarded in 5 minutes
All teams leave structured feedback in one place