What Our Auditors Are Finding Lately: 8 Trends Across GMP and GCP Audits (Q2 2025)
A look back at our recent audit reports and mock inspections.
About once a quarter (and more often if possible), we’ll be publishing an analysis of the trends we’re seeing in our audit work for clients, so (if you’re a paid subscriber) you can learn and act proactively on what we’re finding across the industry. This is the first of what will be many installments reviewing the audit results from the last few months.
We’re lowering the paywall a bit on this first issue to give you a clear idea of what paid subscribers will receive in each issue. Subsequent issues will have a higher paywall.
If you’re a free subscriber, we’ll be cutting off the findings list and a list of 10 questions every firm should be able to answer with documentary evidence at hand. If the answer is “not right now,” you may have a vulnerability worth closing before your next inspection. Graduate to a paid subscription here.
Over the past several months, our teams have conducted a wide range of audits: supplier qualifications, internal QMS reviews, routine GMP vendor audits, mock pre-approval inspections, and GCP investigator site audits. These covered everything from large, global manufacturing facilities to smaller packaging sites and clinical trial centers.
While most facilities were generally in control—and no critical findings were recorded—nearly every audit revealed areas where compliance slipped, often in familiar ways.
What’s in this analysis
This trends write-up uses only the audit reports in this project set from late spring–summer 2025. The set includes audit engagements across distinct contexts:
Mock pre-approval/readiness audits at drug-product manufacturing sites.
A corporate/internal QMS audit at a sponsor HQ overseeing a combination-product portfolio.
Supplier qualification audits at a clinical-supply/secondary-packaging facility.
ICH GCP investigator site audits.
No company or site names are used here, and we haven’t added any facts beyond what those reports contain. Confidentiality is paramount to us.
Below, we expand beyond the top themes and go deeper on each area we actually saw in the reports. For each, you’ll find: what we observed, how often, representative examples (generalized to protect identity), and prescriptive actions you can take.
1. Documentation and change control were the biggest gaps
In roughly 70% of audits, we observed some form of documentation gap:
Investigations and change controls exceeded procedural timelines without documented extensions or with extension requests submitted long after deadlines had already passed.
Change control packages lacked objective closure evidence—for example, implemented actions were described but no supporting records were attached.
Even otherwise strong sites stumbled over basic document control issues, such as SOPs with unclear terminology (e.g., undefined categories like “Redundant”) or organizational charts that were not properly controlled.
At one site, management reviews were conducted monthly in practice but did not clearly include complaints or recalls in scope; these were loosely categorized as “other issues.”
Even strong sites were tripped up by simple omissions like uncontrolled organizational charts or batch records with incomplete fields. FDA inspectors still look first at whether your documentation demonstrates control. If you can’t show a clean, closed trail of evidence, they will assume the underlying system is weak.
Action items:
Build “stop-the-clock” rules into deviations/OOS/change control SOPs (who may grant an extension, when, and what scientific/operational justification is required).
Require real-time (not retrospective) extension entries with justification and a new due date.
Trend cycle times and overdue rates monthly. Escalate repeat offenders to management review.
Verify closure evidence before disposition (e.g., no “same-day” change closures with unchecked effectiveness).
2. Inconsistent or incomplete training records
About half of the audits flagged training as a weakness. Some sites had job-specific curricula that were ambiguous—forms listed a job title with an asterisk, but it wasn’t clear which individuals were actually included when multiple staff shared the same title.
Others had simple lapses, like an assistant manager who had never completed a required GMP training module. Annual GMP refreshers were sometimes required on paper but not consistently documented in the system.
These are small but telling signals to FDA reviewers. A defensible training program needs more than attendance sheets.
Action items:
Tie curricula directly to individual staff, not just job titles.
Track completions through a validated LMS with overdue alerts and escalation paths.
Require annual GMP refreshers to be documented with pass/fail assessments.
Audit training files periodically during internal audits to confirm completeness.
3. Investigations and CAPAs were too slow—or missing!
Several audits uncovered late or incomplete investigations.
In one case, multiple NCRs stayed open for months and were closed long past the 30-day window without justification. Extension requests were sometimes filed long after the deadlines had already passed, undercutting the credibility of the process.
At another site, an out-of-spec chromatographic result wasn’t recognized until weeks after the test had been run—too late to meaningfully re-examine original solutions.
Even the Quality Unit itself wasn’t immune. At one facility, QA approved an investigation that had been processed on the wrong form. While a replacement investigation was later initiated, no CAPA addressed the oversight that allowed QA to miss the error in the first place.
Action items:
Track investigation cycle times as a KPI and escalate overdue records to management review.
Require extension requests to be filed and approved before due dates, with documented scientific rationale.
Train analysts to recognize OOS results in real time, before solutions and equipment states are lost.
Define CAPA triggers for QA oversight failures, making sure errors in QA processes are investigated and addressed.
4. Facility and equipment qualification gaps kept appearing
Facility readiness and equipment qualification were another recurring issue, especially at sites preparing for expansion or inspection.
We saw HVAC and environmental monitoring qualifications still incomplete, cleaning validations not finalized, raw material qualifications pending, and serialization equipment not yet installed.
At one site, staff were signing water loop logs verifying UV lamp function, but auditors noted it was unclear how they confirmed the lamps were actually on.
In another case, daily balance checks relied on a single weight well outside the typical operating range, missing proper bracketing. Uncovered clean equipment stored in hallways was also observed.
Even if underlying systems are strong, these details erode confidence. The FDA expects qualification to be buttoned up well before inspection. The fix is to roll all pending activities into a formal quality plan with clear owners and deadlines, document verification methods precisely, use bracketing weights that reflect actual use ranges, and enforce rules on equipment coverage and segregation.
Action items:
Consolidate all open qualification activities into a master quality plan with owners and deadlines.
Define precise verification methods for utilities and include them in SOPs.
Use bracketing weights that reflect actual use ranges in daily balance checks.
Enforce coverage and segregation requirements for clean equipment.
5. Supplier and vendor oversight was underdeveloped
Roughly one-third of audits pointed to gaps in supplier and vendor oversight.
Vendor lists sometimes contained errors or lacked QA signatures.
Requalification intervals stretched too long compared to industry norms.
In one internal audit, a translation vendor used for complaint handling wasn’t on the approved vendor list at all, and the agreement in place failed to address HIPAA, GDPR, or APPI requirements for handling personal health data.
The FDA increasingly expects vendor oversight to mirror internal controls. That means maintaining controlled vendor lists, setting requalification cycles based on risk, executing robust quality agreements, and ensuring data-privacy provisions are in place for any vendor handling regulated data.
Action items:
Control your vendor lists under QA with required signatures for approval.
Align requalification intervals with industry standards (typically every 2–3 years).
Execute quality and data privacy agreements with every vendor handling GMP or patient data.
Reconcile vendor spend against approved lists to detect unqualified “shadow vendors.”
6. APRs and management reviews were sometimes late or incomplete
Two sites in this dataset had issues with higher-level oversight. One facility still had overdue annual product reviews despite multiple CAPAs meant to address the problem. Another held management reviews regularly but did not consistently include complaints and recalls in scope, relegating them to “other issues.”
At a newer company preparing for its first PAI, management review SOPs were in place, but no meetings had actaully been held at all.
Companies need to treat APRs as hard compliance deadlines, not soft deliverables, and make sure management review agendas explicitly include complaints, recalls, CAPAs, and audit findings.
For start-ups, even a first-pass management review is better than none.
Action items:
Treat APR deadlines as compliance-critical and track them in dashboards.
Make sure management review agendas explicitly include all quality elements: complaints, recalls, CAPAs, and audit trends.
Hold at least one documented management review before any regulatory inspection.
7. Data integrity and lab practices were vulnerable
Laboratory systems showed several vulnerabilities. At one site, there was no governing SOP for analytical method validation, even though reports themselves were thorough. The lack of a defined template created inconsistencies across reports.
Elsewhere, logbooks were not consistently reviewed and signed, and equipment clocks were found set to the wrong time and date. In production, operators sometimes left checkboxes blank for critical sampling points like beginning, middle, and end of fill, undermining GDP expectations.
These may seem minor, but they are the exact types of gaps inspectors latch onto. They raise questions about the integrity of data across the entire system. The remedies are usually straightforward: issue SOPs that define expectations for method validation, implement routine supervisory logbook reviews, require regular instrument clock checks, and reinforce GDP principles during training.
Action items:
Issue SOPs defining expectations for analytical method validation reports.