Blog | The FDA Group

What Our Auditors Are Finding Lately: 10 Trends Across GxP Audits

Written by The FDA Group | January 22, 2026

Over the second half of 2025, our auditors conducted a diverse portfolio of engagements spanning pharma manufacturing, clinical supply chains, medical device operations, biologics production, and lab services across three continents.

These were deep dives into quality systems at companies ranging from emerging biotechs preparing for their first FDA pre-approval inspection to established global CDMOs handling hundreds of client audits per year.

What struck us wasn’t the critical findings (there were virtually none). Rather, it was the consistency of the minor and major observations. The same themes appeared again and again: documentation gaps that made otherwise solid processes look questionable, timeline management that seemed reasonable in practice but failed on paper, and procedural ambiguities that left companies exposed to regulatory risk they hadn’t fully recognized.

This report synthesizes those findings into practical guidance. We’ve anonymized every example to protect our clients, but we’ve preserved enough operational context that you should be able to see yourself—or your suppliers, or your CMOs—in these scenarios.

Talk to us if you need auditing, mock inspection, remediation, or other RA/QA/Clinical support.

What’s in this dataset

This trends report uses only the audit reports in our project set from July to November 2025. The set includes audit engagements across a few distinct contexts:

  • Mock Pre-Approval Inspections (PAI): Five-day mock PAIs at U.S.-based pharma sites preparing for ANDA submission, including DPI and SMI manufacturing.
  • Routine GMP Vendor Audits (API and Finished Dosage): Audits of high-volume generic manufacturers in India producing oral solids for the U.S. market under USFDA and MHRA approvals.
  • Supplier Qualification Audits: First-time and re-qualification audits at CDMOs in China providing API synthesis, drug product manufacturing, and ADC conjugation.
  • GLP-to-GMP Gap Assessments: Gap assessments at R&D laboratories transitioning to GMP-compliant batch release testing, evaluated against 21 CFR Part 211 and ICH Q7 to support Phase 3 and commercial readiness.
  • cGMP Packaging and Labeling Audits: Audits of clinical and commercial packaging and labeling operations supporting multinational trials.
  • Medical Device and Primary Packaging Audits: Audits of ISO 13485-certified device and packaging component manufacturers, including combination products and primary packaging, assessed against ISO 13485, ISO 15378, and 21 CFR Part 820.
  • Clinical Trial Sponsor Oversight (BIMO): Mock BIMO inspections for IDE studies evaluating sponsor oversight, monitoring, data integrity, and device accountability under 21 CFR Parts 812 and 820.
  • Secondary Packaging and Distribution Audits: Monitoring audits of logistics providers handling temperature-controlled storage, packaging, and global distribution under EU GDP and 21 CFR Part 211, with emphasis on thermal mapping and deviation management.

Again, no company or site names are used in the trend descriptions below, and we haven’t added any facts beyond what those reports contain. Confidentiality is paramount to us.

Stats and trends at a glance

The dataset:

  • 33 audits analyzed across August–November 2025
  • 5 countries: United States, India, China, Ireland, Netherlands
  • Audit types: GMP vendor qualification, mock PAI, gap assessments, ISO 13485 medical device, clinical packaging/GDP, mock BIMO, biologics/ADC CDMO
  • Regulatory frameworks: 21 CFR 210/211, 21 CFR 820, 21 CFR 58, ISO 13485, ISO 15378, EU Annex 13, ICH Q7/Q9/Q10

Finding severity breakdown:

  • Critical findings: 0 (0%)
  • Major findings: ~12 (~15%)
  • Minor findings: ~50 (~60%)
  • Recommendations: ~25 (~25%)

Top finding categories (by % of audits affected):

  • Documentation and change control — 70%
  • Supplier/vendor oversight — 45%
  • Investigation and CAPA timeliness — 40%
  • Facility and equipment gaps — 35%
  • Complaint handling backlogs — 25%
  • Data integrity vulnerabilities — 25%
  • APR/management review gaps — 20%

Repeat findings:

  • 20% of audits had findings that recurred from prior years
  • Recurring issues included: complaint backlogs, facility door gaps, PM backlogs, late QA approvals
  • Root cause: CAPA effectiveness checks not performed or not defined

Geographic performance:

  • China: Cleanest outcomes; 3 of 5 audits had zero major findings
  • India: Consistent; 1–3 minor findings typical; administrative gaps (expired licenses/certificates)
  • United States: Widest variance; startup facilities showed qualification debt
  • Ireland: Middle of the pack; procedural alignment gaps between global and local SOPs
  • Netherlands: Logistics scope only; no significant findings

Patterns to watch:

  1. Supplier notification timelines missing from SOPs — quality agreements require it, procedures don’t address it
  2. Retroactive change control extensions — deadlines missed, extensions approved 30+ days later with no justification
  3. No transport validation — ~30% of facilities rely on logistics providers without product-specific validation
  4. Startup qualification debt — HVAC, EM, media fills pending without consolidated project plans
  5. CAPA effectiveness checks skipped — corrections completed, but no verification they actually worked

What’s working:

  • Zero critical findings across the entire dataset
  • Strong environmental monitoring at established CDMOs
  • Mature document control at ISO-certified facilities
  • Effective audit trail implementations in validated electronic systems
  • Open, responsive engagement from audit hosts at closing meetings

Below, we expand on the themes we actually saw in the reports. For each, you'll find: what we observed, how often, representative examples (generalized to protect identity), and prescriptive actions you can take.

1. Documentation and change control remained the biggest gap

In roughly 70% of audits (the same as our prior analysis), we observed some form of documentation gap.

Documentation deficiencies appeared more than any other finding category—not because facilities lacked procedures, but because the procedures weren’t consistently followed or didn’t address common edge cases.

What we found:

  • Timeline management failures: At multiple sites, investigations and change controls exceeded their procedural timelines without any documented extension request. In one case, a change control initiated in February with a 10-day target closure wasn’t closed until April—nearly two months late—with no extension request ever submitted. The SOP simply didn’t address what to do when a deadline couldn’t be met.
  • Retroactive extensions that undermined credibility: At another facility, an extension request for an investigation was approved more than 30 days after the original due date had already passed. While the investigation itself was thorough, the retroactive extension approval (with no scientific justification for the delay) would be difficult to defend to an inspector asking, “Why wasn’t this flagged earlier?”
  • Uncontrolled templates for controlled activities: Management review meetings at one site were documented using PowerPoint presentations that weren’t controlled documents, even though the governing SOP referenced specific templates that should be used. The content of the reviews was appropriate, but the format created a document control gap that could lead to version confusion or incomplete archival.
  • Missing signatures on critical documents: Business continuity plans and emergency drill reports at one site were complete in content, but didn’t have signatures and dates from the reviewers and approvers who should have formally signed off. This turns a good practice into an undocumented one.
  • Incorrect or missing data fields: Equipment cleaning records at one site showed “N/A” entered in fields where room numbers were required. The cleaning was performed correctly, but the documentation didn’t demonstrate where the equipment was cleaned—a basic traceability failure.
  • Forms without attribution: One site’s Deviation Assistance Tool—a form used to document and investigate deviations—did not capture who filled out the document or when. The completed forms were thorough, but without names and dates, the audit trail was incomplete.

We see that FDA investigators routinely pull change control and investigation records and calculate cycle times. Overdue records without extensions suggest a facility that either lacks oversight or tolerates delays.

We see a lot of warning letters specifically cite "failure to thoroughly investigate unexplained discrepancies" and "failure to establish and follow written procedures"—exactly the kind of findings that emerge when documentation discipline breaks down.

A few recommendations:

  • Build explicit “stop-the-clock” rules into your deviation, OOS, and change control SOPs. Define who may grant an extension, when (before the deadline, not after), what justification is required, and what the maximum allowable extension period is.
  • Require real-time extension entries with documented scientific or operational rationale and a new committed due date.
  • Audit your SOP-to-template alignment. Every SOP that references a template should reference a controlled template with a document number, and that template should be easily retrievable.
  • Implement periodic “documentation hygiene” checks—monthly or quarterly reviews of open records approaching deadlines, with escalation to management for any records past 80% of their target timeframe.

2. Investigations and CAPAs were too slow (or missing entirely)

Observed in roughly 50% of our audits here, even when investigations were eventually completed correctly, the timing often created problems. And in some cases, situations that clearly warranted CAPA investigation never actually triggered one.

What we found:

  • Chronic overdue investigations: At one site, the NCR (Non-Conformance Report) log showed multiple investigations exceeding the 30-day SOP-mandated closure window, with some remaining open for months. Nine NCR records were found to be over 60 days old with no closure documentation uploaded. A pattern like this usually points to a systemic backlog rather than isolated exceptions.
  • OOS results not recognized in real time: At one lab we audited, an HPLC-based test for delivered dosage uniformity produced an OOS result, but the investigation wasn’t opened until more than a month later. The analyst had processed the data and moved on without recognizing the OOS. By the time it was discovered during data review, the original test solutions had expired and could no longer be evaluated. This is a good example of a pretty fundamental training gap we encounter quite a bit: analysts must be able to recognize OOS results immediately and escalate them within one business day.
  • QA oversight failures not treated as CAPA-worthy: At one facility, QA approved an investigation that had been processed on the wrong form. When the error was discovered, a replacement investigation was completed correctly, but no CAPA was opened to address the systemic question: how did QA miss this error in the first place? QA oversight lapses should always trigger their own investigation and corrective action, not just be quietly fixed.
  • Deviations closed without meaningful CAPAs: At one site, the list of deviations over 12 months filled 3.5 pages, while the list of CAPAs over the same period was only half a page. Simple improvements—”revise the work instruction,” “retrain the operator”—were implemented informally without opening CAPA records, even though the deviation management procedure allowed for it. This pattern suggests CAPA avoidance rather than genuine continuous improvement.
  • Long-running investigations: One major process incident remained under investigation for more than 18 months after the original event, suggesting resource constraints or unclear ownership.

FDA's OOS guidance (May 2022) emphasizes the importance of timely investigation, specifically noting that delays can compromise the ability to meaningfully investigate.

Warning Letters frequently cite "failure to complete investigation within [X] days as required by your procedure"—a finding that's trivially easy for investigators to document by comparing record dates to SOP requirements.

A few recommendations:

  • Make sure you’re tracking investigation and CAPA cycle times as a KPI reported to management review. You should also have clear escalation triggers (e.g., 80% of target timeline) that require management attention before deadlines are missed.
  • Train analysts explicitly on real-time OOS recognition. Use visual aids in the laboratory showing what an OOS result looks like for common test methods. Require analysts to flag results before leaving for the day.
  • Define CAPA triggers for QA oversight failures: any error in QA review that reaches downstream processes should automatically generate a CAPA evaluation.
  • Review your deviation-to-CAPA ratio. A very low ratio may indicate that CAPAs are being avoided rather than appropriately applied. This is a great metric suprisingly few teams keep in their reporting.

3. Facility and equipment qualification gaps kept appearing

Observed in approximately 40% of these audits, these findings were most prominent at sites undergoing expansion, preparing for new product launches, or transitioning between development and commercial phases.

What we found:

  • Interconnected qualification activities running without coordination: At one facility preparing for commercial manufacturing, the following activities were all in an incomplete or planned state: HVAC qualification, environmental monitoring performance qualification, media fills, facility cleaning procedures, equipment cleaning and decontamination validation, raw material qualification, isolator qualification, temperature and humidity monitoring systems, computer system validation procedures, calibration and maintenance programs, and supplier qualification. While each activity had some level of planning, they weren’t consolidated under a formal Quality Plan with clear owners, dependencies, and deadlines.
  • HVAC delays cascading through other systems: At the same facility, HVAC qualification was delayed by facility power upgrades, which in turn were delayed by permitting with the local utility. This single dependency held up the environmental monitoring qualification, which in turn prevented baseline establishment for microbiological monitoring during production.
  • Calibration and PM stickers showing signs of neglect: At multiple sites, calibration stickers were curled, faded, or difficult to read. In one case, a door inspection sticker showed a due date that had passed without the inspection being completed—and without an extension being documented. At another facility, 201 open preventive maintenance (PM) work orders were identified in SAP for the current year, with some PM due dates in the system not aligned with actual performance dates.
  • Labeling inconsistencies: Storage areas, equipment tags, reagent labels, and shelf designations at some facilities used handwritten labels on colored tape that could smudge or become illegible over time. This creates risk for both traceability (which equipment was cleaned?) and cross-contamination (which storage area is this?).
  • Equipment without identification: At one medical device facility, a piece of manufacturing equipment lacked the required equipment asset ID tag, even though the SOP required all active equipment to be labeled with identification.
  • Environmental monitoring without baseline: At one site, previous efforts to establish environmental monitoring baselines had shown passing particulate levels (ISO 7/8) but multiple hits for spore-forming microorganisms. However, there was no baseline data showing passing microbiological results in the current production configuration.

21 CFR 211.42(c) requires that buildings have “suitable size, construction and location to facilitate cleaning, maintenance, and proper operations.”

FDA investigators often look for evidence that equipment is appropriately qualified before use and that environmental conditions are controlled. A site with visible gaps in qualification (incomplete protocols, missing baselines, undated stickers) signals potential control problems throughout the operation.

A few recommendations:

  • We suggest most teams consolidate all open qualification activities into a single Quality Plan with clear ownership, deadlines, and dependency mapping. Track this plan in your management reviews.
  • Define precise verification methods for utilities and equipment calibration, and include those methods in SOPs. Don’t assume operators know how to verify calibration status—write it down!
  • Implement a labeling standard that requires durable, printed labels with document numbers where appropriate. Eliminate handwritten labeling on equipment and storage areas used for GxP operations.
  • Establish clear rules for what happens when qualification activities are delayed: who is notified, how is the delay documented, and what interim controls apply.

4. Supplier and vendor oversight was underdeveloped

The gaps here (present in 35% of these audits) weren’t about whether suppliers were qualified, but about whether the procedural infrastructure around supplier oversight was adequate to manage ongoing relationships and change notifications.

What we found:

  • Client notification timelines not specified in SOPs: At multiple CDMO sites, SOPs for supplier change notification did not clearly define the timeframes for notifying clients. In one case, the quality agreement with the client required notification of supplier changes within 5 business days, but the internal SOP was silent on timing. This mismatch creates risk: the CDMO may comply with its own SOP while violating the quality agreement.
  • Stage-specific requirements missing: One SOP addressed notification requirements for PPQ and commercial projects, but didn’t clearly apply to clients with products still in clinical development. A change affecting a Phase 2 clinical product might not be subject to the same notification rigor as a commercial product, even though the clinical sponsor needs the information.
  • Quality agreements without regulatory specificity: Some quality agreements we reviewed didn’t clearly specify which regulatory standards (GMP, GDP, IPEC, etc.) applied to the supplier relationship. This creates ambiguity about what “compliance” means under the agreement.
  • Major changes not communicated: At one API manufacturing site, a major change—incorporating nitrosamine impurity testing into the API specification—was not communicated to the customer as required by the quality agreement. At another site, changes related to processes, test specifications, and product specifications were implemented without customer notification, even though the quality agreement explicitly required notification of “major changes.”
  • Approved supplier list gaps: At one site, a third-party logistics provider that had been used for years (with a quality agreement executed for each shipment) had never been formally added to the Approved Supplier List. No transport validation studies or seasonal challenge studies had been performed.
  • Suppliers “under evaluation” used for production: At some sites, API suppliers listed as “under evaluation” or “provisional” were still being used for production batches, creating a disconnect between the qualification status and actual use.

A few recommendations:

  • Audit your SOPs against your quality agreements. Every requirement in a quality agreement should trace to a specific SOP with clear procedural steps for compliance.
  • Define stage-specific notification requirements (development, clinical, commercial) so that clients at all stages receive appropriate change communication.
  • Make sure your quality agreements specify applicable regulatory standards (GMP, GDP, ICH Q7, IPEC, etc.) so that both parties understand compliance expectations.
  • Implement a mechanism to flag when suppliers rated as “under evaluation” or “provisional” are being used for production—this should trigger escalation and accelerated qualification.
  • Validate transportation and distribution: conduct seasonal challenge studies, add logistics providers to your ASL, and ensure temperature excursions trigger documented deviation handling.

5. Annual product reviews and management reviews were late or incomplete

The pattern here was consistent in 30% of the firms we audited: companies understood the requirement for annual reviews but hadn’t implemented them with the rigor that regulatory expectations demand.

What we found:

  • APQRs organized by strength rather than product: At multiple sites, Annual Product Quality Reviews covered only individual strengths (e.g., 20mg tablets) rather than providing a comprehensive assessment across all strengths of a product. This approach can miss patterns that only become visible when data from multiple strengths is evaluated together. FDA’s 21 CFR 211.180(e) requires at least annual review of a representative number of batches—the emphasis is on comprehensive product understanding, not strength-by-strength compliance.
  • Critical quality events omitted: In one APQR, an OOS result and a market complaint were identified during the review period but neither was included in the APQR’s OOS or complaint sections. This omission creates an inaccurate picture of product performance.
  • Misaligned review periods: At one site, the APQR review period was July 2024 to June 2025, but the water system and environmental monitoring trends were reported for January 2024 to December 2024. This misalignment means the APQR doesn’t fully cover quality-relevant data from the actual review period.
  • Management review meetings never held: At one company preparing for its first PAI, management review SOPs were in place, but no management review meetings had actually been held. The procedure existed, but it had never been executed.
  • Management review inputs incomplete: At one site, management review slides covered most required inputs, but there was no discussion of the metric of overdue complaints (complaint aging), even though thousands of overdue complaints were observed. This omission suggests the management review wasn’t capturing the most material quality concerns.
  • Uncontrolled formats: As noted above, management review documentation at some sites used uncontrolled PowerPoint presentations rather than controlled templates.

21 CFR 211.180(e) requires annual review of “quality standards of each drug product to determine the need for changes in drug product specifications or manufacturing or control procedures.”

APQR findings frequently appear in Warning Letters when reviewers can demonstrate that known quality issues weren’t addressed or that required elements were missing. Management review is the mechanism by which senior leadership demonstrates awareness of quality status—incomplete reviews suggest leadership isn’t engaged.

A few recommendations:

  • Treat APQR deadlines as compliance-critical: track them in dashboards, assign clear ownership, and escalate approaching deadlines to management.
  • Make sure your APQRs encompass a holistic review across all strengths and markets. Create templates that require cross-referencing between strengths.
  • Align review periods: all data sources feeding into an APQR should cover the same time period.
  • Hold at least one documented management review before any regulatory inspection using controlled templates that capture all required inputs and outputs.
  • Include complaint aging metrics in management review and require action plans when aging trends deteriorate.

6. Data integrity and lab practices showed some vulnerabilities

In about 30% of these audits, lab findings were related to procedural gaps rather than actual data manipulation, but procedural gaps create the conditions where data integrity problems can go undetected.

What we found:

  • No SOP for analytical method validation: At one site conducting method validation activities, there was no governing SOP defining how validation should be performed, what reports should contain, or what acceptance criteria applied. The individual validation reports reviewed were thorough and aligned with USP <1225> principles, but inconsistencies between reports suggested the lack of a unifying procedure.
  • Composite samples prepared before identification testing approved: At another site, composite samples were drawn before individual identification tests on all containers had been reviewed and approved. This sequence creates risk: if an identification failure is found after the composite has been prepared, the composite may be contaminated.
  • Missing signature logs: At one laboratory transitioning to GMP operations, there was no signature log—or alternative mechanism—to make sure that handwritten initials on data worksheets could be attributed to specific individuals. The recommendation was either to create a signature log or to require analysts to print, sign/initial, and date the first page of each worksheet, with initials only on subsequent pages.
  • Sample labeling gaps: At one site, labeling procedures didn’t distinguish between “unopened expiry” and “opened expiry” dates (a critical distinction for stability samples and reagents). Test Request Forms lacked fields for sample storage location and storage conditions.
  • Audit trail review gaps: At one facility, data integrity assessments had been conducted equipment-by-equipment rather than site-wide, creating potential gaps. One piece of production equipment with user-specific access, passwords, and an audit trail was not subject to periodic audit trail review because it was classified as “not a computer system”—even though it had the characteristics of a computerized system.
  • Clock accuracy issues: At one site, equipment was found to be 9 minutes slow. While the data integrity assessment was performed appropriately, the clock accuracy issue highlights the importance of verifying time synchronization across GxP systems.
  • Data worksheets with watermark overlaps: At one laboratory using Veeva for document management, the system’s watermarks sometimes overlapped with data fields, making it difficult to read recorded values. While not a data integrity issue per se, this creates unnecessary ambiguity in records.

We pretty frequently see the FDA cite around lab data integrity issues, including failure to review audit trails, inadequate investigation of anomalous results, and incomplete documentation of who performed testing.

The FDA’s 2018 guidance on this is extremely important reading.

A few recommendations:

  • Make sure you have an SOP defining expectations for analytical method validation reports: required elements, acceptance criteria, report structure, and review/approval workflow.
  • Sequence sampling activities correctly: no composite sampling until individual container identification testing is complete and approved.
  • Implement signature logs or ensure that all laboratory analysts are identifiable from data worksheets—either through signature logs or by requiring full signatures on the first page of each worksheet.
  • Make sure your data integrity assessments cover any equipment with user-specific access, passwords, and audit trailsregardless of how the equipment is classified!
  • Enhance your labeling and sample management procedures to capture all required information, including storage location, storage conditions, and both unopened and opened expiry dates.

7. Complaint handling and QA oversight need to be tightened

We saw these problems in about 25% of our audits. Complaint handling procedures generally existed, but their execution revealed gaps in timeliness and completeness.

What we found:

  • Complaints not submitted within SOP timeframes: At one site, the SOP required complaints to be submitted to QA within one business day. One complaint was opened on September 12, 2024, but wasn’t submitted to QA until September 24—12 days later—with no documented justification for the delay.
  • Massive complaint backlogs: At one medical device facility, ~2,700 out of ~3,600 open complaints were older than 30 days from date of receipt, with no reply to the customer—a repeat finding from the previous year’s audit. This backlog represents both an obvious regulatory and customer relationship risk and is one of the common reasons teams engage us for staff augmentation.
  • Required form sections left blank: At one site, the QA Follow-Up Section of complaint forms was routinely not completed on any of the forms reviewed. A change control was subsequently opened during the audit to remove this section from the form as redundant—but the fact that it was required and routinely left blank represented a procedural compliance failure.
  • QA oversight errors not triggering CAPA: As noted earlier, when QA approved an investigation processed on the wrong form, the error was corrected but no CAPA was initiated to address why QA missed the error in the first place.
  • Complaint trending not reviewed: At one site, management review slides did not include metrics on complaint aging or overdue complaints, even though thousands of overdue complaints existed.

21 CFR 211.198 requires written procedures for handling written and oral complaints and requires that complaints be reviewed by the quality control unit. 21 CFR 820.198 (for devices) requires similar procedures.

We see FDA investigators often pull complaint logs and calculate aging, so a backlog of overdue complaints is one of the most visible indicators of a quality system under stress.

A few recommendations:

  • Enforce intake timelines through system alerts and escalation. If complaints aren’t logged within one business day, someone should be notified.
  • Lock complaint forms electronically so they cannot be closed without required sections being completed—or formally remove sections that are no longer required.
  • Track complaint aging as a KPI reported to management review. Establish acceptable thresholds and escalation procedures when thresholds are exceeded.
  • Treat QA oversight lapses—errors that QA should have caught but didn’t—as CAPA-worthy events requiring investigation and corrective action.

8. Batch record completeness issues persisted

Observed in approximately 25% of audits, these were often minor findings, but their persistence suggests that GDP training and batch record review need reinforcement.

What we found:

  • Checkpoint fields not completed: At one site, operators did not consistently check designated boxes to indicate whether samples were taken at Beginning of Fill (BOF), Middle of Fill (MOF), or End of Fill (EOF). The samples were taken correctly, but the documentation didn’t demonstrate the timing.
  • Timestamps missing: Sample withdrawal times were not documented in batch records, making it impossible to verify the sequence of sampling activities.
  • Reconciliation gaps: API quantities were not appropriately reconciled in warehouse records during dispensing and sampling activities.
  • Missing variable data on labels: At one site, a complaint investigation revealed that variable data (expiration date, lot number, protocol number) on pouch labels was supposed to be handwritten per the procedure in place at the time—but one pouch shipped without this information. The root cause was human error, but the corrective action was to revise procedures to require preprinted labels rather than handwritten variable data.
  • Split electronic batch records: At another site, electronic batch records were scattered across multiple systems and locations rather than being consolidated in a single reviewable package. This fragmentation makes batch review more difficult and increases the risk of missing something.

21 CFR 211.186 requires that batch production and control records include “documentation that each significant step in the manufacture, processing, packing, or holding of the batch was accomplished.” Incomplete batch records—even for minor fields—create an overall impression of documentation weakness that can color an investigator’s assessment of the entire operation.

A few recommendations:

  • Reinforce your GDP training with specific emphasis on “why” rather than just “what”—operators should understand that incomplete checkboxes create traceability gaps, not just procedure violations.
  • Implement batch record review checklists that verify all required fields are completed before the record advances to the next review stage.
  • Ensure material reconciliation procedures are followed and verified during QA review. Discrepancies should trigger investigation before batch release.
  • Eliminate handwritten variable data on labels where possible; preprinted labels with verification checks are more reliable.

9. Sampling procedures required enhancement

This was an issue in about 20% of the audits. Sampling findings often related to misalignment between what suppliers provided and what regulatory requirements demanded.

What we found:

  • Pre-selected samples from suppliers: At one site, API samples were pre-selected by suppliers and sent separately in loose quantities, rather than being drawn from all received containers per √n+1 or other statistically valid sampling rules. This approach may be convenient, but it doesn’t meet 21 CFR 211.84 requirements for representative sampling.
  • Identification testing only on pre-supplied samples: At the same site, identification testing was performed only on the pre-supplied sample quantity, not on materials from all containers. This means containers in the shipment could potentially contain different materials without detection.
  • Sampling SOPs without statistical justification: Some sampling procedures specified sample sizes without documented risk-based justification or reference to applicable statistical principles.

Remember that 21 CFR 211.84(d)(1) requires that “representative samples of each shipment of each lot shall be collected for testing.” The regulation explicitly permits reduced testing schemes based on appropriate criteria, but those criteria must be documented.

Accepting supplier-selected samples without performing your own sampling creates a compliance gap that FDA investigators have specifically targeted.

A few recommendations:

  • Make sure your sampling procedures follow regulatory requirements (21 CFR 211.84) for representative sampling from all containers.
  • If reduced sampling is used, document the risk-based justification including statistical rationale, supplier history, and applicable regulatory provisions.
  • Require identification testing on samples drawn from containers—not on samples pre-provided by suppliers.

10. Deviation and CAPA timeline management need better definition

This finding relates to the gap between internal timelines and external obligations—something we found in about 20% of these audits.

What we found:

  • Internal closure timelines without client notification timelines: At multiple CDMO sites, deviation management SOPs specified that deviations should be closed within 30 calendar days—but did not define timeframes for delivering final investigation reports including CAPA plans to impacted clients. This means a CDMO could close a deviation internally while the client waits indefinitely for the investigation report they need for their own regulatory submissions.
  • Quality agreement requirements not reflected in SOPs: Quality agreements often contain specific notification requirements (e.g., “notify within 2 business days of any deviation potentially impacting product SISPQ”). When these requirements aren’t mirrored in internal SOPs, the risk of inadvertent non-compliance increases.
  • No escalation paths for chronically delayed deviations: Some procedures specified closure timelines but didn’t define what happens when those timelines are repeatedly missed—no escalation to management, no resource reallocation triggers, no root cause analysis of the delay pattern itself.

While the FDA doesn’t specify exact timeframes for deviation closure, the expectation is that investigations are “timely” relative to the nature of the deviation. More importantly, contractual obligations under quality agreements create legal exposure that extends beyond FDA enforcement.

A few recommendations:

  • Define timeframes in SOPs for delivering final deviation investigation reports and CAPA plans to impacted clients—not just for internal closure.
  • Align internal procedures with your quality agreement requirements. Conduct periodic audits to verify this alignment.
  • Make sure you have escalation procedures for chronically delayed deviations, including management notification and resource assessment.

A quick recap (and what it means for you)

Across this dataset, the same problem areas surfaced repeatedly: documentation and change controls, investigation timeliness, equipment qualification, supplier oversight, annual product reviews, data integrity in laboratories, complaint handling, and batch record completeness. These aren’t new issues by any means. They’re the exact areas inspectors target first because they’re visible, auditable, and often indicative of broader quality system health.

The encouraging finding is that most facilities were generally in control. We recorded virtually no critical findings. The concerning finding is that the gaps we did observe were remarkably consistent across geographies, company sizes, and regulatory frameworks. Companies that think they’re different often aren’t.

If you can’t produce:

  • Closed change controls with documented extensions (requested before deadlines)
  • Complete training records linked to role-based curricula
  • Timely investigations with real-time OOS recognition
  • Qualified utilities and equipment with current calibration
  • Current vendor files with signed quality agreements specifying applicable standards
  • On-time APRs covering all strengths and markets
  • Reliable laboratory data with clear audit trail practices
  • Complaint records showing timely intake and complete QA follow-up
  • Complete batch records with all checkboxes and timestamps

...then you’ve got vulnerabilities worth addressing now, before your next inspection.

A few questions to pressure-test your systems

Looking across these findings, here are 10 questions every firm should be able to answer with documentary evidence at hand. If the answer is “not right now,” you may have a vulnerability worth closing before your next inspection.

  1. Can you show a closed change control package that includes objective evidence of implementation—not just a description—and can you demonstrate that any extensions were requested and approved before the original deadline?
  2. If an investigation or change control goes past its due date, do you have a real-time extension approval with scientific or operational justification on file, or would you be left explaining a retroactive request?
  3. Does your training curriculum map directly to each individual’s role, and can you show timely completion records for every employee, including annual GMP refreshers and job-specific qualifications?
  4. For your last three OOS investigations, can you demonstrate that they were opened within one business day of the analyst recognizing the OOS result—not when the data was discovered during subsequent review?
  5. When you check your utilities and equipment logs, can staff explain exactly how they verify calibration status—and is that method written into an SOP that anyone can follow?
  6. Can you produce a controlled Approved Supplier List with current QA approvals, and can you show that every supplier in active use is covered by a signed quality agreement that specifies applicable regulatory standards (GMP, GDP, IPEC, etc.)?
  7. Are your Annual Product Quality Reviews up to date, covering all strengths and markets for each product, and do your management review minutes explicitly cover complaint aging, CAPA effectiveness, and audit trends?
  8. Do your laboratory procedures include an SOP for analytical method validation that defines report structure and acceptance criteria, and can you demonstrate that composite samples are only prepared after individual container identification testing is complete and approved?
  9. If we pulled a random batch record today, would every required checkpoint—including BOF/MOF/EOF sample verifications, sample withdrawal times, and material reconciliation—be completed, initialed, and dated?
  10. Can you demonstrate that every complaint in the past six months was logged within one business day, includes documented QA follow-up, and is reflected in your management review metrics?



Need an auditing, mock inspection, or remediation partner? Let’s talk.

If these findings sound uncomfortably familiar, you’re not alone. Even well-run organizations stumble on documentation, training, investigations, or oversight when the pressure of daily operations takes precedence. The difference between a clean inspection and a Form 483 often comes down to whether you’ve pressure-tested these systems before regulators do.

That’s where we come in. We connect you with former FDA investigators and seasoned industry experts who specialize in:

  • Auditing — Supplier qualifications, internal audits, and global GxP audits tailored to your operations.
  • Mock inspections — Pre-approval and surveillance-style inspections that reveal vulnerabilities before the agency does.
  • Remediation support — Targeted consulting and staff augmentation to close gaps, draft SOPs, retrain teams, and strengthen your QMS.

Whether you need a single subject matter expert, a full audit team, or on-demand support to execute corrective actions, we can match you with the right expertise quickly.

Let’s talk about how we can help you identify and close compliance gaps—so your next inspection outcome is never left to chance. Drop us a line to start the conversation.