FDA 483 Observations on Data Integrity: The 7 Most Common

The most frequent FDA 483 findings on data integrity and cross-system gaps—what went wrong, what good looks like, and why the same observations keep recurring.

pharmaceutical compliance FDA 483 data integrity ALCOA+ FDA inspection warning letter

FDA Form 483 observations are not randomly distributed. The same data integrity findings appear across inspection after inspection, company after company, year after year. Understanding the pattern is the first step to getting out of it.

FDA publishes 483 observations and warning letters publicly. Analysis of data integrity findings from the last five years reveals seven observations that account for a disproportionate share of citations. Each one reflects a systemic gap, not an isolated mistake. Each one is preventable.


Background: What a 483 observation means

A Form 483, “Inspectional Observations,” is a list of conditions or practices observed during an FDA inspection that the investigator believes may constitute violations of the Food, Drug, and Cosmetic Act or related regulations. A 483 is not a final agency determination—it is a formal notification that requires a written response.

A warning letter is the escalation. Warning letters are issued when a company’s 483 response is inadequate or when the violations are serious enough that FDA needs to compel corrective action. For data integrity findings, warning letters often include language requiring third-party audits, data integrity remediation programs, and in some cases, import alerts.

The seven findings below are drawn from publicly available 483 observations and warning letters, analyzed for frequency and root cause.


Finding 1: Audit trails not enabled or not reviewed

What was cited: Systems that create, modify, or delete GxP records had audit trail functionality disabled, partially disabled, or enabled but never reviewed as part of the quality review process.

What went wrong: Audit trail enablement was treated as a one-time configuration task during validation. No one owned ongoing verification that audit trails remained enabled after system updates, migrations, or user administration changes. Separately, even where audit trails were active, quality reviewers did not review them as part of batch or deviation review workflows—the audit trail existed but served no operational function.

What good looks like: Audit trail status is verified on a defined schedule (typically quarterly) and documented. Quality reviews include a step requiring the reviewer to access and evaluate the audit trail for the records under review. Audit trail review is documented in the review record itself.

Regulatory basis: 21 CFR Part 11.10(e); FDA Data Integrity Guidance (2018), Section IV.


Finding 2: Shared login credentials

What was cited: Multiple personnel used a single shared username and password to access systems generating GxP records. In some cases, a department had one login used by everyone in the group.

What went wrong: Shared credentials are fundamentally incompatible with 21 CFR Part 11, which requires that electronic records be attributable to the individual who created them. When five analysts share one login, an audit trail entry that reads “modified by labuser01” cannot be attributed to any specific person. The ALCOA+ requirement for “Attributable” cannot be met.

Shared credentials persist because individual account management is administratively burdensome, particularly in legacy LIMS environments. System administrators take shortcuts. The shortcuts become practice.

What good looks like: Every person who accesses any GxP system has a unique, individual user account. Account creation and termination are linked to HR processes. No shared accounts exist. Account access reviews occur on a defined schedule.

Regulatory basis: 21 CFR Part 11.300; 21 CFR Part 11.10(d); ALCOA+ Attributable.


Finding 3: Data deleted or altered without audit trail entries

What was cited: Original data was deleted, overwritten, or modified without the change being captured in the system’s audit trail. In some instances, investigators found deleted analytical runs, overwritten chromatography data, or batch record entries modified after the fact with no record of the original value.

What went wrong: In validated systems, this finding typically indicates that audit trail controls were circumvented—either through direct database access, use of backup/restore operations that bypassed the application layer, or use of system administrator accounts that had been configured to exclude audit trail logging. In less sophisticated environments, it indicates that paper-based thinking was applied to electronic records: analysts treated electronic data the same way they would cross out and initial a paper record, not understanding that an electronic change requires a different control.

What good looks like: No pathway exists to modify GxP records without creating an audit trail entry. Database access is restricted such that application-layer controls cannot be circumvented. System administrator accounts are subject to the same audit trail requirements as all other users. Deletion of any record requires documented justification and does not remove the record from the system—it changes the record’s status.

Regulatory basis: 21 CFR Part 11.10(e); 21 CFR 211.68(b).


Finding 4: Incomplete or missing data from failed analytical runs

What was cited: Failing or anomalous analytical results were not retained. Systems showed evidence of test runs that did not appear in the final data set—chromatography injections that were not reported, balance readings that were discarded, instrument runs that were aborted without documentation. The results that were reported were selectively the passing results.

What went wrong: Pressure to release product translated into informal practices of “cleaning up” data sets before formal review. Analysts discarded out-of-specification results, re-ran analyses without documenting the original results as out-of-specification events, or configured instruments to suppress anomalous readings automatically. In all cases, the practice violated the requirement that all original data be retained and that out-of-specification results trigger formal investigation.

What good looks like: Every analytical run, including failures, aborted runs, and invalid runs, is retained in the original electronic record. Out-of-specification results trigger a documented investigation process before any conclusion is drawn. Instrument configuration settings are part of the validation record and cannot be changed without change control.

Regulatory basis: 21 CFR 211.194; 21 CFR Part 11.10(e); OOS Guidance (2006).


Finding 5: Electronic signatures not linked to meaning

What was cited: Electronic signatures did not indicate the meaning of the signature—what the signer was approving or attesting to. In some cases, an “approved” signature was used for both review and final approval, making it impossible to determine which quality decision the signature represented.

What went wrong: Electronic signature implementations were set up to capture that a record was signed, but not what the signature meant. 21 CFR Part 11.50 is explicit: electronic signatures must be linked to their respective electronic records to indicate the printed name of the signer, the date and time when the signature was executed, and the meaning (such as review, approval, responsibility, authorship) of the signature.

What good looks like: Each signature event in the system is associated with a defined meaning that is displayed to the signer at the time of signing. The meaning is captured in the audit trail. Different workflow steps use signatures with distinct, defined meanings. Signers cannot proceed without acknowledging the meaning of what they are signing.

Regulatory basis: 21 CFR Part 11.50; 21 CFR Part 11.70.


Finding 6: System access not removed upon personnel departure

What was cited: Former employees retained active system access weeks or months after their departure. In several cited cases, login records showed accesses occurring under former employee credentials after their documented last day of employment.

What went wrong: IT off-boarding and HR processes were decoupled from quality system administration. When an employee left, their HR record was updated but the system access deprovisioning was manual, slow, or missed entirely. In some organizations, access was deprovisioned from some systems but not others—a former employee’s Veeva access was removed but their LIMS credentials remained active.

What good looks like: Personnel departure triggers an automated or procedure-controlled access deprovisioning workflow that covers all GxP systems simultaneously. Access deprovisioning is completed before or on the last day of employment. A periodic access review (at minimum annually) identifies and removes any residual access. Deprovisioning is documented.

Regulatory basis: 21 CFR Part 11.10(d); 21 CFR Part 11.300(d).


Finding 7: Metadata not preserved or not accessible

What was cited: Raw electronic data files were stored in formats or locations that made metadata inaccessible. In some cases, data had been exported from its original instrument software, converted to a different format, and stored without the associated metadata—sample weights, instrument settings, run parameters—that formed part of the original complete record.

What went wrong: The original instrument software generated a proprietary file format containing both the result and the associated metadata. When data was transferred to a document management system, only a static export (PDF or printed report) was retained. The dynamic original data file—the file that would allow an investigator to independently verify the result—was not preserved or was not retrievable.

What good looks like: Original data is retained in a format that preserves dynamic data elements. The format is validated. Metadata is captured as part of the data record, not as a separate document. If data migration occurs, the migration is validated and the integrity of metadata transfer is verified and documented.

Regulatory basis: 21 CFR Part 11.10(b); FDA Data Integrity Guidance (2018), Section III definition of “complete data.”


The pattern behind the findings

These seven findings share a structural characteristic. They are not caused by malicious intent or by absence of a written SOP. They are caused by gaps between documented procedures and operational reality—usually gaps that developed gradually through informal workarounds, under-resourced administration, or systems that were validated once and then treated as static.

The companies that receive these 483 observations typically have quality systems that look compliant on paper. The finding is that the paper does not reflect the practice.

Closing that gap requires ongoing verification, not one-time validation. The question is not “did we configure this correctly when we went live?” The question is “is this still working correctly today, and can we demonstrate it?”


Related reading:

See BioWise in action

Ask a question that spans your entire compliance stack and get an answer in seconds. BioWise connects Veeva, SAP, TrackWise, and your LIMS — without replacing any of them.