Comprehensive Audit Checklist for Product Development Department

These development-auditor checklists provide a structured, phase-appropriate audit framework for Product Development covering Formulation Development, Analytical Development, and Development QA. They focus on end-to-end traceability from QTPP/CQA/CPP and risk assessments through lab batch records, method development/validation/transfer, stability studies, and tech transfer readiness. The questions are designed to expose common hidden gaps such as weak change […]

These development-auditor checklists provide a structured, phase-appropriate audit framework for Product Development covering Formulation Development, Analytical Development, and Development QA. They focus on end-to-end traceability from QTPP/CQA/CPP and risk assessments through lab batch records, method development/validation/transfer, stability studies, and tech transfer readiness. The questions are designed to expose common hidden gaps such as weak change control, incomplete documentation, inadequate data integrity/audit trail review, uncontrolled retesting, and insufficient controls for sterile and potent (women hormone) development work.

 

 

Formulation Development (FD) — 50 Points

1) Is there a defined project initiation + governance?

1.1 Is there a project charter with scope (tablet/capsule/eye drops/injection/hormone)?
1.2 Roles/responsibilities (FD/AD/DQA/RA/Production) defined?
1.3 Milestones and decision gates documented (prototype, scale-up, TT)?
1.4 Meeting minutes/action tracker maintained?

2) Is QTPP (Quality Target Product Profile) defined and controlled?

2.1 QTPP includes dosage form, strength, route, container, shelf-life target?
2.2 Patient/safety needs addressed (sterile attributes, hormone potency risks)?
2.3 QTPP revision control exists (who can change and why)?
2.4 QTPP linked to CQA/CPP selection?

3) Are CQA (Critical Quality Attributes) identified and justified?

3.1 CQAs listed for each product type (e.g., dissolution for tablets; sterility for injections)?
3.2 Justification documented (risk assessment / prior knowledge)?
3.3 CQAs linked to test methods and acceptance criteria?
3.4 CQA list updated after learning (new impurities, stability issues)?

4) Is risk management (ICH Q9 / FMEA) used properly?

4.1 Risk assessment done early (materials/process/packaging)?
4.2 Risk scoring logic documented (severity/occurrence/detectability)?
4.3 Risk controls assigned (mitigation plan + owners)?
4.4 Risk review done after failures/deviations?

5) Is API characterization adequate for development?

5.1 API polymorph/PSD/solubility/hygroscopicity data available?
5.2 API variability (supplier/lots) assessed for impact on formulation?
5.3 API storage/handling requirements defined (light/moisture/temp)?
5.4 Potent/hormone API special handling documented?

6) Are excipient selection & justification documented?

6.1 Excipient function and grade justified (compendial/DMF status)?
6.2 Compatibility screening done (binary mixes, stress storage)?
6.3 Supplier variability risk assessed (different grades/vendors)?
6.4 Excipients for sterile products meet sterile-grade requirements where needed?

7) Is compatibility study design scientifically sound?

7.1 Conditions (temp/RH/light) justified and recorded?
7.2 Timepoints planned and met?
7.3 Acceptance criteria defined (impurity increase, appearance, pH shift)?
7.4 Conclusions supported by data (not assumptions)?

8) Are prototype formulations controlled and traceable?

8.1 Each prototype has unique code/version and change history?
8.2 Lab batch record exists for each prototype?
8.3 Raw material lots used are traceable?
8.4 Samples retained for reference/comparisons?

9) Are lab batch records complete (GDP compliant)?

9.1 Weights, equipment IDs, timings, steps recorded contemporaneously?
9.2 Deviations from procedure recorded with reason and impact?
9.3 Yield calculations and reconciliation recorded?
9.4 Review/approval of lab records defined (supervisor/DQA)?

10) Is development equipment suitable and maintained?

10.1 Equipment list (mixer, homogenizer, granulator, etc.) controlled?
10.2 Calibration/verification status (balances, thermometers) current?
10.3 Cleaning records maintained (especially for hormone/potent)?
10.4 Equipment use log supports traceability to batches?

11) Are weighing/dispensing controls adequate in FD labs?

11.1 Material labels include name/code, lot, status, expiry/retest?
11.2 Dispensing area controls mix-ups (one material at a time)?
11.3 Use of controlled balances/verified weights?
11.4 Leftover material return/disposal controlled?

12) Is cross-contamination prevention effective in FD labs?

12.1 Segregation between hormone/potent and non-potent work?
12.2 Dedicated tools/consumables for hormone products?
12.3 Cleaning verification approach defined (visual/swab where needed)?
12.4 Waste segregation and disposal documented?

13) For Women Hormone/potent products, is containment adequate?

13.1 HBEL/PDE awareness translated into lab controls?
13.2 Containment equipment used (downflow booth, negative pressure)?
13.3 PPE requirements defined and followed (double gloves, respirator if required)?
13.4 Spill response and decontamination procedure available?

14) Are process parameters captured during development?

14.1 Mixing speeds/times/temperatures documented?
14.2 Order of addition controlled and justified?
14.3 Hold times documented and assessed?
14.4 Critical steps identified (sieving, filtration, pH adjustment)?

15) Is DoE (Design of Experiments) used appropriately (if used)?

15.1 DoE plan defines factors/responses/ranges and rationale?
15.2 Randomization/replicates included where needed?
15.3 Data analysis documented (model fit, residuals)?
15.4 Conclusions translated into control strategy?

16) Are CPP (Critical Process Parameters) identified and linked?

16.1 CPPs mapped to CQAs (e.g., granulation endpoint → dissolution)?
16.2 CPP ranges justified (prior knowledge/DoE)?
16.3 Monitoring methods defined (in-process tests)?
16.4 CPP changes controlled via change control?

17) Is scale-up strategy defined from lab to pilot?

17.1 Scale-up rationale documented (geometric similarity, mixing energy)?
17.2 Pilot batch plans exist (equipment mapping)?
17.3 Differences between lab and pilot steps identified and controlled?
17.4 Risks at scale noted and mitigated?

18) Is technology transfer (TT) readiness planned early?

18.1 TT checklist exists (process, materials, specs, methods)?
18.2 Critical knowledge captured (what failed, what worked)?
18.3 Process instructions clear enough for Manufacturing?
18.4 TT package review/approval roles defined?

19) For tablets/capsules: is dissolution performance addressed in FD decisions?

19.1 Formulation choices linked to dissolution goals?
19.2 Disintegration vs dissolution relationship evaluated?
19.3 Lubricant level/PSD impact studied?
19.4 Robustness to process variation assessed?

20) For tablets/capsules: is blend uniformity / content uniformity risk addressed?

20.1 Mixing strategy and sampling plan defined?
20.2 Segregation risk evaluated (PSD/density differences)?
20.3 Low-dose/hormone products have enhanced controls?
20.4 Acceptance criteria defined for development stage?

21) For Eye Drops: are pH/osmolality/viscosity targets defined?

21.1 Targets justified for comfort/stability/compatibility?
21.2 Buffer selection and concentration rationale documented?
21.3 Viscosity agent selection justified and controlled?
21.4 In-use performance considerations addressed?

22) For Eye Drops: is drop size/drop rate controlled?

22.1 Dropper/nozzle selection rationale documented?
22.2 Drop weight/volume tested and recorded?
22.3 Container closure compatibility verified?
22.4 Variation across component lots evaluated?

23) For Eye Drops: is preservative selection justified (if multi-dose)?

23.1 Preservative type and level justified?
23.2 Preservative compatibility with formulation and container assessed?
23.3 PET (Preservative Efficacy Test) plan exists (as applicable)?
23.4 Neutralization strategy defined for microbiological tests?

24) For injections: is sterilization strategy defined?

24.1 Terminal sterilization vs sterile filtration rationale documented?
24.2 If sterile filtration: filter selection (0.22 µm) justification?
24.3 Filter integrity test requirements defined (pre/post)?
24.4 Bioburden/hold times assessed?

25) For sterile products: is container closure selection justified?

25.1 Vial/stopper/seal compatibility studied?
25.2 Extractables/leachables risk assessed at dev stage?
25.3 CCIT strategy considered (even if later validation)?
25.4 Component lot traceability maintained?

26) Are in-process tests defined for development batches?

26.1 Which checks are done (pH, viscosity, assay quick checks)?
26.2 Criteria defined (even if wider early-stage)?
26.3 Out-of-range handling documented (rework rules)?
26.4 Results recorded and reviewed?

27) Are rework/reprocess rules defined in development?

27.1 What adjustments are allowed (pH adjust, remix, refilter)?
27.2 Who approves adjustments and documents rationale?
27.3 Limits on number of reworks to avoid “testing into compliance”?
27.4 Impact on stability/quality assessed?

28) Is development stability program set up properly?

28.1 Protocol defines conditions (ICH), pull points, packaging?
28.2 Samples representative (final/closest-to-final pack)?
28.3 Excursions handled with impact assessment?
28.4 Stability data trends reviewed and actions taken?

29) Is in-use stability considered for Eye Drops (if applicable)?

29.1 In-use period target defined and justified?
29.2 Micro risk controls assessed (preservatives/packaging)?
29.3 Study design includes opening/handling simulation?
29.4 Acceptance criteria defined and reviewed?

30) Is photostability considered when relevant?

30.1 Risk assessed (light-sensitive APIs/excipients)?
30.2 Study design and packaging protection evaluated?
30.3 Labelling/storage instruction impact assessed?
30.4 Results drive packaging choice?

31) Are packaging compatibility studies done early enough?

31.1 Interaction with plastics (adsorption, leaching) assessed for liquids?
31.2 Foil/film moisture barrier needs evaluated for tablets?
31.3 Label/ink interactions considered (if relevant)?
31.4 Conclusions documented with evidence?

32) Are hold time studies considered (bulk/solution)?

32.1 Hold times defined for bulk blend/granules/solutions?
32.2 Conditions during hold controlled and recorded?
32.3 Micro risks considered for aqueous solutions?
32.4 Hold time exceed triggers deviation?

33) Is documentation of learning/knowledge management strong?

33.1 Development reports summarize experiments and decisions?
33.2 Failed trials captured (not hidden) with lessons learned?
33.3 Decision rationale traceable (why formula changed)?
33.4 Reports reviewed/approved per SOP?

34) Are outsourced development activities controlled (CRO/CMO)?

34.1 Vendor qualification and quality agreement in place?
34.2 Defined scope and data ownership?
34.3 Raw data availability and review process?
34.4 Sample chain of custody controlled?

35) Are samples managed properly in development?

35.1 Sample inventory log exists (what/where/qty)?
35.2 Sample labeling prevents mix-ups (project/batch/version)?
35.3 Storage conditions controlled (2–8°C/light protection)?
35.4 Sample disposal/retention rules defined?

36) Are deviations recorded for development activities?

36.1 Clear triggers for deviation (missed step, wrong parameter, excursion)?
36.2 Deviations include impact assessment and actions?
36.3 Overdue deviations tracked and escalated?
36.4 Recurrence prevention (CAPA) documented?

37) Are CAPA created when needed (not only “note and move on”)?

37.1 Root cause analysis used (5-Why/fishbone)?
37.2 Actions assigned with owners and due dates?
37.3 Effectiveness check defined (evidence of improvement)?
37.4 CAPA closure approved by DQA?

38) Is change control applied to formulation/process changes?

38.1 Changes recorded with reason and risk assessment?
38.2 Change approval required before execution?
38.3 Impact on specs/methods/stability assessed?
38.4 Change history traceable across versions?

39) Is training/competency maintained for FD staff?

39.1 Training matrix for equipment/processes exists?
39.2 OJT/qualification before independent work?
39.3 Refresher training schedule?
39.4 Training effectiveness checked (errors/trends)?

40) Are computerized systems/ELN controlled (if used)?

40.1 User access controls (unique logins)?
40.2 Audit trail enabled and reviewed?
40.3 Data backup/archival available?
40.4 Template/version control for electronic records?

41) Are raw materials for development controlled like GMP where required?

41.1 Status labels and expiry/retest controlled?
41.2 Approved suppliers preferred and documented?
41.3 Small-lot dispensing traceability?
41.4 Storage conditions monitored?

42) Are sterile development clean practices followed (where applicable)?

42.1 Clean area behavior and cleaning logs maintained?
42.2 Bioburden controls for solutions established?
42.3 Filtration handling prevents contamination?
42.4 Micro interface defined (sampling, testing, release gates)?

43) Is formulation selection decision documented (why final formula chosen)?

43.1 Criteria includes CQAs, manufacturability, stability, cost?
43.2 Comparative data tables available?
43.3 Risk assessment updated with final choice?
43.4 Sign-off by cross-functional team?

44) Are development specifications defined and versioned?

44.1 Interim specs exist for prototypes (stage appropriate)?
44.2 Specs link to analytical methods?
44.3 Change control for spec updates?
44.4 Transition to commercial spec plan exists?

45) Is cleaning and lab housekeeping adequate in FD areas?

45.1 Cleaning schedules and logs maintained?
45.2 Potent/hormone cleaning controls stricter and documented?
45.3 Material segregation and “one at a time” practice?
45.4 Waste bins labeled and removed on schedule?

46) Is data integrity (ALCOA+) maintained in lab notebooks?

46.1 Contemporaneous entries (no rewriting later)?
46.2 Corrections GDP compliant (single line, date, sign, reason)?
46.3 No loose papers without attachment control?
46.4 Supervisor review frequency and evidence?

47) Are project deliverables archived and retrievable?

47.1 Final reports stored in controlled repository?
47.2 Version history retained?
47.3 Retrieval demonstrated quickly during audit?
47.4 Retention periods defined?

48) Is there control for near-miss in development (mix-up, wrong version)?

48.1 Near-miss log maintained?
48.2 Root cause and actions documented?
48.3 Trending of repeated near-misses?
48.4 Training/SOP updated from lessons learned?

49) Are safety/EHS requirements integrated (especially hormone/potent)?

49.1 Hazard assessments available?
49.2 Exposure controls/PPE training documented?
49.3 Spill kit availability and drill evidence?
49.4 Waste disposal compliant with hazardous rules?

50) Is FD ready for tech transfer with a complete package?

50.1 Process description clear and reproducible?
50.2 Critical materials list + supplier info included?
50.3 CPP/CQA control strategy proposed?
50.4 FD sign-off and DQA review recorded?


Auditor 2 — Analytical Development (AD) — 50 Points

1) Is there an Analytical Development strategy per project?

1.1 Target Method Profile (TMP) defined (purpose, sensitivity, speed)?
1.2 Method scope covers assay, impurities, dissolution, KF, GC where needed?
1.3 Stage-appropriate lifecycle plan (dev → validation → transfer)?
1.4 Roles and review responsibilities documented?

2) Are method development records complete and traceable?

2.1 Lab notebook/ELN captures experiments and decisions?
2.2 Failed trials documented (not hidden)?
2.3 Clear rationale for parameter choices (column, pH, mobile phase)?
2.4 Supervisor review evidence?

3) Are reference standards/impurity standards controlled?

3.1 Primary standard traceability (COA, storage, expiry)?
3.2 Working standards qualified and documented?
3.3 Potency/correction factors applied correctly?
3.4 Solution stability/expiry defined for standards?

4) Are critical reagents/solvents controlled?

4.1 HPLC/GC grade verification and labeling?
4.2 Volumetric solution standardization records?
4.3 “Top-up” prohibited and monitored?
4.4 Expired reagents disposal documented?

5) Are instruments qualified for development testing?

5.1 HPLC/GC/KF/Dissolution qualification and calibration status?
5.2 PM and breakdown logs maintained?
5.3 Balance calibration and daily checks?
5.4 Temperature devices (ovens/fridges) verified?

6) Are chromatography system suitability requirements defined for dev methods?

6.1 SST criteria defined (RSD, tailing, plates, resolution)?
6.2 SST failure handling documented?
6.3 Carryover checks and blanks used?
6.4 Standard bracketing strategy defined?

7) Is integration/reprocessing controlled (data integrity risk)?

7.1 Integration guidelines exist?
7.2 Manual integration allowed only with justification?
7.3 Audit trail reviewed (who changed what/when)?
7.4 Deleted injections documented and justified?

8) Are forced degradation studies adequate (stability-indicating proof)?

8.1 Stress conditions cover acid/base/oxidation/heat/light?
8.2 Mass balance considered?
8.3 Degradant separation demonstrated?
8.4 Conclusions documented and approved?

9) Is specificity demonstrated (placebo/interference)?

9.1 Placebo interference checked for current formulation?
9.2 Impurity peaks resolved from API peak?
9.3 Preservatives/excipients interference checked (eye drops)?
9.4 Filter/diluent peaks ruled out?

10) Is sample preparation robust and controlled?

10.1 Extraction time/sonication controlled?
10.2 Filter compatibility/adsorption study available?
10.3 Sample solution stability established?
10.4 Dilution scheme error-proofed (checklists)?

11) Are method validation parameters planned stage-appropriately?

11.1 Accuracy/precision plans defined?
11.2 Linearity/range planned with levels and replicates?
11.3 LOD/LOQ determination approach defined?
11.4 Robustness study plan exists?

12) Are development reports reviewed and approved?

12.1 Protocols and reports controlled by document system?
12.2 Deviations during validation documented?
12.3 Acceptance criteria justified?
12.4 QA/DQA review sign-offs present?

13) Are GC residual solvents methods controlled (if applicable)?

13.1 Headspace parameters locked and justified?
13.2 Leak checks/crimp integrity controls?
13.3 Calibration curve acceptance criteria defined?
13.4 Reinjection policy controlled?

14) Are KF moisture methods controlled?

14.1 Drift/blank limits defined?
14.2 Reagent factorization records?
14.3 Moisture pickup prevention in sample handling?
14.4 OOT trending for moisture?

15) Is dissolution method development scientifically justified?

15.1 Medium selection and sink conditions justified?
15.2 Apparatus (paddle/basket) selection justified?
15.3 Filter compatibility confirmed?
15.4 Discriminatory ability evaluated (process/formulation changes)?

16) Are dissolution equipment controls adequate during development?

16.1 Mechanical calibration evidence?
16.2 Vessel verification/PVT if applicable?
16.3 Timer accuracy and sampling discipline?
16.4 Cleaning/carryover prevention?

17) Are impurity profiles managed and trended?

17.1 Unknown peaks handling SOP?
17.2 Reporting thresholds defined?
17.3 Impurity reference standards controlled?
17.4 Trending across prototypes and stability timepoints?

18) Is method suitable for Women Hormone/potent products?

18.1 Sensitivity/LOQ adequate for low-dose?
18.2 Cross-contamination prevention in sample prep?
18.3 Dedicated consumables or cleaning verification?
18.4 Analyst PPE and safety controls?

19) Are stability sample testing methods consistent and controlled?

19.1 Same method version used across time?
19.2 Reinjection windows controlled?
19.3 Stability OOT trending performed?
19.4 Data packages reviewed and approved?

20) Are OOS/OOT handled correctly in AD work?

20.1 Phase-I lab investigation documented?
20.2 Retesting rules controlled (not testing into compliance)?
20.3 Root cause and CAPA recorded where needed?
20.4 QA visibility on critical OOS?

21) Are deviations recorded for analytical work?

21.1 Triggers defined (wrong standard, instrument issues, late testing)?
21.2 Impact assessment documented?
21.3 Overdue deviation tracking?
21.4 CAPA effectiveness checks?

22) Are method changes controlled via change control?

22.1 Rationale for change documented?
22.2 Impact assessed on past results and stability?
22.3 Training performed before implementing?
22.4 Version history traceable?

23) Is method transfer readiness assessed?

23.1 Transfer protocol template exists?
23.2 Critical parameters identified?
23.3 Acceptance criteria for transfer defined?
23.4 Training plan for receiving lab included?

24) Are raw data packages complete and traceable?

24.1 Sequence, SST, chromatograms, calculations included?
24.2 Audit trail snapshots included where needed?
24.3 Reviewer checklist used?
24.4 Archival and retrieval tested?

25) Is computerized system access controlled?

25.1 Unique user IDs enforced?
25.2 Role-based permissions?
25.3 Audit trail enabled and reviewed?
25.4 Backup/restore process verified?

26) Are Excel templates validated and controlled (if used)?

26.1 Validation report exists?
26.2 Formula lock and access restriction?
26.3 Version control prevents local copies?
26.4 QA approval for changes?

27) Are calculations independently verified?

27.1 Second-person check required?
27.2 Units/rounding rules defined?
27.3 Potency/moisture corrections applied consistently?
27.4 Transcription reconciliation step exists?

28) Are sample/standard storage conditions controlled?

28.1 Fridge/freezer monitoring?
28.2 Light protection where needed?
28.3 Labeling includes prep date/expiry?
28.4 Disposal of expired solutions documented?

29) Are lab housekeeping and segregation adequate?

29.1 Solvent segregation and labeling?
29.2 Waste solvent handling compliant?
29.3 Potent/hormone segregation?
29.4 Cleaning schedules recorded?

30) Are training/authorization controls strong?

30.1 Training matrix per instrument/method?
30.2 Qualification before independent work?
30.3 Refresher training schedule?
30.4 Analyst error trending for retraining?

31) Are outsourced analytical activities controlled (CRO)?

31.1 Vendor qualification and quality agreement?
31.2 Raw data ownership and review?
31.3 Sample chain of custody?
31.4 Deviation/OOS communication timelines?

32) Are reagents/media for microbiological tests in AD scope controlled (if applicable)?

32.1 Labeling and expiry controls?
32.2 Storage conditions monitored?
32.3 Method suitability defined?
32.4 Review/approval defined?

33) Are placebo and formulation changes reflected in method specificity?

33.1 Placebo composition kept current?
33.2 Specificity reassessed after formulation change?
33.3 Forced degradation repeated if needed?
33.4 Change documented via change control?

34) Are carryover and contamination controls adequate?

34.1 Carryover checks included in sequences?
34.2 Needle wash settings controlled?
34.3 Blank acceptance criteria defined?
34.4 Actions taken when carryover observed?

35) Are solution stability studies adequate?

35.1 Standard and sample stability tested across expected run time?
35.2 Storage condition defined (room temp/fridge)?
35.3 Reinjection limits defined?
35.4 Deviations for exceeded reinjection window?

36) Are column and consumables managed?

36.1 Column ID and history tracked?
36.2 Storage conditions for columns?
36.3 Column change impact assessed?
36.4 Lot-to-lot consumable variability considered?

37) Is the method robust to small variations?

37.1 Deliberate variations tested (pH, flow, temp)?
37.2 Acceptance criteria defined?
37.3 Conclusions documented?
37.4 Robustness issues feed back to FD/process?

38) Is reporting consistent and controlled?

38.1 Report templates version controlled?
38.2 Correct units and rounding used?
38.3 Reviewer checklist includes spec comparison?
38.4 Corrections handled via GDP/e-signature?

39) Are development specifications aligned with methods?

39.1 Interim acceptance criteria defined?
39.2 Linked to method performance (LOQ)?
39.3 Updated as product matures?
39.4 DQA review present?

40) Are method lifecycle documents archived?

40.1 Protocols, reports, raw data retained?
40.2 Retrieval demonstrated during audit?
40.3 Retention period defined?
40.4 Obsolete versions archived and access controlled?

41) Do you trend method performance?

41.1 SST failures tracked?
41.2 Analyst/instrument bias trends?
41.3 Drift or recurring issues trigger CAPA?
41.4 Trending reviewed and signed?

42) Are near-misses captured (wrong method version, wrong integration)?

42.1 Near-miss log exists?
42.2 Root cause and lessons learned?
42.3 SOP/training updates done?
42.4 Recurrence monitoring?

43) Are security and confidentiality maintained for development data?

43.1 Access control for project data?
43.2 Controlled sharing with partners?
43.3 Audit logs maintained?
43.4 Data export restrictions?

44) Are sterile product analytical needs addressed?

44.1 Particulate/clarity methods readiness (if applicable)?
44.2 Preservative assay method suitability?
44.3 Leachables screening strategy (as stage appropriate)?
44.4 Micro interface clearly defined?

45) Are transfer packages prepared properly?

45.1 Method description + critical parameters included?
45.2 Sample prep and stability instructions included?
45.3 Troubleshooting guidance included?
45.4 AD sign-off and DQA review?

46) Are ad hoc tests controlled (non-standard experiments)?

46.1 Documented objective and approval?
46.2 Raw data captured properly?
46.3 Results not used for release decisions improperly?
46.4 Archived and reviewed?

47) Are instrument software settings controlled?

47.1 Processing methods locked?
47.2 Time/date settings controlled?
47.3 User privileges reviewed?
47.4 Audit trail review evidence?

48) Is lab safety adequate (solvents, potent)?

48.1 MSDS access and training?
48.2 Fume hood use and maintenance?
48.3 Waste segregation?
48.4 Incident reporting?

49) Is management review done for AD metrics?

49.1 KPIs defined (cycle time, OOS rate, overdue reports)?
49.2 Management review minutes available?
49.3 Action items tracked?
49.4 Improvements documented?

50) Is AD output ready for registration/commercialization?

50.1 Stability-indicating evidence complete?
50.2 Validation/transfer readiness confirmed?
50.3 Data integrity and traceability assured?
50.4 Final method package approved by DQA?


Auditor 3 — Development Quality Assurance (DQA) — 50 Points

1) Is phase-appropriate GMP defined for development?

1.1 Stage definitions exist (research vs development vs pilot vs clinical)?
1.2 Controls proportionate to risk and intended use?
1.3 Clear guidance for what must be documented?
1.4 Staff trained on development GMP expectations?

2) Is there a DQA governance model for projects?

2.1 DQA role in reviews/approvals defined?
2.2 Project quality plan exists?
2.3 Quality gate reviews held (go/no-go)?
2.4 Minutes and actions tracked?

3) Document control system for development

3.1 SOPs/protocols/reports controlled with versions?
3.2 Obsolete documents prevented from use?
3.3 Distribution control (who has access)?
3.4 Archival and retention rules?

4) Control of development SOPs

4.1 SOP list covers key activities (batch records, sampling, data integrity)?
4.2 SOP training completion tracked?
4.3 Deviations to SOP handled formally?
4.4 Periodic SOP review schedule?

5) Review and approval of protocols

5.1 Stability/validation/DoE protocols reviewed by DQA?
5.2 Acceptance criteria justified?
5.3 Risk assessments included?
5.4 Protocol deviations captured and approved?

6) Review and approval of reports

6.1 Development reports reviewed with checklist?
6.2 Raw data traceability verified?
6.3 Conclusions supported by results?
6.4 Report version control maintained?

7) Data integrity program (ALCOA+)

7.1 Data integrity SOPs exist for development?
7.2 Unique logins enforced for systems?
7.3 Audit trail review requirements defined?
7.4 Data integrity incidents managed with CAPA?

8) Computerized system governance (CSV where applicable)

8.1 System inventory exists (ELN, LIMS, chromatography software)?
8.2 Validation status defined for intended use?
8.3 Access control and periodic review?
8.4 Backup/restore evidence?

9) Change control system for development

9.1 Change control applies to formulation, method, equipment, supplier changes?
9.2 Impact assessment required (CQA/CPP/stability/transfer)?
9.3 Approvals required before implementation?
9.4 Change effectiveness reviewed?

10) Deviation management in development

10.1 Clear triggers for deviations?
10.2 Investigation quality (root cause, impact assessment)?
10.3 Overdue deviation tracking?
10.4 QA approval for closure?

11) CAPA system effectiveness

11.1 CAPA initiated based on deviation/OOS/trends?
11.2 CAPA actions are specific and owned?
11.3 Effectiveness checks defined with evidence?
11.4 Recurrence monitored?

12) OOS/OOT governance in development testing

12.1 OOS procedure applied in dev labs?
12.2 Retesting rules prevent testing into compliance?
12.3 OOT trending program exists?
12.4 QA oversight documented?

13) Supplier and vendor qualification oversight (CRO/CMO)

13.1 Vendor qualification procedure exists?
13.2 Quality agreements define responsibilities and data access?
13.3 Audit program for key vendors?
13.4 Vendor performance trending?

14) Material control expectations in development

14.1 Raw materials labeled with status/expiry?
14.2 Use of non-GMP material risk assessed?
14.3 Traceability to lots maintained?
14.4 Storage conditions monitored?

15) Batch record / lab record templates governance

15.1 Standard templates exist and controlled?
15.2 GDP requirements included?
15.3 Review and approval workflow?
15.4 Template changes controlled?

16) Training and competency system

16.1 Training matrix exists for FD/AD staff?
16.2 Qualification before independent work?
16.3 Refresher training schedule?
16.4 Training effectiveness monitoring?

17) Management of potent/Women Hormone risks

17.1 HBEL/PDE risk management included in quality planning?
17.2 Segregation and cleaning verification requirements defined?
17.3 Waste disposal controls and EHS interface?
17.4 Incident reporting and escalation?

18) Cross-contamination prevention governance

18.1 Facility and workflow segregation assessed?
18.2 Cleaning validation/verification strategy defined for dev?
18.3 Dedicated tools/consumables rules?
18.4 Effectiveness checks and audits?

19) Sterile development quality governance

19.1 Sterile development activities have defined controls?
19.2 Micro interface (bioburden, sterility, endotoxin) clear?
19.3 Filter integrity/hold times expectations?
19.4 Deviations escalated appropriately?

20) Stability program QA oversight

20.1 Protocol approval and change control?
20.2 Chamber qualification status reviewed?
20.3 Excursions handled with impact assessment?
20.4 Stability data trending and reporting?

21) Sample retention and traceability governance

21.1 Retention policy for dev samples defined?
21.2 Storage condition controls?
21.3 Access logs?
21.4 Destruction authorization?

22) Tech Transfer (TT) quality oversight

22.1 TT checklist and deliverables defined?
22.2 Cross-functional review of TT package?
22.3 Deviations during TT managed?
22.4 Post-transfer feedback loop exists?

23) Control strategy development oversight

23.1 Link QTPP → CQA → CPP → controls documented?
23.2 Strategy updated with learning?
23.3 Risks and mitigations documented?
23.4 QA approval of control strategy milestones?

24) Design of Experiments (DoE) governance

24.1 DoE protocol approval required?
24.2 Data integrity controls on DoE data?
24.3 Statistical review competence available?
24.4 Conclusions appropriately used (no over-claiming)?

25) Packaging/CCIT oversight for sterile products

25.1 Packaging component changes assessed for impact?
25.2 CCIT strategy considered and documented?
25.3 Supplier qualification for stoppers/vials?
25.4 Complaint/leaker trend readiness?

26) Data review checklists and review discipline

26.1 Reviewer checklists exist for lab records and analytical packages?
26.2 Review independence ensured?
26.3 Backdating controls?
26.4 Findings tracked to CAPA?

27) Audit program for development areas

27.1 Internal audit schedule exists for FD/AD?
27.2 Audit findings tracked to closure?
27.3 Repeat findings analyzed for systemic issues?
27.4 Management review of audit outcomes?

28) Metrics/KPI governance

28.1 KPIs defined (deviation aging, OOS rate, cycle time)?
28.2 KPI review meetings documented?
28.3 Actions assigned and tracked?
28.4 Effectiveness of improvements verified?

29) Control of outsourced data and raw data availability

29.1 Contracts require raw data access?
29.2 Data review performed before acceptance?
29.3 Data integrity expectations defined?
29.4 Audit rights included?

30) Laboratory safety & compliance oversight (QA interface)

30.1 EHS training tracked?
30.2 Incident reporting and investigation system?
30.3 Chemical/solvent waste compliance checks?
30.4 Potent exposure control oversight?

31) Computer access management

31.1 User provisioning/deprovisioning controlled?
31.2 Periodic access review performed?
31.3 Shared accounts prohibited?
31.4 Password policies enforced?

32) Archival and record retention

32.1 Retention periods defined for protocols/raw data/reports?
32.2 Archival storage secure and retrievable?
32.3 Electronic record integrity preserved?
32.4 Retrieval test evidence?

33) Handling of errors/near-misses

33.1 Near-miss log exists?
33.2 Root cause and actions documented?
33.3 Learning shared across teams?
33.4 Trend analysis performed?

34) Labeling and identification control (development samples)

34.1 Sample labels standardized?
34.2 Mix-up prevention controls?
34.3 Relabeling rules GDP compliant?
34.4 Reconciliation rules for samples?

35) Control of interim specs and acceptance criteria

35.1 Stage-appropriate specs exist?
35.2 Specs linked to method capability (LOQ)?
35.3 Spec changes controlled?
35.4 Transition plan to commercial specs?

36) Method lifecycle QA oversight

36.1 Method development deliverables defined?
36.2 Validation/verification readiness review?
36.3 Transfer protocols reviewed?
36.4 Post-transfer performance monitoring?

37) Deviations for stability/TT activities

37.1 Missed pulls handled via deviation?
37.2 Late testing impact assessed?
37.3 TT trial failures investigated?
37.4 Effectiveness checks?

38) Handling of excursions (storage, chambers, transport)

38.1 Excursion logs maintained?
38.2 Impact assessments documented?
38.3 QA approvals recorded?
38.4 Corrective actions tracked?

39) Quality review of development batch records

39.1 Batch record completeness verified?
39.2 Traceability of materials/equipment?
39.3 Deviations documented and assessed?
39.4 Approval prior to using results for decisions?

40) Integration of RA/Regulatory requirements

40.1 Regulatory expectations communicated into development controls?
40.2 Document readiness for submission?
40.3 Change impact assessed for registration strategy?
40.4 Approval workflows include RA when needed?

41) Supplier CoA reliance oversight (development stage)

41.1 Reduced testing risk assessment?
41.2 Periodic verification testing?
41.3 Trend review of CoA vs internal?
41.4 Controls for counterfeit prevention?

42) Quality oversight of potent cleaning verification

42.1 Cleaning acceptance criteria defined?
42.2 Records reviewed?
42.3 Failures trigger CAPA?
42.4 Effectiveness verified?

43) Governance of method integration/data processing

43.1 Integration guidelines approved?
43.2 Audit trail review required?
43.3 Role permissions controlled?
43.4 Deviations for data processing issues?

44) Governance of dissolution equipment and method controls

44.1 Calibration/verification oversight?
44.2 Method discriminatory evidence reviewed?
44.3 OOS investigations quality reviewed?
44.4 Trend monitoring for drift?

45) Governance of KF and GC methods

45.1 Drift/leak controls reviewed?
45.2 Reagent/standard controls reviewed?
45.3 OOT trending reviewed?
45.4 Failures investigated with CAPA?

46) Quality agreement coverage for development partners

46.1 Agreement includes data integrity and record access?
46.2 Change notification required?
46.3 Deviation/OOS communication timelines?
46.4 Audit rights and expectations?

47) Review of final project conclusions

47.1 Final development report reviewed for completeness?
47.2 Decision rationale traceable?
47.3 Risks documented for TT/commercial?
47.4 Approval/sign-off recorded?

48) Quality oversight of sample storage and retention

48.1 Storage monitoring (temp/RH) verified?
48.2 Excursions handled?
48.3 Sample access controlled?
48.4 Destruction authorization?

49) Readiness for inspection (audit readiness)

49.1 Records are retrievable quickly?
49.2 Staff can explain procedures consistently?
49.3 Evidence of review and approvals exists?
49.4 Open issues tracked and visible?

50) Management review of development quality system

50.1 Management review meetings documented?
50.2 Quality risks and trends reviewed?
50.3 Actions assigned and tracked?
50.4 Effectiveness of improvements verified?

Leave a Comment

Your email address will not be published. Required fields are marked *