FDA and Industry Collaboration On Computer Software Assurance
FDA and Industry Collaboration On Computer Software Assurance
FDA and Industry Collaboration On Computer Software Assurance
Panelists:
➢ Khaled Moussally, Compliance Group – Global Head of QMS
➢ Jason Spiegler, Siemens Digital Industries Software – Senior Director,
Strategic Initiatives & Customers, Life Sciences Practice
➢ Kurt Weber, Vericel Corporation - Director of Information Technology
➢ Harsha Chulki, ICU Medical - Head of Global IT Quality & CSV
➢ Ken Shitamoto, Gilead Sciences – Sr Director, IT
Material Contributors:
➢ Cisco Vicenty, Office of Compliance, FDA (CDRH)
1
Objectives:
➢ Create awareness to accelerate innovation
➢ Inspire action so you can begin to realize value
Agenda:
➢ Quick recap on FDA CSV Team’s journey and CSA Draft Guidance
➢ Panel discussion on:
o Recent recommendations to the Draft Guidance
o Success Stories: Vericel, Gilead Sciences and ICU Medical
o “Test More; Document Less” “Computer Software Assurance for
Manufacturing, Operations, and
➢ Q&A and Open Discussion
Quality System Software”
Computer System Validation!
www.fda.gov
CSV identified as a barrier for the FDA…
CSV!!!
www.fda.gov
The Industry CSV Team
• Takes forever
www.fda.gov
Key Take Aways
Key
Why Take Aways
Now? Defining Risk
• Clearly define “intended use”.
Create a Paradigm Shift…
• Focus on the “direct impact on device safety and
• Streamline with value-driven, device quality”, and does it result in “patient/user
patient focused approaches 80% safety risk?” See examples.
• Critical thinking & risk-based 20% ➢ LMS vs Manufacturing Equipment Software
agile approaches • For PMA Products, CDRH is exploring using risk
Test Document
• Improve manufacturer’s determination to make implementation of systems an
% Time Spent annually reportable change no 30-Day Notice
capabilities with automation
• Risk Assessment: The intended use of the spreadsheet is for analyzing process quality outcomes and is identified as a
high-risk function. The manufacturing process includes additional changes and inspections that assure non-
conformances do not escape therefore the patient risk is low.
• Testing activities: Created, updated, and deleted analyses and observed that all calculated fields were correctly updated
Patient/Quali
High patient and / or fail the direct Unscripted Testing
• QC Laboratory test calculations Testing Testing
ty System
implementation of a quality Medium Ad-Hoc Testing (With Limited Scripted
system activity defined in a • Adverse event tracking Record)
Unscripted Testing
Testing
•
Risk
regulation Clinical trial results Low Ad-Hoc Testing (With Ad-Hoc Testing
Unscripted Testing
Errors in the spreadsheet outputs • Product Complaint Tracking Record) (With Record)
will cause significant but • Product quality status management
temporary harm or reversible • Product Complaints Trending
Medium damage to patient and / or fail to • Spreadsheet supporting managers review Spreadsheet Description / Intended Use Patient/Quality Implementation Assurance Activity
support the indirect the training assigned to either to their Risk Level Method
implementation of a quality employees (the system that directly Product Recall Decision Matrix – Assessment High Custom Robust Scripted
system activity defined in a implements the quality system activity is form used to decide the “recall” status of a Testing
regulation. the Learning Management System product based on predefined trending criteria
Errors in the spreadsheet outputs Complaint Trending Tool – Extract complaint Medium Custom (VBA Limited scripted testing
will cause minor harm to a data from various sources and identify complaint Macros)
• Company specific quality trending
Low patient and / or fail to implement trends
spreadsheets not required by regulation.
a quality system activity not Non-conformance data analysis - Analyze and Low Configured Ad-hoc testing
defined in a regulation graph non-conformances data stored in a
• Manufacturing cost reports controlled system
• Turnaround time reports
• Project Management tools
No direct or indirect impact on
None • Task tracking, Schedule Implementation Method Examples for Spreadsheets
patient
• Activity Due Date Calculator (Non-
Printing functions, Formatting functions, Simple Arithmetic Operations (e.g. sum, average,
Product) Out of the Box
std.dev), Simple Cell Relationships and Range Operations
• Vendor Assessment Checklist
Formulae (including statistical functions), Single level Boolean functions (If, And, Or),
Configured
Charts, Conditional Statements, Nested Functions, Security (Cell Protection), Pivot tables
Custom Macros, VBA Code, Extensive Nested Logic Functions (e.g. IF(condition1,
Custom value_if_true1, IF( condition2, value_if_true2, IF(condition3, value_if_true3,
value_if_false3)))
Qualification of Automated CSV Tools
These intended uses of these tools are not part of production or the quality system, therefore should
not be considered within the requirements of 21 CFR 820.70(i).
Examples include:
• Code Debugger (for CAPA Automation use case)
• Loadrunner (for simulating anticipated peak load of ERP production system use case)
• Defect management and ALM tools
Assurance Process
Step 1: Identify where and how tool will be used within your organization
Step 2: Determine if the Off the Shelf tool is part of, integrated, or used for automating production or Quality Management
Step 3: Assure the use within your organization
Step 4: Capture Evidence
Can you provide examples of applying these
recommendations and the resulting value?
Vericel Industry Case Study – Unscripted Testing
80% 80%
20% 20%
LEGACY ERP, EBR
Test Document Test Document
SYSTEMS Systems
% Time Spent
% Time Spent
80% 80%
20% 20%
Focus on Focus on Focus on Focus on
Testing Documentation Testing Documentation
% Time Spent % Time Spent
Unscripted Testing –
Spec ID #2 Medium 3 Required Mitigation 2
Error Guessing
Unscripted Testing –
Spec ID #3 Low 2 N/A
Exploratory Testing
ICU Industry Case Study – Unscripted Testing
Reduction in Validation
50% Less system issues found 60% Testing spend
A COMPARISON
FDA CSV Team Practice
Vs
Legacy Practice
Gilead Empirical Retrospective Validation Effort Comparison of Traditional vs. New Model
Analysis
System Name
80%
30%
Process Control System*
(May 2016)
20%
Environmental Monitoring System
(Jul 2016)
10%
“…………………….It was awesome finally going through a validation and being assured
that things were good from the EU perspective as well...
- Frank M.
What are your next steps?
How are you planning to automate?
Next Steps
• Continue developing Use Cases and new recommendations
• Encourage manufacturers to start using recommendations
❑ Capture value - measure better, faster, and less expensive CSV activity and…
❑ Provide FDA with input on what is working vs not working, barriers, etc
32
For Questions
Contact:
Khaled Moussally ([email protected])
Jason Spiegler ([email protected])
Cisco Vicenty ([email protected])
33
BONUS MATERIAL
FDA’s View of Automation
The FDA supports and encourages the use of automation, information technology, and data solutions throughout the
product lifecycle in the design, manufacturing, service, and support of medical devices. Automated systems provide
manufacturers advantages for reducing or eliminating errors, increasing business value, optimizing resources, and
reducing patient risk. Is based on learning from other industries where automation has already shown significant
benefits in enhancing product quality and safety, which in turn reduces Risk, compared with non-automation.
www.fda.gov
FDA CDRH Fiscal Year 2019 (FY 2019)
Proposed Guidance Development
Draft Guidance Topics
❑ Content of Premarket Submissions for Cybersecurity of Medical Devices of Moderate and Major Level of Concern
❑ Surgical Staplers and Staples – Labeling Recommendations
❑ Non-binding Feedback After Certain FDA Inspections of Device Establishments
❑ Select Updates for Recommendations for Clinical Laboratory Improvement Amendments of 1988 (CLIA) Waiver Applications for Manufacturers of In Vitro Diagnostic Devices
❑ Recommendations for Dual 510(k) and Clinical Laboratory Improvement Amendments Waiver by Application Studies
❑ Computer Software Assurance for Manufacturing, Operations, and Quality System Software
❑ Patient Engagement in Clinical Trials
❑ Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices
❑ Lifecycle Regulatory Requirements of Medical Device Servicing (Device Servicer vs Remanufacturer)
❑ Guidance on an Accreditation Scheme for Conformity Assessment of Medical Devices to FDA-Recognized Consensus Standards (ASCA).
Scope
• Manufactures, including combination product manufactures, who have established a 21 CFR 820 quality system.
Out of Scope
• Automation of Clinical Trial and associated Electronic Records
• Automation regulated outside of 21 CFR 820
• Software validation for devices under the requirements of 21 CFR 820.30(g).
• Information Technology Infrastructure assurance activities such as planning, contracting, or continuity of operations
efforts that are used to maintain server operations.
9
www.fda.gov
Focus on Assurance Shift the discussion
www.fda.gov
(Unique) Clarifications and Recommendations
16
www.fda.gov
Appropriate methods and activities for
software assurance
• Take a least-burdensome approach – focus on value for the Manufacturer, not
the Investigator.
• Leverage existing activities and supplier data. Do not reinvent the wheel; take
credit for work already done
• Leverage use of process controls to mitigate risk
• Use Computer System Validation tools to automate assurance activities
➢ Scope of 21 CFR 820.70(i) is applied when computers or automated data processing
systems are used as part of production or quality system.
➢ FDA does not intend to review validation of support tools. Manufacturer
determines assurance activity of these tools for their intended use.
➢ Part 11 narrowly scoped and is under enforcement discretion apply appropriately
• Use Agile testing methods and unscripted testing as appropriate
• Use electronic data capture and record creation, as opposed to paper
documentation, screen shots, etc
• Leverage continuous data and information for monitoring and assurance
17
www.fda.gov
Examples
Embrace
Automation
Implement
Embrace Consistent Risk
Automation
Implement Adopt patient Follow
Scale Risk-based
the level Assurance
oftodocumentation
Adopt Patient-centric
•• Automation
Frameworks needs approaches
be seen
consistent risk centric •Scale level of testing based
Scale the level ofand
SDLConas an enabler.
patient-centric
documentation, not just
•• Put
Adopt
risk
patient front
doesn’t
a consistent
Automation
rating.
center
risk
need to of
frameworkall decisions.
involve across
complex
frameworks approaches •• testing,
FDA
SDLC,
tools.CSV based
Teamon
Change
risk rating. a patient centric
recommends
Management & Periodic Review
• Focus on
Leverage to testing,
vendor not on scripting.
documentation.
approach
Cycle. validation.
Scope:
• Camstar MES
• Epicor ERP
FDA CSV Team • Use electronic data capture and record creation, vs paper documentation, screen shots, etc.
Recommendation • Leverage continuous data and information for monitoring and assurance
Before After
• Manual screen shots of evidence of server’shardware • Automated reports of server’s hardware and software
and software specifications. specifications by installing monitoring tools on servers.
• Use electronic data capture and record creation, as opposed to paper documentation
FDA CSV Team • Use Computer System Validation tools to automate assurance activities
Recommendation • FDA does not intend to review validation of support tools. Manufacturer determines assurance activity of these
tools for their intended use.
• Replaced travel-intensive, hands-on training with remote, hands-free training using Smart Glasses
Success Story
(A wearable, voice-recognition & AI based technology)
Brief Description • Automatic, hands-free, safe evidence capture & voice-enabled real time, online documentation
Before After
• In person training (with expensive travel) required per • Remote training using wearable, hands-free, AI powered
procedures in order to perform certain manufacturing Smart Glasses technology.
tasks.
• Hands-on picture capture with external camera, print • Hands free evidence capture with voice-powered real-
out and attach to documentation offline. Error prone. time documentation. Error free.
• FDA is interested in the situations when a failure to fulfill the intended use of the system, software, or
FDA CSV Team
feature directly impacting device safety and device quality results in direct patient safety risk.
Recommendation
• Deployed a patient centric risk framework across Software life-cycle – i.e. Validation, Change
Success Story
Management & Periodic Reviews.
Brief Description • Leveraged FDA CSV Team’s risk assurance framework.
Before After
• Siloed risk frameworks across processes – • Consistent, simplified risk framework across processes that
frameworks that don’t talk to each other drive a common risk based assurance approach
• Confusion among implementing teams with risk • Consistent implementation of harmonized risk
definitions that don’t align with each other assurance framework
• Redundant work efforts due to misalignment • Reduced cycle times from consistent interpretations
across processes
Case Study Examples – Risk based Assurance – Deliverable Scalability
• FDA is interested in the situations when a failure to fulfill the intended use of the system, software, or
FDA CSV Team
feature directly impacting device safety and device quality results in direct patient safety risk.
Recommendation
• Deployed a software validation framework in which deliverables are scaled based on risk level of the
Success Story
software.
Brief Description • Leveraged FDA CSV Team’s risk assurance framework.
Before After
• One-size-fits-all list of validation documentation for • Deliverables scaled (both quantity & quality) by using a risk
all types of software assurance framework included in FDA CSV Team’s
recommendations
48
www.fda.gov
Example: Annual Filing vs 30 Day PMA Notification
Scenario 3: COTS MES Upgrade
• Intended use: Refer to scenario 1. As part of upgrade, adding new features such as process timers (automating manual
function, e.g. timers for ovens). The Physical manufacturing process is not changing.
• No customizations. Scripted testing for high risk; unscripted testing for new medium & low risk features.
• Downstream controls to inspect/check product remain.
• Based on downstream controls and risk based assurance approach, device quality, device safety, and
patient/user safety risk have been mitigated. CDRH may accept filing in an annual report. Note – this is
developing policy evaluation
Why: FDA’s objective for these efforts is to enable safer and better medical devices for patients
and providers, by making it easier, faster, and more value-added for manufacturers to deploy
the necessary automation.
49
www.fda.gov
Automated Computer System Validation Tools
Function Intended Use Examples
Software testing tool measuring system Used for testing the performance of new manufacturing *Loadrunner, Apache JMeter
behavior and performance under load automations under load
Automated functional graphical user Used for developing a test script based on user interactions to *Winrunner, Ranorex
interface (GUI) testing tool that allows a automate future testing of UI modifications
user to record and play back user interface
(UI) interactions as testscripts.
Bug tracking, issue tracking, and project Used for rapidly capturing issues and bugs found during assurance *Jira, Confluence
management systems. testing
Manage and track the application lifecycle Used for tracking and monitoring all stages of new IT system *Polarion ALM, PTC Integrity
development process. Includes, risk, test, implementations, throughout the lifecycle.
and the respective change control/approval
of applications
Dynamic web performance evaluation tool. Used for testing the performance of web-based User Interfaces *Dynatrace AJAX Edition, New
Relic APM
*All product trademarks, registered trademarks or service marks belong to their respective holders.
Manufacturer is using these tools to automate and supplement tracking and assurance testing for non-product
systems. These intended uses of these tools do not have a direct impact on device quality and device safety.
21
www.fda.gov
Qualification of Automated CSV Tools
Example 1 – Code Debugger
A CAPA automation system is being written in Java script and a developer tool is used to set up breakpoints and step through of
the code. Once the code is debugged all the debugger content is removed prior to the promotion of the code to the production
system. The debugger tool is not part of production or the quality systems.
Step 1: Identify where and how the Debugger will be used within your organization
Step 2: Determine if the intended use for automating part of production or the quality system
Consider the following in your decision, then capture the decision with rationale
• The off-the-shelf tool is not part of or integrated with the production or the quality system
Step 3: Assure the use within your organization
Assurance of the code debugger tool includes testing the tool for use within your organization.
1. Identify or create code to be debugged with known error types
2. Execute the debugger and verify that all expected error types manifested
3. Correct the errors in the code and execute the code to verify that the debugger produced executable code to meet
your use.
Step 4: Evidence
Record the intended use decision and rationale, as well as the acceptance conclusion for the tool.
Qualification of Automated CSV Tools
Example 2 – Automated Testing Tool
An ERP system has a load requirement and HP Loadrunner is used to simulate anticipated peak load of the production system.
The load testing results assures that the system can absorb the required user load. Then the automated testing tool used to test
load to a production system is not part of production or the quality systems.
Step 1: Identify where and how the testing tool will be used within your organization.
Step 2: Determine if the intended use for automating part of production or the quality system
Consider the following in your decision, then capture the decision with rationale:
• The testing tool is not the system of record of the product testing results
• The test tool does not alter the code within the production system
• The testing does not add any data to the production system
• The tool is not used for verification of Medical Device
Step 3: Assure the use within your organization
Assurance of the automated testing tool includes testing the tool for use within your organization.
1. Identify the type of testing results that will be achieved with the testing tool
2. Execute known test cases that represent a solid sampling of test types and conditions that will be encountered during use.
3. Ensure that the testing tool produced the testing results that were expected to meet the testing requirement of your organization
that will minimize defects being introduced into a production environment for your organization.
Step 4: Evidence
Record the intended use decision and rationale, as well as the acceptance conclusion for the tool.
Qualification of Automated CSV Tools
Example 3 – Defect management and ALM tools
A medical device company uses Polarion to automate the company’s CSV process including testing, defect management and other software life
cycle functionality in the support of implementation of a quality production system. The activities performed, and records maintained in Polarion
support the execution of the company CSV procedure and is part of the quality systems.
Step 1: Identify where and how a Defect Management and ALM tool will be used within your organization
Step 2: Determine if the intended use for automating part of production or the quality system
Consider the following in your decision, then capture the decision with rationale that validation is not applicable:
• ALM tool is used to execute company’s CSV process and does not alter the Production system data but automates part of quality system
• ALM tool is configured to automate company’s CSV process and does not impact nor interface with validated production quality systems.
• The testing does not add any data to the production system
Step 3: Assure the use within your organization
FDA considers that impact to the quality system does not present a direct patient or user safety risk. Assurance of the automated testing tool
includes testing the tool for use within your organization.
1. Identify specific functionality or process that the ALM or Defect management tool will automate by creating specific functional level requirements.
2. Execute known test cases that represent functional requirements under a variety of conditions that represent organizational use.
3. Ensure that the testing results produced the desired outcome to a level that provides full confidence in the tool(s) functionality to meet the
intended use of an ALM or Defect management tool within your organization.
Step 4: Evidence
Record the intended use decision and rationale, testing results, as well as the acceptance conclusion for the tool
Complaint Handling Spreadsheet example
Intended Use: Extract complaint data from plant local systems & identify complaint trends across regions in injectables sources
Example Requirement: Global QM shall be able to open read-only complaint data extracts from “xyz” secure storage location.
Risk Level: LOW (patient), a configured implementation method
Spreadsheet Tested: Spreadsheet X, Version 1.2
Test Type: Unscripted Testing
Goals:
• Ensure Global QMs are able to open complaint data extracts from “xyz’ secure storage location (data retrievability)
• Ensure complaint data extracts are read-only (data protection testing) and “xyz” storage location is accessible only to
authorized individuals (data security testing
Assurance activities:
• Performed data retrievability, data security and data protection testing as outlined in the testing goals. Included positive as
well as negative test conditions for testing if unauthorized individuals can access the data.
Conclusion: No errors observed.
The intended use was identified to be low patient risk and rapid exploratory testing of specific functions were performed. The
resulting record quickly identifies the intended use, what was tested, how it was tested, the test objective, who performed the
testing, and conclusion on validation activities.
Risk Based CSV Example:
Learning Management System (LMS)
A medical device firm applies Risk Based Validation to an off the shelf LMS. Qualifying the vendor
then applying risk to the feature level allows for much less documented verification activities.
Ad Hoc
Basic Assurance / Ex: Usability Features – training notifications,
Testing
Low Risk Features overdue training lists, curricula assignments.
80%
Unscripted
Ex: Capture evidence of training completion by
Medium Risk Features Testing
entering username & password.
20%
Scripted
High Risk Features No High Risk Features Testing
0%
Risk Based CSV Example:
Non-Conformance & CAPA Process
A medical device firm applies Risk Based Validation to an off the shelf CAPA System. Qualifying the vendor
then applying risk to the feature level allows for much less documented verification activities.
Unscripted
Ex: Electronic Signature Features – audit trail,
Medium Risk Features Testing
meaning of signature (review, approval).
50%