Smartlogic

Test Incidents – Analysis, Logging & Classification

                   GAMP – Test Incidents – Analysis, Logging & Classification

Incident Analysis and Logging

אילן שעיה ilan Shaya
Ilan Shaya CEO , control and automation specialist and designer

This article was written by Ilan Shaya a world specialist in validation, automation & control

When a test incident occurs during a particular step, the overall test should not be continued if the failure produces an output tat that prevents entry into a subsequent step. When a test continues after following a failure, the failed step should be clearly identified on the test results sheet

It is important to fully record the details of all new test incidents and maintain an index of these incidents

Test incidents may be fed either into an existing change control system or into a separate process for resolving test incidents. An example of an incident report (summarizing details of the incident, proposed solution and retest requirements, review, implementation and closure) is given in GAMP 4, Appendix D6

                    Test Incident Classification

In addition to correcting an identified fault, it is important to evaluate test incidents in order to determine their most likely cause. An important part of any Corrective & Preventive Action (CAPA) process is intended to address this issue. Metrics on the causes of avoidable test incidents provide a useful indicator of areas within the overall SW development life cycle that may benefit from improvement activities, to reduce the likelihood of recurrence.

Typical test incident types that occur in SW testing include, but are not limited, to those described below

              Incorrect SW Installation

Errors such as program dumps, abnormal terminations, or inability to access applications, result often from a failure in the installation or configuration process, or the installation of a wrong SW version

When any of these errors is determined to be the cause of the incident, it is usually necessary to postpone any further testing until the test environment is correctly set up

              Incorrect Programming/Coding

Incidents may result in actual system outputs failing to agree with required system outputs. These incidents should be noted, and, unless the defects are considered sufficiently important to invalidate the rest of the test steps, the execution of the test case can continue

Once the cause of a defect is identified, the defect should be corrected and the corrected SW included in a subsequent SW build for retesting

               Incorrect Test Data

Testing failures may occur as a result of failure to create correct data in the test database, in advance of the test case being run

               Inadequate Specification- Incorrect Understanding of Program Functionality

Testing failures may occur as a direct result of the fact that the controlling Design Specification does not state clearly enough what is required from a particular piece of functionality. This may be particularly evident when a custom system is developed to satisfy a new business process that may not be fully established yet

               Poorly Specified Test Case/Script

Tests can fail if the Test Case or Test Script (or other relevant document) produced is incorrect and indicates an outcome different than documented in the corresponding requirements

When a Test Case or Test Script has been modified during execution, a test incident should be raised to manage changes to the Test Case or Test Script, and to confirm the pass/fail status of the test

Incidence of this type of error should be minimized by ensuring independent review of the test case before approval, including a cross check of the expected output as specified in the controlling requirement.

                Incorrect Design Solution

Test errors can arise where the SW may work correctly against design, but the design implemented does not satisfy the original stated requirements, or fail to reflect any subsequently agreed change requests

               Inconsistent Controlling Design Specification

Test incidents can occur where the content of the relevant controlling Design Specification contains inconsistencies. It is, therefore essential that this specification is corrected to prevent further confusion

This incident type should not be confused with Incorrect Programming/Coding, where the SW coded does not match a particular requirement of the controlling Design Specification, and the code needs to be corrected

The controlling Design Specification inconsistencies should be logged in the incident management system so the specification can be corrected following the appropriate document management system

           Unexpected Test Events

During execution of a Test Case or Test Script, the tester may notice an anomaly in the SW that, although not affecting the success of the overall test objective, nevertheless requires further investigation. This event should be recorded in the incident management system in order that the controlling Design Specification, and the corresponding Test Case or Test Script can be updated to reflect the presence of the anomaly

             Test Execution Errors

Test can be classified as failures if the tester fails to follow the steps outlined in the Test Script, or the overall Test Protocol or Test Specification governing that activity

Missing signatures and timestamps for dates, and other important cross reference information is another area that could cause a test to be considered a failure

               Force Majeure

Test incidents of this nature reflect an unexpected event over which the test or project team have no control, and which brings testing to a premature halt. These events can be raised as issues by the project team, but are generally resolved outside of the project

ולידציה – Validation case study – part 2

ולידציה – Validation case study – part 2

 This article was written by Iian Shaya, validation,automation and control expert

Validation Requirements

Documentation for Initial Tender

Project schedule and milestones design and construction detail

Project quality plan

Supplier’s local subsidiary/agent

Supplier’s documentation that the system / configurable software versions are released to the market and are FDA/EMEA compliant

Compliance to the 21 CFR Part 11 operational requirements – the contractor should ensure covering of all the requirements described below

Contractor to state the system/product status and implementation planned for each requirement – User's approval pre-delivery

Documentation for Design Review

PID for the system

Electrical and pneumatic schematics

:Installation data

General arrangement drawings

Floor loading

Utility requirements

Details of electronic records and approvals that may be subject to regulatory controls under CFR Title 21 part 11

Instrumentation documentation

:Main components specification

Equipment

Instrumentation

Valves

Piping

Control system

Pipe welding documentation

General installation book

Procedures

User guide

Security

Preventative maintenance + spare parts list

Operation procedure

Pressure test procedure

Leak test procedure

Passivation procedure

Calibration

:Functional Design Specifications for

Complete manufacturing system

Software – SW and hardware –HW

Functional Design Specifications

System Detail Design Specifications for HW and SW.

SW source code, with comments, for customized SW

SW complete version history

HMI alarm list, message list and graphical displays

I/O list

:List of materials

Product contact materials

Potential product contact lubricants

Welding procedures for product direct and indirect contact parts

Pre-delivery Inspection and Factory Acceptance Test (FAT) protocols

Steel certificates and gasket certificates

Documentation Prior to FAT or Pre-Delivery Inspection – PDI

:Progress visit report which will include

Mechanical and technical development

Automation and SW development

:Supplier’s factory test results for

Unit tests – The test protocols shall be traced to low level design document and will be approved by the user prior to execution. The approval of the report shall be performed by representatives of the user's validation team, IT QA.

Automation and SW development

:Integration Tests  – Simulation Testing

Approved PDI and FAT protocols

Commissioning

MCCR

Purpose

Scope

Responsibility

Execution instructions

System scope

System description

Drawing verification

Equipment verification

Valve verification

Instrument installation and calibration

Utility verification

Documentations verification

Piping Verification

Sample Point Verification

Safety, health and environment verification

Slopes verification (if relevant)

Electrical and communication activities

Pump installation verification, if relevant

Heat exchanger installation verification, if relevant

Air break verification, if relevant

Dead leg verification, if relevant

CE

Purpose

Scope

Responsibility

Execution instructions

System scope

System description

System startup

Equipment verification

Main equipment operation checks

System FDS (SSO) verification

System performance testing

 This article was written by Iian Shaya, validation,automation and control expert

ולידציה – Function Design Specification – FDS

ולידציה – Function Design Specification – FDS

 This article was written by Iian Shaya, validation,automation and control expert

The Function Design Specification (FDS) is part of the validation documentation that details the solution to be provided to meet the user's requirements. It should be approved by the user and should form the basis of the design for both hardware (HW) and software (SW) designs.

The FDS provides the basis of the design of the system and is used to verify and validate the system during the testing, ensuring all the required functions are present and that they operate correctly. It details all the functions, operator interactions control and sequencing associated with the system, thus allowing the user to confirm, before the system is developed that the proposed solution fully meets its requirements.

FDS Contents

The FDS is structured in a relatively standard fashion, with predetermined chapters and sections, where the final contents are tailored according to the type and size of the system under validation. The FDS presented here includes only to the technical contents. It does not include commercial and contractual requirements, which may also be generally included.

The main chapters and sections of an FDS protocol are:

Relationship to Other Documents – lists all documentation used in the production of the FDS. Includes suppliers' documents (such as URS) and drawings. Each document listed should include the document/drawing number and version number. This allows traceability as documents are updated throughout the project life cycle on any impact on the FDS.

System Overview

Process Overview – includes a description of the process being controlled; this may be taken from the URS, enhancing to detail the interaction with the control system.

Control System Overview – includes detailed control system description, with all the components and interaction between the systems; block and network diagrams can be used to show in detail the system architecture

Scope and Limits of Supply

Scope of Supply – includes a list of deliverables, panels, computers, software, etc

Limits of Supply – includes all items outside the scope of the supply required by the project; where interfacing to 3rd party systems, constraints and assumptions should be included

System Functions and Facilities

Operation Modes – includes all modes of operation for the system

Functional Operation – divides each of the sequences functions into logical areas (determined by the process), and provide complete description of each area

Operator Requirements – describes the interface between the operator and the detailed function

Human/Machine Interface- HMI – details all points of operation, local terminals, remote terminal, message displays, push button stations, etc.

Report Outputs – the format of all reports generated by the system should be detailed, and that the format and explanation of the report contents should be included

System Data – all data gathered, generated or calculated by the system should be detailed

System Interfaces – provide complete details of all inputs and outputs from the control system

System Attributes

Availability – defines expected "working" time of the system between failures

Maintainability – details issues related to maintainability of the plant, in particular for systems that require regular maintenance to ensure the reliable operation

Transport and off loading

Power and services required

Connections to existing/3rd party systems

Changes to existing plant or hardware (HW) equipment

Changes to existing software (SW) systems

Training – details the formal and informal training to be supplied under the contract

Design Factors – details special factors relating to the design of the system, standards and methodologies to be followed for both the HW and SW development

Development Factors

Project Control – includes or makes reference to project plans and timescales, along with details of quality requirements, standards, test and integration and configuration management

Resource Requirements – includes the basic project team provided by the supplier, the access required to the customer's premises, and input and timing required by the customer into the project

Test procedures – including details of all test documentation and responsibilities for testing both offline and online

Module and Integration Testing

Factory Acceptance Testing (FAT) – performed at the suppliers premises

Site Acceptance Testing (SAT) – performed on completion of commissioning to demonstrate pre-handover system operation

:Note

As the final contents of the FDS are tailored according to the type and size of the system under validation, and this document is generic, it covers test procedures that may not be necessary in small or simple systems. The following sections cover the FDS issues that require further details

 About the system functions and facilities in our next article FSD System Function & Facilities

 This article was written by Iian Shaya, validation,automation and control expert

אילן שעיה ilan Shaya

ולידציה – FRS Contents

ולידציה -FRS Contents

 This article was written by Iian Shaya, validation,automation and control expert

The FRS presents functional requirements for installing and operating a monitoring and control system, in response to and compliance with the user's requirements

For example, the FRS may propose to fulfill the URS requirements using a system that includes a PC with control capabilities using HMI screens, PLC, and varied environmental conditions sensors and control devices. The FRS may also propose a color-code display for ongoing environmental conditions, including indications of alarm conditions. An SMS or e-mail notification may be sent to specified personnel in case of specified alarm conditions.

The FRS requirements are organized accordingly with the same order and numbering of sections as the URS for clear correspondence. These requirements are divided into 4 categories- as the user's requirements

Installation Requirements

Operation Requirements

Regulation Requirements

HMI Requirements

Installation Requirements

These requirements cover all the issues regarding system installation to ensure its proper functionality and reliability. Examples of this type of requirements are

List and characteristics of specified hardware (HW) components capable of meeting the system functional requirements

Labeling and identification method for each HW component

List and characteristics of specified software (SW) programs installed on the system PC and the PLC, capable of performing the required operations

Definition of equipment to meet the storage capacity requirements

Definition of equipment and method for achieving the required connections to various types of sensors, communication units, temperature, humidity and pressure transmitters, illumination devices, etc

Definition of equipment and method for achieving the communication compatibility with equipment already installed at the user's facility without extra sensors

Operation Requirements

These requirements cover all the operations that the system must be capable of performing. Examples of this type of requirements are

Environmental conditions (such as pressure, temperature and humidity) to be monitored and controlled

Type of systems to be monitored and controlled, such as Heating, Ventilation and Air Conditioning (HVAC) system, types of sensors, etc

Definition of computerized system capabilities and starting conditions

Definition of system capabilities to recover from failures

List of internal tests to be performed regularly, and alarm indications to be issued in case of failure

Definition of current and historical alarms to be provided regarding all parameters in any case of deviation from the limits specified in the system

Definition of system real-time screens display capabilities

Provision of the following data and HMI displays

Synoptic screens for displaying online values and status

Data logging and storage of historical trends, events and alarms

Tabular screens for displaying events and alarms

Graphical screens for displaying trends

Display of the following information for each alarm

Status -new/acknowledged alarm

Time at which the alarm was activated

Parameter/Tag/Name of the module that activated the alarm

Alarm Description

Alarm Priority

Display of alarms to warn the user, collect alarm history, and enable the user to view current and historical alarms. The system alarms shall include

Component malfunction/failure

Irregularity in parameter reading – such as disconnection of communication lines

Parameters values exceeding the high/low parameter limits

Deviations of system operation from predefined parameters/operations

Method for providing capability to configure the graphs parameters according to

Date and time

Measured parameters

Predefined number of displayed parameters

Definition of trend graphs with maximum and minimum allowed limits of the monitored parameters

Definition of logging interval defined by the user and configured by the supplier

Method for providing capability of authorized user's personnel to define low and high limits and delay time for each

alarm parameter

On FRS regulatory & HMI Requirements you can find out in our next article

 This article was written by Iian Shaya, validation,automation and control expert

אילן שעיה מרצה Ilan Shaya

ולידציה – URS and FRS Preparation Overview

ולידציה – URS and FRS Preparation Overview

 This article was written by Iian Shaya, validation,automation and control expert

User Requirements Specification (URS) and Functional Requirements Specification (FRS) are the first and starting points of a validation process and a validation documentation file

  – The validation process must comply with regulations issued by the United States Food and Drug Administration FDA

:The FDA regulations that are most relevant to the validation process are

Good Manufacturing Practice  – GMP.

Current Good Manufacturing Practice – cGMP

Good Automated Manufacturing Practice – GAMP

The validation process includes design, installation and operation of a monitoring and control system for a production facility, as well as planning and execution of test procedures, to verify that a monitoring and control system meets the FDA standards

Validation documentation is part of the validation process that includes written and/or electronic records regarding the installation and operation of the monitoring and control system, and the corresponding test procedures for this system

Electronic records are often required to fulfill regulations set by the FDA. These regulations regard the scope and application of Part 11 of Title 21 of the Code of Federal Regulations; Electronic Records; Electronic Signatures (21 CFR Part 11). Electronic Records may contain any combination of text, graphics, audio, pictures, or other information represented in electronic form, which are created, modified, maintained, archived, retrieved or distributed by a computer system

Electronic Signatures may contain computer data compilation of any symbol or series of symbols executed, adopted or authorized by an individual to be legally binding equivalent of the individual's handwritten signature

Electronic records and signatures are generally used in Closed Systems, in which the system access is controlled by personnel responsible for the contents of the system electronic records

The responsibility for writing and approving the URS and FRS is shared in practice by the user, who operates the production facility, and the supplier or vendor, who provides the monitoring and control system for ensuring the proper operation of the production facility. Usually, the URS is written by the user and the FRS by the supplier

:Note

The final contents of the URS and FRS are tailored according to the type and size of the system under validation. Since the URS and FRS regarded herein are generic, they include requirements that may not be necessary in small or simple systems

 This article was written by Iian Shaya, validation,automation and control expert