Smartlogic

Test Incidents – Analysis, Logging & Classification

                   GAMP – Test Incidents – Analysis, Logging & Classification

Incident Analysis and Logging

אילן שעיה ilan Shaya
Ilan Shaya CEO , control and automation specialist and designer

This article was written by Ilan Shaya a world specialist in validation, automation & control

When a test incident occurs during a particular step, the overall test should not be continued if the failure produces an output tat that prevents entry into a subsequent step. When a test continues after following a failure, the failed step should be clearly identified on the test results sheet

It is important to fully record the details of all new test incidents and maintain an index of these incidents

Test incidents may be fed either into an existing change control system or into a separate process for resolving test incidents. An example of an incident report (summarizing details of the incident, proposed solution and retest requirements, review, implementation and closure) is given in GAMP 4, Appendix D6

                    Test Incident Classification

In addition to correcting an identified fault, it is important to evaluate test incidents in order to determine their most likely cause. An important part of any Corrective & Preventive Action (CAPA) process is intended to address this issue. Metrics on the causes of avoidable test incidents provide a useful indicator of areas within the overall SW development life cycle that may benefit from improvement activities, to reduce the likelihood of recurrence.

Typical test incident types that occur in SW testing include, but are not limited, to those described below

              Incorrect SW Installation

Errors such as program dumps, abnormal terminations, or inability to access applications, result often from a failure in the installation or configuration process, or the installation of a wrong SW version

When any of these errors is determined to be the cause of the incident, it is usually necessary to postpone any further testing until the test environment is correctly set up

              Incorrect Programming/Coding

Incidents may result in actual system outputs failing to agree with required system outputs. These incidents should be noted, and, unless the defects are considered sufficiently important to invalidate the rest of the test steps, the execution of the test case can continue

Once the cause of a defect is identified, the defect should be corrected and the corrected SW included in a subsequent SW build for retesting

               Incorrect Test Data

Testing failures may occur as a result of failure to create correct data in the test database, in advance of the test case being run

               Inadequate Specification- Incorrect Understanding of Program Functionality

Testing failures may occur as a direct result of the fact that the controlling Design Specification does not state clearly enough what is required from a particular piece of functionality. This may be particularly evident when a custom system is developed to satisfy a new business process that may not be fully established yet

               Poorly Specified Test Case/Script

Tests can fail if the Test Case or Test Script (or other relevant document) produced is incorrect and indicates an outcome different than documented in the corresponding requirements

When a Test Case or Test Script has been modified during execution, a test incident should be raised to manage changes to the Test Case or Test Script, and to confirm the pass/fail status of the test

Incidence of this type of error should be minimized by ensuring independent review of the test case before approval, including a cross check of the expected output as specified in the controlling requirement.

                Incorrect Design Solution

Test errors can arise where the SW may work correctly against design, but the design implemented does not satisfy the original stated requirements, or fail to reflect any subsequently agreed change requests

               Inconsistent Controlling Design Specification

Test incidents can occur where the content of the relevant controlling Design Specification contains inconsistencies. It is, therefore essential that this specification is corrected to prevent further confusion

This incident type should not be confused with Incorrect Programming/Coding, where the SW coded does not match a particular requirement of the controlling Design Specification, and the code needs to be corrected

The controlling Design Specification inconsistencies should be logged in the incident management system so the specification can be corrected following the appropriate document management system

           Unexpected Test Events

During execution of a Test Case or Test Script, the tester may notice an anomaly in the SW that, although not affecting the success of the overall test objective, nevertheless requires further investigation. This event should be recorded in the incident management system in order that the controlling Design Specification, and the corresponding Test Case or Test Script can be updated to reflect the presence of the anomaly

             Test Execution Errors

Test can be classified as failures if the tester fails to follow the steps outlined in the Test Script, or the overall Test Protocol or Test Specification governing that activity

Missing signatures and timestamps for dates, and other important cross reference information is another area that could cause a test to be considered a failure

               Force Majeure

Test incidents of this nature reflect an unexpected event over which the test or project team have no control, and which brings testing to a premature halt. These events can be raised as issues by the project team, but are generally resolved outside of the project

ולידציה – GAMP – Definition of Terms

Good Automated Manufacturing Practice (GAMP) Definition of Terms

Definition of Terms Used in Testing Environments

This document provides a definition of a set of testing terms used within pharmaceutical and other life sciences (consistent with those used in GAMP 4), Information Technology (IT) industries, and control and automation industries, in order to facilitate understanding testing environments.

It is recommended that a definition of consistent testing terms should be prepared on an organizational or project basis, where members of User and Supplier organizations work together. It is helpful to agree on these testing terms definition prior to contract signing, to ensure that contractual issues are based on common understanding of activities and milestones.

:The definition of terms listed below is based on three sources

GAMP 4 Definition from the GAMP Guide for the Validation of Automated Systems

IEEE Definition from IEEE 100, the Authoritative Dictionary of IEEE Standard Terms

BCS Definition from Working Draft: Glossary of terms used in software testing, version 6.2, produced by the British Computer Society Interest Group in Software Testing – BCS SIGIST

Terms and Definitions

Terminology Definition Source
Acceptance Criteria Criteria that a system or component must satisfy in order to be accepted by the User, customer or other authorized entity  GAMP 4 , IEEE
Acceptance Test

Formal testing conducted to determine whether or not a system satisfies its acceptance criteria, and to enable the customer to determine whether or not to accept the system

See also Factory Acceptance Test (FAT) and
Site Acceptance Test SAT

 ,GAMP 4 IEEE
Black Box Testing See Functional Testing IEEE
Boundary Condition Testing Testing for correct operation when one or more variables are at a limiting value or a value at the edge of the domain of interest IEEE
Calibration Set of operation that establish, under specified conditions, the relationship between values indicated by a measuring instrument or system, or values represented by a material measure or a reference material, and the corresponding values of a quantity realized by a reference standard  GAMP 4,   ISO 10012

 

 

Terminology Definition Source
Challenge Testing Testing to check system behavior under abnormal conditions. Can include stress testing and deliberate challenges, e.g., to the security access system, data formatting rules, possible combinations of operator's actions, etc
Commissioning Process of providing to the appropriate components the information necessary for the designed communication between them IEEE
Emulation A model that accepts the same inputs and produces the same outputs as the given system IEEE
Environmental Testing Testing that evaluates system or component performance up to the specified limits of environmental parameters, e.g., temperature, humidity or pressure
Firmware FW Combination of hardware (HW) device, computer instructions and data that reside as read-only software (SW) on that device IEEE
Factory Acceptance Test FAT

Acceptance Test in the Supplier's Factory, usually involving the customer

See also Factory Acceptance Contrast to Site Acceptance Test – SAT

GAMP 4 , IEEE
Functional Testing Testing that ignores the internal mechanism of a system or component, and focuses solely on the outputs generated in response to input and execution conditions
Also known as Black Box Testing
GAMP 4 , IEEE
Hardware- HW

Physical equipment used to process, store, or transmit computer programs or data

Physical equipment used in data processing, as opposed to programs, procedures, rules, and associated documentation

IEEE
HW Testing Testing carried out to verify the correct operation of system HW independent of any custom application SW IEEE
Installation Qualification – IQ Documented verification that a system is installed according to written and pre-approved specifications  GAMP 4 , IEEE
Integration

Process of combining SW components, HW components, or both, into an overall system

Sometimes described as SW Integration and System Integration, respectively

IEEE

 

Terminology Definition Source
Integration Testing

(1) Testing in which SW components, HW components, or both, are combined and tested to evaluate the integration between them

(2) Orderly progression of testing of incremental pieces of the SW program, in which SW elements, HW elements, or both, are combined and tested until the entire system has been integrated to show compliance with the program designed, and the system capabilities and requirements

IEEE
Instance Single installation of a SW application (plus associated databases, tools and utilities). Usually applied to configurable IT systems
Load Testing Stress testing conducted to evaluate a system or component up to the limits of its specified requirements
Loop Testing Testing in which control system inputs and outputs are exercised and their functionality verified
Market Requirements Specification Statement of generic industry requirements used by the Supplier as an input to its product development life cycle IEEE
Middleware Combination of HW, computer instructions and data that provides infrastructure used by other system modules IEEE
Module Testing Testing of individual HW or SW components, or groups of related components IEEE
Negative Testing Testing aimed at showing that the SW does not work BCS
Operational and Support Testing

(1) Testing conducted to evaluate a system or component in an operational environment

(2) All testing required to verify system operation in accordance with design after the major component is energized or operated

IEEE
Operation Qualification- OQ Documented verification that a system operates according to written and pre-approved specifications throughout all specified operating range GAMP 4 , IEEE
Performance Qualification – PQ Documented verification that a system is capable of performing or controlling the activities of the processes it is required to perform or control, according to written and pre-approved specifications, whilst operating in its specified environment GAMP 4 , IEEE
Positive Testing Testing aimed at showing that the SW does meet the defined requirements BCS

 

 

Terminology Definition Source
Qualification Process to demonstrate the ability to fulfill specified requirements

GAMP 4

  ISO

Simulation A model that behave or operates like a given system when provided a set of given inputs IEEE
Site Acceptance Test – SAT

Acceptance Test at the customer's Site, usually involving the customer

See also Factory Acceptance Contrast to Site Acceptance Test – SAT

GAMP 4,

IEEE

Software – SW Computer programs, procedures, and associated documentation and data pertaining to the operation of a computer system IEEE
Stress Testing Testing conducted to evaluate a system or component at or beyond the limits of its specified requirements IEEE
Structural Testing Examining the internal structure of the source code. Includes the low-level and high-level code review, path analysis, auditing of programming procedures, and standards actually used, inspection for extraneous "dead code", boundary analysis and other techniques. Requires specific computer science and programming expertise
Also known as White Box Testing
GAMP 4
Supplier Organization or person that provides a product   GAMP 4,  ISO
System Testing Testing conducted on a complete, integrated system to evaluate its compliance with its specified requirements GAMP 4
Test

 Procedure in which by executes an activity under specified conditions, the results are observed and recorded, and an evaluation is made of some aspect of that system or component

 Determination of one or more characteristics according to a procedure

,GAMP 4

ISO

GAMP 4

 IEEE

Test Case Set of test inputs, execution conditions and expected results developed for a particular objective, such as to exercise a particular program path or to verify compliance with a specific requirement

GAMP 4 ,

 IEEE

Test Plan Document describing the scope, approach, resources and schedule of intended activities. It identifies test items, features to be tested, testing tasks, personnel en charged of each task, and risks requiring contingency planning GAMP 4

 

Terminology Definition Source
Test Procedure Detailed instructions for the setup, execution and evaluation of results for a given test case GAMP 4,   IEEE
Test Protocol See Test Specification IEEE
Test Script Documentation that specifies a sequence of actions for executing a given test GAMP 4 , IEEE
Test Specification Document describing the scope, management, use of procedures, sequencing, test environment, and prerequisites for a specific phase of testing
Test Strategy See Test Plan
Unit Testing Testing of individual HW or SW units or groups of related units IEEE
Usability Testing Testing the ease with which Users can learn and use a product BCS
User Person or persons who operate or interact directly with the system GAMP 4
Validation Establishing documented evidence which provides a high degree of assurance that a specific process will consistently produce a product meeting its predetermined specifications and quality attributes GAMP 4 , FDA
Verification Confirmation, through the provision of objective evidence, that specified requirements have been fulfilled GAMP 4 ,    ISO
White Box Testing See Structural Testing IEEE

ולידציה – Validation case study – part 2

ולידציה – Validation case study – part 2

 This article was written by Iian Shaya, validation,automation and control expert

Validation Requirements

Documentation for Initial Tender

Project schedule and milestones design and construction detail

Project quality plan

Supplier’s local subsidiary/agent

Supplier’s documentation that the system / configurable software versions are released to the market and are FDA/EMEA compliant

Compliance to the 21 CFR Part 11 operational requirements – the contractor should ensure covering of all the requirements described below

Contractor to state the system/product status and implementation planned for each requirement – User's approval pre-delivery

Documentation for Design Review

PID for the system

Electrical and pneumatic schematics

:Installation data

General arrangement drawings

Floor loading

Utility requirements

Details of electronic records and approvals that may be subject to regulatory controls under CFR Title 21 part 11

Instrumentation documentation

:Main components specification

Equipment

Instrumentation

Valves

Piping

Control system

Pipe welding documentation

General installation book

Procedures

User guide

Security

Preventative maintenance + spare parts list

Operation procedure

Pressure test procedure

Leak test procedure

Passivation procedure

Calibration

:Functional Design Specifications for

Complete manufacturing system

Software – SW and hardware –HW

Functional Design Specifications

System Detail Design Specifications for HW and SW.

SW source code, with comments, for customized SW

SW complete version history

HMI alarm list, message list and graphical displays

I/O list

:List of materials

Product contact materials

Potential product contact lubricants

Welding procedures for product direct and indirect contact parts

Pre-delivery Inspection and Factory Acceptance Test (FAT) protocols

Steel certificates and gasket certificates

Documentation Prior to FAT or Pre-Delivery Inspection – PDI

:Progress visit report which will include

Mechanical and technical development

Automation and SW development

:Supplier’s factory test results for

Unit tests – The test protocols shall be traced to low level design document and will be approved by the user prior to execution. The approval of the report shall be performed by representatives of the user's validation team, IT QA.

Automation and SW development

:Integration Tests  – Simulation Testing

Approved PDI and FAT protocols

Commissioning

MCCR

Purpose

Scope

Responsibility

Execution instructions

System scope

System description

Drawing verification

Equipment verification

Valve verification

Instrument installation and calibration

Utility verification

Documentations verification

Piping Verification

Sample Point Verification

Safety, health and environment verification

Slopes verification (if relevant)

Electrical and communication activities

Pump installation verification, if relevant

Heat exchanger installation verification, if relevant

Air break verification, if relevant

Dead leg verification, if relevant

CE

Purpose

Scope

Responsibility

Execution instructions

System scope

System description

System startup

Equipment verification

Main equipment operation checks

System FDS (SSO) verification

System performance testing

 This article was written by Iian Shaya, validation,automation and control expert

ולדיציה – Validation case study – part 1

ולדיציה -1 Validation Requirements – case study- part

 This article was written by Iian Shaya, validation,automation and control expert

Validation Requirements is a document which may be part of the validation documentation that describes the validation strategy for a system or subsystem. This document is generic; the system or subsystem may include a PC with Human/Machine Interface (HMI), a Programmable Logic Controller (PLC), virtual hardware (HW), software (SW), and other components designed to maintain the user's facility in proper conditions specified by the user.

Validation Requirements Document Contents

This document is structured in a relatively standard fashion, with predetermined chapters and sections, where the final contents are tailored according to the type and size of the system under validation.

The main chapters and sections of a Validation Requirements document are:

Responsibility

Validation Requirements

Documentation for Initial Tender

Documentation for Design Review

Documentation Prior to Factory Acceptance Test (FAT) or Pre-Delivery Inspection – PDI

Commissioning

Documentation For Installation and Operational Qualification (IQ and OQ) – to be checked at PDI/FAT

Design Qualification (DQ) Protocols (Including PC/PLC) Architecture

Installation Qualification (IQ) Protocols (Including PC/PLC

Operational Qualification (OQ) Protocols (Including PC/PLC

Performance Qualification (PQ) Protocols

Computerized System Validation

Responsibility

This section lists the responsibilities of contractor, the user, and the required contents of the documents composing the validation file

The contractor is responsible for creation and performing the DQ, Design Review, Commissioning, IQ and OQ validation protocols

The user is responsible for creation and performing the PQ validation protocols

Design Qualification (DQ) – The design of the system will be documented and checked in the Design Specification. This specification will include details of the system and must be traceable to the URS and BOD documents

Mechanical Completion Check Report (MCCR) of the system will be documented and checked only by the contractor. This document will check system readiness for the IQ

Commissioning Execution (CE) of the system will be documented and checked only by the contractor. This document will check system readiness for  OQ

Installation Qualification –IQ

IQ will establish documented evidence that the system is installed according to the manufacturers’ specifications and user requirements, and assure that the environment is appropriate for its intended purpose

Each IQ protocol will include an appendix of deviation report, which describes the deviations (if they existent) of the specific system, and the contractor will be responsible to correct them

Operational Qualification – OQ

OQ will establish documented evidence that the system is operated according to manufacturers’ specifications and user requirements, and assure that the environment is appropriate for its intended purpose

OQ will establish documented evidence that the system is operated according to manufacturers’ specifications

.Each OQ protocol will include an appendix of deviation report, which describes the deviations (if they existent) of the specific system, and the contractor will be responsible to correct them

Performance Qualification- PQ

PQ will establish documented evidence that the system performs according to manufacturers’ specifications and user requirements, and assure that the environment is appropriate for its intended purpose

The PQ protocols are user's responsibility

 This article was written by Iian Shaya, validation,automation and control expert