Smartlogic

ולידציה – GAMP – Test Example – part 3

Good Automated Manufacturing Practice (GAMP) – Test Example

Testing Process Automation Systems

This article cover  the third part of  our  Good Automated Manufacturing Practice (GAMP) test example. This part will cover  the typical test phases for Factory Acceptance Test – FAT, Site Acceptance Test – SAT / Installation Qualification – IQ, Operation Qualification – OQ and Performance Qualification – PQ

Typical Test Phases – GAMP – Test Example

 Typical test phases for a complex process automation system. This example assumes that the system is configured by a Supplier and delivered to site after a FAT. Test system can also be configured by the system integrator or by the User. In this case, the same coverage is required, but the test phasing and location may be different.

The User and Supplier should work together to develop an overall approach to testing that reflects the risk assessment output and ensures adequate test coverage of the functionality, whilst avoiding unnecessary repeated tests.

Factory Acceptance Test – FAT

Done at the Supplier's premises after Supplier's integration testing, and before the system is released for delivery to the User's site

The required coverage should reflect the relative risk priority associated with the system element under test. This coverage can increase if problem are found within the initial sample.

In determining the required coverage, the User needs to base decisions on the risk assessment output taking into account both the potential effect on product quality and safety resulting from the process, and the intrinsic risk likelihood associated with the method of implementation.

Before performing risk assessment to decide on the required coverage, the User should review the supplier's internal tests to confirm that they are adequately documented.

Site Acceptance Test – SAT / Installation Qualification – IQ

Done at the User's premises after installation of the system on site

:The required coverage should include

Checking that the full system, including HW, SW backups, and documentation has been delivered to site in a condition suitable to its intended purpose

Checking that the site environment suits the specification of the installed equipment: temperature, humidity, pressure, dust, vibration, interference, etc

Checking that the system has been correctly installed.

Demonstration the system still operates as it was when accepted during the FAT, typically by re-recording system components and versions, and by re-repeating a small sample of FATs on site

Testing any elements which could be adequately tested in the FAT environment, typically interfaces to other equipment, networks, etc

Re-testing, following remedial action on any element subject to conditional release at the end of the FAT

Operation Qualification – OQ and Performance Qualification – PQ

Done at the User's premises after SAT/IQ

If a system has been fully tested in the FAT, after successful completion of the IQ (along with any additional field functional tests and calibrations), the system is treated as an integrated part of the process equipment qualification. This should ensure that the full system, procedures and trained personnel are ready for production.

End of GAMP – Test Example – part 3.

 

ולידציה – GAMP – Test Example – part 2

Good Automated Manufacturing Practice (GAMP) – Test Example

Testing Process Automation Systems

This article cover  the second part of  our  Good Automated Manufacturing Practice (GAMP) test example. This part will cover  the typical test phases for supplier's application SW Module Testing, supplier's Module Integration Testing and the supplier's Integration Testing – cont'd

                     Typical Test Phases

 Typical test phases for a complex process automation system. This example assumes that the system is configured by a Supplier and delivered to site after a FAT. Test system can also be configured by the system integrator or by the User. In this case, the same coverage is required, but the test phasing and location may be different.

The User and Supplier should work together to develop an overall approach to testing that reflects the risk assessment output and ensures adequate test coverage of the functionality, whilst avoiding unnecessary repeated tests.

: Typical Test Phases for Process Automation Systems

 Supplier's Application SW Module Testing

Done at the Supplier's premises after the module has been placed under configuration control and code reviewed. Before system integration. Module testing generally covers: Module data handling

 Interfaces to other modules

 Operator interfaces

 Module functionality Failure paths and response to fault conditions should be included within the tests

Supplier's Module Integration Testing

Done at the Supplier's premises.After individual module have been tested and integrated into a single unit.Before full system integration

Module integration testing generally covers: Correct operation of Interfaces between modules

Failure paths and response to fault conditions should be included within the tests

Supplier's Integration Testing –

Done at the Supplier's premises.After module testing, and before the User is invited to witness the FAT

Integration testing generally covers:

 HW

I/O interfaces

Operator interfaces

Interfaces to other equipment

System functionality

Data handling functions

Failure paths and response to fault conditions should be included within the tests

:HW tests typically include

·      Checking system build against approved HW specifications and drawings

·      Recording system components, version numbers (including SW versions) and capacities

·      Checking electrical supplies and grounding

Supplier's Integration Testing (cont'd) –

Checking correct power up of system components

Checking any self test/diagnostic information

Checking correct communication on standard interfaces

:I/O interface tests typically include

Exercising inputs and outputs to check correct configuration of ranges, alarms, etc.

:Operator interface tests typically include

System displays and navigation

Security and access controls

:Tests for interfaces to other equipment typically include

Checks of communications protocol setup

Checks that the required data can be transferred

Checks of actions in case of communications failure

Tests for system functionality typically include:

Monitoring functions

Alarm strategies

Control functions)control modules, equipment modules, procedural control(

Power failure and recovery

Component failure and redundancy

Performance checks

:Tests for system data handling typically include

Operator data entry

Data formatting and quality checks

Checks of calculated values

Checks of recipes

Checks of access to current process data, alarms and events )displays, alarm summaries, etc.(

Checks of access to historical process data, alarms and events )trends, reports, alarm histories, etc.(

Checks of audit trail functionality

Checks of data capacity and retention times

Checks of archive and restore

Checks of provisions for electronic signatures

Checks of disaster recovery procedures

End of  ולידציה – GAMP – Test Example – part 2

 

 

 

 

ולידציה – GAMP – Test Example – part 1

Good Automated Manufacturing Practice (GAMP) – Test Example

Testing Process Automation Systems

This article cover  the first part of  our  Good Automated Manufacturing Practice (GAMP) test example.

                                     Definitions

This section provides brief descriptions of three different types of process control systems.

                  Configurable Equipment

Configurable Equipment is the collective name given to simple configurable instruments/ devices, such as 3-term controllers, check scales, bar code readers, etc. Their functionality depends on that their configuration setup meets the process requirements. The software (SW) components of these systems are typically defined as GAMP SW Category 2.

                    Embedded Systems

Embedded Systems is the collective name for systems with a greater degree of configuration and programmability. Devices such as Integrated Circuits (ICs) with configuration setups and Programmable Logic Controllers (PLCs), which are supplied as an integral part to an item of process equipment, e.g., PLCs controlling a centrifuge or packaging machine, or IC embedded in High Performance Liquid Chromatography (HPLC) systems. Embedded Systems typically contain SW components belonging to multiple GAMP categories.

                 Stand-Alone Systems

Stand-Alone Systems is the collective name for large programmable control systems having distributed functionality across a network, e.g., Distributed Control Systems (DCSs), and Supervisory Control and Data Acquisition (SCADA). They are engineered as an entity to control a complete plant. Stand-Alone Systems typically contain SW components belonging to multiple GAMP categories.

                      Testing and the GAMP Life Cycle 

     Stand-Alone Systems

A process automation system developed for a new application typically requires some or all of the following test phases:

Suppliers Module Testing

Suppliers Module Integration Testing

Suppliers Integration Testing

Factory Acceptance Test (FAT)

Site Acceptance Test (SAT)

Installation Qualification (IQ)

Operation Qualification (OQ)

Performance Qualification (PQ)

The exact combination of testing required for a particular system should reflect its complexity, the maturity of its underlying SW and hardware (HW) elements, and the risk impact on product quality, patient safety and data integrity. Collectively these will determine the risk priority. The phrase 'low risk' should be understood as 'having a low risk priority, as determined by a formal risk assessment'.

Testing of modifications, patches or upgrades should be related to the risk priority of the change. For example, it may be appropriate for parameter changes to be applied directly to the production environment, assuming that the system have been range checked for such parameter.

End of ולידציה – GAMP – Test Example – part 1

ולידציה – GAMP – Hardware & Software Test Environments

ולידציה –  GAMP – Hardware & Software  Test Environments

Good Automated Manufacturing Practice (GAMP) Hardware (HW) Environment Test

HW can be categorized according to both GAMP 4 HW category (standard or custom) and its function within the test environment. It can be one o f these three things

 Part of the system under test, i.e., part of the production HW

 Test HW representing part of the production environment, which may be needed because it is not feasible to include a certain element of the production system in the test environment

A separate test system, which may be used to represent an external system

          Representative Test Environment

As started previously, the HW environment should represent as close as possible the production environment

For example, if the test environment uses a standard network hub of the same type as the production environment, then the substitution probable introduces low probability if invalid test results in the production environment. If, however, the network cabling in the test environment is uses short patch cables, whilst the real environment has cable runs close to the maximum recommended length, there is clearly a possibility of different network behavior, and additional tests on site may be needed to prove proper network performance

                    Control of Test Environment

For standard HW (per GAMP 4 HW Category 1), the manufacturer's reference and serial numbers should be recorded.

For custom HW (per GAMP 4 HW Category 2), the version of the item and its controlling specification should also be recorded.

For all test HW, any applicable calibration status should be recorded in the context of the specific

                         Removal from Production Environment

If the test HW is added in the way that it may appear in the production environment, then, this should be documented a temporary modification to the production system. Removal of the temporary modification should be documented as well

Good Automated Manufacturing Practice (GAMP)  Software (SW) Environment Test

Test SW can also be categorized according to both GAMP 4 SW category and its function within the test environment:

Part of the system under test, i.e., part of the production HW

Test SW representing part of the production environment

A separate test system

              Representative Test Environment

The SW environment should represent as close as possible the production environment

For example, if the test environment uses a process controller of the same type as the production environment, then the substitution probable introduces low probability if invalid test results in the production environment. If, however, a particular interface is simulated in the test environment, a possibility remains that different timing factors or process dynamics could affect the operation in the production environment, and additional tests on the production interface may be appropriate

                        Control of Test Environment

For standard SW (per GAMP 4 SW Category 1, 2 or 3), the manufacturer's reference and version numbers (including installed patch cables) should be recorded. Any configuration or setup parameters should be controlled

For configured or custom SW (per GAMP 4 SW Category 4 or 5), the item should be placed under configuration management and the version in use recorded

              Removal from Production Environment

If the test SW is added in the way that it may appear in the production environment, then, this should be documented a temporary modification to the production system. Removal of the temporary modification should be documented as well

ולידציה –  GAMP – Hardware & Software  Test Environments

ולידציה – Introduction to GAMP Test Environments

ולידציה – Introduction to Good Automated Manufacturing Practice (GAMP) Test Environments

The environment in which testing is conducted should be determined based on the life cycle output of the risk assessment. The following general principles apply:

  • The proposed test environment should represent as close as possible the production environment. The differences between the two should be detailed in the Test Specification or Protocol, and should be subject to impact assessment.
  • The test environment should be controlled and recorded to a level of detail that would allow it to be reconstructed if necessary. Such control includes:
  • System hardware (HW) and software (SW) components
  • Test HW (versions, serial numbers, as appropriate)
  • Test SW (version control of any simulations)
  • Test data (version control of any test data sets, test recipes, etc.)
  • Test user accounts
  • Where test HW/SW/data/user accounts are applied as they may appear in the production environment, controls should be available to ensure that they can either be removed cleanly or isolated from use (either logically or in time).
  • Where the test environment is required to undergo a temporary change to facilitate the execution of specific tests, both the change and its removal must be formally documented.

 GAMP Test Environments

In many circumstances it is undesirable to conduct all testing on the final production environment. Common examples include:

  • Non-availability of infrastructure at the point in the project life cycle when testing is carried out.
  • Non-availability of certain interfaces.
  • Requirement to test changes outside of the production environment prior to installation.
  • Requirement to carry out tests which may be destructive to the production environment.

 

The progression of SW build from a development environment through production environment depends on the risk priority associated with the system being installed or the change being made, and on factors such as the ease of possible modification removal from the system.

A change to a custom data processing module in a large business system may require progression from a development environment to a test environment, to a validation environment, and then to the production environment. This may be required because the change is a high risk priority, and, even if the if the original module could be restored easily, the test data may remain in the production environment.

Some tests, e.g., Performance Qualification (PQ), or part of it, may need to be conducted in the production environment.