ReliaSoft often receives questions/requests from customers/third parties regarding software validation (usually related to FDA requirements). This document was created to address such requests.
All of ReliaSoft's shrink-wrap products are thoroughly tested and completely validated before commercial release. Our strict validation process, along with our extensive documentation on both the use of the software and the underlying mathematics, ensures (with a very high probability) that all results provided by ReliaSoft applications are valid and correct.
ReliaSoft's validation and quality assurance (QA) procedures were developed independently of the Food and Drug Administration (FDA) validation requirements. Although we believe that ReliaSoft's validation testing is of a much broader nature and far more intensive than the FDA requirements, it is of course the responsibility of each organization to determine whether the use of ReliaSoft's software will be in compliance with any regulatory guidelines they may be subject to. We expect that this document will provide the majority of the information about ReliaSoft's procedures that will be required for you to conduct your evaluation.
With all standard shrink-wrap products, ReliaSoft provides extensive documentation in the form of user's guides, help files and theoretical textbooks. The purpose of the documentation is to present the methodologies used (equations and formulations), provide examples and familiarize the end user with the underlying theory.
An end user can do the following:
Additionally, comparisons between ReliaSoft's numerical results and published results (by other authors) are also provided with the printed documentation.
Some of these textbooks are posted on ReliaSoft's public wiki. Specifically, the following references are available:
The following sections present an overview of the requirements for software development at ReliaSoft.
All theory and formulations created or implemented by ReliaSoft must be theoretically sound and correct, accepted by academia and industry experts, and also must be thoroughly validated before becoming part of a standard software product (SSP).
Theory and formulations that are not created by ReliaSoft must meet the following standards:
Theory and formulations that are developed by a ReliaSoft scientist must meet the following standards:
All products and product components developed by ReliaSoft must adhere to the industry-wide accepted standards for developing Windows-based object-oriented applications, including industry standards for graphical user interfaces (GUIs). Multiple authors, including Microsoft and Microsoft Press publications, offer a wide range of articles and books that detail accepted practices. Additionally, such code development must adhere to the guidelines for software development per ReliaSoft's internal document "Coding Practices and Procedures for SSP Software," and to current practices, methods and documents posted in the Development section of ReliaSoft’s intranet.
Microsoft guidelines for developing Windows applications (as detailed in several documents, such as "Application Specification for Microsoft® Windows® 2000") are adhered to. Deviation, if any, from these guidelines must be approved by ReliaSoft’s technical review board and reasons for the deviation must be documented.
Periodic design reviews are conducted with members of the code development team, technical writers and quality assurance. In addition, theoretical and technical review boards are conducted.
The guidelines implemented by ReliaSoft for source code management are as follows:
ReliaSoft has always adhered to the highest quality and reliability standards for all of its software products and services. The quality assurance (QA) and testing procedures for all ReliaSoft products, including custom software, are based on a scaled agile framework which is facilitated using CA Agile Central. The scaled agile process involves detailed and comprehensive testing efforts by multiple individuals and teams over time specific iterations; thorough documentation of all issues identified during each iteration; and independent validation of all methods, theory and calculated results.
These tests include low level testing at 'unit' and 'integration' levels. They generally test libraries etc. and are therefore performed by developers. Their purpose is to ensure that the software produces the correct results (e.g. 2+2 = 4). They must be completed before System Tests can be concluded. This unit testing ensures that components and models function as intended and also ensures robustness. This testing is the responsibility of the individual developers and it involves a test-analyze-and-fix (TAAF) process.
Integration testing involves testers looking for bugs within the relationships and interfaces between a pair of components or group of components. (An example is how Weibull++ is integrated with ALTA.) Integration testing is not required if the application is made up of individual utilities that do not share data or invoke one another. However, if the software uses API, shares data or passes control from one component to another, then integration testing becomes an important method to verify that the components are working together properly. Integration testing requires that the tester has a firm understanding of how the components are intended to work together.
Once all modules/components have been tested, compiled and linked without any errors or warnings, wider testing is implemented on the complete application or system. Formal logs of system testing activities are kept in our Bug Logging and Tracking system - CA Agile Central. In general, any and all issues found are corrected as they are identified and testing resumes immediately with the corrected version to assure that the fix did not create any regression issues. Special emphasis is given to the component or issue that was corrected. This is repeated throughout each iteration of the development phase.
Installation and Environmental Testing
The last phase of testing is installation and configuration testing. This is done in-house on multiple test systems including most versions of Windows® (e.g., Windows 7 & 8.1, Windows 10). When appropriate, feedback is also obtained from outside testers. In the case of software that will reach foreign markets, foreign OS (operating system) testing is also performed (e.g., Chinese Windows, Korean Windows).
In parallel with the software functionality testing, multiple engineers, scientists, and statisticians working independently perform testing and validation of all calculated results. Validated results are thoroughly documented and many of the examples are then illustrated in textbooks that are released with the software. Any issues found are corrected, re-tested and re-validated using multiple data scenarios.
When ReliaSoft releases a product, it has been thoroughly tested and ReliaSoft is highly confident of its quality and reliability. After all issues have been resolved and validated, a release decision is made. This decision is based on the number and frequency of issues found. Before software release can commence, the Release Manager must ensure that all test documentation has been signed-off (and approval for release has been given) by the Test Manager.
In the case of custom systems, we expect the client to test the product to assure that the system is functioning as intended in the client environment. For systems that incorporate a non-ReliaSoft database to support functionality and calculations, the client is responsible for taking steps to maintain the integrity of the data stored in the database. An overview of the testing procedures employed by ReliaSoft is presented in the next table.
Testing Phase | Testing Type | Responsibility | Testing Details |
Development Phase |
Unit Testing Integration Testing |
Development QA Theoretical |
Perform ongoing unit testing during development. This will involve testing individual components as they are developed. At this stage, individual components will be tested for correctness of calculations as well as interaction with other components. |
General Testing Phase |
Process Testing Integration Testing System Testing |
QA Development Theoretical Documentation |
Validate process results for correctness. Track the occurrence and resolution of all anomalous issues. Test the performance of components when they are fully integrated. |
Installation Testing Phase |
Installation and Environmental Testing |
QA Development |
Test the system internally/externally under various installation environments. For custom systems, work with the client to test the system at the deployment location. |
Calculation Validation Phase |
Calculation Validation |
Theoretical |
Validate a representative sample of system calculations by performing identical calculations independently and comparing the results. |
Release Phase |
Final Testing |
QA Development Theoretical Documentation |
Unit, process, integration and functionality, usability and deployment testing. |
Testing in each phase is cyclical utilizing a test-analyze-and-fix (TAAF) process. All issues after the initial development phase are communicated, documented and resolved utilizing CA Agile Central.
CA Agile Central tracks all issues encountered for each product from the development phase and throughout its life cycle. This integrated system forms the basis of our QA procedures. All issues are logged into CA Agile Central as incidents and are tracked from creation to resolution. Each defect is logged for the appropriate team to address. Defects can include suggestions from customers, employees, bugs found during testing, etc. Once the defect has been addressed by development, it must be tested by the QA group to ensure that the resolution employed correctly addresses the issue and does not introduce any additional regression issues. CA Agile Central facilitates teamwork between developers, theoretical experts, technical writers, QA and management to create a reliable and robust application.
The next figure shows one of the defect reporting interfaces within CA Agile Central.
A software application is not deemed ready for deployment until the release candidate has been accepted by ReliaSoft's QA group. The QA group must be satisfied with the current state of the application based on the results of the testing.
Once a product is ready for deployment: