logo for information-management-architect.com
leftimage for information-management-architect.com

Software Testing Best Practices

Need information management software testing best practices and want practical suggestions to ensure rapid project delivery?


Software testing best practices checklist

Test planSoftware Testing Best Practices

Test objectives should clearly summarize project objectives, test objectives, scope and roles and responsibilities.
Test strategy should refer to the requirements acceptance criteria, that were developed as part of the requirements definition phase, and the test environments that will be used.
Test plan should list the scope of each test, including entrance criteria, how test cases will be prepared, what test data will be used, what test scripts are required, who will execute test scripts,  how defects will be managed, how test results will be managed, and test exit criteria;
Test data strategy should list the overall approach for creating test data.
Large volumes of data for should not be used for unit, system, integration, regression and quality assurance testing.
Test deliverables should specify what will be produced.
Resource plan should list all project roles and responsibilities, level of effort, and other test resource requirements
Training considerations should list any training that might be required so the test team can complete testing; and
Test schedule should identify a test schedule that clearly defines when all testing is expected to occur—This schedule may be included in the project schedule.
A standard test plan template should be used for each project.

Requirement traceability software testing best practices
A requirements traceability matrix should be used to outline all testing required for each requirement.
Test requirements should identify which of the following apply:
  • Inspection, requires a careful, critical review of documentation, code and/or data to verify that a requirement has been addressed;
  • Analysis, examines alternative solutions in detail so that the nature and function of each alternative may be judged as to best fit for achieving the requirements.  One example is to analyze third party products to determine which of the products should be purchased.  Another example is to review business process metrics to determine if the business process has improved as hoped because of implementing a new solution;
  • Demonstration, a practical demonstration showing how something works to provide evidence that a requirement has been achieved via examples or experiments; and
  • Test, performs a pre-defined examination or trial consisting of one or more steps intended to ensure that a requirement has been properly implemented.
Unit testing software testing best practices

Should focus on testing individual modules to ensure that they perform to specification, handle all exceptions as expected, and produce the appropriate alerts to satisfy error handling.
Should be performed in the development environment.
Should be conducted by the software developer who develops the code.
Should validate the module's logic, adherence to functional requirements and adherence to technical standards.
Should ensure that all module source code has been executed and each conditional logic branch followed.
Test data and test results should be recorded and form part of the release package when the code moves to production.

Code Review
Should focus on reviewing code and unit test results to provide additional verification that the code conforms to data movement best practices and security requirement.
Should verify that test results confirm that all conditional logic paths were followed and that all error messages were tested properly

System testing software testing best practices

Should focus on testing a set of components  to ensure that it performs to specification.
Should be performed by a senior developer or lead developer in a system test or integration test environment.
Errors, or defects, should be documented, classified by severity,  and monitored by the defect management process to ensure timely resolution.

Integration testing

Should focus on testing software release i.e. a set of components intended to move to production as a planned release.
Should be managed by the data architect or software designer in an integration test environment which "mirrors" the intended production environment.
Errors, or defects, should be documented, classified by severity,  and monitored by the defect management process to ensure timely resolution.
Should include:
  • System and volume testing to confirm that the system is operating correctly and can handle the required data volumes; 
  • Reconciliation tests to manually confirm the validity of data; Regression testing to ensure that the new software does not cause problems with existing software; and
  • Performance testing to ensure that data can be loaded in the available “load window”; Load manager testing; Warehouse manager testing; and Infrastructure and operations component testing.
Security testing

Should be performed in the production environment  under the direction of the security officer.
Should certify compliance with system security and integrity
Should address:
  • System back-up;
  • Recovery; and
  • Security audit trails and tracking.
Quality assurance testing

Should focus on testing a set of components to ensure that they meets requirements.
Should be performed by an independent test team in a quality assurance environment that "mirrors" the production environment.
Should test every requirement and all system, user and production support documentation.
Test cases, test data and test results should be defined as configuration items and handed over to the production support team as part of the release package.

User acceptance testing (UAT)

Should focus on testing a set of requirements to ensure that they meet user expectations.
Should be performed in the quality assurance test environment or in a separate user acceptance test environment.

Test Documentation


Should state test objectives and what the test is expected to achieve.
Methodology should describe the general methodology or strategy for each test.
Conditions should specify the type of input that shall be used.
Test progression should describe the manner in which progression is made from one test to the next.
Test results should specify how test results will be documented and where they will be stored.
Constraints should indicate any limitations on the test.
Criteria should describe the rules used to evaluate test results e.g. range of data values, combinations of input types used, maximum  number of errors allowed.
Data reduction should describe the techniques used to manipulate or prepare the test data.
Test cases should specify
  • Test name;
  • Test description;
  • Test control;
  • Inputs;
  • Outputs; and
  • Test scripts (or procedures).
Defects

Should be documented in a defect management repository.
Should be analyzed to determine if they are development issues or if they are requirement issues.
Should be assigned to the appropriate team for remedy.
Should be monitored to ensure that appropriate tests are conducted after software is modified.
Should be managed to ensure that changed code is migrated into the appropriate environment.
Should be monitored daily.
Software testing best practices summary...

Software testing best practices should be specified in the test plan and clearly establish test expectations for all team members. It is important to complete the plan early in the project and manage it carefully to ensure on time, within budget project delivery.



footer for Information management page