logo for information-management-architect.com
leftimage for information-management-architect.com

Software Testing Strategies

Improve quality with David Bowman’s information management guidelines for software testing strategies

This site is designed for Information Technology professionals who need to improve quality and require guidance and direction to help teams consistently produce error free results.

It provides information management guidelines for software testing strategies.

Test Plan strategy
  • Should clearly summarize project objectives, test objectives, scope and roles and responsibilities;
  • Should refer to the requirements acceptance criteria, that were developed as part of the requirements definition phase;
  • Should define the test environments that will be used;
  • Should list the scope of each test, including entrance criteria, how test cases will be prepared, what test data will be used, what test scripts are required, who will execute test scripts, how defects will be managed, how test results will be managed, and test exit criteria;
  • Should list the overall approach for creating test data;
  • Should ensure that large volumes of data are not be used for unit,  system, integration, regression and quality assurance testing;
  • Should specify what deliverables will be produced;
  • Should list all project roles and responsibilities, level of effort, and other test resource requirements;
  • Should list any training that might be required so the test team can complete testing;
  • Should identify a test schedule that clearly defines when all testing is expected to occur—This schedule may be included in the project schedule; and
  • Should ensure that a standard test plan template is used for each project.
Software Quality Assurance Plan Template

Under Construction  


Software Testing Strategies For Requirements
  • Should be rejected if they fail the "Smart" test i.e. each requirement must be: specific i.e. objectives should state exactly what will be achieved (e.g.; unambiguous); measurable i.e. objectives must be quantifiable so that you will know if you have met the requirements; achievable i.e. there must be a clear link between the objective and the business case or business Plan; realistic i.e. each requirement should be achievable; and time-targeted i.e objectives should specify a completion date and milestones;
  • Should have requirements acceptance criteria be clearly defined at the end of the requirements analysis phase;
  • Should have approved requirements acceptance criteria as part of the requirements approval process;
  • Should have requirements acceptance criteria base-lined as part of the requirements base-lining process;
Test Data Acquisition

  • Should have a test data acquisition plan completed at the end of the architecture and design phase;
  • Should not use large volumes of data  for unit, system, integration, regression and quality assurance testing;
  • Should be considered a configuration item and should  base-lined;
  • Should ensure that changes to base-lined test data are only made upon approval of a change request;
  • Should ensure that sufficient on-line storage is planned for test data so that back-up/restore can occur without waiting for access to time consuming tape archive backup,
Requirement Traceability Strategy
  • Should ensure that a requirements traceability matrix is used to outline all testing required for each requirement;
  • Should ensure that the author of each architecture and design document documents how the design satisfies requirements
  • Should ensure that he requirements traceability matrix is  reviewed prior to commencing coding to ensure that all requirements have been satisfied and that no additional "nice to have" requirements have been added.
Test Case Strategy

Should ensure that quality assurance test cases are created after sign-off and approval of architecture and design documents.

Software Testing Strategies For Unit Testing
  • Should focus on testing individual modules to ensure that they perform to specification, handle all exceptions as expected, and produce the appropriate alerts to satisfy error handling;
  • Should be performed in the development environment;
  • Should be conducted by the software developer who develops the code;
  • Should validate the module's logic, adherence to functional requirements and adherence to technical standards;
  • Should ensure that all module source code has been executed and each conditional logic branch followed; and
  • Should ensure that test data and test results are recorded and form part of the release package when the code moves to production.
Code Review Strategy
  • Should focus on reviewing code and unit test results to provide additional verification that the code conforms to data integration  best practices and security requirements; and
  • Should verify that test results confirm that all conditional logic paths were followed and that all error messages were tested properly.
System Testing
  • Should focus on testing a set of components  to ensure that it performs to specification;
  • Should be performed by a senior developer or lead developer in a system test or integration test environment;
  • Should document errors, or defects, classified by severity; and
  • Should ensure that defects are monitored by the defect management process to ensure timely resolution.
Integration Testing
  • Should focus on testing a software release i.e. a set of components intended to move to production as a planned release;
  • Should be managed by the data integration architect in an integration test environment which mirrors the intended production environment;
  • Should ensure that errors or defects are classified by severity; and monitored by the defect management process to ensure timely resolution;
  • Should include system and volume testing to confirm that the system is operating correctly and can handle the required data volumes; 
  • Should include reconciliation tests to manually confirm the validity of data;
  • Should include regression testing to ensure that the new software does not cause problems with existing software; and
  • Should include performance testing to ensure that data can be loaded in the available load window.
Security Testing
  • Should be performed in the production environment under the direction of the security officer;
  • Should certify compliance with system security and integrity; and
  • Should address system back-up, recovery,  and security audit trails and tracking.
Quality Assurance Testing
  • Should focus on testing a set of components to ensure that they meet requirements;
  • Should be performed by an independent test team in a quality assurance environment that mirrors the production environment;
  • Should test every requirement and all system, user and production support documentation; and
  • Should ensure that test cases, test data and test results are defined as configuration items and handed over to the production support team as part of the release package.
User Acceptance Testing
  • Should focus on testing a set of requirements to ensure that they meet user expectations; and
  • Should be performed in the quality assurance test environment or in a separate user acceptance test environment.
Software Testing Strategies For Release Testing
  • Should focus on testing a set of requirements (that is expected to move to production as a production release) to ensure that it meets user expectations.
  • Should certify that the solution meets production support requirements.
Summary...

Software testing strategies should be specified in the test plan and clearly establish test expectations for all team members.

It is important to complete the plan early in the project and manage it carefully to ensure on time, within budget project delivery.

This site provided information management guidelines for software testing strategies.