logo for information-management-architect.com
leftimage for information-management-architect.com

Integration Testing

Improve quality with David Bowman’s information management guidelines for integration testing

This site is designed for Information Technology professionals who need to improve quality and require guidance and direction to help teams consistently produce error free results.

It provides information management guidelines for integration testing.

What is quality assurance testing?


The objectives of quality assurance testing is to ensure that all software components meet requirements and are ready to transition to production.

Scope
  • Should focus on testing a set of components that is expected to move to production as an integrated build to ensure that it performs to specification; and
  • Should certify that the solution meets specification.
Entrance Criteria

Configuration Management Area Responsibilities
  • Should ensure that a configuration management process has been established and documented to handle requested changes to project documentation;
  • Should ensure that a code migration process has been documented and arrangements made to accommodate testing;
  • Should ensure that a build schedule has been established and documented;
  • Should ensure that the build schedule includes periodic planned builds for defect fixes while in test;
  • Should ensure that a turnover package is documented and that all code changes are identified and made available to the release management and test team;
  • Should ensure that the turnover package includes database changes, install scripts/instructions, structure changes, etc;
  • Should ensure that code to be tested is locked-down and ready to be moved into the test area; and
  • Should ensure that the testing environment is base-lined and updated with the release baseline i.e. the code and program versions in the test environment should equal the production environment.
Project and Business Area Responsibilities
  • Should ensure that test team members have been involved in project meetings since analysis;
  • Should ensure that at least one test team member is on all applicable project communications lists;
  • Should ensure that the requirements specification has been base-lined, is under configuration control, and has been made available to the test team for review;
  • Should ensure that all design documentation has been base-lined, is under configuration control, and has been made available to the test team for review;
Development Area Responsibilities
  • Should ensure that independent code reviews and  review checkpoints have been completed and documented;
  • Should ensure that unit test results have been documented and reviewed by the test team;
  • Should ensure that unit test defects have been fixed, validated or otherwise properly finalized, and documented;
  • Should ensure that system test results have been documented and reviewed by the test team;
  • Should ensure that system test defects have been fixed, validated or otherwise properly finalized, and documented;
  • Should ensure that a development team representative from each subject area has been identified and assigned to support the integration testing effort, and is listed in the test plan;
  • Should ensure that all needed support files e.g. account lists, flat file inputs, ini files, etc. have been identified and provided where applicable and are part of the turnover package;
Defect Management Responsibilities
  • Should ensure that the defect management process has been documented and implemented;
  • Should ensure that all project team members have been trained in, and have appropriate access to, the defect tracking tool; and
  • Should ensure that a defect manager has been identified, allocated, and trained.
Test Area Responsibilities
  • Should ensure that test plans are completed and approved;
  • Should ensure that test cases and test scripts have been completed and approved; and
  • Should ensure that integration test cases have been traced to business and system requirements through a requirements traceability matrix.
Support Area Responsibilities
  • Should ensure that a unique integration test environment has been reserved and is under configuration management control;
  • Should ensure that the test environment has been verified as accessible and functioning, including connectivity to all applicable systems;
  • Should ensure that all applicable test tools have been installed, verified as functional, and that user guides and documentation are up-to-date;
  • Should ensure that proper user id's and permissions needed for testing have been obtained and verified;
  • Should ensure that test data has been received and pre-tested for validity;
  • Should ensure that deployment and installation instructions have been successfully executed and validated;
  • Should ensure that points of contact have been assigned for their various component responsibilities and are listed in the project test plan; and
  • Should ensure that all applicable service agreements are in place and updated if needed, including clear ownership of incident resolution.
General Management Responsibilities
  • Should ensure that any training required by the test team has been completed;
  • Should ensure that any waivers or exceptions to the above have been documented and approved; and
  • Should certify that integration testing may proceed based on a  recommendation by the integration test lead.
Test Cases
  • Should be created by the integration test lead;
  • Should be recorded in the project test management tool e.g. HP Quality Center;
  • Should ensure that one or more test cases are created for each component included in the release;
  • Should specify the steps needed to execute the test and the expected results of each test.
Test Data
  • Should be created by a production support development team;
  • Should be extracted from production data;
  • Should be a sub-set of production data;
  • Should be augmented by the test team as required to test module interaction, unique and/or negative conditions;
  • Should be retained in the staging environment so that the same data can be re-used as new components are developed; and
  • Should be based entirely on pre-approved test data i.e. testing should be based on augmented production data but should not involve testing with live production data.
Test Scripts
  • Should be created by a test script developer;
  • Should be stored in the project test management tool;
  • Should be used by the integration tester; and
  • Should be retained for regression test purposes.
Test Execution

Should be performed by integration testers in the  the integration test environment.

Test Data
  • Should be created as part of the test data acquisition plan;
  • Should be based on a sub-set of production data; and
  • Should be augmented as appropriate to ensure adequate testing of all specifications.
Defects
  • Should be identified, and recorded in the project defect management tracking tool, e.g ClearQuest;
  • Should be assigned to the Project Manager who should determine impact on project deliverables, approve/reject change; and create and manage change requests as appropriate to ensure that requirements are changed, reviewed, and approved before commencing technical analysis design, build, test and deploy;
  • Should follow the project solution delivery methodology e.g. analysis, design, build, and review checkpoint process, before they are are certified ready for re-testing in the integration test environment; and
  • Should be monitored by the Project Manager to ensure defect resolution.
Test Results

Should be stored in test management tool as an attachment to the test case.

Exit Criteria
  • Should ensure that any updates to test documentation have been completed and are under configuration management control;
  • Should ensure that test cases have been executed according to the test plan and any deviations have been documented and approved;
  • Should ensure that all required test types have been completed;
  • Should ensure that all defects have received a final disposition i.e. open, closed, postponed, invalid or duplicate;
  • Should ensure that select test results have been reviewed by the quality assurance test team;
  • Should ensure that all closed defects have an assigned a defect root cause;
  • Should ensure that sign-off has been obtained from designated stakeholders indicating test completion; and
  • Should ensure that other test completion and validation conditions, as specified in the project test plan, have been met.
Summary...

The objectives of quality assurance testing is to ensure that all software components meet requirements and are ready to transition to production.

This site provided information management guidelines for integration testing.