This article is in continuation with testing life cycle part one.
The following activities should be performed at each phase of the life cycle:
1 • Analyze the structures produced at this phase for internal testability and adequacy.
2 • Generate test sets based on the structure at this phase.
In addition, the following should be performed during design and programming:
1 • Determine that the structures are consistent with structures produced during previous phases.
2 • Refine or redefine test sets generated earlier.
Throughout the entire life cycle, neither development nor verification is a straight-line activity. Modifications or corrections to a structure at one phase will require modifications or re-verification of structures produced during previous phases.
Requirements
The verification activities that accompany the problem definition and requirements analysis phase of software development are extremely significant. The adequacy of the requirements must be thoroughly analyzed and initial test cases generated with the expected (correct) responses.
Developing scenarios of expected system use helps to determine the test data and anticipated results. These tests form the core of the final test set. Generating these tests and the expected behavior of the system clarifies the requirements and helps guarantee that they are testable.
Vague requirements or requirements that are not testable leave the validity of the delivered product in doubt. Late discovery of requirements inadequacy can be very costly. A determination of the criticality of software quality attributes and the importance of validation should be made at this stage. Both product requirements and validation requirements should be established.
Design
Organization of the verification effort and test management activities should be closely integrated with preliminary design. The general testing strategy – including test methods and test evaluation criteria – is formulated, and a test plan is produced. If the project size or criticality warrants, an independent test team is organized. In addition, a test schedule with observable milestones is constructed. At this same time, the framework for quality assurance and test documentation should be established.
During detailed design, validation support tools should be acquired or developed and the test procedures themselves should be produced. Test data to exercise the functions introduced during the design process, as well as test cases based upon the structure of the system, should be generated. Thus, as the software development proceeds, a more effective set of test cases is built.
In addition to test organization and the generation of test cases, the design itself should be analyzed and examined for errors. Simulation can be used to verify properties of the system structures and subsystem interaction; the developers should verify the flow and logical structure of the system, while the test team should perform design inspections using design walk troughs.
Missing cases, faulty logic, module interface mismatches, data structure inconsistencies, erroneous I/O assumptions, and user interface inadequacies, are items of concern. The detailed design must prove to be internally coherent, complete, and consistent with the preliminary design and requirements.
95.1
The previous post is regarding C SHARP programming of Microsoft dot net and you can have a glance at that here.
The following activities should be performed at each phase of the life cycle:
1 • Analyze the structures produced at this phase for internal testability and adequacy.
2 • Generate test sets based on the structure at this phase.
In addition, the following should be performed during design and programming:
1 • Determine that the structures are consistent with structures produced during previous phases.
2 • Refine or redefine test sets generated earlier.
Throughout the entire life cycle, neither development nor verification is a straight-line activity. Modifications or corrections to a structure at one phase will require modifications or re-verification of structures produced during previous phases.
Requirements
The verification activities that accompany the problem definition and requirements analysis phase of software development are extremely significant. The adequacy of the requirements must be thoroughly analyzed and initial test cases generated with the expected (correct) responses.
Developing scenarios of expected system use helps to determine the test data and anticipated results. These tests form the core of the final test set. Generating these tests and the expected behavior of the system clarifies the requirements and helps guarantee that they are testable.
Vague requirements or requirements that are not testable leave the validity of the delivered product in doubt. Late discovery of requirements inadequacy can be very costly. A determination of the criticality of software quality attributes and the importance of validation should be made at this stage. Both product requirements and validation requirements should be established.
Design
Organization of the verification effort and test management activities should be closely integrated with preliminary design. The general testing strategy – including test methods and test evaluation criteria – is formulated, and a test plan is produced. If the project size or criticality warrants, an independent test team is organized. In addition, a test schedule with observable milestones is constructed. At this same time, the framework for quality assurance and test documentation should be established.
During detailed design, validation support tools should be acquired or developed and the test procedures themselves should be produced. Test data to exercise the functions introduced during the design process, as well as test cases based upon the structure of the system, should be generated. Thus, as the software development proceeds, a more effective set of test cases is built.
In addition to test organization and the generation of test cases, the design itself should be analyzed and examined for errors. Simulation can be used to verify properties of the system structures and subsystem interaction; the developers should verify the flow and logical structure of the system, while the test team should perform design inspections using design walk troughs.
Missing cases, faulty logic, module interface mismatches, data structure inconsistencies, erroneous I/O assumptions, and user interface inadequacies, are items of concern. The detailed design must prove to be internally coherent, complete, and consistent with the preliminary design and requirements.
95.1
The previous post is regarding C SHARP programming of Microsoft dot net and you can have a glance at that here.