INTEGRATION TESTING PART TWO

Integration test designs, test cases, test procedures and test reports are documented in the Integration Test section of the Software Verification and Validation Plan.

Problem areas:

Internal:

· between components

Invocation:

· Call / message passing

Parameters:

· type, number, order, value

Invocation return:

· identity (who?), type, sequence

External:

· Interrupts (wrong handler?)

· I/O timing

Interaction

· between modules/units/subsystems

Integration Test Procedure
Test Plan

Test Plan is a document, which instructs a tester what tests to perform in order to integrate and test already existing individual code modules.

Inputs for Test Plan:

· Detailed Design & Analysis document

· Software Requirement Specification

· Quality Plan

Test Items:

· List each of the items/ programs/units to be tested

Features to be tested:

· List each of the features/functions/requirements which will be tested

· Order of the modules to be tested

Features not to be tested:

· Explicitly list each of the features/functions/requirements which will not be tested and Why?

Approach:

· Should specify the any one of the methods/approach to be used to test the designated groups of features. Different approaches are:

· Top Down Approach

· Bottom-Up Approach

· Sandwich Approach

· Big-Bang Approach

· Should specify the tools to be used to test the designated groups of features.

Item pass/fail criteria:

· Should specify the criteria to be used to decide whether each test item has passed or failed testing

Suspension/Resumption criteria:

· Under what circumstances the test will be suspended all or a portion of the testing activity on the items associated with this plan.

· Specify the testing activities to be repeated, when testing resumes.

· Check points in long test.

Test Deliverables:

· Test Software

· Test Cases

· Test Data

· Test Reports

Environmental needs:

· Hardware (client/server platforms, memory, backup systems etc)

· Software (OS, database, language, 3rd party software (if any))

· Test tool

Training:

· Describe the plan for providing training in the use of the software being tested

· Specify the types of training

· Personnel to be trained

· The training staff.

Schedule:

· Description of overhead software, concentrating on those that may require special effort

· For writing the integration test procedures

· Performing tests

· Documenting test results

Note: Start and End dates given for each phase

Resources:

· Who and how many are needed

· What skills are needed

· How much time needs to be asserted

· What kinds of tools are needed (hardware, software, platforms, etc.)

Exit criteria:

· Bug rate falls below a certain level

· Deadlines (release deadlines, testing deadlines, etc.)

· When test goals have been reached

· Code coverage

· Defect density

Approvals:

· By Team Leader

· By Test Manager

· By Project Manager

· By Client

WHITE BOX TESTING

WHITE BOX TESTING PART TWO

WHITE BOX TESTING PART THREE

WHITE BOX BASIC PATH TESTING

BLACK BOX TESTING PART ONE

BLACK BOX TESTING PART TWO

AUTOMATION IN TESTING

AUTOMATED TESTING TOOLS

AUTOMATED TESTING ANALYSIS

AUTOMATION BEST PRACTICES

AUTOMATED TESTING PROCESS

CHECK POINTS IN AUTOMATED TESTING

SILK TEST

SILK TEST INTRODUCTION

FEATURES OF SILK TEST

FEATURES OF SILK TEST PART TWO

LIMITATIONS

TEST PLAN FOR SILK TEST

INSTALLATION TIPS FOR SILK TEST

TEST CASES IN SILK TEST

No comments:

Post a Comment