Integrated Test Design & Automation; Hans Buwalda, Dennis Janssen, Iris Pinkster; 2001
1 säljare

Integrated Test Design & Automation Upplaga 1

av Hans Buwalda, Dennis Janssen, Iris Pinkster
Foreword  ix

Acknowledgements  xi

1 Introduction  1

1.1  What is testing?  1

1.1.1  Why do we test?  2

1.1.2  Facts of life in testing  2

1.1.3  Testing and development  3

1.1.4  Quality of testing  5

1.2  An introduction to TestFrame  6

1.3  The TestFrame model  7

1.3.1  Reusable test products  8

1.3.2  Fitting  8

1.3.3  Structuring  9

1.3.4  Tooling  9

1.4  Phasing with TestFrame  10

1.4.1  TestFrame products  11

1.5  Preparation  12

1.5.1  Risk analysis  13

1.5.2  Test strategy  13

1.5.3  Test plan  14

1.5.4  Test planning  14

1.6  Analysis  14

1.6.1  Clusters  14

1.6.2  Test conditions  15

1.6.3  Action words  15

1.7  Navigation  16

1.7.1  Test tools  16

1.7.2  Navigation script  16

1.7.3  Separating analysis and navigation  16

1.8  Execution  17

1.8.1  Test report  18

1.8.2  Error management  18

1.9  Summary  20

 

2 Preparation  23

2.1  Introduction  23

2.2  Preliminary study  24

2.2.1  General  25

2.2.2  Organization  26

2.2.3  Test effort  27

2.2.4  Physical test environment  29

2.2.5  Documentation / experts  31

2.3  Risk analysis  32

2.4  Test strategy  51

2.4.1  Organization structure  52

2.4.2  Quality attributes and their relative importance  53

2.4.3  Test types  56

2.4.4  Cluster matrix  57

2.4.5  Cluster cards  58

2.4.6  Tuning  60

2.5  The test plan  60

2.5.1  Test plan versus plan of approach  61

2.5.2  Test plan structure  62

2.5.2.1  Description of the assignment  62

2.5.2.2  Defining the scope of the assignment  63

2.5.2.3  Filling the TestFrame test model  64

2.5.2.4  Specifying the time schedule  65

2.5.2.5  Defining quality assurance  65

2.5.2.6  Describing the test organization  65

2.5.2.7  Defining standards and procedures  66

2.5.2.8  Miscellaneous  67

2.6  Structuring the test environment  67

2.6.1  Step 1: Determine the effect of the test’s scope on the test environment  68

2.6.2  Step 2: Draw up an inventory of the future production environment  69

2.6.3  Step 3: Draw up an overview of the required test environment  69

2.6.4  Step 4: Describe the differences between the test environment and the
(future) production environment  71

2.6.5  Step 5: Describe the responsibilities for structuring the test environment  71

2.6.6  Step 6: Describe the responsibilities in maintaining the test environment
during the project  72

2.6.7  Step 7: Structure the test environment and maintain it during the project  72

2.6.8  Step 8: Describe the responsibilities of test environment maintenance
after the test project  73

2.6.9  Step 9: Maintaining the test environment after completion of the test
project  73

2.6.10 Work area  73

2.7  Project file  74

2.7.1  Planning  75

2.7.2  Monitoring and control  76

2.7.3  Standards and procedures within the project  80

2.8  Summary  80

 

3 Analysis  83

3.1  Introduction  83

3.2  Test set structure  84

3.2.1  Initial database  85

3.2.2  Division into clusters  85

3.2.3  General cluster documentation  86

3.3  Scope  86

3.3.1  Determining the basic information  87

3.3.2  Determining the test’s depth of testing  87

3.3.3  Example of defining the scope for a test object  88

3.4  Clusters  95

3.4.1  Division into clusters  96

3.4.2  Recording clusters  97

3.4.3  Cluster overview  97

3.5  Test conditions  98

3.5.1  Creating the right test conditions  99

3.5.2  Recording test conditions  100

3.5.3  Example of test conditions for a test object  101

3.5.4  Another way of drawing up test conditions  104

3.6  Test cases  106

3.6.1  Naming action words  106

3.6.2  Naming arguments for action words  109

3.6.3  Documenting the action words  111

3.6.4  Example of action word documentation for a test object  111

3.6.5  Recording test cases  114

3.6.6  Example of test cases for a test object  115

3.6.7  Drawing up test cases  116

3.6.8  Documentation lines  122

3.6.9  Making optimal use of spreadsheet functionality  122

3.6.10 Argument commands  123

3.7  Test conditions and test techniques  126

3.7.1  Decision table technique  127

3.7.2  The decision table technique’s working method  128

3.7.3  Example of a decision table for a test object  131

3.7.4  Entity lifecycle test  132

3.7.5  Working method for the entity lifecycle test  132

3.8  Test cases and test techniques  134

3.8.1  Syntactic testing  136

3.8.2  Syntactic testing working method  136

3.8.3  Semantic testing  138

3.8.4  Semantic test’s working method  138

3.8.5  Joint testware development  141

3.8.6  Joint testware development working method  141

3.9  Data dependency  143

3.9.1  How to counter data dependency  144

3.9.2  Database structure  145

3.9.3  Contents of the initial database  146

3.9.4  Date synchronization  147

3.9.5  Loading the database via a spreadsheet  148

3.10  Summary  149

 

4 Navigation  151

4.1  Introduction  151

4.2  Opting for manual or automated test execution  152

4.2.1  Advantages of the traditional automated testing method compared to
manual testing  153

4.2.2  Advantages of automated testing using TestFrame compared to
traditional automated testing methods  154

4.2.3  Reasons for opting for a manual test with TestFrame  155

4.3  Technical test using record & playback tools  156

4.4  Navigation structure  157

4.4.1  The functions  158

4.4.2  Using libraries  160

4.4.3  Physical structure  161

4.4.4  The starter motor  162

4.4.5  The test tool  162

4.4.6  External tools  162

4.4.7  The test environment  163

4.4.8  Documentation  163

4.5  The engine  166

4.5.1  Routines which can be carried out by the engine  166

4.5.2  Recognizing action word functions  168

4.5.3  Check function  168

4.5.4  Commands of arguments  168

4.6  Developing an action word function  168

4.6.1  Feasibility  169

4.6.2  Preparation  170

4.6.3  Specifying action word functions  171

4.6.4  Testing the action word function  173

4.6.5  GUI-based systems  174

4.6.6  Character-based systems  178

4.7  Navigation standards  183

4.7.1  Variables  183

4.7.2  Constants  185

4.7.3  Action word names  186

4.7.4  Function layout  186

4.7.5  Agreements about programming  188

4.8  Alternative scripts for navigation  188

4.9  Summary  190

 

5 Execution  193

5.1  Introduction  193

5.2  Start position of the test run  194

5.3  Planning the test run  194

5.4  Test run strategies  196

5.4.1  Type A — full test set each time  197

5.4.2  Type B — run to first error; resume from there  197

5.4.3  Type C — run to first error; resume from start  199

5.4.4  Testing under pressure of time  200

5.5  Analysis of test results and test report  200

5.6  The transfer phase  202

5.7  Issue management  203

5.7.1  Consultation arrangement  204

5.7.2  Issue management procedure  205

5.7.3  What has to be recorded?  206

5.8  Test product management after the test process  209

5.8.1  Preconditions  209

5.8.2  Procedure  210

5.9  Summary  211

6 Test management  213

6.1  Introduction  213

6.2  Resistance  213

6.3  Commitment  215

6.4  Lack of clarity with regard to responsibilities  216

6.5  Conflicts outside the test process  217

6.6  Motivation  218

6.7  Dependencies  220

6.8  Summary  221

 

Appendix  223

A.1  Basics  223

A.2  Comments  223

A.3  Subroutines and functions  224

A.4  Variables and constants  224

A.5  Arrays  226

A.6  Flow control  227

A.7  Text manipulation  230

A.8  GUI functions in classes  231

A.8.1  Window class  231

A.8.2  Button class  232

A.8.3  Edit class  233

A.8.4  List class  233

A.8.5  Menu class  234

A.8.6  General functions  235

 

References  236

 

Index  237

 
Foreword  ix

Acknowledgements  xi

1 Introduction  1

1.1  What is testing?  1

1.1.1  Why do we test?  2

1.1.2  Facts of life in testing  2

1.1.3  Testing and development  3

1.1.4  Quality of testing  5

1.2  An introduction to TestFrame  6

1.3  The TestFrame model  7

1.3.1  Reusable test products  8

1.3.2  Fitting  8

1.3.3  Structuring  9

1.3.4  Tooling  9

1.4  Phasing with TestFrame  10

1.4.1  TestFrame products  11

1.5  Preparation  12

1.5.1  Risk analysis  13

1.5.2  Test strategy  13

1.5.3  Test plan  14

1.5.4  Test planning  14

1.6  Analysis  14

1.6.1  Clusters  14

1.6.2  Test conditions  15

1.6.3  Action words  15

1.7  Navigation  16

1.7.1  Test tools  16

1.7.2  Navigation script  16

1.7.3  Separating analysis and navigation  16

1.8  Execution  17

1.8.1  Test report  18

1.8.2  Error management  18

1.9  Summary  20

 

2 Preparation  23

2.1  Introduction  23

2.2  Preliminary study  24

2.2.1  General  25

2.2.2  Organization  26

2.2.3  Test effort  27

2.2.4  Physical test environment  29

2.2.5  Documentation / experts  31

2.3  Risk analysis  32

2.4  Test strategy  51

2.4.1  Organization structure  52

2.4.2  Quality attributes and their relative importance  53

2.4.3  Test types  56

2.4.4  Cluster matrix  57

2.4.5  Cluster cards  58

2.4.6  Tuning  60

2.5  The test plan  60

2.5.1  Test plan versus plan of approach  61

2.5.2  Test plan structure  62

2.5.2.1  Description of the assignment  62

2.5.2.2  Defining the scope of the assignment  63

2.5.2.3  Filling the TestFrame test model  64

2.5.2.4  Specifying the time schedule  65

2.5.2.5  Defining quality assurance  65

2.5.2.6  Describing the test organization  65

2.5.2.7  Defining standards and procedures  66

2.5.2.8  Miscellaneous  67

2.6  Structuring the test environment  67

2.6.1  Step 1: Determine the effect of the test’s scope on the test environment  68

2.6.2  Step 2: Draw up an inventory of the future production environment  69

2.6.3  Step 3: Draw up an overview of the required test environment  69

2.6.4  Step 4: Describe the differences between the test environment and the
(future) production environment  71

2.6.5  Step 5: Describe the responsibilities for structuring the test environment  71

2.6.6  Step 6: Describe the responsibilities in maintaining the test environment
during the project  72

2.6.7  Step 7: Structure the test environment and maintain it during the project  72

2.6.8  Step 8: Describe the responsibilities of test environment maintenance
after the test project  73

2.6.9  Step 9: Maintaining the test environment after completion of the test
project  73

2.6.10 Work area  73

2.7  Project file  74

2.7.1  Planning  75

2.7.2  Monitoring and control  76

2.7.3  Standards and procedures within the project  80

2.8  Summary  80

 

3 Analysis  83

3.1  Introduction  83

3.2  Test set structure  84

3.2.1  Initial database  85

3.2.2  Division into clusters  85

3.2.3  General cluster documentation  86

3.3  Scope  86

3.3.1  Determining the basic information  87

3.3.2  Determining the test’s depth of testing  87

3.3.3  Example of defining the scope for a test object  88

3.4  Clusters  95

3.4.1  Division into clusters  96

3.4.2  Recording clusters  97

3.4.3  Cluster overview  97

3.5  Test conditions  98

3.5.1  Creating the right test conditions  99

3.5.2  Recording test conditions  100

3.5.3  Example of test conditions for a test object  101

3.5.4  Another way of drawing up test conditions  104

3.6  Test cases  106

3.6.1  Naming action words  106

3.6.2  Naming arguments for action words  109

3.6.3  Documenting the action words  111

3.6.4  Example of action word documentation for a test object  111

3.6.5  Recording test cases  114

3.6.6  Example of test cases for a test object  115

3.6.7  Drawing up test cases  116

3.6.8  Documentation lines  122

3.6.9  Making optimal use of spreadsheet functionality  122

3.6.10 Argument commands  123

3.7  Test conditions and test techniques  126

3.7.1  Decision table technique  127

3.7.2  The decision table technique’s working method  128

3.7.3  Example of a decision table for a test object  131

3.7.4  Entity lifecycle test  132

3.7.5  Working method for the entity lifecycle test  132

3.8  Test cases and test techniques  134

3.8.1  Syntactic testing  136

3.8.2  Syntactic testing working method  136

3.8.3  Semantic testing  138

3.8.4  Semantic test’s working method  138

3.8.5  Joint testware development  141

3.8.6  Joint testware development working method  141

3.9  Data dependency  143

3.9.1  How to counter data dependency  144

3.9.2  Database structure  145

3.9.3  Contents of the initial database  146

3.9.4  Date synchronization  147

3.9.5  Loading the database via a spreadsheet  148

3.10  Summary  149

 

4 Navigation  151

4.1  Introduction  151

4.2  Opting for manual or automated test execution  152

4.2.1  Advantages of the traditional automated testing method compared to
manual testing  153

4.2.2  Advantages of automated testing using TestFrame compared to
traditional automated testing methods  154

4.2.3  Reasons for opting for a manual test with TestFrame  155

4.3  Technical test using record & playback tools  156

4.4  Navigation structure  157

4.4.1  The functions  158

4.4.2  Using libraries  160

4.4.3  Physical structure  161

4.4.4  The starter motor  162

4.4.5  The test tool  162

4.4.6  External tools  162

4.4.7  The test environment  163

4.4.8  Documentation  163

4.5  The engine  166

4.5.1  Routines which can be carried out by the engine  166

4.5.2  Recognizing action word functions  168

4.5.3  Check function  168

4.5.4  Commands of arguments  168

4.6  Developing an action word function  168

4.6.1  Feasibility  169

4.6.2  Preparation  170

4.6.3  Specifying action word functions  171

4.6.4  Testing the action word function  173

4.6.5  GUI-based systems  174

4.6.6  Character-based systems  178

4.7  Navigation standards  183

4.7.1  Variables  183

4.7.2  Constants  185

4.7.3  Action word names  186

4.7.4  Function layout  186

4.7.5  Agreements about programming  188

4.8  Alternative scripts for navigation  188

4.9  Summary  190

 

5 Execution  193

5.1  Introduction  193

5.2  Start position of the test run  194

5.3  Planning the test run  194

5.4  Test run strategies  196

5.4.1  Type A — full test set each time  197

5.4.2  Type B — run to first error; resume from there  197

5.4.3  Type C — run to first error; resume from start  199

5.4.4  Testing under pressure of time  200

5.5  Analysis of test results and test report  200

5.6  The transfer phase  202

5.7  Issue management  203

5.7.1  Consultation arrangement  204

5.7.2  Issue management procedure  205

5.7.3  What has to be recorded?  206

5.8  Test product management after the test process  209

5.8.1  Preconditions  209

5.8.2  Procedure  210

5.9  Summary  211

6 Test management  213

6.1  Introduction  213

6.2  Resistance  213

6.3  Commitment  215

6.4  Lack of clarity with regard to responsibilities  216

6.5  Conflicts outside the test process  217

6.6  Motivation  218

6.7  Dependencies  220

6.8  Summary  221

 

Appendix  223

A.1  Basics  223

A.2  Comments  223

A.3  Subroutines and functions  224

A.4  Variables and constants  224

A.5  Arrays  226

A.6  Flow control  227

A.7  Text manipulation  230

A.8  GUI functions in classes  231

A.8.1  Window class  231

A.8.2  Button class  232

A.8.3  Edit class  233

A.8.4  List class  233

A.8.5  Menu class  234

A.8.6  General functions  235

 

References  236

 

Index  237

 
Upplaga: 1a upplagan
Utgiven: 2001
ISBN: 9780201737257
Förlag: Addison Wesley
Format: Häftad
Språk: Engelska
Sidor: 256 st
Foreword  ix

Acknowledgements  xi

1 Introduction  1

1.1  What is testing?  1

1.1.1  Why do we test?  2

1.1.2  Facts of life in testing  2

1.1.3  Testing and development  3

1.1.4  Quality of testing  5

1.2  An introduction to TestFrame  6

1.3  The TestFrame model  7

1.3.1  Reusable test products  8

1.3.2  Fitting  8

1.3.3  Structuring  9

1.3.4  Tooling  9

1.4  Phasing with TestFrame  10

1.4.1  TestFrame products  11

1.5  Preparation  12

1.5.1  Risk analysis  13

1.5.2  Test strategy  13

1.5.3  Test plan  14

1.5.4  Test planning  14

1.6  Analysis  14

1.6.1  Clusters  14

1.6.2  Test conditions  15

1.6.3  Action words  15

1.7  Navigation  16

1.7.1  Test tools  16

1.7.2  Navigation script  16

1.7.3  Separating analysis and navigation  16

1.8  Execution  17

1.8.1  Test report  18

1.8.2  Error management  18

1.9  Summary  20

 

2 Preparation  23

2.1  Introduction  23

2.2  Preliminary study  24

2.2.1  General  25

2.2.2  Organization  26

2.2.3  Test effort  27

2.2.4  Physical test environment  29

2.2.5  Documentation / experts  31

2.3  Risk analysis  32

2.4  Test strategy  51

2.4.1  Organization structure  52

2.4.2  Quality attributes and their relative importance  53

2.4.3  Test types  56

2.4.4  Cluster matrix  57

2.4.5  Cluster cards  58

2.4.6  Tuning  60

2.5  The test plan  60

2.5.1  Test plan versus plan of approach  61

2.5.2  Test plan structure  62

2.5.2.1  Description of the assignment  62

2.5.2.2  Defining the scope of the assignment  63

2.5.2.3  Filling the TestFrame test model  64

2.5.2.4  Specifying the time schedule  65

2.5.2.5  Defining quality assurance  65

2.5.2.6  Describing the test organization  65

2.5.2.7  Defining standards and procedures  66

2.5.2.8  Miscellaneous  67

2.6  Structuring the test environment  67

2.6.1  Step 1: Determine the effect of the test’s scope on the test environment  68

2.6.2  Step 2: Draw up an inventory of the future production environment  69

2.6.3  Step 3: Draw up an overview of the required test environment  69

2.6.4  Step 4: Describe the differences between the test environment and the
(future) production environment  71

2.6.5  Step 5: Describe the responsibilities for structuring the test environment  71

2.6.6  Step 6: Describe the responsibilities in maintaining the test environment
during the project  72

2.6.7  Step 7: Structure the test environment and maintain it during the project  72

2.6.8  Step 8: Describe the responsibilities of test environment maintenance
after the test project  73

2.6.9  Step 9: Maintaining the test environment after completion of the test
project  73

2.6.10 Work area  73

2.7  Project file  74

2.7.1  Planning  75

2.7.2  Monitoring and control  76

2.7.3  Standards and procedures within the project  80

2.8  Summary  80

 

3 Analysis  83

3.1  Introduction  83

3.2  Test set structure  84

3.2.1  Initial database  85

3.2.2  Division into clusters  85

3.2.3  General cluster documentation  86

3.3  Scope  86

3.3.1  Determining the basic information  87

3.3.2  Determining the test’s depth of testing  87

3.3.3  Example of defining the scope for a test object  88

3.4  Clusters  95

3.4.1  Division into clusters  96

3.4.2  Recording clusters  97

3.4.3  Cluster overview  97

3.5  Test conditions  98

3.5.1  Creating the right test conditions  99

3.5.2  Recording test conditions  100

3.5.3  Example of test conditions for a test object  101

3.5.4  Another way of drawing up test conditions  104

3.6  Test cases  106

3.6.1  Naming action words  106

3.6.2  Naming arguments for action words  109

3.6.3  Documenting the action words  111

3.6.4  Example of action word documentation for a test object  111

3.6.5  Recording test cases  114

3.6.6  Example of test cases for a test object  115

3.6.7  Drawing up test cases  116

3.6.8  Documentation lines  122

3.6.9  Making optimal use of spreadsheet functionality  122

3.6.10 Argument commands  123

3.7  Test conditions and test techniques  126

3.7.1  Decision table technique  127

3.7.2  The decision table technique’s working method  128

3.7.3  Example of a decision table for a test object  131

3.7.4  Entity lifecycle test  132

3.7.5  Working method for the entity lifecycle test  132

3.8  Test cases and test techniques  134

3.8.1  Syntactic testing  136

3.8.2  Syntactic testing working method  136

3.8.3  Semantic testing  138

3.8.4  Semantic test’s working method  138

3.8.5  Joint testware development  141

3.8.6  Joint testware development working method  141

3.9  Data dependency  143

3.9.1  How to counter data dependency  144

3.9.2  Database structure  145

3.9.3  Contents of the initial database  146

3.9.4  Date synchronization  147

3.9.5  Loading the database via a spreadsheet  148

3.10  Summary  149

 

4 Navigation  151

4.1  Introduction  151

4.2  Opting for manual or automated test execution  152

4.2.1  Advantages of the traditional automated testing method compared to
manual testing  153

4.2.2  Advantages of automated testing using TestFrame compared to
traditional automated testing methods  154

4.2.3  Reasons for opting for a manual test with TestFrame  155

4.3  Technical test using record & playback tools  156

4.4  Navigation structure  157

4.4.1  The functions  158

4.4.2  Using libraries  160

4.4.3  Physical structure  161

4.4.4  The starter motor  162

4.4.5  The test tool  162

4.4.6  External tools  162

4.4.7  The test environment  163

4.4.8  Documentation  163

4.5  The engine  166

4.5.1  Routines which can be carried out by the engine  166

4.5.2  Recognizing action word functions  168

4.5.3  Check function  168

4.5.4  Commands of arguments  168

4.6  Developing an action word function  168

4.6.1  Feasibility  169

4.6.2  Preparation  170

4.6.3  Specifying action word functions  171

4.6.4  Testing the action word function  173

4.6.5  GUI-based systems  174

4.6.6  Character-based systems  178

4.7  Navigation standards  183

4.7.1  Variables  183

4.7.2  Constants  185

4.7.3  Action word names  186

4.7.4  Function layout  186

4.7.5  Agreements about programming  188

4.8  Alternative scripts for navigation  188

4.9  Summary  190

 

5 Execution  193

5.1  Introduction  193

5.2  Start position of the test run  194

5.3  Planning the test run  194

5.4  Test run strategies  196

5.4.1  Type A — full test set each time  197

5.4.2  Type B — run to first error; resume from there  197

5.4.3  Type C — run to first error; resume from start  199

5.4.4  Testing under pressure of time  200

5.5  Analysis of test results and test report  200

5.6  The transfer phase  202

5.7  Issue management  203

5.7.1  Consultation arrangement  204

5.7.2  Issue management procedure  205

5.7.3  What has to be recorded?  206

5.8  Test product management after the test process  209

5.8.1  Preconditions  209

5.8.2  Procedure  210

5.9  Summary  211

6 Test management  213

6.1  Introduction  213

6.2  Resistance  213

6.3  Commitment  215

6.4  Lack of clarity with regard to responsibilities  216

6.5  Conflicts outside the test process  217

6.6  Motivation  218

6.7  Dependencies  220

6.8  Summary  221

 

Appendix  223

A.1  Basics  223

A.2  Comments  223

A.3  Subroutines and functions  224

A.4  Variables and constants  224

A.5  Arrays  226

A.6  Flow control  227

A.7  Text manipulation  230

A.8  GUI functions in classes  231

A.8.1  Window class  231

A.8.2  Button class  232

A.8.3  Edit class  233

A.8.4  List class  233

A.8.5  Menu class  234

A.8.6  General functions  235

 

References  236

 

Index  237

 
Foreword  ix

Acknowledgements  xi

1 Introduction  1

1.1  What is testing?  1

1.1.1  Why do we test?  2

1.1.2  Facts of life in testing  2

1.1.3  Testing and development  3

1.1.4  Quality of testing  5

1.2  An introduction to TestFrame  6

1.3  The TestFrame model  7

1.3.1  Reusable test products  8

1.3.2  Fitting  8

1.3.3  Structuring  9

1.3.4  Tooling  9

1.4  Phasing with TestFrame  10

1.4.1  TestFrame products  11

1.5  Preparation  12

1.5.1  Risk analysis  13

1.5.2  Test strategy  13

1.5.3  Test plan  14

1.5.4  Test planning  14

1.6  Analysis  14

1.6.1  Clusters  14

1.6.2  Test conditions  15

1.6.3  Action words  15

1.7  Navigation  16

1.7.1  Test tools  16

1.7.2  Navigation script  16

1.7.3  Separating analysis and navigation  16

1.8  Execution  17

1.8.1  Test report  18

1.8.2  Error management  18

1.9  Summary  20

 

2 Preparation  23

2.1  Introduction  23

2.2  Preliminary study  24

2.2.1  General  25

2.2.2  Organization  26

2.2.3  Test effort  27

2.2.4  Physical test environment  29

2.2.5  Documentation / experts  31

2.3  Risk analysis  32

2.4  Test strategy  51

2.4.1  Organization structure  52

2.4.2  Quality attributes and their relative importance  53

2.4.3  Test types  56

2.4.4  Cluster matrix  57

2.4.5  Cluster cards  58

2.4.6  Tuning  60

2.5  The test plan  60

2.5.1  Test plan versus plan of approach  61

2.5.2  Test plan structure  62

2.5.2.1  Description of the assignment  62

2.5.2.2  Defining the scope of the assignment  63

2.5.2.3  Filling the TestFrame test model  64

2.5.2.4  Specifying the time schedule  65

2.5.2.5  Defining quality assurance  65

2.5.2.6  Describing the test organization  65

2.5.2.7  Defining standards and procedures  66

2.5.2.8  Miscellaneous  67

2.6  Structuring the test environment  67

2.6.1  Step 1: Determine the effect of the test’s scope on the test environment  68

2.6.2  Step 2: Draw up an inventory of the future production environment  69

2.6.3  Step 3: Draw up an overview of the required test environment  69

2.6.4  Step 4: Describe the differences between the test environment and the
(future) production environment  71

2.6.5  Step 5: Describe the responsibilities for structuring the test environment  71

2.6.6  Step 6: Describe the responsibilities in maintaining the test environment
during the project  72

2.6.7  Step 7: Structure the test environment and maintain it during the project  72

2.6.8  Step 8: Describe the responsibilities of test environment maintenance
after the test project  73

2.6.9  Step 9: Maintaining the test environment after completion of the test
project  73

2.6.10 Work area  73

2.7  Project file  74

2.7.1  Planning  75

2.7.2  Monitoring and control  76

2.7.3  Standards and procedures within the project  80

2.8  Summary  80

 

3 Analysis  83

3.1  Introduction  83

3.2  Test set structure  84

3.2.1  Initial database  85

3.2.2  Division into clusters  85

3.2.3  General cluster documentation  86

3.3  Scope  86

3.3.1  Determining the basic information  87

3.3.2  Determining the test’s depth of testing  87

3.3.3  Example of defining the scope for a test object  88

3.4  Clusters  95

3.4.1  Division into clusters  96

3.4.2  Recording clusters  97

3.4.3  Cluster overview  97

3.5  Test conditions  98

3.5.1  Creating the right test conditions  99

3.5.2  Recording test conditions  100

3.5.3  Example of test conditions for a test object  101

3.5.4  Another way of drawing up test conditions  104

3.6  Test cases  106

3.6.1  Naming action words  106

3.6.2  Naming arguments for action words  109

3.6.3  Documenting the action words  111

3.6.4  Example of action word documentation for a test object  111

3.6.5  Recording test cases  114

3.6.6  Example of test cases for a test object  115

3.6.7  Drawing up test cases  116

3.6.8  Documentation lines  122

3.6.9  Making optimal use of spreadsheet functionality  122

3.6.10 Argument commands  123

3.7  Test conditions and test techniques  126

3.7.1  Decision table technique  127

3.7.2  The decision table technique’s working method  128

3.7.3  Example of a decision table for a test object  131

3.7.4  Entity lifecycle test  132

3.7.5  Working method for the entity lifecycle test  132

3.8  Test cases and test techniques  134

3.8.1  Syntactic testing  136

3.8.2  Syntactic testing working method  136

3.8.3  Semantic testing  138

3.8.4  Semantic test’s working method  138

3.8.5  Joint testware development  141

3.8.6  Joint testware development working method  141

3.9  Data dependency  143

3.9.1  How to counter data dependency  144

3.9.2  Database structure  145

3.9.3  Contents of the initial database  146

3.9.4  Date synchronization  147

3.9.5  Loading the database via a spreadsheet  148

3.10  Summary  149

 

4 Navigation  151

4.1  Introduction  151

4.2  Opting for manual or automated test execution  152

4.2.1  Advantages of the traditional automated testing method compared to
manual testing  153

4.2.2  Advantages of automated testing using TestFrame compared to
traditional automated testing methods  154

4.2.3  Reasons for opting for a manual test with TestFrame  155

4.3  Technical test using record & playback tools  156

4.4  Navigation structure  157

4.4.1  The functions  158

4.4.2  Using libraries  160

4.4.3  Physical structure  161

4.4.4  The starter motor  162

4.4.5  The test tool  162

4.4.6  External tools  162

4.4.7  The test environment  163

4.4.8  Documentation  163

4.5  The engine  166

4.5.1  Routines which can be carried out by the engine  166

4.5.2  Recognizing action word functions  168

4.5.3  Check function  168

4.5.4  Commands of arguments  168

4.6  Developing an action word function  168

4.6.1  Feasibility  169

4.6.2  Preparation  170

4.6.3  Specifying action word functions  171

4.6.4  Testing the action word function  173

4.6.5  GUI-based systems  174

4.6.6  Character-based systems  178

4.7  Navigation standards  183

4.7.1  Variables  183

4.7.2  Constants  185

4.7.3  Action word names  186

4.7.4  Function layout  186

4.7.5  Agreements about programming  188

4.8  Alternative scripts for navigation  188

4.9  Summary  190

 

5 Execution  193

5.1  Introduction  193

5.2  Start position of the test run  194

5.3  Planning the test run  194

5.4  Test run strategies  196

5.4.1  Type A — full test set each time  197

5.4.2  Type B — run to first error; resume from there  197

5.4.3  Type C — run to first error; resume from start  199

5.4.4  Testing under pressure of time  200

5.5  Analysis of test results and test report  200

5.6  The transfer phase  202

5.7  Issue management  203

5.7.1  Consultation arrangement  204

5.7.2  Issue management procedure  205

5.7.3  What has to be recorded?  206

5.8  Test product management after the test process  209

5.8.1  Preconditions  209

5.8.2  Procedure  210

5.9  Summary  211

6 Test management  213

6.1  Introduction  213

6.2  Resistance  213

6.3  Commitment  215

6.4  Lack of clarity with regard to responsibilities  216

6.5  Conflicts outside the test process  217

6.6  Motivation  218

6.7  Dependencies  220

6.8  Summary  221

 

Appendix  223

A.1  Basics  223

A.2  Comments  223

A.3  Subroutines and functions  224

A.4  Variables and constants  224

A.5  Arrays  226

A.6  Flow control  227

A.7  Text manipulation  230

A.8  GUI functions in classes  231

A.8.1  Window class  231

A.8.2  Button class  232

A.8.3  Edit class  233

A.8.4  List class  233

A.8.5  Menu class  234

A.8.6  General functions  235

 

References  236

 

Index  237

 
Begagnad bok
499 kr
Fri frakt & skickas inom 1-3 vardagar
Köpskydd med Studentapan
Varje köp täcks av Studentapans köpskydd som säkerställer att boken kommer fram, att du får rätt bok och att skicket stämmer överens med beskrivning.
499 kr
Fri frakt & skickas inom 1-3 vardagar