Systematic Software TestingGain an in-depth understanding of software testing management and process issues that are critical for delivering high-quality software on time and within budget. Written by leading experts in the field, this book offers those involved in building and maintaining complex, mission-critical software systems a flexible, risk-based process to improve their software testing capabilities. Whether your organization currently has a well-defined testing process or almost no process, Systematic Software Testing provides unique insights into better ways to test your software.This book describes how to use a preventive method of testing, which parallels the software development lifecycle, and explains how to create and subsequently use test plans, test design, and test metrics. Detailed instructions are presented to help you decide what to test, how to prioritize tests, and when testing is complete. Learn how to conduct risk analysis and measure test effectiveness to maximize the efficiency of your testing efforts. Because organizational structure, the right people, and management are keys to better software testing, Systematic Software Testing explains these issues with the insight of the authorsOCO more than 25 years of experience." |
Contents
When Are We Done Testing? | 264 |
Defect Discovery Rate | 265 |
Remaining Defects Estimation Criteria | 266 |
Running Out of Resources | 267 |
Measuring Test Effectiveness | 268 |
Customer Satisfaction Measures | 269 |
Help Desk Calls | 272 |
Number of Defects Found in Testing | 273 |
14 | |
17 | |
19 | |
20 | |
23 | |
26 | |
Software Risk Analysis | 28 |
When Should It Be Done? | 29 |
Step 1 Form a Brainstorming Team | 31 |
Step 2 Compile a List of Features | 32 |
Step 3 Determine the Likelihood | 33 |
Step 4 Determine the Impact | 34 |
Step 5 Assign Numerical Values | 36 |
Step 6 Compute the Risk Priority | 37 |
Step 7 ReviewModify the Values | 38 |
Step 8 Prioritize the Features | 43 |
Step 9 Determine the Cut Line | 44 |
Step 10 Consider Mitigation | 45 |
Planning Risks and Contingencies | 47 |
Project Assumptions | 51 |
Chapter 3 Master Test Planning | 53 |
Levels Stages of Test Planning | 54 |
Audience Analysis | 57 |
Activity Timing | 58 |
Standard Templates | 60 |
Sections of a Test Plan | 61 |
10 Test Plan Identifier | 62 |
20 Table of Contents | 63 |
40 Glossary | 64 |
60 Test Items | 65 |
70 Software Risk Issues | 66 |
80 Features to Be Tested | 67 |
90 Features Not to Be Tested | 68 |
100 Approach Strategy | 70 |
Methodology Decisions | 72 |
Resources | 74 |
Test Coverage Decisions | 75 |
Walkthroughs and Inspections | 76 |
Configuration Management | 79 |
Collection and Validation of Metrics | 81 |
Changes to the Test Plan | 82 |
Other Strategy Issues | 83 |
120 Suspension Criteria Resumption Requirements | 84 |
130 Test Deliverables | 85 |
140 Testing Tasks | 86 |
160 Responsibilities | 88 |
170 Staffing and Training Needs | 89 |
180 Schedule | 90 |
190 Planning Risks and Contingencies | 91 |
200 Approvals | 92 |
Chapter 4 Detailed Test Planning | 97 |
Acceptance Testing | 102 |
Audience Analysis | 103 |
Sources of Information | 107 |
User Responsibilities | 108 |
Usability Labs | 110 |
Alpha and Beta Testing | 112 |
Requirements Traceability | 113 |
Configuration Management | 114 |
Exit Criteria | 115 |
Release Strategies | 116 |
Gradual Implementation | 117 |
Phased Implementation | 118 |
Test Environment | 119 |
System Testing | 121 |
Sources of Information | 122 |
Software Configuration Management | 123 |
ExitEntrance Criteria | 127 |
Smoke Test | 129 |
Integration Testing | 130 |
Audience Analysis | 132 |
Sources of Information | 133 |
Integration Test Planning Issues | 134 |
Configuration Management | 135 |
Unit Testing | 136 |
Common Obstacles in Unit Testing | 137 |
Education and BuyIn | 138 |
Standards and Requirements | 139 |
Configuration Management | 140 |
Reusing Unit Testware | 142 |
Buddy Testing | 143 |
Chapter 5 Analysis and Design | 145 |
Creating Inventories | 146 |
Gather Reference Materials | 149 |
Determine Test Objectives | 150 |
Prioritize Objectives | 151 |
Parse Objectives into Lists | 152 |
Create an Inventory Tracking Matrix | 153 |
Identify Tests for Unaddressed Conditions | 155 |
Evaluate Each Inventory Item | 157 |
Maintain the Testing Matrix | 158 |
BlackBox vs WhiteBox | 159 |
BlackBox Science | 161 |
Equivalence Partitioning | 162 |
Boundary Value Analysis | 165 |
Decision Tables | 166 |
StateTransition Diagrams | 168 |
Orthogonal Arrays | 170 |
BlackBox Art | 173 |
Random Testing | 175 |
SemiRandom Testing | 176 |
Exploratory Testing | 177 |
WhiteBox Science | 179 |
Cyclomatic Complexity | 180 |
Scalability of WhiteBox Testing | 181 |
Test Design Documentation | 182 |
Test Design Specification Identifier | 184 |
Approach Refinement | 185 |
Feature PassFail Criteria | 186 |
IEEE Test Case Specification | 187 |
Test Case Specification Identifier | 189 |
Output Specifications | 190 |
Using a Spreadsheet | 191 |
IEEE Test Procedure Specification | 192 |
Example Test Procedure | 194 |
Chapter 6 Test Implementation | 199 |
Test Environment | 200 |
People | 202 |
Hardware Configuration | 204 |
Cohabiting Software | 205 |
Interfaces | 207 |
Source of Test Data | 208 |
Volume of Test Data | 211 |
Strategy | 212 |
Model Office Concept | 213 |
What Should Be Automated? | 214 |
Repetitive Tasks | 217 |
Tedious Tasks | 218 |
Avoiding Testing Tool Traps | 219 |
Great Expectations | 220 |
Automating the Wrong Thing | 221 |
Ease of Use | 222 |
Choosing the Wrong Vendor | 223 |
Unstable Software | 224 |
Doing Too Much Too Soon | 225 |
Underestimating TimeResources | 226 |
Poor Timing | 227 |
Evaluating Testware | 228 |
Quality Assurance Group | 229 |
Traceability | 230 |
Defect Seeding | 232 |
Mutation Analysis | 235 |
Testing Automated Procedures | 237 |
Chapter 7 Test Execution | 239 |
Before Beginning Test Execution | 240 |
Deciding Who Should Execute the Tests | 241 |
Deciding What to Execute First | 242 |
Writing Test Cases During Execution | 243 |
Test Log | 244 |
Test Incident Reports | 245 |
IEEE Template for Test Incident Report | 248 |
Incident Summary Report Identifier | 249 |
Impact | 250 |
Investigation | 252 |
Disposition Status | 254 |
Attributes of a Defect Tracking Tool | 255 |
Using Multiple Defect Tracking Systems | 257 |
Testing Status and Results | 258 |
Test Summary Report | 259 |
Test Summary Report Identifier | 261 |
Variances | 262 |
Recommendations | 263 |
Defect Removal Efficiency DRE | 276 |
DRE Example | 278 |
System Test DRE Example | 279 |
Unit Testing PRE | 280 |
Defect Spoilage | 282 |
Defect Density and Pareto Analysis | 284 |
Coverage Measures | 285 |
Requirements and Design Coverage | 286 |
Code Coverage | 287 |
Code Coverage Weaknesses | 288 |
Code Coverage Strengths | 290 |
Global Code Coverage | 291 |
Chapter 8 The Test Organization | 293 |
Test Organizations | 294 |
Sample Test Organizations | 295 |
Independent Test Teams | 296 |
Integrated Test Teams | 298 |
Developers | 299 |
Test Coordinator | 300 |
Quality Assurance QA | 301 |
Outsourcing | 302 |
Independent Verification Validation IVV | 303 |
Office Environment | 304 |
Location Relative to Other Participants | 305 |
Immersion Time | 306 |
Quiet Time | 308 |
Meetings | 309 |
Chapter 9 The Software Tester | 311 |
Characteristics of Good Testers | 312 |
Finding Good Testers | 313 |
Users | 314 |
QA Personnel | 315 |
Hiring Testers | 316 |
Read the Candidates Resume | 317 |
Learn How to Ask Questions | 318 |
Ask Candidates to Demonstrate Skills | 319 |
Answer the Candidates Questions | 320 |
How Many Testers Do You Need? | 321 |
Retaining Staff | 323 |
Working Overtime | 324 |
Software Tester Certifications | 326 |
Value of Certification to Testers | 329 |
Help in Getting a Job | 330 |
Value of Certification to Test Managers | 331 |
Motivation | 332 |
Value of Certification to the Company | 333 |
Recertification | 335 |
Chapter 10 The Test Manager | 339 |
What Is Management? | 340 |
Interpersonal Roles | 341 |
Informational Roles | 342 |
Management vs Leadership | 343 |
Leadership Styles | 344 |
Marine Corps Principles of Leadership | 345 |
The Test Manager As a Leader | 346 |
Authority | 347 |
Responsibility | 348 |
Span of Control | 349 |
Effective Communication | 350 |
The Test Managers Role in the Team | 351 |
Career Path Development | 352 |
Desktop Procedures | 353 |
Staying Late | 354 |
Motivation | 355 |
Building Morale | 356 |
Selling Testing | 357 |
Managers Role in Test Automation | 358 |
Obtain Adequate Resources | 359 |
Ensure Proper Integration of Tool | 360 |
Measure the Return on Investment | 361 |
Topics That Require Training | 362 |
Tools | 363 |
Communication Skills | 364 |
OnSite Commercial Training | 365 |
Training in a Public Forum | 366 |
Specialty Training | 367 |
Metrics Primer for Test Managers | 369 |
Benefits of Using Metrics | 370 |
Identify Risky Areas That Require More Testing | 371 |
Identify Training Needs | 372 |
Provide a Basis for Estimating | 373 |
Rules of Thumb for Using Metrics | 374 |
Normalize the Values of the Metric | 375 |
Periodically Revalidate the Need for Each Metric | 376 |
Respect Confidentiality of Data | 377 |
Present the Raw Data | 378 |
Format the Data for the Audience | 379 |
Provide Adequate Training | 381 |
What Metrics Should You Collect? | 382 |
Metrics Used by the Best Projects | 383 |
Measurement Engineering Vision | 385 |
Chapter 11 Improving the Testing Process | 387 |
Improving the Testing Process | 388 |
Baseline Current Practices | 390 |
Develop Vision and Goals | 392 |
FormulatePrioritize Requirements | 393 |
Establish a Project | 394 |
Develop a Plan | 395 |
Introduce Change Incrementally | 397 |
Pilots | 399 |
Return to Step 1 | 400 |
Ask the Practitioners | 402 |
Metrics | 403 |
Provide Training | 404 |
ISO Certification | 407 |
ISO Checklists | 408 |
Pros and Cons of ISO Certification | 410 |
Capability Maturity Model CMM | 411 |
CMM Level 1 Initial | 413 |
CMM Level 3 Defined | 414 |
CMM Level 4 Managed | 415 |
Test Process Improvement TPI Model | 416 |
Key Areas of the TPI | 417 |
Levels of Maturity | 418 |
Improvement Suggestions | 420 |
Description for Test Spec Techniques 5A | 424 |
Improvement Suggestions for Test Spec Techniques 5A | 425 |
Chapter 12 Some Final Thoughts | 427 |
Use Preventive Testing Techniques | 429 |
Develop a Testing Strategy | 430 |
Analyze Defect Trends and Patterns | 431 |
Conduct Training Continually | 432 |
Appendix A Glossary of Terms | 433 |
Glossary of Terms | 434 |
Appendix B Testing Survey | 451 |
Test and Evaluation Practices Survey | 452 |
Appendix C IEEE Templates | 459 |
IEEE Templates | 460 |
Test Documents | 461 |
Test Plan | 462 |
Unit Testing | 463 |
Test Design Specification | 464 |
Test Procedure | 465 |
Test Log | 466 |
Test Incident Report | 467 |
Test Summary Report | 468 |
Appendix D Sample Master Test Plan | 471 |
Sample Master Test Plan | 472 |
4 Test Items | 475 |
5 Software Risk Issues | 476 |
7 Features Not to Test | 477 |
Defect Tracking | 478 |
Test Cycles | 479 |
10 Suspension Criteria and Resumption Reqmts | 480 |
13 Test Environment | 481 |
15 Responsibilities | 482 |
17 Planning Risks and Contingencies | 483 |
18 Approvals | 484 |
Appendix B for STQEnet MTP | 485 |
Appendix C for STQEnet MTP | 486 |
Appendix D for STQEnet MTP | 488 |
Appendix E Simplified Unit Test Plan | 489 |
Simplified Unit Test Plan | 490 |
4 FunctionsFeatures Attributes SubFunctions | 491 |
5 Shared Elements | 492 |
7 NonModified or Other Functions and Attributes | 493 |
9 Approach | 494 |
Appendix F Process Diagrams | 495 |
Appendix G Bibliography | 499 |
500 | |
Index | 513 |
Other editions - View all
Common terms and phrases
acceptance test acceptance test plan activities buy-in certification changes clients code coverage complete configuration management create Cyclomatic Complexity defect tracking describe effectiveness equivalence partitioning evaluation example Figure formal function identify IEEE Std impact implementation important inspections integration testing interfaces inventory Key Point level of test lifecycle Master Test Plan methodology metrics modules number of bugs number of defects planning risks priority problem process improvement regression test release requirements Rick Craig schedule smoke test software development software engineering Software Metrics software risk analysis Software Testing staff Step Table techniques Template for Test Test Automation Test Design Specification test environment test execution test group Test Log test manager Test Objectives Test Procedure test set Test Summary Report test team testers testing effort testing organization testing process testing tools testware unit testing usability users Waterfall model