Testing tool classificationRequirements testing tools Static analysis tools Test design tools Test data preparation tools Test running tools - character-based, GUI Comparison tools Test
Trang 1Tool support for testing (CAST)
Software Testing ISEB Foundation Certificate Course
Trang 2Types of CAST tool Why capture/replay is not test automation
Automating and testing are separate skills
Trang 3Testing tool classification
Requirements testing tools Static analysis tools
Test design tools Test data preparation tools Test running tools - character-based, GUI Comparison tools
Test harnesses and drivers Performance test tools
Dynamic analysis tools Debugging tools
Test management tools Coverage measurement
Trang 4Static analysis
Test design
Test data
preparation
Coverage measures
Test running
Dynamic analysis Debug
Performance measurement
Test harness
& drivers Comparison
Trang 5Requirements testing tools
n Automated support for verification and
validation of requirements models
consistency checkingconsistency checking
animationanimation
Tool information available from:
Ovum Evaluates Software Testing Tools (subscription service)
Trang 6Static analysis tools
n Provide information about the quality of
software
n Code is examined, not executed
n Objective measures
cyclomatic complexitycyclomatic complexity
others: nesting levels, sizeothers: nesting levels, size
Trang 7Test design tools
n Generate test inputs
from a formal specification or CASE repositoryfrom a formal specification or CASE repository
from code (e.g code not covered yet) from code (e.g code not covered yet)
Trang 8Test data preparation tools
n Data manipulation
selected from existing databases or filesselected from existing databases or files
created according to some rulescreated according to some rules
edited from other sources edited from other sources
Trang 9Test running tools 1
n Interface to the software being tested
n Run tests as though run by a human tester
n Test scripts in a programmable language
n Data, test inputs and expected results held in test repositories
n Most often used to automate regression
testing
Trang 10Test running tools 2
n Character-based
simulates user interaction from dumb terminalssimulates user interaction from dumb terminals
capture keystrokes and screen responsescapture keystrokes and screen responses
n GUI (Graphical User Interface)
simulates user interaction for WIMP applications simulates user interaction for WIMP applications (Windows, Icons, Mouse, Pointer)
capture mouse movement, button clicks, and capture mouse movement, button clicks, and
keyboard inputs capture screens, bitmaps, characters, object statescapture screens, bitmaps, characters, object states
Trang 11Comparison tools
n Detect differences between actual test results and expected results
screens, characters, bitmapsscreens, characters, bitmaps
masking and filteringmasking and filtering
n Test running tools normally include
comparison capability
n Stand-alone comparison tools for files or
databases
Trang 12Test harnesses and drivers
n Used to exercise software which does not
have a user interface (yet)
n Used to run groups of automated tests or
comparisons
n Often custom-build
n Simulators (where testing in real environment would be too costly or dangerous)
Trang 13Performance testing tools
Trang 14Dynamic analysis tools
n Provide run-time information on software
(while tests are run)
allocation, use and deallocation, use and de allocation of resources, e.g allocation of resources, e.g memory leaks
flag unassigned pointers or pointer arithmetic faultsflag unassigned pointers or pointer arithmetic faults
Trang 15Debugging tools
n Used by programmers when investigating,
fixing and testing faults
n Used to reproduce faults and examine
program execution in detail
singlesingle steppingstepping
breakpoints or watchpoints at any statementbreakpoints or watchpoints at any statement
examine contents of variables and other dataexamine contents of variables and other data
Trang 16Test management tools
n Management of testware: test plans,
specifications, results
n Project management of the test process, e.g estimation, schedule tests, log results
n Incident management tools (may include
workflow facilities to track allocation,
correction and retesting)
n Traceability (of tests to requirements,
designs)
Trang 17Coverage measurement tools
n Objective measure of what parts of the
software structure was executed by tests
n Code is instrumented in a static analysis pass
n Tests are run through the instrumented code
n Tool reports what has and has not been
covered by those tests, line by line and
summary statistics
n Different types of coverage: statement,
branch, condition, LCSAJ, et al
Trang 18Types of CAST tool Why capture/replay is not test automation Automating and testing are separate skills
ISEB Foundation Certificate Course
Trang 19Advantages of recording manual tests
n documents what the tester actually did
useful for capturing ad hoc tests (e.g end users)useful for capturing ad hoc tests (e.g end users)
may enable software failures to be reproducedmay enable software failures to be reproduced
n produces a detailed “script”
records actual inputsrecords actual inputs
can be used by a technical person to implement a can be used by a technical person to implement a more maintainable automated test
n ideal for one-off tasks
such as long or complicated data entrysuch as long or complicated data entry
Trang 20Captured test scripts
n will not be very understandable
it is a programming language after all!it is a programming language after all!
during maintenance will need to know more than can during maintenance will need to know more than can ever be ‘automatically commented’
n will not be resilient to many software changes
a simple interface change can impact many scriptsa simple interface change can impact many scripts
n do not include verification
may be easy to add a few simple screen based may be easy to add a few simple screen based
comparisons
Trang 21Compare seldom vs compare often
Failure analysis effort
Trang 22Too much sensitivity = redundancy
If all tests are
robust, theunexpectedchange ismissed
If all tests aresensitive, theyall show the
Trang 23Automated verification
n there are many choices to be made
dynamic / post execution, compare lots / compare dynamic / post execution, compare lots / compare little, resilience to change / bug finding effective
n scripts can soon become very complex
more susceptible to change, harder to maintainmore susceptible to change, harder to maintain
n there is a lot of work involved
speed and accuracy speed and accuracy of tool useof tool use is very importantis very important
n usually there is more verification that can
(and perhaps should) be done
automation can lead to better testing (not automation can lead to better testing (not
Trang 24Types of CAST tool Why capture/replay is not test automation Automating and testing are separate skills
ISEB Foundation Certificate Course
Trang 25Effort to automate
n The effort required to automate any one test varies greatly
typically between 2 and 10 times the manual test efforttypically between 2 and 10 times the manual test effort
n and depends on:
tool, skills, environment and software under testtool, skills, environment and software under test
existing manual test process which may be:existing manual test process which may be:
•• unscripted manual testing
•• scripted (vague) manual testing
•• scripted (detailed) manual testing
Trang 26Unscripted manual testing
Trang 27Wizzo Computer
Step Input Expected Result Pass
1 Run up Scribble Document menu displayed
2 Open file with sorted list Menus displayed
3 Select Add items to List Item box displayed
4 Add some items to List Items added in order
5 Move an item Item moved, list is unsorted
6 Add an item Item added at end of List
7 Delete item from List Item deleted
8 Delete item not in List Error message displayed
9 Save changes in new file New file created
Scripted (vague) manual testing
Trang 28Step Input Expected Result Pass
1 Run up Scribble Document menu displayed
2 Open file with sorted list Menus displayed
3 Select Add items to List Item box displayed
4 Add two items to List Items added in order
5 Move an item Item moved, list is unsorted
6 Add an item Item added at end of List
7 Delete item from List Item deleted
8 Delete item not in List Error message displayed
9 Save changes in new file New file created
A vague manual test script
Trang 29Wizzo Computer
Step Input Expected Result Pass
1 Double click “Scribble”
icon.
Scribble opened, List menu disabled.
2 Click on “File” menu File menu appears, options
enabled: Open, New, Exit.
3 Click on “Open” option “Open” dialogue box lists
document "countries.dcm" in current folder.
4 Select “countries.dcm”
and click OK button.
“Open” dialogue box removed, file “countries.dcm”
opened and displayed List menu enabled.
Scripted (detailed) manual testing
Trang 30Types of CAST tool Why capture/replay is not test automation Automating and testing are separate skills
ISEB Foundation Certificate Course
Trang 31Don’t automate too much long term
n as the test suite grows ever larger, so do the maintenance costs
maintenance effort is cumulative, benefits are notmaintenance effort is cumulative, benefits are not
n the test suite takes on a life of its own
testers depart, others arrive, test suite grows larger testers depart, others arrive, test suite grows larger nobody knows exactly what they all do … dare not throw away tests in case they’re important
n inappropriate tests are automated
automation becomes an end in itselfautomation becomes an end in itself
Trang 32Maintain control
n keep pruning
remove deadremove dead wood: redundant, superceded, wood: redundant, superceded,
duplicated, wornduplicated, worn outout challenge new additions (what’s the benefit?)challenge new additions (what’s the benefit?)
n measure costs & benefits
maintenance costsmaintenance costs
time or effort saved, faults found?time or effort saved, faults found?
Trang 33n commit and maintain resources
“champion” to promote automation“champion” to promote automation
technical supporttechnical support
consultancy/adviceconsultancy/advice
n scripting
develop and maintain librarydevelop and maintain library
data driven approach, lots of redata driven approach, lots of re useuse
Trang 34Tests to automate
n run many times
regression testsregression tests
mundanemundane
n expensive to perform manually
time consuming and necessarytime consuming and necessary
multimulti user tests, endurance/reliability testsuser tests, endurance/reliability tests
n difficult to perform manually
timing criticaltiming critical
Automate
Trang 35Tests not to automate
n not run often
if no need (rather than expensive to run manually)if no need (rather than expensive to run manually)
one off tests (unless several iterations likely and one off tests (unless several iterations likely and
build cost can be minimised)
n not important
will not find serious problemswill not find serious problems
n usability tests
do the colours look nice?do the colours look nice?
some aspects of multi-media applications
Automate
Trang 36Summary: Key Points
ISEB Foundation Certificate Course
There are many different types of tool support for
testing, covering all areas of the life cycle.
Automation requires planning and up-front effort
Identify and adopt best practice
Tool support
3 6