2002
DOI: 10.1002/spe.491
|View full text |Cite
|
Sign up to set email alerts
|

Lessons learned from automating tests for an operations support system

Abstract: We present experience gained in automating tests for an operations support system. A major portion of the effort was devoted to extending a commercial test tool so that testers could easily manipulate graphical user interface (GUI) objects on two implementations of the application. For this purpose, we developed a test automation library as support infrastructure for writing tests. The challenges and tradeoffs are discussed such as simplicity/complexity for a tester versus a library developer, hiding/exposing … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2012
2012
2021
2021

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 4 publications
0
5
0
Order By: Relevance
“…The high complexity of a test automation system is a risk to the perceived value of the automation. If manual testing is seen as easier to perform than writing the automated test scripts , the consequence may be that testers choose to work manually instead of automating tests, in particular if there is time pressure .…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…The high complexity of a test automation system is a risk to the perceived value of the automation. If manual testing is seen as easier to perform than writing the automated test scripts , the consequence may be that testers choose to work manually instead of automating tests, in particular if there is time pressure .…”
Section: Resultsmentioning
confidence: 99%
“…Inadequate development practices A critical insight is that test automation is software development and need to be treated as such . This means that there is a clear need for staff skilled in programming and an understanding of how to do software development , to be able to do effective and efficient development of the test automation.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Gut check: Also "data gut check." Quick, broad, and shallow testing (44) before and during data analysis. Synonymous words: smoke test, sanity check (45) , consistency check, sniff test, soundness check.…”
Section: Boxes Box 1: Terminologymentioning
confidence: 99%
“…Gut check: Also “data gut check.” Quick, broad, and shallow testing [ 48 ] before and during data analysis. Although this is usually described in the context of software development, the concept of a data-specific gut check can include checking the dimensions of data structures after merging or assessing null values/missing values, zero values, negative values, and ranges of values to see if they make sense (synonymous words: smoke test, sanity check [ 49 ], consistency check, sniff test, soundness check).…”
Section: Introductionmentioning
confidence: 99%