Analysis can ensure software testers not only shape tooling but usage and strategy, suggests Tricentis Fellow, Satisfice CEO and consultant software tester James Bach.
In a blog post for Tricentis, Bach says he believes that many developers find analysis when software testing unpleasant if not downright intimidating.
“This is why I split out five domains of work and list analytical work as just one of those domains. Still, it’s an important domain, and my personal favourite,” he says.
Bach says even sitting down and working out exactly how you recognise bugs can pay dividends. How might a bug be present yet invisible to your team? How might it be visible and go un-noticed?
“If the product just changed, what is the best testing you can do in the next hour to catch problems that might have been introduced?” he says. “What is the difference between an important problem and an unimportant one?”
Asking questions that take a more analytical approach can help develop clever answers to interesting problems. Deep testing with a more analytical approach can find elusive bugs, suggests Bach.
“Of all the things you could test, what are you not testing?” he says. “What data are you testing with? What data should you be testing with?”
Tricentis, he notes, offers tools that provide some assistance in answering these questions.
“I have been particularly impressed with the analytical features of Tricentis LiveCompare.”
Bach says a test impact analysis (TIA) tool that helps determine which subset of tests to execute for a given set of changes, fitting a certain risk profile, is unfortunately not the norm.
Most other test tools focus more on administrative or technological aspects of the test process, he explains.
Bach says that as part of his role as Tricentis Fellow, he is working to shape tooling that helps testers do analysis and design tests, “not just push buttons”.
Read the full article, which continues to give some examples of analysis in and reasoning about software testing.