Correctly according to the doctrine, we have collected our technical requirements for a test automation tool in countless workshops. All requirements are packed neatly and structured in Excel (because everyone can use it somehow - from the tester to the managing director). Thanks to platforms such as Testtool Review, the tools in question can then be found easily. Well, and what do we do now with the 10 remaining tools?
How about a counting rhyme?
"10 small testing tools can rejoice now,
one can't be tried, there were only nine"
We work with standard technology, program in a standard programming language and use a standard GUI without a lot of bells and whistles. (You can do exciting projects with it too, honestly!) So it's hard to understand why we can't evaluate a tool ourselves, but have to have it "set up" and "customized" by the manufacturer, and maybe even pay for it.
"9 no testing tools run overnight.
One crashed a lot, there were only eight."
It almost flew out of the final during the installation, because it could not be installed (again, a boring, freshly set up standard PC). Well, the support was nice and so it worked then anyway. But it didn't run stable on any of the 3 workstations used. Even the friendly support does not console over it.
"8 small testing tools, we don't want to bend,
one doesn't fit into the process, there were only seven."
We like to adapt our processes - if there is a reason for it. A tool can also be a reason. Because every manufacturer has probably also thought of something in terms of how the workflow and the functions are implemented. And there you can already assume a lot of experience of the manufacturer. But it has to be arguable. If it isn't, then it doesn't fit.
"7 small testing tools, what do we check first?
support, sales and supply, now there are only six of them."
When you purchase a tool, you always enter into a partnership. Such a commitment is usually long-term and the relationship must also fit. A sales Christmas card (quite analog, maybe even handwritten) is a nice attention. But if, after downloading the evaluation version, the sales department calls you every day ("Do you have any questions?, Can we help?") and sends you e-mails, it becomes penetrating and even if you have a question, you don't want to ask it anymore. Even the joy of the Christmas card doesn't help. Likewise, when it comes to support and the selection of training courses, etc., you want a fair "partnership". Unfortunately, not everyone delivers that...
"With six small testing tools we now play free.
Four we did not understand, there were only two."
Test automation tools can be complex and only masterable with in-depth training. But they don't have to be. If seasoned software developers and testers are not able to operate a tool via the documentation and their experience, it is questionable whether it can prove itself in daily use. And: There is another way, as the two remaining tools prove. Both can be operated so intuitively that it is a real pleasure to work with them. We even hear off the record that team members really enjoy using them to implement tests.
"2 small testing tools now remain to choose from,
'One will, the other won't' is clear either way."
The two finalists are neck and neck. Now both have to show what they can do. The final decision will be made in early 2014. Until then, I'm looking forward to the Christmas cards.