Continuing my lessons learned from the thrilling interviews and years of test experience I bring you the saga of Automation and why I am not unhappy about not getting a job.

During a recent interview I was asked about automation and how I would use it to make testing better for this specific company.  I thought a few moments and answered truthfully: ‘I don’t know… I don’t know what needs to be automated yet.”   This answer made sense to me, but apparently was NOT the answer the manager I was sitting across from was looking for or expecting. He gave me a moment to continue, and I, being wise, decided I hadn’t really put enough foot in my mouth and should go for broke. ” I would automate the boring things…”, I continued, ‘And the things that would be easy to validate, such as checking output or log files, and anything that is monotonous or repetitive so as to free up myself or other testers for real testing.’  There we go.. that did it. I could tell by the look on his face that he and I vastly differed on our appreciation and valuation of automation. I could also tell I was about to get the polite handshake, which was rather quick in coming, and then I was shuffled to the door.

I don’t dislike automation, in spite of me not wanting to use it everywhere. I think it has a place. It is a tool, and like any other tool, should be used when appropriate. What I hate… and when I say hate, I mean with the furious intensity of a thousand burning suns, is when someone wants to automate everything, without consideration for what the goal of automating everything actually would mean.  I think automation is excellent when used in a smart manner, applied to specific tasks rather than being used as the end-all solution to all testing.

One of my favorite bloggers and testers, Lanette Creamer, recently saidTo summarize,..the boring part IS your job as a tester. Any kind of tester. However, it is also your job to try to automate the boring part, for the simple fact that humans aren’t especially good at repetitive tasks, and brain engaged testing is the best sort of testing. It may not be possible to eliminate tedious checks, but try to reduce them where you can. ‘  I completely agree here, and this is an excellent way to approach automation, in my opinion. Use automation to reduce the wear and tear on testers, removing the grind, and allowing them to move forward to additional testing.  Of course, as the focus of Lanette’s article covered, sometimes the boring stuff can’t be avoid, or even automated, so trying to automate boring tasks isn’t always the smart play.

So, my being unwilling to try to shoehorn all of the testing into automation cost me an opportunity, but I really don’t feel like I lost out in this case. I’d have been miserable there, as I really need my random Exploratory testing to feel like I am testing stuff…  And I still don’t hate automation.  😉

As I noted in my previous post, I recently found myself going through interviews, as I am between contracts, and thought I would continue to discuss some of the questions I have been asked. Today I want to tackle a question I have been asked no fewer than three times so far:

What makes a great tester?  This question is generally one that I dread, as the answer is likely to vary depending on the manager that is asking it. Some will likely want someone who has attention to detail, or someone who is meticulous at detailing data collected from testing. Some may want someone who can follow instructions, or possibly someone who can code … And each of these are valid qualities to look for in a tester.

However, all that said, I still answer the question based on what I feel is important, and that quality is curiosity.  The desire to dig into something, the compelling need to poke something, to learn how it behaves and how it misbehaves.  Curiosity is key, in my opinion, for a successful tester since most of the other qualities can be learned, but if someone isn’t curious, if someone doesn’t enjoy digging into products to see what makes them tick, they simply won’t be a long term tester.

Each company and team will have its own processes and procedures, and people can learn those, just as they can learn to perform tasks related to testing. They can follow instructions and perform automation without having any investment into the test. They can, to a point, learn to be more attentive to detail and can learn process to document data.  On the other hand, I haven’t found a way to teach or train curiosity.

Why curiosity? Because I believe it leads to better Exploratory Testing, or Ad Hoc testing. It leads to discovering new bugs, or discovering bad behavior, intended or otherwise. It leads a tester to try new things, to try things that were likely not intended, but not prevented.  For example, on a project I recently worked on, one of the features was the inclusion of data compiled from a friends list, allowing the person to compare their game stats against their friends game stats.  This function relied on a series of server calls, each possibly taking up to 4 seconds, and with an overall timeout of 1 minute before an error was returned. Using the baseline info provided, it was determined that the expected number of friends on an average persons list would be between 10-15, with a max of 30-35. Unfortunately this data was not part of our product spec or any documentation. It was information I heard a pair of developers discussing in the hall, and while I didn’t pay attention to it at that exact moment, it was logged and cataloged somewhere in my head, as evidenced by the fact that on my drive home from work, I thought that we needed to develop tests to flex that area. When I got home, I mailed myself a message to look into it, asked the wife to remind me to test that area, and then scratched out some idea on how to test… all because I was curious to see if we could break the system or cause data loss. I was curious to see if we could cause the API to choke or otherwise behave badly!

With those parameters, I designed basic testing to validate the core functionality, and then thought ‘What if you had more friends?’ and expanded the testing to 40, then 50 accounts.  As we added more friends, we noted more and more timeouts, with up to 10% of the accounts failing to return data within that 1 minute window.  So we flagged the area, filed a bug and moved on due to time constraints.  Of course, as luck would have it, had we had the time to continue investigating we would have found that at 60 accounts in friends, its possible to create a hardlock, but that would be discovered later.   Even though there was not a person on the team with experience in this specific area, we were able to test it well enough to discover the root bug because we were curious and tried unexpected or unusual tests.

Curiosity.  It always gets my vote as the most important quality for a tester.

Recently I spent the afternoon going through an interview process for a Senior STE role that, unfortunately, I was unable to secure. However, the exchange was most excellent, as were the folks I was interviewing with, and through the interview there were many neat discussions to be had.  At one point the manager I was chatting with placed a soda can on the table and asked ‘How many test cases can you create for this can?’  I thought for a moment, and then asked ‘How many do you want? And how many you wish to be meaningful?’

This response garnered a smile, but it did lead to an exchange that was rather engaging. At first I was given no restrictions or limits, so I discussed the possible tests on the physical can itself… the height, the weight, and so on. The tab on the top, the opening, and so forth. Created a table to show all the possible intersections of the tests, and so forth.  Then I asked again for more clarification… and again received nothing, so I continued to dig deeper, to go into the stresses the can could withstand, the volume it could contain, the effects of pressure and heat or even cold on the can, and so on.  At some point in the exchange I turned and said ‘I can continue and drive the test count up to at least 100 if you wish, but to be honest, at some point, without knowing the scope, the tests lose meaning.’

Far too often we see this in a complex system, where there are test after test that, if we truly looked at the scope and intent, are completely unnecessary.  I ran into this recently while testing a console based game, and discovered we had a series of test that tested a function we were not even responsible for.  Not only were we testing outside of the scope of the tests, but we were running less comprehensive tests than the group that actually managed that area, so our data was inferior.  The time invested in running these tests was not insignificant, and as soon as we realized we could increase some coverage elsewhere that was more relevant.  So how did those tests get in to our plan, and why had we run them in the past?  At some point in the past, when the project had an entirely different team, someone felt it was important to have more tests, so more tests were added.  It was a quantity issue, rather than quality.  More tests created a great need for people, for time to dedicate to the task, and gave the impression that there was lots of activity to be done, in spite of the fact that about 8% of the total tests were later discovered to be outside of scope or just not necessary.

So how do you migrate from quantity to quality? How do you ensure your tests are relevant and appropriate? Scope.  Determining the intent, the scope, the purpose of the test, is absolutely essential.  In the case of the soda can above, my continued requests for more information, for definition of scope eventually lead to ‘We want to know if it works correctly’.  This limited direction, this meager scoping of the tests, at least points to a few areas we can test.  In a few moments I had a list of 31 tests that would be essential, such verifying the can can be sealed correctly after being filled, can it withstand pressure in a normal environment, can it withstand a reasonable amount of heat or cold and does the tab work. It also included exceptions, such as what happens if the tab breaks – is it possible to open the can?

With that little direction I moved the tests into basic functionality as well as usability, with a focus on how the end user will expect that can to work and how they will interact with the can.  This focus, this direction, allowed for a much short, much better focused list of tasks to test.  Rather than plunging into an overly bloated test cycle, with tests that had to bearing on the scope, we instead moved forward to test only the relevant issues that would be important to the user.

For most of the last 20 years I have spent my time learning, growing, seeking, sharing and embracing the world of Software Test.  And as rewarding as that journey has been, it is still not over, nor is it a journey I wish to take alone, so I offer this place, this blog, as a place to join me as I continue on my path to learn, to mentor and share, and most importantly, to test.

And as we journey together, I hope to learn as much from you, my visitors, as I can offer in return, allowing us both to experience growth and success.