Monday, January 21, 2013

Is a Technical Test Team a Good Thing?


During a recent exchange with another testing blogger, he made the following comment:
“It's fabulous to have at least one testing toolsmith on a project, but a project that's dominated by them will often tend to focus on the tools, rather than the testing.”
This is an interesting comment. And it goes against just about everything we been working for over the past few years. We built a technical test team where everyone is an automated tester, and everyone is a programmer/tool builder.

In software testing, there is a risk of spending too much time on the wrong things.  I imagine that is common that testers who are automators prefer to spend more time working on automation rather than manual testing and other non-technical tasks.  It is like a child wanting to skip the vegetables and go straight to dessert. Delicious but not very healthy.

The other blogger also suggested that a team of technical testers may share a single point of view with the application programmers.
"Usually, there are more than enough programmers on a project, and therefore more than enough people who think like programmers."
His concern is that many programmers think alike and have similar biases. The role of testers is to bring a fresh perspective to the project and a technical-orientated team may not be able to do that.

Interesting points.  I can only respond to these arguments from my own experience.

  • We have a large technical test team. Everyone is a programmer or at least has growing programming skills. Non-technical testers are not considered during the recruiting process.
  • We are an agile shop where we are heavily influenced by the principles of agile. I make this distinction because there many agile shops and yet there are relatively few agile shops. Many places dress traditional command and control project management in agile clothes and then think they are agile.
  • In our shop, development teams are largely self organizing – they own most of their own decisions and figure out how to work most efficiently. One key to making this work is direct feedback. If things are not going well, it is easy to tell (solving problems is another story).
  • Success of our testing team is measured by how well it enables the development process.  If we are doing testing right, we speed programming and application development.  Testers enable development to start effectively very early in the sprint, provide immediate feedback throughout the sprint, and give the team confidence to program and to commit code late in the sprint.  We deliver our code to production at the end of each sprint, so testers have to do things right to keep this process moving.  
So, back to the comments.

Technical testers will “focus on tools, rather than the testing.”  This cannot happen (at least for long) in an agile shop like ours.  A tester who does not support the development process will slow it down and the direct-feedback machine will roar.  In fact, the only way we are able to develop and to test efficiently and deliver to production in short cycles is because we have technical testers who can leverage themselves.

The development team will fall into 'group think' when application programmers and testers share the same skill set.  As I look around the development team floor, I see a diverse cast of characters, and group think does not seem a big concern to me.  Instead of seeing a limitation when programmers and testers share a skill, I see it as a bonus.  If the programmer and the tester don’t speak the same language, they probably won’t talk as much or share as many ideas.  It is much easier for programmers to explain their code to technical testers than it is to those who have no idea what the programmer is saying.

The programmers working on our test team want to be on the test team.  They are not here as a punishment or to bide their time.  They are testers who are also programmers.  Not all programmers have this interest in testing, and that difference may be enough to vaccinate us against group think.



2 comments:

  1. I think you have a great setup. But in general I do see automation testers losing track of testing and focusing on automation and tools rather than testing.

    Am curious to find out what your test design process is. How do you decide what tests your are going to run and is this affected by any technical limitations?

    Also if there are tests which are tedious to automate what do you do with these?

    ReplyDelete
    Replies
    1. Hi Richard,

      I agree with your comment about automation testers losing track of testing and focusing on automation and tools. That is why everyone on our team has general testing responsibilities. One of our guys asked if he could focus solely on automation. I told him that if he works only on automation and is not involved with direct application testing, he will soon become irrelevant. Working on project sprint work keeps everyone focused and honest.

      Our test design process matches our development process. We work in two-week sprints, and development tasks are broken into one-day units (where we can). Testing happens early and often. In this case, the test-design process varies depending on the need. It range from simple, business-focused checks to broad exploratory testing to formal analysis/formal test design methods. Execution of these tests can also vary from manual to hybrid to fully automated, depending on what is most efficient.

      Thanks for asking…Bob

      Delete