Monday, January 21, 2013

Is a Technical Test Team a Good Thing?


During a recent exchange with another testing blogger, he made the following comment:
“It's fabulous to have at least one testing toolsmith on a project, but a project that's dominated by them will often tend to focus on the tools, rather than the testing.”
This is an interesting comment. And it goes against just about everything we been working for over the past few years. We built a technical test team where everyone is an automated tester, and everyone is a programmer/tool builder.

In software testing, there is a risk of spending too much time on the wrong things.  I imagine that is common that testers who are automators prefer to spend more time working on automation rather than manual testing and other non-technical tasks.  It is like a child wanting to skip the vegetables and go straight to dessert. Delicious but not very healthy.

The other blogger also suggested that a team of technical testers may share a single point of view with the application programmers.
"Usually, there are more than enough programmers on a project, and therefore more than enough people who think like programmers."
His concern is that many programmers think alike and have similar biases. The role of testers is to bring a fresh perspective to the project and a technical-orientated team may not be able to do that.

Interesting points.  I can only respond to these arguments from my own experience.

  • We have a large technical test team. Everyone is a programmer or at least has growing programming skills. Non-technical testers are not considered during the recruiting process.
  • We are an agile shop where we are heavily influenced by the principles of agile. I make this distinction because there many agile shops and yet there are relatively few agile shops. Many places dress traditional command and control project management in agile clothes and then think they are agile.
  • In our shop, development teams are largely self organizing – they own most of their own decisions and figure out how to work most efficiently. One key to making this work is direct feedback. If things are not going well, it is easy to tell (solving problems is another story).
  • Success of our testing team is measured by how well it enables the development process.  If we are doing testing right, we speed programming and application development.  Testers enable development to start effectively very early in the sprint, provide immediate feedback throughout the sprint, and give the team confidence to program and to commit code late in the sprint.  We deliver our code to production at the end of each sprint, so testers have to do things right to keep this process moving.  
So, back to the comments.

Technical testers will “focus on tools, rather than the testing.”  This cannot happen (at least for long) in an agile shop like ours.  A tester who does not support the development process will slow it down and the direct-feedback machine will roar.  In fact, the only way we are able to develop and to test efficiently and deliver to production in short cycles is because we have technical testers who can leverage themselves.

The development team will fall into 'group think' when application programmers and testers share the same skill set.  As I look around the development team floor, I see a diverse cast of characters, and group think does not seem a big concern to me.  Instead of seeing a limitation when programmers and testers share a skill, I see it as a bonus.  If the programmer and the tester don’t speak the same language, they probably won’t talk as much or share as many ideas.  It is much easier for programmers to explain their code to technical testers than it is to those who have no idea what the programmer is saying.

The programmers working on our test team want to be on the test team.  They are not here as a punishment or to bide their time.  They are testers who are also programmers.  Not all programmers have this interest in testing, and that difference may be enough to vaccinate us against group think.



Thursday, January 17, 2013

View Into the Automation Lab


The fun of having technical skills on a test team is that you can fix problems that bother you. One  bothersome thing that irked us is babysitting automated test execution.  Starting the execution is easy  (the build system does it), but managing the run to completion (rerunning failed tests) eats time.

Here are details about our situation:
  • Our automation environment has a limited number of test machines (11-ish to be exact), these machines are locked away in a room because the desktop cannot be locked to run the automated tests, and they are shared by teams across three continents (Europe, North and South America).  
  • Today we have a team of testers that monitor the automated regression run and make sure that all of the tests have run.  When tests are identified for rerun this team will then take a group of tests and manually kick them off.
  • Once the tests are running, someone on the team then monitors this set to make sure the tests complete and then kicks off another set of reruns as they are identified.
  • If anyone from the other offices wants to kick off a test on those machines, he has to coordinate it with the regression team.  There is no good visibility.
  • The end result here is a lot of time spent watching tests run and potentially creating large gaps in machine usage.

Cool Tech to Solve Problems

“Why not build a web based application that can be accessed by anybody in the company needing to know a current status of their tests?”  Right away the ideas started to fly around and the solution, while all relatively easy to accomplish, all of a sudden became further reaching than just automatically rerunning tests.  It would also be nice to have some sort of a dashboard that shows the status of the test machines.  Oh, and how about the ability to change which machine a test is running on.  While we’re at it, since certain projects may have a higher priority let’s add the ability to reorder tests in the queue so those can run first....the ideas were now flowing.

The first step of this solution was to build a function to monitor the test machines.  The rerun app will need to know this.  It would also be cool for people to know this too, so we built a web based dashboard that shows the current status of each machine.  It’s very simple, but provides a huge amount of information that is helpful when evaluating test runs.


This part is taken care of through Ruby and some SQL queries against the QualityCenter database.  Here's how QualityCenter database tells if a machine is available or not:

1: SELECT RN_HOST
2:   FROM td.RUN
3:  WHERE RN_STATUS <> 'Not Completed' 
4:    AND RN_HOST IN ('<your machine name here>') 
5:    AND RN_RUN_ID IN
6:       (SELECT MAX(RN_RUN_ID)
7:          FROM td.RUN
8:         GROUP BY RN_HOST)


With the dashboard in place, the lion’s share of the work is left up to backend code. The rest is display, database updates and other code-type stuff to give it some appeal and not look like a site that a bunch of QA folks came up with. Afterall, we are a technical test team!

Wednesday, January 9, 2013

The Technical On-Deck List


There have been times when I have questioned our approach to staffing the test team with only technical people.  What will we do when all the test frameworks and testing-related tools and utilities are built? How do we keep all these programmers in testing happy and productive?

Happily that day never seems to come.  The applications and business needs keep changing and, better yet, the more cool (and productive) stuff we do, the more cool (and productive) stuff we find to do. I call this the escalating awesomeness factor.

This struck me today as I was sitting with a few members of my team troubleshooting some automation code.  As we worked on this, my mind raced to a list of other projects not getting attention.  Here is a sampling from my technical on-desk list:

Automation of Test Maintenance for Existing Tests
We have a large test suite that we run every day.  Running them each day enables us to keep up with test maintenance (things shouldn’t pile up until the end of a sprint before production delivery).  Even with proper staffing (the right people and the right amount of the right people), some parts of test maintenance is tedious.  On the short list of things to do is to look at all the tasks we do for test maintenance and then program our way out of it.

Automated management for rerunning tests
Our build system kicks off our test suite, and most of the processes are automated.  But each day, we still spend too much valuable time babysitting tests.  One of the biggest time sucks is rerunning tests.  With our current application suite, we cannot send all our tests to the grid or cloud, and we have to work with a limited set of resources – licenses and workstations.  Using smart people to babysit machines is so wrong.  We must make tests and machines take care of themselves. We know what to do (in fact, we have a pretty cool plan for what to build), but the challenge is to find time and do it.

Create custom test portals for each project
Our current test repository is geared for the test team.  It does not present information in a way that a specific development team (product owners, application programmers, and testers) cares about.  Further, we are between test frameworks.  Most tests are in our old GUI framework, some are in our new services framework, and soon tests will be in our new GUI framework – and there are unit and programmer-written integration tests.  Project teams don’t care about which frameworks we use, they just want to know if their code and features are working each day.  We have a prototype that pulls all this together and presents it in a useful way, but it is one more side project that we have to find time and energy to complete.  The product teams will love it, though.

Being better with TDD and unit tests on our test team projects
This is a case of the shoe maker’s son having no shoes.  We on the test team have developed applications with no unit tests and no regression tests.  Eek.  We have some work to do to clean up our development projects.  Even though there is work here that doesn’t add any cool new features to our frameworks, I am looking forward to this as an excuse to work on some new testing tools.  We have been looking at Cucumber and RSpec for a while.  This will be a good opportunity for all of us.

Next generation tools
We are still using commercial tools and I am anxious to move away from them (and to save the licensing fees for the company).  There are things that I like about the current tools that we have to replicate before we leave.  Taking a relatively raw tool like WATIR (webdriver) and building all the features needed for abstraction, readability, reusability, and maintainability is a big job.  Even though I feel confident that we can use tools that are not complete and still get value, it is daunting task.  It hurts so good.

I love having a full queue of meaty technical work, and I love even more knowing that when this is done, the next work will be even more challenging (and fun)!

Thursday, January 3, 2013

Hello World


Welcome to our new blog.  Our goal is to build a technical testing community to share experiences as we work to improve our existing automated testing tools, develop new frameworks using open-source libraries, and build new tools to solve other problems as we see them.

We will share technical information and points of view that promote the technicalization (not a real word but I like it) of testing. We will share code, descriptions of our projects, failures, hacking stories.  Expect to see references to agile testing -- agile development is the best way we have seen to develop (program and test) applications.  Agile testing (which is fundamentally different from traditional QA) cannot work without technical testing.

What you will not see here is anything that resembles traditional software testing or QA.  If you are working in that environment, I am sorry.  I have been there, and I have no intention of going back.

The core group of this new community works together in the same development team.  As we grow, we hope to gather kindred spirits and hackers who share our interests and technical goals.

In addition to the blog, we will actively tweet while we hack and otherwise goof off together.  We use the hashtag #rubytest.  Watch for it.

Death to software testing.  Long live software testing!

-- Bob Jones