Web Cams and User Testing

CoFactors has been quiet for a few days because I’ve been travelling and participating in the other aspect of our business besides design work: usability testing + analysis. Specifically, in this case, we were working with a consumer e-commerce client at a testing lab outside Philadelphia. Yours truly is more typically involved in the analysis and strategy side of things, but this time it made sense to have me taking notes. This also meant that I had control of the web cam.

There are a variety of different posts that come to mind relating to the nuances of testing, of course. Catalyst uses a portable digital lab based around Morae’s a/v recording software, for instance (you can even see a Catalyst case study on their site). There are also the realities of recruiting the correct participants; the behind-the-glass client dynamics and etiquette; and even testing facility food. But the one I wanted to start with concerns the web cam.

Specifically: part of our portable lab set up involves two high-powered, networked laptops and a tripod mounted camera. The user and moderator are looking at a flatscreen connected to one laptop, in the testing room. Behind that flatscreen, facing the participant, is a webcam that records their facial expressions and gestures while they react to interface. (Of course, we record what their mouse is doing too). That camera is controlled by one of our notetakers, sitting in a different room. In this case, that was me. And it occurred to me to make mental note of three best practices:

1. Have a mobile camera

Catalyst has invested a great deal in devising a system that captures real time information and that also can produce high-quality DVDs within minutes of a test ending. Some of the discussions that occurred while this system was evolving concerned the type of camera and how mobile it should be. The partner in charge of the testing business, Peter, made a point of insisting that we shouldn’t have a static camera in case the participant moved around. This meant that some of our staff had to construct a system that would permit a totally stable picture that could nevertheless pan back and forth and up and down in real time.

2. Make sure the drive mechanism and cambera are unobtrusive

Peter also rejected, much to staff disappointment, a number of prototype set ups because they were “too noisy or flashy.” Our team had found a light, portable tripod with gears that could be operated remotely and to which a camera could be attached. But when you operated it, it sounded like a very loud electric car - and it also had a flashing red light. At the time, although I agreed that was cheesy and ridiculous, I didn’t quite appreciate how absolutely wrong it would have been for actual testing. The current set up has no light and barely makes a whirr.

3. Make sure the controls are truly usable

The second note-taker in our set-up is the one who tunes the client out and also doesn’t necessarily worry about helping the moderator adjust his or her questioning. Instead, this person stuffs on earphones and makes sure that conversation and data are being captured correctly. Usually, this means just monitoring the recording software and doing some smart typing.

But sometimes, as with one of the participants yesterday, it means behaviing more like a cameraman at a live sports event. One of our participants could not stop fidgeting: there was slouching; there was half standing; there was leaning fowards, backwards and sideways. Much waving of hands. But also an occasional high-degree of self-consciousness when it was remembered there was a camera present. The upshot: for an hour, to keep the participant solidly in the camera frame, I had to constantly track back and forth as well as up and down. Had the camera been immobile, or simply wide focus, we would have lost valuable reactions. If it had been loud or otherwise obtrusive, it would have made the testee additionally uncomfortable. And had I not been easily able to operate the device with one hand, I could not also have kept up with taking notes while also trying to focus.

Not rocket science, for sure. But testimony to the fact that an experienced anticipation of potential glitches is absolutely critical to polished usability testing. The idea is to have no distractions for the participant while capturing and enormous amount of behavioral data - and to be able to distribute some of that data in real time, with readily portable equipment. Actually achieving these goals, however, is harder than it seems…and often the result of an appreciation for the importance of small details.

JF

Leave a Comment