Presenting Test Plans
In a sense, the usability test itself is a presentation of the test plan. The test plan is the agenda for the facilitator's meeting with the participant. But this is not the kind of presentation we're talking about here. Instead, this section will provide strategies for preparing and doing presentations with the test plan as a deliverable, as work product. Like any documentation review, the structure and agenda for the meeting should be driven by the meeting's purpose.
There are a few main reasons to review a test plan with the project team and stakeholders. In a high-level meeting, your aim is to get buy-in for the overall approach, while at the other end of the spectrum, you may want to do a dry run of the usability test, digging into every detail.
If your stakeholders do not have much experience with usability testing, you may need to stage a buy-in meeting to review the overall process with them. You could hold a session that provides an introduction to usability testing in general, but this won't be as meaningful to stakeholders as a test plan that deals with their particular product. In this kind of meeting, you must go through every aspect of the plan, but you do not need to go into a great level of detail on the script. Instead, spend even amounts of time on the objectives, the methodology, and the script.
Once your stakeholders are comfortable with the idea of usability testing and the approach you're taking, you can solicit input on the script. There are two kinds of input you need. First, you need to know if your range of scenarios covers every possible use of the system. Second, you need to know what to get out of each scenario. If you're already familiar with the system, you should be able to do most of this yourself, but the stakeholders may have some input.
Testing the Test
The best way to get input on a test plan is to do a mock test, running through each step of your plan with a participant who might be another member of the team not directly involved with the design. Doing a dry run of the test is a good way to ensure that the scenarios aren't too goofy and that you can do everything you need to in the given amount of time. A dry run can uncover potential logistical issues—for example, transitioning from one scenario to the next—or identify major holes in the script where scenarios or instructions for the facilitator should be.
A test plan isn't a hard story to tell, especially if the project team has bought into the idea of usability testing. While other documents in this book lend themselves to different meeting structures, the only decision you'll need to make for a usability test plan is how much time to spend on each section.
If you have only one meeting to review a test plan, make sure you cover each part—don't leave anything out—but you don't need to go into great detail on the logistics and methodology. If your stakeholders aren't big "detail people" you can also just hit the high points of the script, describing what scenarios you're testing and how those correspond to the different functions or areas of the web site.
Effective Presentation of the Test Script
If you've provided a lot of information in your test script, as in Figure 3.4, you need to decide how you will go through all the details during your review meeting. For the majority of reviews, describing the scenario and stating what will be read to the user may be sufficient. In these cases, you can gloss over the other information. If the scenario is especially complex, you may want to go through every detail. Additionally, if you're meeting with developers who will be building the product to test, you may want to spend more time on the expected product behavior—what the web site does in response to user actions.
Regardless of which method you choose to present the script, you may find that the meeting starts to get out of hand. Specifically, you may find your pristine usability test suddenly soiled by competing agendas—stakeholders or team members may want to add pointless and out-of-scope questions and tasks to the test. You may also find the review of the test derailed by questions about usability testing in general, or your methods.
Scope Creep: Losing Sight of Objectives
When presented with the opportunity to talk to actual end users, some stakeholders and team members are like kids in a candy store. They may start suggesting questions and tasks—reasonable though they might be—that are outside the scope of the test. Some stakeholders may use the post-test questionnaire, for example, to solicit feedback about other areas of their business. When this happens, the stakeholders need a reminder about the purpose of the test.
The flip side of this coin is that the suggested questions and modifications may fall within the scope of your script, but the script has become so long that there's no way to get through it all in the time allotted. This is a good indication that your objectives are not specific enough—they're not serving as an effective filter to keep the test focused.
Inevitably, you'll be presenting your test plan to a roomful of stakeholders, one or two of whom aren't familiar with usability testing. They may start questioning the purpose of the exercise, or worse, your methodology (see below). If your discussion is derailed by these kinds of questions, you should whip out your usability testing elevator pitch—the three-sentence description that at once gives an overview of usability while shutting down this line of questioning. Don't have a usability elevator pitch? This is your reminder to come up with one, or just use this one, free of charge:
Usability testing is a means for us to gather feedback on the design of the system in the context of specific real-world tasks. By asking users to use the system (or a reasonable facsimile) we can observe opportunities to improve the design, catching them at this stage of the design process rather than later when changes would be more costly. Usability testing has been built into the project plan since day one. We need to get through the plan because we have users scheduled to come in next week, but if you want to talk further about usability testing, we can discuss it after the meeting.
Feel free to modify this to suit your needs. The message is clear: I want to help you understand usability testing, but now's not the time.
Usability testing has been a popular technique for web designers for years, and yet we're still called upon to justify our methods. You may get questions about the kinds of data you're gathering or the number of participants, the length of the interview, or the format for follow-up questions. This is another opportunity to prepare three or four sentences on methodology, but you may have to have several prepared responses in anticipation of different kinds of questions. You know your stakeholders best, so confer with your team before speaking to the client, and brainstorm possible objections to the methodology.