Publishers of technology books, eBooks, and videos for creative people

Home > Articles > Design > Voices That Matter

Q&A with Jakob Nielsen and Kara Pernice: An Interview by Jason Cranford Teague

Beauty may be in the eye of the beholder, but how can you truly know what they are beholding? In their new book, Eyetracking Web Usability, Jakob Nielsen and Kara Pernice explore usability through examining the gaze of the viewer. Jason Cranford Teague had a chance to talk to them about Web usability studies, navigation, misdirecting your audience, and the frustration of ad overload.
Like this article? We recommend

Like this article? We recommend

Jason Cranford Teague: From reading Eyetracking Web Usability, it’s clear that the designer’s primary goal should be to guide the viewer’s eye around the page, creating a hierarchy of information that helps the viewer find what they are looking for and accomplish a task. At its best, eyetracking should give designers a direct view as to whether they have achieved their goal. What are some reasons you find designers might not be receptive to considering eyetracking studies, and what do you tell them to overcome any doubts they might have about the tests’ validity?

Jakob Nielsen: The problem with eyetracking is that the particulars are so driven by the visual design of the specific Web pages we are testing. Whenever we show an example, it’s easy for designers to dismiss it as being caused by a usability problem on that site that they surely wouldn’t be stupid enough to make themselves.

That’s why it’s important to realize that the examples are just that: they are pictures to show a few typical cases of user behaviors, but the real findings come from our observation of those same user behaviors on hundreds of other pages that we can’t show in the book.

Ultimately, if people don’t want to believe in usability, they can always find some reason to dismiss any study. Too few test subjects, the wrong test participants, the wrong study stimuli, the wrong tasks, and so on. Or, when that fails, there’s always the old chestnut that “Jakob Nielsen used to be right, but he’s wrong now.” Of course, the same people who dismiss the new findings are the same ones who dismissed the old findings 5 years ago, 10 years ago, and 15 years ago, even if they now claim that the old findings were “obvious.”

There’s no real way to prove that research is done right other than to wait 5 or 10 years to see whether the findings become generally accepted. The one thing we can say is to ask readers to think about who was right about usability 10 years ago. Chances are that the people who knew how to do valid research then also know how to do it now.

Kara Pernice: I think designers actually welcome eyetracking results too easily. Unlike basic usability studies, eyetracking studies give you the impression that they are in the user’s head as he works. So to me, a bigger danger than getting people to believe, is getting them to believe that they have to look at a whole body of research and not just at one user’s tempting gazeplot on a page.

Jason: Eyetracking can be a powerful tool in the usability tester’s arsenal, but it can feel as if eyetracking results are treated like a Rorschach test, saying more about the person reading the results than the person being recorded. What advice do you give that can help ensure more accurate and unbiased analysis?

Kara: One of the main shortcomings of some eyetracking studies is that the facilitator sets it up like a Rorschach test. The problem is that a Web user is never handed one Web page and asked to dissect it. Of course designers examine individual pages and do critiques. But users do not. ET [Eye Tracking] technology makes it easy to run studies on a stand-alone Web page, but it is dangerous to test usability this way because it is not the way users think or work.

Usability people should test the Web as the fluid medium that it is. That is: Give the participants real tasks, very open-ended ones and very specific ones. Let them do their own tasks. Allow them to visit the sites and pages that they want.

People do not read a Web page as they would an inkblot that a scientist puts before them. The facilitator may show the user one element at a time, asking him to give it his full attention. But this is simply not even close to the way people work on the Web today, and it will give misinformed results.

People are usually doing a task, or they have already seen a part of the page on a previous page on the site so ignore that, either because they don’t like it or they simply don’t need it at the time. You should test and evaluate pages based on not just the one page or part you are interested in, but consider all of the pages that the user visited.

The last point is that the subject of an eyetracking test should never be the user. The subject is the designs he is using.

Jakob: In any study, one of the first things we tell participants is that “we’re not testing you, we are testing the design.” We partly say this to put people at ease, but we also say so because it’s true. It doesn’t really matter how any one person uses a particular Web site. The users are simply debugging tools to exercise the functionality in ways that exposes whether the design is easy or difficult to use.

If we’re testing a real technofreak, then that person may proceed through all the study tasks without making any errors. But there will still be places where even a geek is slowed down or sidetracked by a miscue. On the other hand, say we test somebody who’s really afraid of computers and doesn’t know anything about technology. Chances are that this person will have many more difficulties. But we’re not measuring the user’s speed; we’re looking for those elements of the site that are particularly misleading or awkward to use.

Jason: One worry I have about usability testing—and eyetracking testing specifically—is that a new user will act and react to a Web site very differently from a habitual user. How do you account for learned behavior in your studies?

Jakob: Very true: new users and experienced users do behave differently. But it’s important to remember that nobody becomes an expert without having been a novice first. And on the Web there’s no such thing as a training course. If a site is not immediately approachable, users simply turn away and go visit the next site on their list of search engine hits.

A second reason to focus on new users is that most Web pages are only visited once by any given person. It’s pretty rare for people to return repeatedly to the same page, even if they are loyal users of a particular site. So on the page level, you often have to cater mainly to first-time visitors to that page.

All that said, it’s obviously also important to study expert user performance, especially for applications such as intranets where employees may use mission-critical applications every day of the year. Doing so just requires a different set of user research methodologies than the ones we employed in this project. So skilled users will have to remain the topic of another book.

Kara: We consider various user groups in all of out larger-scale studies. In our eyetracking research we did evaluate new users and experienced Web users separately, and we found very few differences. Probably the biggest difference is that newer Web users are slightly less sure of themselves, and less familiar with UI conventions. So basically newer Web users will sometimes simply fixate more or longer than more experienced Web users do.

That said, it is incredibly hard these days to even find new Web users. Short of tapping in to the Seniors market or running studies in third-world countries, we are hard-pressed to find people who aren’t familiar with links, buttons, applications, and forms. Regarding Seniors, we do many Web usability studies with people over 65, but they are not an ideal group to include in eyetracking studies purely for logistical reasons: Regression lenses in glasses, bifocals, and various eye diseases that come with age conflict with the eyetracker and impact calibration needed to capture the user’s gaze

Jason: How important was the testing environment for ensuring consistent results in your study?

Kara: It was important to the degree that we needed to keep the eyetracker functioning well. This meant that the lighting in the room needed to be good but not glaring, the computers had to be powerful and meet the eyetracking manufacturer’s specs, the user had to sit a certain distance from the monitor, and the user’s chair couldn’t swivel or roll.

Jason: Web sites are constantly being redesigned. It seems like Facebook has a facelift every other month. Are the older screen shots presented in your book still valid?

Jakob: Definitely. This is not a book on how to redesign any of the particular Web sites we tested, so it doesn’t matter whether they have changed during the time we spent on data analysis and writing. The screenshots are purely examples to illustrate the research findings, and we keep seeing the same usability issues during studies we have conducted after the book was published.

Usability findings don’t change very fast because they are based on human behavior, not on technology. And users are pretty much the same this year as they were last year; certainly when it comes to issues like how they scan navigation menus or look around big, frivolous pictures on Web sites.

Jason: It often feels like Internet Explorer 6 will never go away. If the tests were done using IE 6 before the feature where each page opens in its own tab, how does this affect the results in the book?

Kara: In behavioral and ET studies on IE 6 or IE 8 people often tune out the browser’s chrome, including tabs, once the page has fully loaded. They use the toolbars, search, and menus. But once focused within the sites they practice “selective disregard”: They know the tabs are there but put them out of their mind until they need them. They do the same thing with menus within the Web page, shopping carts, and all kind of things. The few situations where the tabs may actually have a little impact are mainly in compare-and-contrast scenarios and on search engine results pages and other directory-type scenarios. If people open multiple tabs (or windows for that matter) and are switching back and forth between tabs, it can affect mostly their initial looks on a page. Some of these (we determined about 300 milliseconds worth) can often be discounted as they are almost the same things as “residual looks” from the previous page.

Regardless, people still use the Back button very frequently, tabs or no tabs. Typical users rarely customize their defaults to have links auto-open new windows, and even experienced Web users do not always think to open a link in a new tab or window. It’s really a moot point in many ways because the eyetracking technology we use, even today, will not show the tabs in the user’s browser. This is hard-coded in the eyetracking software, so when we test with IE8 the tabs simply do not show for the users in the eyetracking framework.

Jakob: It actually matters surprisingly little what browser people use. Most of the basic usability findings have been the same from Mosaic 1 to IE 8, with regard to things like users’ preference for getting the information fast and their dislike of glossy, uninformative pictures.

Of course, the browser version matters a lot to designers, because it dictates what technical tricks they can employ. (Here, the advice remains to stay two versions behind, so that people can use your site even if they are slow to upgrade.) But users don’t care whether something is implemented in tables or CSS, or whether a feature is done in Flash, Silverlight, or AJAX. In all our books, we focus on user experience as experienced by the user, and not on the implementation.

Jason: How do you deal with dynamic elements such as drop downs and AJAX-based content when performing eyetracking tests?

Jakob: While we’re watching the user, a dynamic screen element works exactly the same as anything else: we can observe whether the user’s gaze is directed at whatever information is currently displayed in a certain spot on the screen. The same goes for watching slow-motion replays after the study.

The problem comes when we want to create visualizations like the ones used for the illustrations in the book. Here, we depend on having a screenshot as the background for the picture. And static screenshots are not representative of dynamic screens. So if you make a heatmap of a page with, say, dropdown or flyout menus, it will appear that users spent significant time staring at blank areas of the screen. That’s because they looked at the menu items which were temporarily displayed in those areas, even if they don’t appear on the screenshot.

This is only really a problem for writing a book, where we have to rely on printed screenshots. In our seminars, we show video clips with slow-motion replays of the users’ gaze paths, and there you can clearly see how people look at the dynamic elements, as they appear and disappear.

It’s easy enough to analyze the usability of dynamic screen elements. We just can’t show good static visualizations, so we have to do without the examples.

Jason: I’ve been reading a lot of debate about whether horizontal or vertical navigation is more user friendly. Have you encountered any strong evidence for either case while performing your eyetracking studies?

Kara: What we did learn is the entire site aesthetic and IA [Information Architecture] dictate whether the navigation works well. So many instances of both horizontal and vertical nav work well or miserably. It’s really not a “one or the other” kind of decision. But one thing vertical nav has over horizontal is that there is less likelihood that there will be more than one vertical column of items, where there are often several horizontal rows of items, such as banners, various types of nav, login, shopping cart, and search. The more UI elements that people have to scan, the harder it is to find the menu.

Jason: In Eyetracking Web Usability, you bring up the concept of “miscues”. How do miscues work to erode a customer’s trust in a Web site and how is that detected when performing usability testing?

Jakob: A miscue occurs when the user starts out doing something wrong, but then recovers and gets back on the right track. This is one of the usability issues that eyetracking is particularly good at revealing because we can observe users staring at something they don’t need and wasting time trying to decipher it.

You may think that miscues are no big deal because users are able to eventually get to the correct place and accomplish their task. If people can buy on the site, isn’t it just as good if they look at some more products than the ones they want?

The problem is the miscue is caused by something: the user spends time looking at the wrong thing because it seemed like the right thing. In other words, the design was misleading. If this happens too often, users start thinking that the site is trying to trap them. Credibility can be lost very fast on the Web where people have been trained to maintain a high level of vigilance due to the proliferation of scams.

Kara: You can almost think of miscues as simply distracting elements, something that draws the user’s attention when it shouldn’t. Too many of these can decrease a person’s feeling of control on a site.

Jason: What is the most important thing a designer can do to make navigation obvious to the viewer?

Kara: The menu should have has some visual border and color differentiator from the rest of the page. It should remain persistent and never change positions, so people can find it when they need it and tune it out when they do not.

Jakob: Let me add that sometimes we find that users don’t fixate on the navigation at all. That’s actually good, because it means that the navigation is so consistently designed that people know what it is without having to inspect it closely. If users are satisfied with the content on a page and don’t want to move elsewhere, it’s great if the design allows them to spend their time on the good stuff and ignore the nav.

Jason: A growing trend I see in recent Web design is the emphasis of larger images—especially photography—as an integral part of the page’s message. What can eyetracking tell us about the best balance between imagery and text when communicating a compelling message?

Jakob: In our study, we found that big images often function as an obstacle course and that users deliberately look all the way around such images. This is because of the prevalence of big images that are purely decorative (or advertising) and don’t help users get any closer to their goals.

If there’s one thing we have learned from years of Web research, it’s that users are very impatient and don’t like to be slowed down. So these obstacle-course images are highly annoying.

On the other hand, we also find cases where users look closely at big images and really study the details. If fact, people often ask for even bigger images and enjoy features for zooming or getting close-up views.

What’s the difference between big images that are ignored and big images that attract attention? It’s whether the picture is content or fluff. In other words, does the picture contain something that users are interested in at that moment in time? If yes, people will enjoy the image and study it. If no, they will ignore it and be annoyed at having to look around it.

Kara: Stock images of smiling people are the biggest misallocation of screen real estate on the Web today. Stock art images are fine if they completely relate to the message the designer is trying to convey. But many that designers choose are so generic that they could appear inoffensively on just about any Web site.

Jason: One constant frustration Web designers face with clients or managers is when they tell us we have to fit more ads on a page to meet short term business goals. How can Web designers use eyetracking to show them that simply shoveling more advertisements on the page diminishes the effectiveness of the advertising and, indeed, of the site itself?

Kara: This is frustrating, but we have to empathize. Advertising is big business and feeds many people. Ads themselves are not the issue. Most users accept the model: suffer some ads and get free content. The problem is what you suggest—too many conspicuous ads that obliterate the content that users want.

Try pointing them to information about “Exhaustive Review.” The general rule is that more things on a page make it harder for people to find what they want. Cleaner pages inspire exploration and help the user feel in control.

If ads look like ads and don’t purport to be navigation then people can sometimes tune them out. So the real problem is not extra fixations or even a lot of extra effort. The real problem is how these ads affect the credibility and overall aesthetic of the site. Do you really want your brand to feel like Times Square? If the site doesn’t offer some major value to balance the inconvenience of all the ads, people will stop going there.

Jakob: The question is really long-term vs. short-term business goals. If you stuff a page with ads, you may be able to make money today from clueless marketers who don’t recognize that they’re paying to be displayed but not to be seen. In the long run, a page with too many ads means two things:

  • Poor performance of each ad, as they visually compete with too many other ads. Sooner or later, even the most clueless advertising manager will realize that the money is wasted, and they will stop doing business with you.
  • Poor user experience because the clutter makes it harder for users to locate the content they care about. Initially, page views won’t drop, because visitors only get annoyed after loading your pages. But sooner or later they will stop visiting, because each visit is so dissatisfying.

If you think the site won’t be around next year, then sure, go ahead and smother it under too many ads. But beware that this easily turns into a self-fulfilling prophesy.

Peachpit Promotional Mailings & Special Offers

I would like to receive exclusive offers and hear about products from Peachpit and its family of brands. I can unsubscribe at any time.

Overview


Pearson Education, Inc., 221 River Street, Hoboken, New Jersey 07030, (Pearson) presents this site to provide information about Peachpit products and services that can be purchased through this site.

This privacy notice provides an overview of our commitment to privacy and describes how we collect, protect, use and share personal information collected through this site. Please note that other Pearson websites and online products and services have their own separate privacy policies.

Collection and Use of Information


To conduct business and deliver products and services, Pearson collects and uses personal information in several ways in connection with this site, including:

Questions and Inquiries

For inquiries and questions, we collect the inquiry or question, together with name, contact details (email address, phone number and mailing address) and any other additional information voluntarily submitted to us through a Contact Us form or an email. We use this information to address the inquiry and respond to the question.

Online Store

For orders and purchases placed through our online store on this site, we collect order details, name, institution name and address (if applicable), email address, phone number, shipping and billing addresses, credit/debit card information, shipping options and any instructions. We use this information to complete transactions, fulfill orders, communicate with individuals placing orders or visiting the online store, and for related purposes.

Surveys

Pearson may offer opportunities to provide feedback or participate in surveys, including surveys evaluating Pearson products, services or sites. Participation is voluntary. Pearson collects information requested in the survey questions and uses the information to evaluate, support, maintain and improve products, services or sites; develop new products and services; conduct educational research; and for other purposes specified in the survey.

Contests and Drawings

Occasionally, we may sponsor a contest or drawing. Participation is optional. Pearson collects name, contact information and other information specified on the entry form for the contest or drawing to conduct the contest or drawing. Pearson may collect additional personal information from the winners of a contest or drawing in order to award the prize and for tax reporting purposes, as required by law.

Newsletters

If you have elected to receive email newsletters or promotional mailings and special offers but want to unsubscribe, simply email ask@peachpit.com.

Service Announcements

On rare occasions it is necessary to send out a strictly service related announcement. For instance, if our service is temporarily suspended for maintenance we might send users an email. Generally, users may not opt-out of these communications, though they can deactivate their account information. However, these communications are not promotional in nature.

Customer Service

We communicate with users on a regular basis to provide requested services and in regard to issues relating to their account we reply via email or phone in accordance with the users' wishes when a user submits their information through our Contact Us form.

Other Collection and Use of Information


Application and System Logs

Pearson automatically collects log data to help ensure the delivery, availability and security of this site. Log data may include technical information about how a user or visitor connected to this site, such as browser type, type of computer/device, operating system, internet service provider and IP address. We use this information for support purposes and to monitor the health of the site, identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents and appropriately scale computing resources.

Web Analytics

Pearson may use third party web trend analytical services, including Google Analytics, to collect visitor information, such as IP addresses, browser types, referring pages, pages visited and time spent on a particular site. While these analytical services collect and report information on an anonymous basis, they may use cookies to gather web trend information. The information gathered may enable Pearson (but not the third party web trend services) to link information with application and system log data. Pearson uses this information for system administration and to identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents, appropriately scale computing resources and otherwise support and deliver this site and its services.

Cookies and Related Technologies

This site uses cookies and similar technologies to personalize content, measure traffic patterns, control security, track use and access of information on this site, and provide interest-based messages and advertising. Users can manage and block the use of cookies through their browser. Disabling or blocking certain cookies may limit the functionality of this site.

Do Not Track

This site currently does not respond to Do Not Track signals.

Security


Pearson uses appropriate physical, administrative and technical security measures to protect personal information from unauthorized access, use and disclosure.

Children


This site is not directed to children under the age of 13.

Marketing


Pearson may send or direct marketing communications to users, provided that

  • Pearson will not use personal information collected or processed as a K-12 school service provider for the purpose of directed or targeted advertising.
  • Such marketing is consistent with applicable law and Pearson's legal obligations.
  • Pearson will not knowingly direct or send marketing communications to an individual who has expressed a preference not to receive marketing.
  • Where required by applicable law, express or implied consent to marketing exists and has not been withdrawn.

Pearson may provide personal information to a third party service provider on a restricted basis to provide marketing solely on behalf of Pearson or an affiliate or customer for whom Pearson is a service provider. Marketing preferences may be changed at any time.

Correcting/Updating Personal Information


If a user's personally identifiable information changes (such as your postal address or email address), we provide a way to correct or update that user's personal data provided to us. This can be done on the Account page. If a user no longer desires our service and desires to delete his or her account, please contact us at customer-service@informit.com and we will process the deletion of a user's account.

Choice/Opt-out


Users can always make an informed choice as to whether they should proceed with certain services offered by Adobe Press. If you choose to remove yourself from our mailing list(s) simply visit the following page and uncheck any communication you no longer want to receive: www.peachpit.com/u.aspx.

Sale of Personal Information


Pearson does not rent or sell personal information in exchange for any payment of money.

While Pearson does not sell personal information, as defined in Nevada law, Nevada residents may email a request for no sale of their personal information to NevadaDesignatedRequest@pearson.com.

Supplemental Privacy Statement for California Residents


California residents should read our Supplemental privacy statement for California residents in conjunction with this Privacy Notice. The Supplemental privacy statement for California residents explains Pearson's commitment to comply with California law and applies to personal information of California residents collected in connection with this site and the Services.

Sharing and Disclosure


Pearson may disclose personal information, as follows:

  • As required by law.
  • With the consent of the individual (or their parent, if the individual is a minor)
  • In response to a subpoena, court order or legal process, to the extent permitted or required by law
  • To protect the security and safety of individuals, data, assets and systems, consistent with applicable law
  • In connection the sale, joint venture or other transfer of some or all of its company or assets, subject to the provisions of this Privacy Notice
  • To investigate or address actual or suspected fraud or other illegal activities
  • To exercise its legal rights, including enforcement of the Terms of Use for this site or another contract
  • To affiliated Pearson companies and other companies and organizations who perform work for Pearson and are obligated to protect the privacy of personal information consistent with this Privacy Notice
  • To a school, organization, company or government agency, where Pearson collects or processes the personal information in a school setting or on behalf of such organization, company or government agency.

Links


This web site contains links to other sites. Please be aware that we are not responsible for the privacy practices of such other sites. We encourage our users to be aware when they leave our site and to read the privacy statements of each and every web site that collects Personal Information. This privacy statement applies solely to information collected by this web site.

Requests and Contact


Please contact us about this Privacy Notice or if you have any requests or questions relating to the privacy of your personal information.

Changes to this Privacy Notice


We may revise this Privacy Notice through an updated posting. We will identify the effective date of the revision in the posting. Often, updates are made to provide greater clarity or to comply with changes in regulatory requirements. If the updates involve material changes to the collection, protection, use or disclosure of Personal Information, Pearson will provide notice of the change through a conspicuous notice on this site or other appropriate way. Continued use of the site after the effective date of a posted revision evidences acceptance. Please contact us if you have questions or concerns about the Privacy Notice or any objection to any revisions.

Last Update: November 17, 2020