• Analytics – The Usability Lab of the new decade

    Brandon started a discussion on an internal mailing list, asking, “The usability lab is now the ________ ?”, and explaining:

    10 to 15 years ago the usability laboratory was the must-have for vetting and testing your design ideas. But more nimble development processes and new tools seem to have superseded the usability lab. Some of these are:

    • remote screen sharing and screen recording tools and services
    • voice of the customer feedback systems, like Get Satisfaction and other SaaS tools
    • A/B testing and multi-variant testing
    • remote co-design tools like online card sorting
    • survey tools for straight-up surveys and concept evaluations
    • public betas, previews, and opt-ins

      By picking a smart set of these tools, I think an organization can make smarter decisions throughout the design and development process in a live dialog with more users than a traditional usability lab ever could.

    The gist of the internal response was, “Analytics!” As Jesse put it, “Studying actual user behavior will beat the lab every time.” Analytics alone isn’t sufficient, so it’s best augmented with live intercepts of actual users engaging in real tasks.

    As I see it, though, this is not news. We’ve known about analytics, multi-variant and A/B testing, and the like for over a decade. Companies like Amazon have demonstrated just how powerful it is to instrument your site and use that data to drive key decisions. Heck, 5 years ago, Adaptive Path began work on what became Measure Map, an analytics tool designed specifically for bloggers, our reaction to a marketplace full of bloated tools.

    However, even today, when we ask our clients for analytics data, it is common that they do not have any. How is it that in 2010, there are still many organizations not taking advantage of this rich vein of information?

    The first reason (and, really, it’s my standard answer for almost anything to do with product development) is mindset. Most organizations simply don’t have the mindset to measure, analyze, and iterate to improve. Particularly product development teams. This is one place that marketing seems to have product development beat.

    For decades, marketing has tracked the impact of its efforts, and that practice has carried over to the Web. Product development often does not — before ubiquitous internet, there was no value in instrumenting software interfaces, because there was no way to get that data. So product development learned to make decisions using other methods, such as usability engineering. And once something ships, product groups tend to move on to the next thing, and aren’t particularly interested in how well what they shipped is performing — that’s now in the past. Product dev teams need to incorporate analytics feedback into their processes.

    Another issue is that came up in our company dialogue was that many companies that do use analytics tools don’t know what to do with the data. It’s not clear how to turn that data into information, and that information into valuable insights about what action take. They don’t know what questions to ask the data; or they ask too many questions, and they get so many answers, they don’t know how to prioritize the feedback; or they have a focused set of answer, but they don’t know what to change in their efforts to desirably “move the needle”.

    So what to do? Well, if you’re not embracing analytics, do so, now. Google Analytics is free. Use it. But don’t go crazy. Identify a few key measures. Allow a little while for the data to be dependable. Develop some hypotheses as to what’s happening, and make some small changes to test those hypotheses. Lather, rinse, and repeat. You’ll grow more confident, and as you do, you’ll get more sophisticated. Always remember that analytics are not “the answer” — but they are an input into a continually improving and iterating development process.

    There are 5 thoughts on this idea

    1. Glinskiii

      Hi Peter,

      Enjoyed the read. As a former Analytics practitioner (turned strategist), I can say from experience that our WA discipline is a compliment to Usability and not a replacement. When you rely on web analytics data as your only source of input, you end up relying on your analyst’s heuristic skills to make assumptions about design. Data can only take you so far, and when this happens, mistakes are inevitable.

      My recommended approach has always been to use analytics for macro observations as a way of isolating problems (optimize x page, improve x form area). Then, take it to lab to get specific. This process dramatically cuts down on testing time, but also gives you the qualitative user feedback to build a compelling case to create change. No matter how compelling the data is, it’s hard to get someone to empathize with a report. Analytics + Lab is a one two punch.

      With all that being said, a UX practitioner well versed in both WA and Lab methods is a killer combination and a great option for low cost experience optimization. If a client isn’t willing to pay for both WA skills and real usability, it’s going to be a tough sell to get them to develop a testing discipline that’ll give them any real level of insight.

      Keep the great content coming.

      ~patrick

    2. Gino Z.

      The bullet point I see missing here is building in tools for conversation directly with your audience. This can’t always happen, but it is one of the most powerful tools I’ve personally experienced (a la Flickr’s forums or Basecamp Answers).

      It’s a slippery slope because you open yourself up to direct communication (and there are crazy people on the internet!), but this method provides the most insight about peoples’ concerns and things they love. It also fosters the most trust, because your users get to talk directly with the folks that design and build the stuff they’re using. That’s rare.

    3. Brandon Schauer

      I think that saying this is all about analytics missed the mark. The nature of the internet and the availability of new and cheap tools makes it possible to do research, vet concepts, and try out alternative designs before they ever hit prime-time usage where analytics is one of the preferred tools.

      If you’re only using analytics, you’re missing many new ways of integrating your customer into the design process before you invest in development and launch.

    4. Samantha Starmer

      I’m in agreement with Brandon. In my opinion, the key is that using only analytics, usability, contextual inquiry, or any research and insights technique alone means you miss out on a huge amount of valuable information. Just this week I posted a brand new position on our UX eam at REI as a UX Sr. Analyst. The function is expected to synthesize quantitative (WA, MVT, surveys, etc) and qualitative (customer feedback, call center info, etc) to come up with customer experience recommendations and help with prioritization.

    5. Sally Carson

      Building on what Samantha’s saying, I think analytics provide the “what” and qualitative analysis can provide the “why.”

      For example — and pardon me if you’ve heard me use this example before:

      If using Google Analytics, you see that time spent on a particular page is unusually high, that could mean that (1) the page is providing something of value that’s causing the user to linger, OR (2) the user is stalling there, unable to locate what they’re looking for.

      These are two very different scenarios, each showing the same data, and only qualitative analysis could help explain whether the data is cause for concern or celebration.

    Comments are closed.

  • Close
    Team Profile