SXSW 2008 Panel – A/B Testing: Design Friend or Foe?

A/B Testing: Design Friend or Foe?

– Corey Chandler Lead Interaction Designer, eBay Inc
– Jake Cressman Producer, Electronic Arts
– Chris Maliwat Sr Dir of Prod Mgmt, Vuze Inc
– Micah Alpern Design Dir Social Search, Yahoo! Inc
– Elliot Shmukler Principal Prod Mgr, LinkedIn

Very valuable panel on testing design changes to determine their business outcome before unleashing it on the public at large. Check out my notes below.

#### What is A/B testing? ####

– Divert small amounts of traffic to different versions of site design to get metrics on what works.

#### Is it the job of Interaction Designers to predict the outcome of our designs? ####

– Shmukler: it’s our job, but is an impossible one. Impossible to predict even with significant expertise in your domain area.
– Maliwat: You can predict, but you can’t know with what degree of resolution you are correct. Testing can verify and validate the decision / opinion
– Alpern: It is our job and is important, but we need to set up heuristics to verify what works and why.

> Netflix: “Predictions color our thinking. We need to test.”

– Maliwat: but that’s not to say they didn’t make choices and predictions.
– Shmulker: you can’t test every choice you make on a page, but testing gives you a way to refine.
– Cressman: every designer brings a toolbox of heuristics to a design. where do they come from and how are they validated is the question. seems they are just passed down from book to book and blog to blog. testing gives us a way to validate our heuristics.

> Article: Design choices can cripple a web site. “None of us know what the BEST page design or copy is until we test. It is time to put away the design/copy expertise of gurus and test.” Concept: there are NO real experts. Levels the playing field.

– Maliwat: you can’t crowdsource a design. at least, the application of that is minimal. iPod would not have been designed by a crowd. Contributions of A/B testing are to help optimize, but not to revolutionize. e.g. “Are two buttons better than one. Is red better than orange.”
– Cressman: sometimes the absurd ideas are effective… but you won’t find that out until you test.
– Alpern: A novice might use testing as their north star, but won’t solve fundamental underlying problems.
– Shmuckler: you don’t have the resources to test every possible option under the sun. You will always have to make choices. Expertise will always be required for starting. Small design changes won’t show you any difference in testing numbers. Changes have to be fairly major for A/B testing to actually show you if it is better or worse.
– Chandler: Small changes CAN affect major differences. Leading can make all the difference between something showing up above the fold vs. below the fold.

> Brian Eisenberg – Clickz Networks. How to decrease sales by 90%.
> “You should focus on one change at a time. See if it affects a change. Then make another change.”

– Chandler: There is a lot of overhead on testing. How to keep the big picture in mind? Is it worth doing the testing.
– Shmukler: If the risk is high, extensive testing should be done.
– Maliwat: Testing sounds dull doesn’t it? It doesn’t feel very creative. You have to balance incremental steps towards a solution with keeping the bigger picture in mind. If you are JUST doing incremental changes and not looking towards a larger vision, you are not effecting a very large change.
– Alpern: You have to be willing to take risks and push the envelope and take larger leaps than just small incremental changes.

> Jakob Neilsen: “A/B can only be done if there is one clear, measurable action.”

Chandler: There are things beyond what you are measuring (number of clicks, conversions, etc) that are important. There is a larger issue at play of people enjoying the experience, will they recommend this to their friends? How much attention to the quantitative vs. Qualitative elements?

– Shmukler: You always have to pay attention to the quantitative. There is ALWAYS a goal.

#### When is it worth it to compromise a core metric to improve customer experience? ####

E.G: sacrificing 25% revenue to build towards a larger vision.

– Shmuker: So define the larger vision as the metric you are measuring!
– Alpern: The longer term goals are harder to measure vs. instantaneous metrics (PVs, etc). If you are obsessed with the short term metric, your user experience can become perverted.

#### How might the role of designers change in the future as a result of this kind of measuring. ####

– Maliwat: testing shifts our responsibility, as our decisions require more validation. Ultimately this will be good for design. It’s good to test and prove before releasing something to a billion page views. But it’s making us feel less important. It doesn’t feel creative.
– Alpern: It’s exciting that the barriers to prototyping and creating products are becoming smaller. We can develop and share test examples with an appropriate sized focus group and that is empowering.

#### Chandler: How do we decide what level of quality needs to go out for the tests? ####

(E.G. Seeing if citrus fruits affect scurvy in sailors. Feeling bad for the sailors who didn’t get the citrus…) Are we willing to harm the experience of users for a period of time in order to test a possibility. Where is the threshold?

– Maliwat: When death is not involved, the thresholds are different. :) It’s okay to be radical.
– Alpern: If companies don’t treat their customers with respect, they will lose.
– Shmukler: Bad experience is acceptable for a short period of time
– Chandler: Sometimes you need an acclimation period though! How long are do we wait before we freak out about bad results of a test?

#### QA Session ####

** Q: What do you think about sales funnels. Isolated environments for an interaction. (this is a specific UI design question) **

– Alpern: it’s pretty common in registration, etc. The question is “when do you enter it.” Should never be a barrier to exploration. You should always allow a clear path back out. Isolated environments are usually a good thing. Also have to be careful to decide which metric you want to keep in mind because they might be in conflict! (completion rate vs. sales)

** Q: What is your relationship with in-house analysts? **

– Maliwat: I am those other people :). In general you have to be in agreement about what is important, and how to measure what is important.
– Shmukler: many times in-house analysts have an expertise on how to read the metrics and are less emotionally attached to the design and so could point out things you are blind to.
– Chandler: That interaction between the two groups is important. We provide each other insight. Designers can help to interpret the results better than analysts can. Analysts can provide better data than designers can.


Posted

in

by

Comments

2 responses to “SXSW 2008 Panel – A/B Testing: Design Friend or Foe?”

  1. Matthew Pennell Avatar

    Excellent notes. :) This was one of the panels that I really should have gone to (the topic being very relevant to my day job), but I was seduced by one of the more designer-y sessions instead. Now I can pretend I went to both!

  2. […] For a more detailed Summary of the session check Matt Heerema’s Web Design Blog […]

Leave a Reply to A/B Testing: Design Friend or Foe? @ SXSW :: Inside Out Cancel reply

Your email address will not be published. Required fields are marked *