CrazyEgg To The Rescue Again

CrazyEgg is really one of my favorite secrets of being a metrics junkie, simply because it makes problems look so freaking obvious, when they could be buried if you relied on quantified analysis.  For example, there is nothing I can drill into in Google Analytics, A/Bingo, or my homegrown stats tracking which would have told me what this picture does:

That is my AdWords landing page.  I paid good money to get folks to see it, and I want them clicking the photo or the big purple button, not clicking the non-active text in the sidebar!  Thank you, CrazyEgg, you again earned your monthly keep and then some.

Sure, now that I know the problem is actually happening, fixing it is a matter of adding one line in a Rails template (to cause those bullet points to be hyperlinks to the conversion).  OK, two lines: one line to make the “obvious” fix… and another line for the A/B test.

No Responses to “CrazyEgg To The Rescue Again”

  1. Shadow14l March 21, 2010 at 10:50 am #

    Sorry but common sense would have been more apt in this situation. You don’t need a service to tell you that any list in a sidebar should be all linked.

  2. Jon March 21, 2010 at 11:37 am #

    Shadow14, you don’t get it. The point is that CrazyEgg will show you the points where your common sense failed. No one has 100% failproof common sense, knowledge and forecast ability. You might not need a reminder about sidebar links. You might need a reminder on other things. That’s it.

  3. Paul March 21, 2010 at 12:16 pm #

    Wow, that really is useful information! What interests me though is your response to such a picture – personally I would simply ‘make the “obvious” fix’. You propose also performing an A/B test – what exactly would you test in this scenario?

  4. David B. March 21, 2010 at 12:39 pm #

    It really fascinates me what these little tools can do to help improve conversion. Thanks for sharing!

  5. Patrick March 21, 2010 at 7:34 pm #

    >>
    You propose also performing an A/B test – what exactly would you test in this scenario?
    >>

    Alternative A is the site as I had it previously, Alternative B is the site with the five bingo cards on the left hand side linked to the appropriate registration screen, the conversion of interest is whether someone successfully registers for the free trial or not. If adding the links decreases registration in a statistically significant way, for whatever reason (say, by confusing customers with too much visual clutter) it gets yanked no matter how “obvious” of an improvement it is.

  6. Paul March 22, 2010 at 12:41 am #

    >>
    If adding the links decreases registration in a statistically significant way […] it gets yanked
    >>

    Interesting – the image implies that the links are desperately needed. I suppose I never considered that they could possibly _decrease_ registration rates; that seems so counter-intuitive.

    But I suppose that it is possible, and thus will probably happen sooner or later. In order to prevent it, would you say the best approach is simply “test everything”?

  7. Sylvain March 27, 2010 at 11:25 pm #

    Very interesting. Thanks for sharing.

    One thing that’s getting my attention is the vertical “hit” line on the right hand side of the side links. I wouldn’t have guessed that so many people would try and click away from the supposed link text. Probably worth considering to extend the link or even put a small clickable icon at the right of every item.

    Cheers,
    Sylvain

  8. Stuart March 28, 2010 at 1:34 pm #

    Have you tested the LP without a sidebar at all?

  9. Patrick March 31, 2010 at 6:58 pm #

    Now there is an idea…

  10. Anne Gunn April 1, 2010 at 6:52 am #

    You’ve probably seen this but I think it is a great post. It relates to your point that you can’t just rely on numbers/stats, you have to actually think about what is going on.

    http://blog.asmartbear.com/data-interpretation-mistakes.html

    p.s. Happy first day of going full time.

  11. Jeff April 7, 2010 at 4:25 am #

    In your A/B approach, are you randomly using one or the other for different users or did you go a period of time with A and then B (Hopefully for an equal period.)?

    Did you do any usability tests first with a few users?

    And how many more clicks does it take to justify making any changes?

  12. Patrick McKenzie April 7, 2010 at 5:39 am #

    Jeff: Users are randomly assigned to one of the two treatments and consistently see that for. For details, see the A/Bingo docs.

    I don’t do usability tests with users prior to deploying things live.

    Statistical significance. I use, if I recall correctly, one-tailed T tests.

  13. zackattack April 18, 2010 at 7:52 pm #

    Hi Patrick,

    Can you link us to the formula for your one-tailed T tests? Do you have an app for that, or what?

    Thanks,
    Zack

  14. Patrick April 18, 2010 at 7:59 pm #

    Zack: see the A/Bingo source code. The math behind A/B testing is also covered here: http://elem.com/~btilly/effective-ab-testing/

Trackbacks/Pingbacks

  1. Listen to CrazyEgg To The Rescue Again - MicroISV on a Shoestring - Hear a Blog - March 22, 2010

    […] CrazyEgg is really one of my favorite secrets of being a metrics junkie, simply because it makes problems look so freaking obvious, when they could be buried if you relied on quantified analysis.  For example, there is nothing I can drill into in Google Analytics, A/Bingo, or my homegrown stats tracking which would have told […] Original post […]