CrazyEgg is really one of my favorite secrets of being a metrics junkie, simply because it makes problems look so freaking obvious, when they could be buried if you relied on quantified analysis. For example, there is nothing I can drill into in Google Analytics, A/Bingo, or my homegrown stats tracking which would have told me what this picture does:
That is my AdWords landing page. I paid good money to get folks to see it, and I want them clicking the photo or the big purple button, not clicking the non-active text in the sidebar! Thank you, CrazyEgg, you again earned your monthly keep and then some.
Sure, now that I know the problem is actually happening, fixing it is a matter of adding one line in a Rails template (to cause those bullet points to be hyperlinks to the conversion). OK, two lines: one line to make the “obvious” fix… and another line for the A/B test.

Sorry but common sense would have been more apt in this situation. You don’t need a service to tell you that any list in a sidebar should be all linked.
Shadow14, you don’t get it. The point is that CrazyEgg will show you the points where your common sense failed. No one has 100% failproof common sense, knowledge and forecast ability. You might not need a reminder about sidebar links. You might need a reminder on other things. That’s it.
Wow, that really is useful information! What interests me though is your response to such a picture – personally I would simply ‘make the “obvious” fix’. You propose also performing an A/B test – what exactly would you test in this scenario?
It really fascinates me what these little tools can do to help improve conversion. Thanks for sharing!
>>
You propose also performing an A/B test – what exactly would you test in this scenario?
>>
Alternative A is the site as I had it previously, Alternative B is the site with the five bingo cards on the left hand side linked to the appropriate registration screen, the conversion of interest is whether someone successfully registers for the free trial or not. If adding the links decreases registration in a statistically significant way, for whatever reason (say, by confusing customers with too much visual clutter) it gets yanked no matter how “obvious” of an improvement it is.
>>
If adding the links decreases registration in a statistically significant way […] it gets yanked
>>
Interesting – the image implies that the links are desperately needed. I suppose I never considered that they could possibly _decrease_ registration rates; that seems so counter-intuitive.
But I suppose that it is possible, and thus will probably happen sooner or later. In order to prevent it, would you say the best approach is simply “test everything”?
Very interesting. Thanks for sharing.
One thing that’s getting my attention is the vertical “hit” line on the right hand side of the side links. I wouldn’t have guessed that so many people would try and click away from the supposed link text. Probably worth considering to extend the link or even put a small clickable icon at the right of every item.
Cheers,
Sylvain
Have you tested the LP without a sidebar at all?
Now there is an idea…
You’ve probably seen this but I think it is a great post. It relates to your point that you can’t just rely on numbers/stats, you have to actually think about what is going on.
http://blog.asmartbear.com/data-interpretation-mistakes.html
p.s. Happy first day of going full time.
In your A/B approach, are you randomly using one or the other for different users or did you go a period of time with A and then B (Hopefully for an equal period.)?
Did you do any usability tests first with a few users?
And how many more clicks does it take to justify making any changes?
Jeff: Users are randomly assigned to one of the two treatments and consistently see that for. For details, see the A/Bingo docs.
I don’t do usability tests with users prior to deploying things live.
Statistical significance. I use, if I recall correctly, one-tailed T tests.
Hi Patrick,
Can you link us to the formula for your one-tailed T tests? Do you have an app for that, or what?
Thanks,
Zack
Zack: see the A/Bingo source code. The math behind A/B testing is also covered here: http://elem.com/~btilly/effective-ab-testing/