May 28, 2013

Three Dimensions of Analytics and Optimization

I’m multiple years into an on-going quest to properly capture and articulate the different “roles” of an analytics and optimization organization. I’m not talking about “people roles” here so much as “types of deliverable value” by an organization. Through multiple jobs – both client-side and agency-side – I’ve grown and learned and refined that definition, so I feel like I’m getting close!

We’ve been noodling on that topic pretty actively of late, and the result seemed worth sharing:


That’s all there is to it. Taking them in order:

  • Performance Measurement – there is tremendous value in doing this well. It means capturing the right key performance indicators (KPIs), establishing targets for them, putting them onto a dashboard (with a few contextual/supporting metrics – emphasis on “few”), and automating the bejeezus out of it. Low latency data, cleanly structured, with big, blaring alert indicators for any KPI that is going awry. And very little on-going human tasks to maintain it.
  • Hypothesis Validation – this is where “analysis” occurs. It requires having a clearly articulated idea and qualification of that idea up front to confirm that, if the idea is validated (through historical data analysis, A/B testing, primary or secondary research, or some combination), action will be taken.
  • Support – this covers a smorgasbord of activity: end-user training, site tagging, quick data pulls, data governance, and so on. It’s all of the tasks that support the first two items (we have versions of the diagram that breaks these functions out into separate boxes…but they’re all lumped together here for a reason).

With this view of the world of analytics and optimization, a (somewhat) obvious question is: “How are you dividing your investment across these three areas?” I’ve seen plenty of companies that invest almost exclusively in Support, which is a problem, because there is no direct business value in that area.

In an ideal world, over 50% of your total analytics and optimization investment goes to Hypothesis Validation. That’s where the work that actually drives a business forward happens:

  • Performance Measurement is important, but it’s an inherently backward-looking function and alerting (reactive) system. Effective Performance Measurement is both an alignment mechanism for the organization and a hypothesis trigger (“Something is wrong! It could be because of <hypothesis>.”). 
  • Support is an enablement mechanism that ensures data is available, correct, and accessible (through both technology and training).

Hypothesis Validation, mind you, is not some sort of lofty realm where only skilled analysts can operate. Everyone in the organization can come up with hypotheses, and, with proper Support, many people in the organization can be empowered to do some level of validation on their own. The more people in an organization who understand that “being data-driven” means operating with a hypothesis-oriented mindset, the more value the organization will get from its data.


  • Austin

    3601 S. Congress Ave.
    Bldg. C302
    Austin, TX 78704
    United States
    View on Map
  • Cincinnati

    1311 Vine St
    Cincinnati, OH 45202
    United States
    View on Map
  • London

    14 Grays Inn Road
    WC1X 8HN
    United Kingdom
    View on Map
Map of Clearhead’s location in Austin, TX

Let’s talk.