We provide daring, entrepreneurial digital executives with the extra brains and brawn needed to fully utilize digital analytics and A/B testing practices and to drive disruptive results, change and learning.Contact Us
We provide our partners the full breadth of services required to build and sustain a data-driven organization.
Before companies begin to significantly resource analytics and testing in their organization, they often ask themselves “What does it mean for us to be data-driven? How would it even work? What would it look like?” Because it’s difficult to rebuild the ship while you’re already at sea, we help our clients re-imagine a “digital optimization” organization from the 50,000 foot view to the ground floor.
Based on a thorough assessment and synthesis of your current approach to analytics and testing, we design a new blueprint for organizing team members around the utilization of data.
The roles and chart is simply the first step. An optimization program requires definitions, deliverables, workflows, service levels and plans to get things running. We help bring the blueprint to life.
We know the tools, strategies and trends that are working (and not) in the market. We can help you avoid common pitfalls and adopt better tools and methods. Just ask us.
Digital optimization includes new concepts and practices that need to be described through a simple vernacular and workshops that are accessible and (yes) fun. We not only fish for our clients; we teach their teams why fishing is fun, how to fish and, then, we take them out fishing with us.
Once the digital optimization model is designed and ready for action, our clients often find that they need extra muscle and brains to augment their current team or scale the operation. We offer a team of nimble, expert, Clearhead-trained digital optimization all-stars to dive right in and bring the promise of Clearhead to life.
Once upon a time (not very long ago) we were the client – leading our own ecommerce and product development teams. We carried the responsibility of delivering on P&L commitments, developing our organization and driving growth. For years we wrestled with, and continuously refined, a process for managing teams and business decisions in a simple, data-driven, transparent way. Clearhead is the culmination of that experience.
We are obsessed with how a lean, data-driven practices drive smarter and more profitable businesses. If you fear that, in your company, “analytics” has become a euphemism for over-reporting and that your big decisions are still mostly driven by gut, we can help you change that (quickly). A clearheaded digital business is one that has a merit-based, sustainable approach to inspiring, validating and pursuing ideas that are most likely to improve business performance and satisfy customer demands.
The “Clearhead Process” is a simple, but meticulous, step by step approach to developing, prioritizing, validating and learning from well articulated digital business hypotheses. We have designed our process in such a way that it integrates well with most any set of stakeholders and roles and can be easily understood, adopted and replicated by even the most complex, legacy organizations.
Matt Wishnow is the founder and CEO of Clearhead. Previously, Matt was the founder of Insound.com and Drillteam Marketing. Launched in 1998, Insound.com is the the oldest and most respected of music web-stores, catering to vinyl enthusiasts and other music obsessives. Drillteam was an early social marketing agency, servicing Toyota, Target, Nike and other elite brands. Following the sale of Insound to Warner Music Group, in 2007 Matt was hired to design and lead WMG's Direct to Consumer business. He holds degrees in Fine Art and Semiotics from Brown University and recently moved to Austin, Texas.Close
Ryan Garner is the co-founder and EVP of Clearhead. Previously, Ryan was Vice President of Direct-to-Consumer for the Warner Music Group, responsible for delivering and supporting the web businesses of WMG’s biggest artists including Paramore, Bruno Mars and Wiz Khalifa. Before joining Warner in 2007, Ryan spent 5 years at JetBlue Airways as a product manager and solution architect. Ryan was an integral part of the early team that helped develop JetBlue.com into the airline's most important sales channel, responsible for 80% of sales at the time of his departure.Close
Melissa Cunningham is the Director of Client Engagement at Clearhead. Prior to Clearhead, Melissa held leadership roles at Bazaarvoice, Powered, and Coremetrics-IBM. Each of these roles focused on digital marketing measurement and analytics. Her focus on client engagement and training has allowed her to partner with innumerable Fortune 500 clients to enable them to track, understand, and optimize ROI within their eCommerce channel. Melissa received her Bachelors degree at Hofstra University and Masters degree at Boston University.Close
Jared Bauer is the Sr. Manager of Analytics at Clearhead. Jared has spent over six years working in the digital analytics space. During his previous work at digital agencies Resource Interactive and Possible, Jared lead the measurement and analysis efforts for a number of Fortune 500 companies including: Procter & Gamble, Nestle, Sherwin-Williams, Bush Brothers, Hewlett-Packard, ConAgra Foods, and Abbott Laboratories. Working across multiple clients and campaigns has given him a wide knowledge of a number of measurement tools and understanding of measurement in various marketing channels. Jared holds a master’s degree from American University.Close
Brian Cahak is the VP of Business Development & Marketing. Previously, Brian was the co-founder of a SaaS start-up in the auto technology space, the COO of an art eCommerce company, and the VP of eCommerce Operations for Callaway Golf. Brian leverages his experience at both Fortune 100 companies and tech start-ups to lead Clearhead's go to market strategy. Brian holds an MBA from the University of Texas and a BS from West Point.Close
Laura Stude is a Testing and Optimization Manager at Clearhead. She spent her early career in client services and copywriting at various ad agencies and later found herself in marketing and social media roles at Furniture Brands and Famous Footwear. After attending a Startup Weekend and learning more about Lean Startup principles, she left her job (and hometown) to pursue software development and entrepreneurship at The Starter League in Chicago. This eventually led her to Austin, where she has gained experience at various local startups in back-end and front-end programming, lean methodologies, UX and breakfast tacos.
When not learning to program or directing movers where to put boxes, Laura enjoys traveling, spending time with friends and family, the outdoors, football, tracking packages and any drink in a mason jar. She's a big believer in the Golden Rule and loves meeting interesting people.Close
This marks the second post in our series on “things we hear while selling testing and optimization services”.
This edition finds us in the pitch with a senior digital business director at a large travel oriented business. She is a very bright digital business women with a year full of plans that include re-platforming their email system and overhauling their mobile experience.
The conversation continues as we explain how our services can help her make data driven decisions on features and designs she may want to use in her new mobile experience. “That sounds great,” she says.
At this point we believe we are on the verge of a new and very exciting partnership with a company we know we can help, when the conversation suddenly turns to “timing”.
“You know”, she says. “I am excited about the possibilities here, but I want to be honest about the timing. Like I mentioned, I have a very full roadmap planned for this year. Testing is on it, but not until the back half. Let’s talk again in 6 months.”
“Well, we can certainly appreciate that”, we say. “But have you considered that testing and optimization should perhaps be the driving force that refines and prioritizes your roadmap and not an item on it?”
“Okay, thank you very much for your time. Please keep us in mind when you get to the testing portion of your roadmap.”
And since that moment, we’ve heard similar comments from no less than three other senior digital leaders from equally large businesses.
So what’s going on here?
I suppose this is not surprising. In a world where every day there’s a hot new tech start-up you should meet, a fancy new feature being launched by a competitor, and an endless supply of “table stakes” that are “no brainers” to work on, the idea of taking a minute to validate your assumptions before moving forward can feel, well, slow.
With that said, how many times have you done the deal with the hot new start-up or finally rolled out that “table stakes”, “no brainer” of a feature, only to find it didn’t materially move your business forward?
And what if I told you there’s a way that you could get a data driven read on these decisions before you fully invest in them? That’s what a data driven, “hypothesis validation” program can do for you.
Here’s a real life example. One of our fantastic software-as-a-service, subscription based clients had the idea that giving away a free, tangible gift in exchange for signing up for their 30-day free trial would lead to explosive growth in trial sign-ups.
They were so confident in the concept that they were ready to move forward with a partnership with a shiny new start-up that actually helps SaaS companies manage free giveaway programs. “This is a no brainer”, they said.
The great news for us was they are an existing client with whom we had had success debunking the value of a previous “no brainer” via the hypothesis validation program we run for them. They are now believers in using testing to validate their roadmap, so when we suggested a very simple, minimum viable test wherein we’d hardcode an offer for a free giveaway to 50% of their leads and then manually fulfill them via Amazon to see if over the course of 3 days there was any sign this giveaway offer would lead to the explosive trial growth the expected, they jumped at it.
You can probably guess what’s coming next. After 3 days of offering the free giveaway, the results were completely flat.
Now the learning for this was not that their leads don’t like free stuff, but rather, that they weren’t primed enough from their landing pages and value prop presentation to be receptive the free offer. Kind of like when you walk by a booth for a company that you’ve never heard of and the first thing you are greeted with is “free T-Shirt”.
And the best part — landing page and value prop updates were on the roadmap for later in the year and after the giveaway program rollout. So this very simple experiment drove a data driven change to their roadmap priorities and increased the odds that their roadmap will actually drive real value. That’s what roadmaps should do, right?
How can I make data driven, product roadmap decisions? Don’t worry - here’s 3 steps to get you started.
1. Identify the assumptions in your roadmap.
Whether you have a formal roadmapping process or just a list of stuff you plan to work on this year, you have a roadmap. In it lies your assumptions on the things you think will improve your user experience and help your digital business. Identify them and write them down. Even if you think it’s “table stakes”, write down why you think so.
A 2013 table stake example — “A responsive mobile site”.
A likely assumption — I believe if we build a single website that automatically adjusts it’s layout and features based on the devices it runs on, we will sell more products on phones and tablets.
2. Test those assumptions as simply as possible.
Before you greenlight any major project, go through each assumption you’ve written down and identify the simplest way you can come up with to confirm whether there’s any merit to it.
An example of a minimum viable way to test the responsive site assumption above:
Take one page type, say a product detail page if you run an ecommerce store, and make that responsive to the device you believe is most important, say the iPad. Now test this one page on iPads. Does it convert materially better than what you offered before?
If so, great. You know you are headed in the right direction. If not, you need to adjust your assumptions…and the approach to your roadmap.
3. Evolve your roadmap.
Each test you run exposes a card in the proverbial poker game of your business, and, in turn, provides valuable information enabling you to make bets with better odds.
Your roadmap should continuously evolve as you obtain data through minimally viable tests and make better informed bets.
We recommend that Testing & Optimization be the core process - the “table stake” - that defines and continually refines your roadmap. The key benefit being that your major decisions are more likely to deliver material revenue growth for the business than those based on gut, a vendor’s fancy sales pitch, or an educated guess.
Don’t have a testing & optimization program? Well, we can, of course, help you. Please drop us a line. ContactUs@Clearhead.me
Posted 10 hours ago, Comments
In meeting with many (OK…several) amazing brands each week that are smart, well-staffed and committed to A/B testing, I observe a common trend in how these businesses talk about and organize around the practice.
Commonly, in these businesses, the design, engineering and product management resources are consumed entirely by major initiatives, driven by executive strategy or a perceived need to chase “table stakes.” Because there are scant resources inside these organizations to do much more than has been asked of him, the idea of A/B testing, a pursuit that the whole company agrees is important, gets pushed out to either the Marketing or Analytics teams. These teams, not traditionally staffed with dedicated engineering resources, then rely on their own skills, the testing software itself, or a small “shadow development” team to build and QA tests.
This approach to A/B testing, compensating for a lack of bandwidth in the legacy product development and design process, ensures that A/B testing is a parallel pursuit, one entirely measured by the ability to improve conversion rate or top-line KPIs through experimentation. However, this pursuit does not get the benefit of defining product development and design hypotheses. As such, the product roadmap continues to be littered with unvalidated assumptions, waste, opportunity cost and risk. Sure, the business is testing (check) and conversion rate is hopefully inching in the right direction (check). But the larger organization and the major digital product investments are no more data-driven, leaner or likely to succeed as a result of the testing competency. In these companies. there is the “testing group” and then there is the “product team or council.” And the two do not get the full benefit of the other. This is the best description I would use to describe what is commonly called “conversion optimization” in e-commerce or digital product organizations.
DATA-DRIVEN PRODUCT DEVELOPMENT
“Hypothesis Validation,” on the other hand, is a broader, organizational approach to product development and design. A/B testing is a method commonly applied to validate hypotheses. But it isn’t the only one (there is user testing, historical data analysis, voice of customer data/survey analysis, etc.). Additionally, conversion optimization is a key benefit of hypothesis validation. But that is not the only benefit of hypothesis validation. There are three (3) ostensible goals to this practice:
1. A smarter organization
2. Reduced waste and opportunity cost
3. Improved performance of business KPIs
Numbers one (1) and two (2) above are only possible by re-thinking the “who” and “why” of A/B testing inside the organization. If relegated to a parallel or “shadow” team, the learning benefit remains fragmented and the waste in the major product investments is unchanged. To harness the full benefit of this amazingly powerful capability, the business should ensure that (a) tests are driven by hypotheses that genuinely reflect both the greatest opportunity and the greatest contemplated resource investment and (b) that the product roadmap and major marketing campaigns are increasingly informed by validated learnings derived from testing.
Clearly, I am making a key distinction and prescribing a preference and solution in the simplest possible terms. In reality, these changes take time and contemplate disruption. I would argue that the risks of not tackling this head on, though, are far greater. The cost of investing in a new A/B testing program without entirely connecting it to a hypothesis driven product and marketing philosophy ensures more cost (tools and people), occasional KPI improvement (nice) and limited/no impact on the efficiency of product development or marketing (true).
A thought experiment I often pose in conversations with entrepreneurial executives who want to be more data driven involves asking the following questions:
1. “When you define and approve features or work to be designed and engineered as part of a product roadmap, (how) are the underlying hypotheses captured and validated? How do you know it your roadmap is generally optimized?”
2. “How would your business change if all product roadmaps were evaluated by the extent to which they are informed by validated learnings derived through data and/or testing?”
3. “What are the key moments (meetings, definitions, requirements, approvals) in the formation of product roadmaps and digital marketing plans were you can ask the organization: (a) What is success for this initiative? (b) Are we able to capture the data necessary to measure success? (c) What are the hypotheses driving this initiative? (d) How can we validate or test those hypotheses?”
Clearhead exists, in part, to help our clients take advantage of the amazing opportunities enabled through the benefit of digital business data. Sometimes that help is simply the extra muscle required to work with their existing tools and team for greater output. And, other times, that help is assisting senior executives in re-thinking their organization, workflow and deliverables to task their data with the most important job we can imagine — the validation of core value hypotheses and exciting growth hypotheses.
Posted 3 weeks ago, Comments
We are often asked by clients whether they should run tests during the holiday buying season. Sometimes they are hesitant due to the perceived financial risk from a “losing” test or the operational risk of introducing tests during a seasonal code freeze. Sometimes the concern is about their team’s organizational agility - their ability to organize effectively around a complex activity in a siloed environment.
At Clearhead we believe the opportunities and potential gain far outweigh the risks and that each of the aforementioned concerns can be readily addressed. Here are four reasons you should seriously consider launching a testing program this holiday season….even in mid-November.
1. The seasonal traffic spike provides a once-a-year opportunity to drive revenue.
Many eCommerce retailers will earn 20% - 40% of their annual revenue during the holidays and the unique visitors for most sites reach all-time highs. The pressure to achieve revenue targets this holiday season is especially high since Thanksgiving is the latest it’s been since 2002. Shorter season = greater performance pressure each day.
Don’t fret - you can easily ride the wave!
This is possible for three reasons. First, retailers today have powerful optimization tools at their fingertips to fine-tune site conversion. Second, because of the seasonal traffic spike, many retailers have a once-a-year opportunity to determine statistically significant winners on multiple tests in a condensed timeframe. Not only will the winners enhance the conversion of your site during the holidays but they’ll also provide the lessons needed to help you get a quick start going into 2014. Lastly, conversion optimization provides the best financial leverage of any investment you’ll make on your site. Period. When done right, the ROI for conversion optimization is many multiples of the investment required. The increasing cost and decreasing ROAS of google Adwords is squeezing most mid-size retailers. Investing more into traffic without investments in conversion optimization simply won’t scale. Start now.
2. Test seasonal merchandising ideas or seasonal product offerings.
Many eCommerce executives we speak with have been contemplating holiday merchandising tweaks including countdown timers, expedited shipping options, wishlists, or stock out notifications. Others also contemplate holiday product offers involving gift cards (ecards and physical), gift wrapping, and tweaks to the search results. A well-defined optimization strategy provide a means to iteratively determine which plays win and then exploit the winners by directing 100% of site traffic to them with minimal IT involvement.
3. In many instances, a code freeze shouldn’t hinder you.
4. Timing is not too late…even though it’s mid-November.
Don’t worry - the speed to testing can be incredibly fast. We can help you develop and prioritize test-worthy hypotheses, implement certain testing tools, and configure tests within hours or days. We’d encourage you not to let this once-a-year window pass you by.
Clearly there is a financial opportunity cost to missing conversion gains during seasonal traffic spikes. In addition, our clients regularly convey to us the softer benefits of optimization programs including increased decision-making agility and customer intimacy, team cohesion, and broader clarity about KPI’s and goals. Missing out on those lessons now would be like finding a lump of coal in your stocking!!
If you’d like to learn more about rapid, iterative testing over the holidays then please contact us. We’d be happy to brighten your holidays!
-The Clearhead Team
Posted 3 weeks ago, Comments
Click here to apply now!
Posted 3 weeks ago, Comments
I’m not sure if this metaphor is especially current (is the term “Moneyball” a clearly understood and appreciated concept in 2013?). And I’m certain that my metaphor is not fully baked. That being said, I’ll posit the following as a simple, accessible way to think about the value of data-driven digital optimization. I’m attached to the “Moneyball” analogy because (1) I have been enamored of Bill James since the age of eight when my parents bought me my first annual “Bill James Abstract” as a veiled attempt to get me to read more, (2) I still hear perspectives on A/B testing and optimization that I liken to outdated, poorly understood baseball folklore and (3) I think the metaphor helps my team understand and explain why we do what we do.
So, how is our approach to digital optimization like Moneyball? Here’s how:
Our singular goal is to help clients maximize the number of quality “at bats” and preserve “outs.”
There are a number of other analogies to be made here — around the risk/reward of falling in love with home runs and of the value of data and underappreciated metrics. But I think the aboves serve as an effective starting point to frame the conversation.
In the simplest of terms, most businesses are driven by their ability to deliver and improve a value proposition in the interest of acquiring and maintaining customers. For digital businesses, this means that all design, UX, product and engineering decisions need to serve a customer desire. The more accurately the company can super-serve this desire, the more likely it is to succeed. In pursuit of this endeavor, the company is constantly making marketing, content, design, experience and product development decisions that I’d liken to “at bats.” They are a chance to step up to the plate and connect with customer desire. The more informed and efficient these decisions are, the more likely they are to deliver value and yield returns.
Every new design; every promotional email; every new site feature; every CTA copy line; These are all at bats. When they miss wildly — when they do not connect, hurt performance and don’t lead to rapid learning — they are ostensibly “outs.” Outs are big swings at the plate that do not lead to any customer or intrinsic learning value. Outs drive up operational cost, marketing cost and opportunity cost. Ultimately, there is a finite number of outs that any company can survive. Outs are very costly financially and to brand perception.
Big, costly site-redesigns for the sake of vanity; chasing “features of the day” because they are bright and shiny and the cool kids are implementing them. Major experience or feature additions driven by an executive’s whims. Those are likely outs. Those are Dave Kingman’s. Those are White Sox-era, pre-PEDs, White Sox-era Sammy Sosa’s. They may occasionally connect for home runs, but they ultimately are expensive ways to waste a lot of outs very quickly. They will ultimately bury a team.
Bill James developed the “Dave Kingman Award” (an award you don’t want to win) for hitters whose predominant contribution to runs created was through home runs, largely masking the cost of the low on base percentage and WAR. Players like Kingman, Tony Armas and Ron Kittle were all decorated all-stars, with gaudy home run totals who most SABER-metricians would argue cost their team far more than they delivered. Companies stuck in waterfall development methodologies, driving design and engineering decisions based on the opinions of a select few, are the Dave Kingman’s of digital businesses.
Companies that are using analytics as a reporting device and dabbling with A/B and user testing are further along in the spectrum. They are not Dave Kingman’s. However, unless that data and the validated learnings developed through analytics and testing are driving all key design and development decisions, the companies are still wasting a huge number of precious at-bats. These are more evolved businesses. They exhibit superior top-line stats, but they still strike out far too often. Think of Reggie Jackson or Jim Thome as a point of comparison — extremely talented hitters who were far too willing to eschew data and situational hitting for big swings (and misses)
Those companies that have developed cultures wherein data, experimentation and validated learnings are driving all key product design and development decisions are the ones maximizing their at bats. They are scoring through a lot of walks, singles and doubles. They rarely strike out and they occasionally hit one out of the park when the count is in their favor. They understand their customer, the product and their market through the benefit if many validated (and invalidated) hypotheses. These companies are the Ted Williams’ (full career), Joe Morgan’s (mid-1970s) and Rickey Henderson’s (1985, 1990) of baseball.
So, how do you maximize at-bats? You employ data for situational hitting? You understand the pitcher’s strengths and weaknesses. You take pitches. You choke up with two strikes. You understand that singles and doubles lead to runs. You understand that a walk is an opportunity to get more data and postpone future outs. You swing for the fences only when you are most likely to “guess right.” OK — I am mixing metaphors now. But you get the point. The idea is that data driven continuous experimentation, iterative changes, targeting and testing allow for more “at bats” — attempts at connecting with the unique customer segments and delivering incremental value. These at bats don’t offer the undeniable appeal of the monstrous three run homer (sorry, Earl Weaver) but they lead to more data, smarter swings and a greater likelihood for solid contact.
I’ve likely belabored this metaphor enough and should bring the big point home. Digital Optimization is about data-driven at bats. Digital Optimization isn’t about home runs. People are justifiably seduced by tales of 90% conversion rate lift from a single idea or test. But those are aberrations and, if the result of a highly disruptive test, were earned with great risk associated, no doubt. Digital optimization is about the accumulation of data, in the interest of making decisions that are most likely to “connect” and less likely to “strike out.” Pure and simple. This applies to every aspect of digital business — marketing, communication, design, experience, product development, engineering, etc. Every decision, every expense, every resource allocated in these areas constitutes an at bat that can either lead to more productive future at bats and eventual value (runs) or a wasted opportunity (outs).
If you want to employ Moneyball-thinking in your digital business, make sure that you are capturing and testing every distinct hypotheses constantly. Be honest with yourself and your team about how you approach your at bats. Do you take big swings based on gut and habit? Or do you quickly and continuously iterate and learn, accumulating data so that your “big swings” are more likely to connect based on the data earned? The transition from the former to the latter is not easy. But it’s very much possible. In fact, that’s in no small part why Clearhead exists.
Companies that think digital optimization can be pursued as a series of forced, big swings at the plate in search of home runs should know that this approach comes with great risk of negative performance impact and great expense. Without doubt, there will be occasional “winners,” but they will inevitably be burdened by the cost of risk and waste. When do we believe in swinging for the fences? When you have a lot of data about what works — when you know the pitcher’s strengths and weaknesses and understand the context and are in a hitter’s count. You arrive here with the benefit of many at bats and a lot of data to inform your guess.
All of which takes me to a prevailing concept for the success of a digital optimization program. While it is common to chase and glorify the big testing and conversion optimization wins, a digital optimization program’s value should be equally correlated to waste and “outs” avoided. Everybody wants winners. But if not fully integrated into the design and build philosophy and rationalized against real cost and opportunity cost, analytics and testing can devolve into dabbling that does not drive the big bets. When properly designed, you will see a team of smarter hitters across the board. A team informed by a wealth of data. A team able to get on base with increased frequency. And a team less likely to make big outs in crucial situations. A team more likely to score through walks, singles and doubles. And, occasionally they will hit one out of the park. And they will be able to describe exactly why they swung for the fences that time.
Posted 1 month ago, Comments