M-Commerce

CASE STUDY

How the Weather Channel Tested the Wind on Site Design Variables

You just launched a new interactive ad campaign and corresponding landing page. You discussed the page design with your team. You know your customers. You’ve built out the landing page to meet their expectations and be consistent with the campaign. All done, right?

Businesses undercut the potential success of their online campaigns all the time. They design landing pages based on what they know — or think they know — about their customers or products, rather than testing to see how customers actually respond to different versions and combinations. Sometimes they create a really brilliant feature for a Web page, but then they unknowingly add something else that undercuts it.

Everyone is surprised and confused about why the brilliant new feature is not getting results. This is understandable, but you need data to assess these performance dynamics, not just opinions. The only sure way to determine whether the headline, hero shot, call-to-action button, or something as simple as the color of a button is helping or hurting performance is to test and optimize.

It’s impossible to know how your assumptions will play out if you don’t test them. That’s how Widemile worked with The Weather Channel Interactive (TWCi) to find success with landing page optimization and multivariate testing. We questioned what we thought to be “true” and, most importantly, tested to see if those assumptions were right, leading ultimately to an optimal design that significantly outperformed the original landing page.

Success Can Blind Us

If you’ve had success, it’s sometimes difficult to think that it could be any other way. You know the adage: If it ain’t broke, don’t fix it. Business history is littered with once-great companies that failed to recognize that the rules of the game had changed, because it was so counter to their earlier formula for success.

This can also happen on a smaller scale within today’s successful companies. Our client, The Weather Channel Interactive, was already the leading provider of online weather, news and information — reaching more than 25 million unique users each month. As the clear leader in providing weather-related information, it began to branch out into other services from its core cable and Web properties. Should be a recipe for success, correct?

We worked with The Weather Channel Interactive on its weather alert service, Notify. Severe weather alerts are sent to customers’ phones, and for an additional fee (a premium subscription called “Notify Plus”), they can also receive alerts by text message or e-mail. Product sales peak seasonally for a short period of time, and severe weather events cause spikes in site traffic that drive demand even further.

Notify was already successful, but it wanted to see if it could do even better. Brad Bacon, director of distribution for consumer applications at TWCi, proactively came to Widemile and said, “We’ve heard about this testing stuff. How can we improve our conversion rate further to build our subscriber base?”

We worked together to complete a sophisticated landing page optimization test for Notify. Our objective was to increase conversions by 30 percent during its short 10-week buying cycle.

Revisit What You Know to Be True

Before just jumping into an optimization program and testing different page elements willy-nilly, we questioned everything about the original landing page — including things the team felt strongly about and “knew” to be effective. We wanted to verify these assumptions were, in fact, correct — or identify a better approach. The original design included a detailed table that compared features for Notify and Notify Plus. In addition, bulleted lists of benefits and features were all over the page, perhaps at the expense of a central focus (see Figure 1).

The design was very feature-driven — based on the assumption that customers would want to know everything they could get for their money. It was a case of “We developed all these cool features, so everyone must want to know all about every single one of them.” The result was a comprehensive page, which perhaps had both too much information and too little information at the same time. We wanted to test to find out.

(click on image to enlarge)

We challenged ourselves to really think about the customers, which were mid- to high-income families and homeowners, primarily in the Midwest. We asked questions like these: What are they looking for when they come to this page? What problem are they trying to solve? In this case, we hypothesized that they wanted a way to know when a tornado or other severe weather event was heading toward their home or community. Then the next questions became, how do you really speak to that? And how can we fill that desire with something substantial?

It seemed reasonable to position Notify visually as an early warning system for severe weather conditions. We know that site visitors tend to respond to a clear offer, so we removed the non-core content and simplified the key elements to send a clear message: If a severe weather event or tornado is heading toward you, we’ll send an alert to your phone in plenty of time for you to protect your home and family. Basically, we attempted to make the choice to sign up easy. We designed a simple page that spoke to the lifestyle benefit of keeping families safe. We now had to test this hypothesis.

Always Test and Validate

Hypotheses are just informed opinions; they’re a good place to start, but need verification with objective data. After redesigning the page, we simulated an A/B test (looking at version A vs. version B) to verify that the new positioning and design were superior to the original. We did this by comparing the conversion rates of the new page to three months of conversion data from the original page, using the highest historical rate as our baseline, just to be conservative. The new design was an immediate improvement, and once that was validated, we jumped into the multivariate test to fine-tune the design even further.

Multivariate testing gave us a dynamic and expedient way to find the optimal page. We could look at real data to understand how visitors were reacting and converting with various design combinations. We could modify key elements of the new page — the banner, the headline, the button, etc. — to determine the best-performing combination under actual user conditions.

Take the button, for example. Depending on how it works with the other elements on the page, a button can affect whether the customer takes the offer or leaves. With multivariate testing, you can try buttons before or after the text, on the top of the page or the bottom. You can figure out what the button should say and what color it should be to get the best possible results.

We were able to test our hypothesis and end the guesswork. We tested the equivalent of 1,000 possible Web page combinations using up to five versions of each page element (e.g., headline, etc.). Through a sophisticated analytical process and using our optimization platform technology, we were able to see how actual customers responded to specific combinations. Within 14 days, we knew exactly which combination drove the highest number of conversions — and we had our optimal page. See Figure 2.

(click on image to enlarge)

The Weather Channel Interactive had its new page in plenty of time to take advantage of its 10-week peak season. “Every day counts,” Bacon says, “and now I know that I never want to leave money on the table by not optimizing beforehand.”

It Pays to Question

Our hypothesis turned out to be correct: The new landing page performed significantly better than the original. The best-performing combination increased Notify subscription rates by 225 percent over the original.

One of the unexpected lessons we learned through this process was how much it pays to question conventional design wisdom — and with multivariate testing, we could easily do that. When we ran these tests, the conventional wisdom was not to use Flash animation on a landing page. It was thought to be distracting and that people don’t respond well to it. These ideas emerged because designers were using Flash to embellish or add fake movement to a page, which sometimes turned off customers and worked against conversion goals.

We wanted to learn more about how Flash animation could work on a landing page, so we asked the question: What if we used Flash in a very simple way? We were able to quickly and easily test the performance of a Flash-animated banner against an identical static banner. Rather than distracting from the message, the Flash-powered banner contributed to the key message by dynamically illustrating a tornado heading toward a house — with a Notify message in between. The Flash-enabled version beat the static version hands down, and was a big surprise component of the campaign’s success.

“We learned more about the specifics on what was driving or hurting performance — a new level of granularity and insight — because we looked at things differently,” Bacon says, “and it paid off.”

There is always more to learn. While Notify subscription rates remain healthy, The Weather Channel Interactive could almost certainly see another bump by questioning conventional assumptions. For example, when we initially ran these tests, we found that the seasonal peak drove the best-performing pages to outperform the worst by a disproportionate margin — almost 300 percent. How might landing pages that are modified for different seasons (e.g., severe heat in summer, blizzards in winter) perform against the current winner?

Based on what we know about landing page design, it seems likely that this would have the potential to reach an entirely new set of users. Since it’s something we “know,” we would definitely want to test it!


Frans Keylard is director of Web optimization for Widemile, a provider of Web testing and landing page optimization technologies and services.


Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

E-Commerce Times Channels