Nov 14 2013

Designing for Success with New Relic: How Testing Simple Landing Page Elements Yielded a Triple-Digit Improvement

Small wins can yield exponential results. Our recent work with New Relic showed how detail-oriented, metrics-supported design can go far. We were excited to see how conversions jumped an astounding +118% with a simple layout change. With Optimizely, we tested and tracked the performance of two variations on an existing design for four weeks. Read on to learn more about data-backed design.

How the Design Worked

Every business wants more leads, sales, and (as a result) more revenue and growth. With improved engagement, businesses on the web can get the results they want.

When we looked at New Relic’s library of landing pages, we wanted to increase product deploys through effective design. The task was both exciting and daunting. These landing pages offered tons of potential places to test in the dozens of active pages, each with varying lengths, colors, CTAs, etc. With the host of options before us, we decided to start small: We took on the most trafficked pages and tested one element at a time.

Once we decided to take on New Relic’s most trafficked acquisition pages, we came up with an A/B/C method, with “A” as the baseline “original/existing” design. In our first test, the first alternative, “B,” performed poorly, and the “C” option took off, earning more than twice the product deploys of the baseline “A.” Thanks to the ability to isolate the variables, we could make an informed design decision.

So, what was the defining factor that made “C” skyrocket? A button. Yep, the change from a form to a button made people click, sign up, and then deploy the New Relic product at a remarkably higher rate. But it all started at the initial engagement: the button asking the user to sign up. If the user saw an empty form, they didn’t sign up. Is a form directly asking for information a turnoff for users? Do people prefer to be asked via a button first? Maybe.

If the user encountered an empty form, they didn’t engage. But there’s something about a button that gets users to a form to try a product.

Exhibit A

The “original/existing” design (our baseline “A”) greeted the visitor with the form and a call-to-action further down the page.

Baseline "A"

Exhibit B

Our next option was the unsuccessful “B,” which simply moved the form up, so it was now clearly in the header.

 Version "B"

Exhibit C

Finally, the stellar +118% option “C” collapsed the form into a button instead.

 Version "C"

And here’s an excerpt of our results from Optimizely.

Optimizely results

In Conclusion

Data-backed designs make us, our users, and our clients the most happy. When performance increases on a site, it means that objectives are being met: the site’s visitors are finding what they’re looking for, the site’s hosts are meeting their guests, and there’s a well-designed place for their interaction.

To guide visitors to their goal, it’s necessary to gauge their reactions, iterate with agility, and prioritize your work to meet the company’s goals. Tools like Optimizely help us make educated design decisions, like we did with New Relic’s acquisition pages.

What testing tools have streamlined the design process for your team or site? Let us know in the comments — we’re always up for learning about new ones!

About the Author:

Aron is an account strategist at digital-telepathy, political junkie, competitive surfer, and evangelist for the convergence of design and analytics. You can follow Aron on Twitter.

Leave a Response

4 Responses

  1. Nov 15 2013
    Michael

    But wait–you say the button engaged them, but a button leading to a form right because how else would you actually get the customer’s information?

    • Nov 21 2013
      Aron Schuhmann

      @Michael, you’re correct. Clicking the button opens a form which captures the customer’s information. You can see a live example of the winning page here http://newrelic.com/lp/mobile-monitoring

    • Jan 08 2014
      Dave Grow

      This unfortunately looks like a flawed analysis. To Michael’s earlier comment, the button then took them to a form. At that form, there will undoubtedly be drop-off. So it seems as though this is comparing apples to oranges (i.e., did they click the button vs. did they actually sign up).

      You would have to measure the next step of the button flow to determine whether it’s actually better. Did more people actually finish the sign-up flow in the button flow vs. the other?