“Let’s A/B test that.”

You hear it over and over because, according to Steelhouse, A/B testing is the most used testing method for improving conversion rates. And ya know why? Because it works. Obama himself raised an extra $61 million due to A/B testing. You can’t argue with that.

And you can’t argue this either: whether you’re writing email subject lines, website copy, or video content, you want the best possible content solution to be the one in front of your prospects. Yeah you do.

Which is why we A/B test. A/B testing, also known as split testing, is a marketing experiment that involves comparing two versions of something to determine which one produces better results.

Fundamental Components to Any A/B Test

This approach to testing content and design attributes has been known to drive some pretty compelling results, but that all assumes one thing: that it’s done correctly. The three keys to running an A/B test (of any sort) that’s accurate and will therefore drive long term results are:

  1. Understand what result you’re aiming for: is your goal to drive click-throughs? Views? Or shares? Selecting one goal at the outset is critical to selecting a winner. For example, an email subject line that wins at open rate may be different than the winner for click-through rates. Know your goal beforehand.
  2. Isolate one variable: this is the crux of any scientific experiment and determining what’s the cause and what’s the effect. It is imperative to keep every variable other than the one you’re testing for the same in both test A and B in order to determine the cause of your winner. For example, if you publish one video at 10:00AM on Monday with CTA 1 and the next video completely blows the first out of the water with CTA 2, but you published it at 1:00pm on Monday, how will you determine what contributed to the improved views? Was it the different time you posted it? Or the different CTA?
  3. Make sure it’s actually random: If your customers see the winner: Video Thumbnail A and your prospects see Video Thumbnail B, can you reasonably attribute the thumbnail of video A to its success? No, because your groups were not random and it’s more than likely that your customers will react differently to your content than prospects. Make sure your test groups are random! (This is generally where marketing technology comes in handy!).

Attributes to A/B Test in Video

There’s probably a million and one things you could test in your video content from scripts to talent to music to lighting.

You get the jist. But reasonably speaking, we probably don’t all have the time, nor the budget, to film multiple components of each marketing video we create and test them out. (Cue the Beach Boys “Wouldn’t It Be Nice”).

It’s a good starting point to A/B test those things that can be easily switched after production. The three main opportunities are:

  • Getting to the Video: Promotional copy
  • Playing the Video: Splash screens or thumbnails
  • Following through on the Desired Action: CTAs

Testing Video Promotional Copy

Okay, okay so this isn’t your video per se. But promotional copy can sometimes make or break whether or not your audience even ends up in the right location to view your video. For example, email copy that drives readers to watch your video on a landing page needs to be compelling enough to encourage viewers out of the email, on to your page, and clicking play.

This opportunity is one that requires some particular discipline in terms of making sure to only tweak one component of the copy. This will often surface as:

  • subject lines (email)
  • headlines (email, social, landing page)
  • body copy (email, social, landing page)

Video A:B TestingMeasures of success will most often be in total number of click-throughs and click-through rate (the percentage who clicked through of the total who viewed your video). You could also look to the percentage who played the video as well, but generally speaking, we leave this up to the thumbnail/splash screen testing.

Testing Splash Screens or Video Thumbnails

Splash screens or video thumbnails are the static images that show on your video before anyone hits ‘play’. Believe it or not, the image that’s shown can dramatically influence a person’s interest in viewing the video content.

When A/B testing splash screens, the shots used are generally frozen frames from within your video. With Vidyard’s A/B split testing for video splash screens, you can choose up to 8 different thumbnail images for your website, determine how often each one is shown, and automatically choose a winner.

Success in splash screen A/B testing is typically measured through views, or the number of people who clicked play on your video.

Testing Video CTAs

After pouring your heart out on camera and seeing your team put in all that effort in video production, I’m not surprised to hear you’d like some people to follow through on your call-to-action … whatever it may be. And really, what’s the point of your marketing video if it’s not driving action?

Which is why A/B testing is a smart idea here too.

The best CTA attributes to test include:

  • color
  • text/copy
  • size
  • font

But make sure you don’t test them all at once! You’ll notice in the example below from Content Verve, that only one word was changed while everything else remained the same. And even though the color, size, and font remained the same, there was a 24.91% decrease in conversion just from changing the word “My” to “Your”. Change that guy back!

Measures of success in CTA A/B testing will usually involve click-throughs. Either the total number of click-throughs or the click-through rate.

A Note on Mid-Way Content

You probably noticed that the opportunities I listed here get viewers to the video, clicking play, and continuing through the CTA. But the meaty, mid-way content that will take them from clicking play to the CTA isn’t listed. So what gives?

A/B testing that portion of your video can be particularly complex given the involvement in producing multiple shots for testing like I described at the beginning of this post.

But all is not lost! A best practice in this scenario is to keep your eye on your video’s engagement data. If you notice that there are large drop-offs in one particular area of your video (see example at right), then it might be time to take a look at making an adjustment.

You have the tools, now go forth and … test, test, test … then conquer!

Kimbe MacMaster