Data By Design

The grey background or the purple one?

This headline or a different one?

More copy or less?

These decisions might seem trivial, but they can stall marketing campaigns. It can cause internal rifts within teams, and it can sometimes get ugly – especially if a lot of people are involved.

Our advice: Don’t bog down the campaign with decisions metrics can make for you.

Dell Technologies World has three simple goals for its pre-event marketing initiatives: Registrations, Registrations, Registrations.  So when they consulted with Opus Creative to generate e-newsletter content, the ultimate objective was crystal clear.  The most effective route to achieve it, however, required some thoughtful testing.

E-newsletter campaigns are data gold mines.  Not only can marketers track metrics such as open rate and click-throughs, but they can test content to determine best paths forward on every send.

The tactic is largely referred to as “A/B testing”, with “A” generally being safer or previously used content, and “B” content being new, fresh, or creatively progressive.  

Todd McIlhenny, Opus Creative’s Business Director, sees it as a necessary step within any newsletter send, but also as a pulse on the broader scope of the marketing campaign.

“It’s about getting confidence.  It takes personal preference out of the equation,” Todd says. “We sometimes call it ‘data by design.’”

The process is simple.  With more than 2 million people in the 2019 campaign’s overall list, Opus Creative and Dell select particular segments – generally people most engaged with the event – and use them as creative guinea pigs. 

Out of 2 million, about 100,000 are selected for each test. 

It’s important that the tests go to a large enough number to be statistically relevant. 

If your list isn’t that large to begin with, aim for 5 to 10 percent of your overall list.

Of those 100,000 prospective attendees, half receive test A, and the other half gets test B.  After a designated amount of time – from 2 hours to 2 days, depending on the campaign timeline – data is collected to determine which email performed better. 

Performance is based on viewer engagement – basically, open rate and clicks.  

Perhaps even more telling:  Test B won the day in almost every case.

“(The B test) wins all the time, to the point where I thought there was a bug or something wrong,” said Lauren Featherstone, Opus Creative’s Program Manager. “The truth is that users embrace new designs, so it’s a good reminder to keep stretching your comfort zone.  You can’t assume that you have it all figured out.”

25907_DTW19_March5-Gen-NL_R03.jpg