Want more event attendees? Test your email invitation!

anastasia-zhenina-UkykVWseRJQ-unsplash-600

I just got results back from a test we ran on the email invitations we send to subscribers on our Email & More, A Q&A with ... list, and I wanted to share them with you, my fellow Only Influencers, because so many of us sponsor webinar series and other events.

A little background first:

My agency, Holistic Email Marketing, is in Season Five of Email & More, a monthly panel discussion that brings together email and CRM marketers, thought leaders (including OI-ers!), and a live audience to discuss industry topics of the day.

Although we usually draw good crowds, we were curious to learn whether we could improve our email invitations to draw more attendees. So we took the same advice we give clients and tested our event emails to discover which approach was more likely to persuade people to click to the registration page.

Curiosity versus loss aversion

We wondered whether emails that pique curiosity would be more attractive than emails that promise to help them avert loss. These are two of the seven psychological principles, as defined by psychologist Robert Cialdini. that motivate people to act.

Methodology: First, we chose to test curiosity and loss aversion in our email invitations to see which one generated more clicks on our registration call to action.

Next, we set up a holistic testing programme using A/B split tests to determine which message style was more effective. With holistic testing, we can set up our control and variant messages as a unit in which each element – subject line, image, call to action, body copy – aligns with our hypothesis. This approach, which I developed, lets us test entire messages against each other instead of testing each element separately.

Here's the hypothesis we used to guide our testing plan and choices:

"Emails that utilise loss aversion-oriented elements (subject lines, body copy, and calls-to-action) will result in higher open and conversion rates than emails that utilise curiosity-oriented elements. This is because the psychological impact of potential losses will create a stronger motivation for recipients to engage with the email content."

In other words, we expected marketers would be more motivated by avoiding loss than by satisfying curiosity to check out our discussion.

Testing process: Using holistic testing, we created Version A and Version B of the same message:

  • Version A (control) used a subject line, preheader, body copy, and call to action that invoked loss aversion.
  • Version B (variant) changed those three elements to invoke curiosity.

All other elements of the messages, including click destination, design, images, and supporting copy with speaker details, times, etc., were identical. This made us confident that uplifts resulted from our changes and not from other elements in the message.

Testing period and cadence: We sent three invitations over three weeks (one message per week). This gave us three tests for gathering data. Each test had two versions.

Which test won?

We predicted Version A (loss aversion) would show higher open and click rates than Version B (curiosity). This would tell us that emphasising potential losses is more effective than piquing curiosity to motivate our recipients to click on the messages and register for an Email & More panel discussion.

But surprise! Curiosity generated more clicks than loss aversion in two of the three tests. A third test showed no statistically significant difference between the two.

Message

Variants

Open Rate

Click rate

Uplift

Message 1

Loss

39%

3%

None

Curiosity

35.9%

2.8%

Message 2

Loss

42.5%

1.5%

47% for curiosity with 95% significance

Curiosity

34.5%

2.2%

Message 3

Loss

34.9%

1.6%

56% with 99% significance

Curiosity

30.01%

2.5%

Note: Loss aversion had higher open rates on each of the three tests. However, our success metric was click rate. Curiosity achieved higher click rates on two of our three tests. This aligned with our objective (gaining more attendees).

Conclusion

Our testing regimen achieved a dual objective. We learned which of two psychological principles prompted more email recipients to click on the message and get details about the event. But it also reminded us that a test which doesn't prove or support your hypothesis isn't a failure. Now we know (although we will keep testing to confirm) that curiosity beats loss aversion for our audience.

I've provided our methodology and logic so you can set up a similar test for your own event invitations. But, as always, your results might vary from ours because your audience is different from ours and your event might have different goals.

If you do try a test, I would love to know how it worked out for you!

anastasia zhenina UkykVWseRJQ unsplash 600Photo by Anastasia Zhenina on Unsplash