The long tail isn't just for sales and SEO anymore!
That's probably what you think about when someone mentions "long tail" strategy. It began with author Chris Anderson, who wrote The Long Tail: Why the Future of Business Is Selling Less of More in 2004 about the profit potential in selling less-popular goods with longer shelf lives instead of high-demand, high-cost merchandise.
Anderson applied the long-tail concept to search marketing in 2006 when he argued that less-popular but more specific keywords in both paid and organic search could be more effective than generic or expensive, high-traffic keywords. They might get fewer hits, but they're more likely to lead searchers to the right results faster.
The long tail of email marketing and how attribution is affected
An email campaign has always had a long tail, too, although we typically haven't framed the discussion that way.
The email long tail comes into play when someone acts on your email days, weeks or even months after you send it. It doesn't matter whether your subscriber opened your email the day you sent it and then went on to something else or let it sit unopened, just waiting for serendipity to strike.
This long tail is one of email's many benefits, but you probably aren't accounting for it right now. And that does a disservice to your email program because it's likely not getting all the credit it should for driving revenue, engagement and value for your company. And without this missing credit, your attribution and, in turn, your budget could suffer.
Why email's long tail is overlooked
It all has to do with the way we report on campaign results. We typically look only at the initial reporting period and then move on to the next campaign. Often the reporting period is chosen arbitrarily, based on frequency of sends, rather than data.
But conversions and revenue happen over a long period of time. Low-commitment actions like downloads or registrations will happen quickly. When we're talking about money and long-consideration purchases, these can take longer.
The problem is that we don't continue to measure performance over the long term. Often, any activity that happens after the reporting window closes – say, a week to a month or even more after the campaign goes out – doesn't get counted. That's what I mean when I say our campaigns could have performed much better than our official reports show.
New research from the Data & Marketing Association's Consumer Email Tracker 2023 highlights the need to keep revisiting campaign performance. Here's one finding: 19% of consumers save their discounts, offers or sales email for a later date. This means they’re interested, but now is not the right time for them to action it.
Here is what consumers are most likely to do with interesting emails before they buy:
What you gain when you revisit campaigns
When was the last time you went back to your results long after the reporting window closed? I will wager you will find email activity – opens, clicks, and very likely conversions. They won't be as significant in volume as the activity your campaign creates in your initial reporting window, but they all contribute to your campaign performance.
In fact, we’ve found at Holistic Email Marketing that if someone has saved an email to act on it later, their intent is higher; therefore, the conversion rate is higher than in the initial reporting window.
The result is campaign metrics that are likely more positive than you initially thought, especially taking conversions into account.
Fellow OI-er Elizabeth Jacobi of MochaBear Marketing, took up the cause for long-tail email reporting as well after I spoke at Email Innovations Summit this June. I discussed revisiting analytics long after a campaign has ended and shared the client case study below that revealed the need for including the long-tail results within your reports.
The client case study:
Our client sends campaigns twice per week and was tracking the campaigns over a 4-day period. Using Google Analytics, we delved into their reports and found that with a single campaign, they were under-recording their email success by 128%. This wasn’t an anomaly, as after checking multiple campaigns, we found very similar results within their other campaigns.
Using this data, we were able to define the appropriate reporting period, which resulted in increased attribution to email as well as increased budget.
- Date range tracked: 8 March – 11 March
114 transactions, 1,294 web visits, 9% CR, £8,326 revenue.
- Expand the date range: 8 March – 31 May
303 transactions, 2,317 web visits, 13% CR, £19,022.30 revenue.
What was astonishing was the users who visited the website from this email in April, although lower in volume compared to the previous 23-day period, showed high intent, as their conversion rate was a whopping 37%!
Long-tail research in 5 steps
It's easy to discover whether you're missing campaign activities, especially conversions and revenue. Follow this 5-step procedure.
1. Ensure your analytics software is set up correctly and all your email campaigns and programs are tagged to track email success.
2. Check your dashboard: In your Google Analytics dashboard, look at 15 to 20 campaigns that you sent out within the last year.
3. Look for campaign activity: For each campaign, review all of the campaign activity until the present or until nothing else registers. Keep track of how many weeks you have to go out before no more activity happens. This will give you an idea of your long-tail potential.
4. Expand your monthly reporting: Go beyond the regular reporting on your campaigns in that time period. Go back a month and get updated metrics for those campaigns as well. If you discover long-tail activity from that period, add a page to your new report with those fresh metrics of the previous campaigns.
5. Find your cutoff: At some point you will have to decide when to stop looking for activity. Remember you need to recoup your investment in time, too, so be realistic with this. But your long-tail research will help you find the right cutoff point.
Implications for A/B testing and long-tail conversions
I was talking about this recently with a client. We were talking about how hard it is to send a control message to 10% of the list, the variant to another 10%, and the winning version to 80% of the list because the winner could end up being the loser when you base it on conversions and factor in the long tail.
This standard A/B testing procedure works well enough for testing using opens and clicks as your success metrics, but not for conversions because it doesn't account for conversions that happen long after the short 2-3 hour testing period.
To account for this, you would do a 50-50 split, record the result, update the results after time, and then based on conversions you would consider the long tail.
As with your reporting, you would keep checking on results and include these considerations when you declare the winner and ensure learning includes the long tail.
Only you know if the effort will be worthwhile
As I noted earlier, some campaigns or brands or products likely don't have long tails. A daily flash sale probably generates less long-tail activity than a cruise-line campaign. But you won't know until you start checking.
My prediction is you will find more conversions happening and unaccounted for than you expect. Collecting and reporting that data will give you a truer picture of your email program's actual performance and how it contributes to your company's bottom line, attribution and budget.
Photo by Jason Leung on Unsplash