The Great Content Debate: Defining What Online Engagement Really Is
The term “engagement” or “content engagement” is tossed around the blogosphere and in publisher circles around the world, but the stakeholders don't seem to be able to agree on what really defines engagement.
Is it a click, a conversion, a purchase?
Ultimately, brands must decide what engagement means to them, based on their own defined goals and strategies. Yet at the same time, some measures are more meaningful than others.
According to a study sponsored by Pitney Bowes, and reported on by eMarketer, the number one business objective marketing professionals in North America are trying to accomplish when enhancing the online customer experience is increased engagement – at 58% surveyed.
It’s likely that these marketers run the gamut as to which metrics they use to measure engagement. However, they know that it’s important.
Ultimately, there are two types of engagement which can be measured – tactical and strategic. Tactical engagement is where most of the confusion lies.
Many of the below are considered to be vanity metrics, but that said, they still have use. Most of these measures in isolation don’t do a very good job at giving a clear macro engagement picture., however once stripped away from the true vanity elements, the picture starts to become much clearer.
Views/traffic – Back in the day we used to call this “hits” - in fact, it was quite common for websites to have a hit counter at the bottom of its homepage. This screams vanity metric. Keeping track of views and/or traffic is important, but it’s far from an engagement metric. Besides, bots and click-fraudsters can really skew this measurement.
Clicks – This can be a very valuable thing to track when used to measure the health of the on-site buyer’s journey. Are users clicking on the appropriate links in order to move from one stage to another? Is the information architecture flat enough?
Click-through rate is also important to track because it informs marketers what messaging, and/or calls to action, work best to drive users to a desired action.
However, just like views, clicks can be skewed by bots and other nefarious online activity. It also doesn’t really tell marketers if someone actually consumed the content on the other side of the click. While a valuable metric to track, when it comes to engagement it’s mostly just vanity.
Social Acknowledgement – This includes shares, likes, retweets, social comments, follows, etc. This area of measurement has a black eye on the marketing playground, and has been sufficiently beat up by bloggers over the years. As mentioned in the other categories above, bots play a role with these numbers, too.
Some of these numbers can indeed inform the editorial of a content team, but studies show little to no correlation between number of shares and actual engagement/consumption. Most of these metrics inform the content team as to the quality of their headlines or the earned trust their content has gained over time. This category is mostly just vanity, though.
Blog Comments – This is indeed a form of engagement. Some argue that it’s the most important measure of content engagement - the Moz blog can get dozens of comments on one post, while others rarely get a comment. It's my belief that this has to do with the culture a brand develops with its audience.
Some audiences weren’t built to have a commenting culture, while others were. Anecdotally, it seems the more mature (older) a blog is, the more likely it is to have a commenting culture with its audience.
Comments are valuable because they provide direct feedback to the content team, however in the grand scheme of things, the data points they provide tend to be minuscule compared to other engagement metrics. This is not a vanity metric.
Links – Naturally earned links are great for SEO, and should absolutely be tracked. These citations mean that the person linking actually read some of the content - and that’s definitely a form of engagement, too. However, like blog comments, the data set is too small to make macro conclusions about content engagement.
Conversions – The ultimate form of engagement which helps drive business outcomes. Whether it’s a product or an eBook, marketers love getting conversions. When conversions happen, the audience grows. This is one of the most important metrics to track and measure.
Unfortunately, it doesn’t really inform marketers as to the quality of their overall editorial. This is one of the most important metrics to track, but it’s not much of an overall macro engagement measure.
Subscribers – I, like many, subscribe (pun intended) to the idea that content marketing’s first purpose is to build an audience that can later be harvested as enthusiastic customers. When someone subscribes to a blog or another type of website, they’re showing an intent to engage with content. Intent doesn’t equal macro content engagement.
Scroll Depth – This is a very good way to look at a website’s overall engagement level, and while not perfect, there’s a very good chance that if someone took the time to scroll to the end of a post they read and engaged with at least a portion of it.
A limitation here is that very few - if any - analytics platforms report this information. However, the technology from Crazy Egg offers a “Scrollmap report” using a heat map layover on every website page.
Dwell Time – Like scroll depth, this is a very good way to track overall site and page engagement. Chartbeat Analytics found that users who spend 15 seconds or longer on a page consume 80% or more of the content. But also like scroll depth, most analytics de jour aren’t built to accurately measure this.
Google Analytics’ (GA) time on site measurement doesn’t include bounces. A bounce is a single page session on a site. Session duration is calculated once a user clicks on another page. Without advancing to another page GA can’t calculate dwell time.
To overcome this, analytics software like Moat can track real dwell time without needing another session to calculate it.
This is a more strategic senior management view of engagement. It represents the total engagement over time of an individual IP address, person or subscriber, and is generally an indicator of the propensity of that content consumer to become a customer.
It’s usually assigned a scoring scheme - engagement scoring is also known as “lead scoring”, assigning points based off tactical engagements with online content and/or in-app. Apps, through geo-fencing, can also track brick and mortar touches.
Engagement Scoring – According to Google’s 2011 report on ZMOT, the average customer becomes one after 11 touches. These touches can happen on a website, blog, social media, email, native advertising, brick and mortar, etc. When these touches add up to seven or more hours, the likelihood of the user becoming a customer is optimized. This adds to the need for longer content types, which could include videos, demos, ebooks, etc.
According to Marcus Sheridan in his book “They Ask, You Answer,” a prospect’s close rate soars from less than 5% to 80% if they consume 30 pages of content or more.
However, for those products and services that require very specific targeting characteristics (i.e. annual revenue thresholds, specific title, decision maker, industry, etc.) measured through propensity scoring, touches don’t matter because the prospect is not qualified. That said, it doesn’t mean they won’t become qualified some day in the future.
Engagement scoring, especially in a B2B setting, is an absolute must to track potential strategic business outcomes. Without question, this is a bottom line way to properly measure engagement over the digital half-life of a prospective customer or repeat customer.
That said, from a tactical point of view, everything mentioned above has some value associated with engagement. The question is – how much value?
The vanity metrics – not so much. However, the two most important engagement identifiers on the tactical list include scroll depth and dwell time.
In a perfect world, marketers would have access to both, but realistically, due to the limitations of most analytics software, this may not be the case.
This post originally appeared on inPowered’s blog.
Follow Chad Pollitt on Twitter