The other day I got a short email:
Ken, have you seen this article? Do your findings or observations in the industry back this up? http://www.clickz.com/clickz/column/2272468/only-16-percent-of-b2b-consumers-prefer-live-webinars
It turns out I had indeed seen that article. And I kept hesitating on whether to address it or not. Then I saw industry colleague Roger Courville retweet an excellent quote from Doug Devitre: "Sharing more statistics doesn't deliver value. Interpretation creates value."
That pushed me over the edge. Time to talk about webinar statistics. There is a desperate need for more rigor, definition, and transparency in those percentages people like to bandy about.
Let's start with the linked article at the top of this post. The author, Mathew Sweezey, says he asked 400 B2B consumers how they prefer to watch a webinar. "While only 16 percent said they prefer to watch it live, the other 84 percent either didn't care if it was live or preferred to watch it at another time."
Mr. Sweezey goes on to make some very good points and offers useful tips on repurposing your content effectively. Then he closes with this sentence: "Just remember that if only 16 percent of people can watch your webinar live, you still have that valuable 84 percent to influence." This interpretation of his own data is unsubstantiated. Nowhere in the data did it even imply that only 16% of your audience can watch your webinar live.
Sweezey's survey asked respondents about their preferences. He doesn't separate the numbers for people who gave no preference and those who preferred to watch recordings. And none of those figures reflects actual viewing behavior… Just a self-stated preference, a type of data point that often (usually?) diverges from actual observed behavior patterns.
I dug through some other studies by webinar/webcast vendors to see what they reported on actual viewing behavior. Unfortunately I ran into the usual problems with interpreting and trusting the assembled results data.
Two vendors have been particularly good about sharing consolidated data from customer web events. ON24 and BrightTALK both present annual insights into attendee behavior patterns. I truly appreciate their openness and generosity in tracking, assembling, and reporting the data. But I still have problems with assigning too much weight to many of the results as shown.
BrightTALK in its early days put a lot of emphasis on recorded presentations. They feature topic-related "channels" where viewers can watch content on demand. In the past year or two they have been ramping up the emphasis on live content, often embedding live webinars on customer web pages. Unsurprisingly then, they show live attendance increasing steadily from 2008 to 2012 as a percentage of total viewers. But should this be extrapolated to mean that over the entire internet, there is a similar shift in viewing preferences? I don't think so, and I wouldn't make a statement one way or the other based on BrightTALK's data. (They don't attempt to make that extrapolation either, thank goodness.)
BrightTALK reported that in 2012 the split between viewing patterns for registrants on their system was 39% watching the recorded presentation, 30% watching the live webinar, and 32% never watching either. That "feels about right" to me from my own anecdotal experiences with my clients' events across different industries on different technologies. But for any one event, the numbers could be wildly different. (Incidentally, one other stat of theirs that interested me was that 44% of live viewers watch the entire event, while only 32% watch all of an on-demand event.)
ON24 in their latest Benchmarks Report says that 25% of those who preregister for a live webinar watch the archive. But they don’t say how many of those also watched the live event, so I have no idea how to break down their stats for comparison to Sweezey or BrightTALK. For me, it's an uninterpretable number.
Just this week I saw some figures on NoJitter.com excerpted from The Information Week "State of Unified Communications" report authored by Michael Finneran. The first sentence declared: "When comparing attitudes and adoption for various Cisco and Microft products, Information Week research found that Cisco has a commanding lead among surveyed Web conferencing users, with 60% reporting they use the company's WebEx; that's almost twice the rate of Citrix Go-ToMeeting (36%) and Microsoft's Live Meeting (33%)."
I worry when I see statistics presented like this. First of all, in a single sentence the article managed to misspell both Microsoft and GoToMeeting. Trivial, but it helps undermine my trust in their quality control and reporting accuracy. Then they report that Live Meeting has a reported 33% usage rate, even though Microsoft has discontinued the product and urged customers to switch to Lync. Maybe people are holding onto the old product, maybe they are including Lync in their answer as an equivalent. I don't know. It deserves some kind of explanation or footnote.
Then I have no idea whether the survey made any attempt to differentiate "Web conferencing" from larger web events, online training, etc. Are they comparing combined usage for WebEx Event Center, Training Center, and Meeting Center to just GoToMeeting? Or does the GoToMeeting figure include GoToWebinar and GoToTraining usage as well? Was the decision on what to report left up to the respondent?
When presented with consolidated and summarized result percentages, my natural inclination is to take their significance with a very large grain of salt. I usually don't know how the question was phrased, whether guidance or definition was provided to respondents, whether multiple responses were allowed or whether the question was limited to a single response, what data was included in the figure, and so much more.
Before I close, I should add a final shout out here to Wainhouse Research, an industry research company that covers the web conferencing space in some detail. Wainhouse conducts a twice-yearly survey of businesses, asking them about their use of web conferencing products. The information is presented in an extensive research report that does a very good job of breaking out data points and showing trend patterns over time. They send highlights of the results to survey participants and keep the remainder of the detail to be used in engagements with their clients. But the statistics they track do not include viewer/attendee behaviors. Wainhouse tends to concentrate on usage of web conferencing products from the host's perspective. And as with most of these surveys, little to no attention is paid to differences in usage between the various products in a vendor's suite.