Adrian wrote to me over the holidays to ask if I had any benchmarks or comparative statistics on "attention ratios" for webinars. It's an interesting question and one that opens a can of worms. So let's pop the lid and start 'em wiggling!
If you are not familiar with the concept of webinar attention ratios or attention meters, I had better start with a definition. They are a valiant attempt by webinar software vendors to give webinar hosts additional data about attendee behavior and interest in the session. Several webinar technologies have incorporated the idea.
Some vendors include a percentage figure for each attendee, visible in the post-session reports. Some vendors include an overall percentage or meter that presenters and hosts can see during the session, reflecting average overall audience engagement.
Here are a few examples based on what I could quickly and easily find online:
Our first problem is identifying what the software is measuring. I have asked various people in some of these organizations what the percentage is based on and I have received conflicting answers. The easiest assumption is that the ratio merely looks at whether the conference window has focus or not (in other words whether it is the last thing the user clicked on). The instantaneous in-room figure would be the ratio of in-focus sessions to background sessions. And the post-session reports figure for each attendee would indicate the amount of time s/he kept the conference window in the foreground.
However some vendors have been much more mysterious, and told me that other factors are included in their calculation. Maybe things like number of interactions with the software (Q&A, hand-raising, poll answering, etc). I would like a lot more clarity before determining how much emphasis to place on the statistic.
Our second problem is deciding whether the data tells us anything of value. If the figure looks entirely or mostly at whether the conference window has screen focus, is that important? An attendee could click once outside the conference window -- maybe by accident -- and still pay plenty of attention to the content. And let's be honest… Computers are inherently a multitasking environment. Is it even reasonable to expect attendees to do nothing else at all while watching a one hour webinar? I might well be suspect of an extremely high ratio for an individual. It makes me think they opened the webinar and then left the room or took a phone call!
The third problem is the one Adrian identified… Given a set of numbers, what constitutes high/low, or good/bad? How are you going to use the data you collected? Is a person with an 80% attention ratio a better sales lead than someone with a 70% ratio? Is a webinar with a cumulative attention ratio of 68% successful or not?
I'd be very interested in your comments. Do you pay attention to your attention ratios? Do you know your average results across a number of webinars? Do you have a sense of what is a good score and what is a bad score? The floor is yours.
UPDATE JAN 15 2014: The Senior Product Marketing Manager at Adobe Systems wrote in to share the fact that they do indeed expose what goes into their attendee engagement meter. You can download the whitepaper at http://j.mp/AdobeEngagement. Thank you, Rocky Mitarai and Adobe!