Blog reader Emma wrote me today to ask if there was a way to compare the webinar satisfaction ratings her organization collects on their educational/informational webinars with those of a larger cross-segment of the population. After all, you aren't going to please all the people all the time, but how do you know whether your approval ratings are good, bad, or average compared to industry norms?
I couldn't think of any easy way to accomplish what Emma wanted. Industry benchmark reporting from companies like ON24 and BrightTALK typically don't include cumulative satisfaction ratings on their clients' content. It's not really fair to their clientele to share that kind of information and since there is no standardization in satisfaction scales, it's difficult to make comparisons between results.
But Emma's question opened a much bigger bag of worms related to surveys. First of all, a general question about "Satisfaction" doesn't tell you much. Sure, it's a number you can report and show to your manager, but what if it's low? What are you going to fix? I try very hard to ask for separate ratings on technology, content, and presentation skills.
You might give a fantastic webinar with a wonderful speaker, perfect technology, and a clearly presented topic. Some attendee will respond as "Very Dissatisfied" because they didn't read the invitation closely enough and expected different information. Or they were at a different level of expertise than the information was targeted for. Should you change anything about the webinar because of their rating? It probably means you need to change your marketing materials and description copy writing. But could you figure out that's where your problem lies?
Another big problem with webinar feedback surveys is that they are often skewed towards positive results. If attendees are dissatisfied enough to leave the session early, they will never see the survey if it only pops up at the end of the session time. Some webinar technologies (usually the ones that require local download and installation of the client software, such as WebEx and GoToWebinar) are able to display the survey when an attendee shuts the conference window. But if an attendee feels you have wasted their time with your awful webinar, they often don't want to give you even more of their time by completing survey questions for your benefit.
If you really want to see how people feel about your webinars compared to others, maybe that's what you should ask them. In addition to finding out whether they thought yours was good, you could ask "How did this webinar compare on average to webinars you have attended from other companies?" -- Much better, Somewhat better, Average, Somewhat worse, Much worse.
If you have some suggestions for Emma, I would welcome your comments. Or if you know of industry satisfaction benchmarks for webinars, I'd love to see a link! If you are reading this offline or in an RSS feed, you can comment on the original blog post at the following link: