VMworld 2011: The Power of Session Surveys

I wanted to give some statistics of a few of the sessions I presented to drive home how important these surveys are to us the presenters.  David Hill and I presented CIM1264: Private VMware vCloud Architecture Technical Deep Dive three times throughout the week.  After each session we made changes BASED solely on the feedback of the audience and the results speak for themselves.  Below is a breakdown of each day and the key statistics based on responses.  You can see as we made changes the scores went up.  So if you are attending VMworld Copenhagen, please remember to do the surveys.  We actually use the information to make the sessions better for you the listeners.

Tuesday CIM1264 10:00 Session (75 responses/510 attendees):

  1. How would you rate this session?   
    • Average: 3.95
  2. How would you rate the speaker(s) overall effectiveness?  
    • Average: 4.09
  3. How likely are you to implement what you learned in this session?
    • Average:  3.84 (Likely)
  4. How likely are you to recommend this session to a friend or colleague ?
    • Average: 7.74 (Pretty Likely)
Comments:

“Didn’t want to hear the business case. wanted to go deep about vcd”

“First 15 minutes was overview”

Wednesday CIM1264 8:00 Session (49 responses/309 attendees):

  1. How would you rate this session?
    • Average: 4.16
  2. How would you rate the speaker(s) overall effectiveness?
    • Average: 4.43
  3. How likely are you to implement what you learned in this session?
    • Average:  3.96 (Likely to Extremely Likely)
  4. How likely are you to recommend this session to a friend or colleague ?
    • Average: 8.1 (VERY Likely)
Comments:

“Best session so far! Looking forward to more content guiding lab manager customers down the right path to VCD.”

“Very good helpful tips for cloud architecture.”

Thursday CIM1264 1:30 Session (27 responses/196 attendees):

  1. How would you rate this session?
    • Average: 4.59
  2. How would you rate the speaker(s) overall effectiveness?
    • Average: 4.56
  3. How likely are you to implement what you learned in this session?
    • Average:  4.33 (Likely to Extremely Likely)
  4. How likely are you to recommend this session to a friend or colleague ?
    • Average: 8.85 (VERY Likely)
Comments:

“QA time at end of session was great. Super speakers, very knowledgable.”

What you can see is that each session was progressively rated better than the last.  This is a direct correlation to adjustments made by the presenters based on the feedback received by the audience.  I personally want to thank all the attendees that took the time to fill out the surveys, however I want to also point out the ratio of responses to attendees.  This was on average only 13-15% for each session.  Now this data was pulled as of 1:00 today (9-7-11), and you can still access your login and submit results….I’m just saying 🙂  You can also see from the comment responses that as the sessions went on based on adjustments they also were improved from the baseline.  David and I took our survey feedback very seriously and the results showed it.  Thank you again for all those that attended our sessions.

One comment

Leave a Reply

Your email address will not be published. Required fields are marked *

*

Scroll To Top
%d bloggers like this: