How do you evaluate digital learning via Kirkpatrick model without over-burdening the users?

Hi

We are looking to start evaluating our digital content using the Kirkpatrick model.

However; we are aware that our users will not want to be faced with a long survey after completing a lesson, particularly if it takes longer to complete than the lesson itself.

We've got star ratings on our lessons, but want to get more detailed feedback.

Can anyone suggest any techniques that they use to obtain the feedback but without annoying the users?

Thanks in advance

Lynn

Views: 183

Reply to This

Replies to This Discussion

Hi Lynn,

This is something that I will be dealing with in the near future as well, so I haven't actually tried anything yet but my first idea was to put together a review of previous modules before learners can move on to the next. This obviously depends on how the learning is structured and what level of KP you are looking at. But you could have a review of learning with some evaluation questions- you could list the key points and ask learners what they have applied in the workplace, and some very quick yes/no/don't know buttons around the usefulness and relevance of the previous module.

That hopefully wouldn't be too burdensome for learners, would re-cap some of the knowledge they have and you would get some information you can use.

Plus, I would guess they will be more willing to do that at the start of a new module than at the end of one.

Alternatively, you will know which staff have accessed the learning and you could set up an automated email with a few survey questions in it to be sent to their inbox the day after they complete- in my experience if you make the feedback quick and easy, you can get staff to respond. I started a brief Survey Monkey (other online survey tools are available!) evaluation system, rather than making phone calls to learners etc and we rapidly went from under 20% of learners' line managers feeding back to over 70%. It also helps if you make it very clear why you are asking for the information. The survey I designed had buttons for yes/no/don't know and optional text boxes- I was surprised by how many managers actually wrote a few comments, even if they didn't need to.

Hope that helps, let me know what you try and what works- as I said I'll need to do something similar soon!

Thanks

Tristan

Hi Tristan, thanks for the reply.

For sequential content, I really like the idea of the quick review of the previous lesson at the start of the next. As you say, the audience would be more likely to be in the mood to complete a survey when they are starting content than when they've finished and need to get on to the next task.

We have some pretty good reporting tools on our LMS, so we could use these to identify people who have viewed a lot of content and those who have viewed little or none. We could then use this data to question the consumers about what they like and the avoiders about why they are not viewing.

We've just launched the 5 star rating system on the lessons and have announced it on our Yammer network, so it will be interesting to see if our colleagues start to apply the ratings (and see what we get!).

In my previous customer-side roles - and talking with customers at the moment - there has been a tendency to just use 5-star ratings for e-learning modules, though I realise that doesn't address your question directly.

Two roles ago, I did come up with a simple set of 4 questions that would offer more depth of evaluation, but still minimise the amount of time needed to complete the evaluation. I've written about these here:

https://digitallearningthoughts.wordpress.com/2016/01/28/creating-s...

Hope this helps.

Tim

Thanks Tim

I loved your article. I can see how the questions you ask will provide some really useful information for determining the value of the courses and get some great feedback for making improvements.

I particularly liked the idea of using a Net Promoter Score question and asking how we could change the content to ensure a rating of 10.

We are looking at implementing NPS scores (http://www.medallia.com/net-promoter-score/) as one measure. Widely understood and easy to implement.

Agree with comments around pre and post course surveys but tedious to fill and collate

Thanks to everyone for your replies.

I think that as a result we need to look at creating something very short and sweet and that will provide maximum information for minimum user effort.

I guess the tricky parts will be determining exactly when to deliver a survey and the medium - do we add to the following lesson, use a stand-alone survey or even leave the LMS and use Survey Monkey.

Lots of food for thought.

Thanks again.

Lynn

RSS

Members

Sponsor promotion

LSG Sponsors

 

© 2017   Created by Donald H Taylor.   Powered by

Badges  |  Report an Issue  |  Terms of Service