We are looking to do some remote training at our company and I would like some advice as to obtaining electronic feedback. There seem to be a few companies out there that do very comprehensive feedback packages. I wondered if there were any recommendations or hints and tips? I am after something that just replaces our feedback form in an electronic format. Also  - does anyone have an idea of average costs of these things? Another question is how does one ensure that people actually complete the feedback if they are not in the same country as you...? Any advice would be gratefully received.

Views: 315

Reply to This

Replies to This Discussion

Hi Olivia

There are some online (free) survey tools you could use like survey monkey https://www.surveymonkey.com/(others are available) - this does mean you have to create your own questionnaires.

You cannot ensure that people complete these surveys (but we found that more people responded electronically than to paper). If you need to you can offer incentives like draws, prizes etc but probably have to accept that completion rate will never be as high as you want.

Hi Olivia,

Coincidentally we're about to pilot something along these lines (working with an analytics firm in Switzerland). Do let me know if you'd potentially be interested to be part of this (no cost). We're trying to make the whole process as simple as possible for a trainer/organiser, using some bespoke programming at the front end, then using a well-established survey tool, and some clever simple reporting at the end.

Jonathan

http://www.careerinnovation.com/innovation-hub/

Hi Olivia

We use Survey Monkey, the tool that Julian mentions, for our evaluations and found that although the completion rate fell, compared to handing out paper forms and asking for them back within the room, the quality of responses increased. As well, the time saved on manual inputting of the survey responses was enormous. It also enables us to report on one programme at a time, or on all training delivered within a certain period. To do this, we use one survey but provide a drop-down menu of all training delivered for our delegates to choose from.

As an organisation, we pay for an account with Survey Monkey as it enables us to brand our surveys and utilise some of the more advanced features. Though we don't often use these features for our evaluations, they can be really useful for making a survey more relevant to the person completing it.

Jonathan - I would be interested to know more about the project that you are working on.

Kyla, thanks. The broad context is our interest in metrics and the business case for training and other interventions - real impact is so hard to measure. We've used tools like Survey Monkey in the past but want to make it even easier for a course organiser to run the process. So we're creating a way for the survey to be launched simply by cc-ing our address when sending details to course participants. Ultimately we will use the same system to trigger (automatically) follow-up surveys e.g. 60 days later too (and possibly to participants' managers/colleagues). While we're piloting the reporting will be done 'manually' but as soon as we're sure the front-end process is working, we'll automate that as well, so we can hopefully run this on a 'Freemium' basis and give a basic feedback survey at no cost.

At this stage we are keen to pilot with a few organisations to test both the questions and the process. If you (or Olivia) want to have a chat, drop me a line with a bit more detail about what you're wanting to do -  http://www.careerinnovation.com/contact-us/

I'm also keen to learn from other people (on this thread) what works, how to encourage responses, what else is out there, what's proving difficult etc.

Hi Jonathan

We use Survey Monkey automatically for our follow up process. 

The first evaluation is mainly done electronically via an intranet link, but this doesn't work in a very few areas of our work, when we have to resort to paper unfortunately and enter the evaluation manually when we are back in the office. 

From this evaluation the web team have set up a process where the first evaluation triggers a follow up through survey monkey that sends the appropriate second evaluation to them.  This depends on the course they have attended, which is filled in by the learner on the first evaluation.  This used to be sent out manually but is proving a lot easier now it is automated!

We have been using this for about three months now and results so far are positive with a reasonably high return rate.

Hope this helps.

Thanks Sarah. I think the idea of automated follow-up is great, to help assess impact of training. I guess the use of tools like Survey Monkey should make it accessible to more and more people via mobile etc? But interesting to hear you're still finding some people will need/prefer paper.

We've had quite a lot of interest since I first posted above, and our Swiss-based colleague has now created a website: www.workometry.com. The focus - as in your example above - is on ease of use for the administrator.

Interesting to hear about other tools too, like trainingcheck that Bob mentions below.

Hi Sarah

This is really interesting - How does Survey Monkey automate the follow up emails? And presumably these follow ups link them directly to a new survey for their second evaluation form?

Sorry if I've misunderstood...

Kyla

I have used Google Forms which has the advantage of the analytical features within Google and can be exported to a range of formats for analysis.

We use Training Check (http://trainingcheck.com/) for our electronic feedback. It allows you to setup surveys and feedback forms, but then creates comprehnsive reports from your answers. It is quite cost effective as well. 

Hi Olivia,

We use the same 'learner survey' across the board for all training (facilitated (UK/offshore), remote (UK/offshore), in-house and outsource. We receive an average of 4125 survey responses monthly which is roughly a 95% response rate for our training. Our user generated error rate is 0.07% (i.e. feedback at the wrong ends of the scales not matching verbatim comments). The survey tool was built in house using a free source Java script library. We used this to create sliders (recording feedback on a 0-10 scale), free text boxes etc.. Our learner responses submit to on a SQL database which can be exported in to an Excel format for us to investigate the data. We are trainers not web developers/designers and if we have done it proves it is not impossible. The above is to give some credibility to my response.

Our experience is that a single survey regardless of facilitation method is the best solution. This is why:

A single solution means you can compare all courses/facilitators equally. If surveys are completed in a mixed of paper/electronic format the validity of the data could be challenged due to the learner perceptions of the question/survey format. It allow the survey to be presented as business and usual, they are not getting an amended or different survey this is really important in offshore. Our survey is an evolution of a paper survey, we have worked hard to minimise the questions, to make the survey as quick as possible to complete.  More importantly only gathering feedback on relevant subjects. To do this define an aim and then group up to three questions around this with a free text box to gather feedback directly against this aim.  It is really rewarding to read the genuinely constructive feedback. From experience this is less likely to happen if you have to many opportunities for free text or have not positioned what area you are interested in (the aim). Our survey aims are trainer impact (the learning experience) and impact on job role (will you be better at your current job after this training).  If you are using a web based training tool like Webex you can send the URL to all leaners in the session, allowing training time to complete, checking the database for responses coming could drive a 100% completion rate with the trainer telling learners the number/% of surveys received.

I agree with other responses that Survey Monkey is a great tool it defiantly has a role in a comprehensive evaluation package.  We have used it as a post training follow-up on specific programmes where more time is available, it’s draw-back time to complete.    

Hope this helps.

 

Hi Olivia

We use Survey Monkey for post learning and performance feedback.  Its improved loads and you can set it up very easily.   

It won't mean that people will complete it however if you make this a condition of receiving training then that could help increase the sign up. 

Good luck, 


Blake

RSS

Members

Sponsor promotion

LSG Sponsors

© 2019   Created by Donald H Taylor.   Powered by

Badges  |  Report an Issue  |  Terms of Service