Our learning objectives may be solid, our content looks beautiful but what about our assessments? In these difficult times we must measure the effectiveness of our training and prove that important ROI, how can our assessment strategies help?
127 Members

You need to be a member of learningandskillsgroup to add comments!

Join learningandskillsgroup

Comments are closed.

Comments

  • e-Assessment in Practice, 11-12 November 2009, Shrivenham. The list of confirmed speakers are posted at http://www.cranfield.ac.uk/cds/symposia/ea09.jsp
    Please alert me if you are aware of any exciting projects in the field of e-Assessment.
  • The Chartered Institute of Educational Assessors (CIEA) National Assessment Conference is on 6th May:
    http://www.ciea.org.uk/news_and_events/events_listing/national_conf...
  • I would like to add a slightly different slant to Casson’s suggested approach. We have recently completed and launched a project where we use pre-assessment to structure the content. It is a certified course which will be delivered to the thousands of experienced Offshore Oil & Gas workers in the UK sector and is part of the Minimum Industry Standards Training initiative.
    This online course has all of the pre-assessment, instructional content and summative assessment electronically tied back to Learning Objectives (typically 1 to 3 pages per LO).
    Keep in mind that these are experienced workers (new starts are covered in a separate course) and the course is 6.5 hours long. The thought of having to go through all of this content, which most already know, was not very desirable from the workers perspective and the downtime from the employing company’s view.
    The difference here is that based on the learners results in the pre-assessment, the course is dynamically built, only delivering content to cater for the knowledge gaps. The result is the worker could be out of the invigilated centre in an hour rather than over six hours. This innovative approach, in my opinion this is a win-win scenario, and a good example for the argument of testing first.
    One final point the passmark is 100% and to help achieve this the course will continue to dynamically rebuild itself based on the answers to the summative questions, so far it has been very well received.
  • The forthcoming conference on e-Assessment may be of interest to group members. Visit http://www.cranfield.ac.uk/cds/symposia/ea09.jsp for further details.
  • I just knew this was going to be a great discussion. I do hope others will come along and contribute too.
  • I certainly agree with you Phil about checking someone's ability to DO, and you were quite correct in that aviation technicians do under-go rigourous skills tests.

    I am intrigued by Casson's idea of using a 'test' to get student's thinking about the subject and can see that this idea would be very useful. Anything that stimulates interest in the subject is a bonus and I can see that this would also give the learner self-confidence. If a subject is too 'alien' to them then they will be reluctant to engage, whereas I think Casson's idea will actually make the learner realise they have a contribution to make and will encourage the learner to take part not only in e-learning but to also take part in discussions. Thanks Casson, I think I am going to add that one to my own teaching toolbox. All I ahve to do now is get my design team to produce another item for me!
  • I applaud your approach Casson, but I'd try to cast off the shackles of assuming that because e-learning was an element of the "performance solution strategy", it must be the channel for assessment. I believe that E-learning, in its many forms, can be a useful means of testing knowledge and attitudes. It can also be used to test the acquisition of a mental or physical skill (for example reaching a timely conclusion, observing a deviation, selecting an object from a mixed background, deciding upon a course of action using given data). But because I KNOW something or BELIEVE something, it does not mean I can DO something. My point is that you can get to mastery of a performance through progressive stages, and test each stage in turn (I can recognise an instrument by picking it from amongst others; I can turn it on; I can use it to perform a variety of tasks, and so on.) Ultimately however the criterion test is can I use instrument X to perform task Y under conditions Z? Scenario-based e-learning might be a closer approximation but it's still not the ultimate test, and so what's wrong with using e-learning as a way of managing work-based skill checks? Why not have the trainee view or download a checklist or model of the required outcome, and have an experienced colleague use it to observe the trainee at work? "Hard" and "soft" data could be used to measure conformance. The results could be recorded on paper or through an LMS with options for further support and improvement.
  • Richard, Thank Goodness for that! I'd hate to be a passenger on a plane that had been checked by an engineer who understood 95% of how to get aircraft ready to fly safely. I'd be very worried about the 5% he'd failed on. I think. I'd be reassured, not by his or her understanding, but by the manifestation of that understanding and probably that would mean he'd been skill-checked preparing or repairing planes that were to take to the air under conditions similar to those of my flight.
  • Hi all

    Phil, I really should have said that when they get back to the shop floor I want them to use the knowledge they have gained to deliver the expert customer care that my organisation is known for!

    If I then lead from Phil's 1st comment and try to answer Richard too. I use, in my words, I am no expert here but am trying to understand and exploit the tools we can use when developing eLearning content, 'test to teach' I have been experimenting with this approach in a number of ways.

    For instance I have used this in situations where the content on the page and the product information referred too is revealing enough for them to 'have a go' at a question about the subject. I am trying to make them reflect and think and I also want to give them a little confidence to enable them to reflect and think and then to have the same confidence in themselves in front of their customers. Does that make sense?

    I have also produced a short, 5 minute or so module for new joins. It is a series of 6 scenario based questions. They learner is given 2 chances to find the 'model' answer, (all of the answers are right in some way) and on each go are given feedback in a way that 'layers' the learning. All of the learning is in the feedback. In the whole piece we introduce the learners to the way we treat customers through to how our Advantage Card works, in 6 screens.
  • Some interesting comments there Phil and I can see where you are coming from. However in my field, which is training personnel to carry out maintenance on aircraft, I am very interested in how much has been learnt but, more importantly, I am even more interested in how much has been understood.

    I find usually that understanding only comes when the knowledge or learning has been applied. As a result, e-testing can only go part of the way to ensure that I have given the student the knowledge they need to carry out the job.
This reply was deleted.

accreditation/certification framework

Has anyone worked on developing an accreditation or certification framework? We are exploring ways of setting something up and need a Goframework as a structure for the organisation which hasn't done this before. We do have assessments built in our training courses but nothing formal. Any advice would be appreciated.

Read more…
2 Replies

IT Screening tools

I am looking for a tool that will allow us to test the IT competence of both existing staff and those that are applying to join my organisation. It would therefore need to be available via the web. Ideally the tool would have simulations to test competence in Microsoft apps such as Word, Excel etc but it would also be beneficial if there was an authoring element that would allow us to create tests for our own bespoke applications. It would also be advantageous if it could link to an LMS we are…

Read more…
0 Replies

e-competence assessment

We're embarking on a project to develop shared competence assessment tools across our region.   So far, we have developed a region-wide Statutory and Mandatory Training Framework which defines learning outcomes and refresher intervals in nine essential skills areas.  All our hospital Trusts in the South Central area have signed up to the Framework.  We are now progressing the next step, which is to create online assessments for each of the nine skills areas, for staff to undertake prior to…

Read more…
1 Reply

e-Assessment in Practice, 10-11 November 2010

First Call for PapersPlease consider actively participating in the conference either by presentinga paper, poster or demonstrating your recent research on e-Assessment. Pleasefeel free to circulate it among your colleagues as you see fit. Look forwardto meeting you in November. I am more than happy to contact any individuals ifyou want me to pursue them for a presentation.Please find further information athttp://www.cranfield.ac.uk/cds/Symposia/EA10.htmlConfirmed speakeres include ... Dr. Kerry…

Read more…
1 Reply