Our learning objectives may be solid, our content looks beautiful but what about our assessments? In these difficult times we must measure the effectiveness of our training and prove that important ROI, how can our assessment strategies help?

Members: 127
Latest Activity: Jun 14, 2019

Discussion Forum

accreditation/certification framework

Started by Hootan Zahraei. Last reply by Brian Fox May 4, 2011. 2 Replies

IT Screening tools

Started by Karen Smout Oct 15, 2010. 0 Replies

e-competence assessment

Started by Alison Wright. Last reply by Barry Sampson Sep 24, 2010. 1 Reply

Comment Wall

Add a Comment

You need to be a member of Assessment to add comments!

Comment by Venkat on May 13, 2009 at 9:25
e-Assessment in Practice, 11-12 November 2009, Shrivenham. The list of confirmed speakers are posted at
Please alert me if you are aware of any exciting projects in the field of e-Assessment.
Comment by Richard Gott on April 29, 2009 at 10:03
The Chartered Institute of Educational Assessors (CIEA) National Assessment Conference is on 6th May:
Comment by Ken Jones on April 28, 2009 at 16:55
I would like to add a slightly different slant to Casson’s suggested approach. We have recently completed and launched a project where we use pre-assessment to structure the content. It is a certified course which will be delivered to the thousands of experienced Offshore Oil & Gas workers in the UK sector and is part of the Minimum Industry Standards Training initiative.
This online course has all of the pre-assessment, instructional content and summative assessment electronically tied back to Learning Objectives (typically 1 to 3 pages per LO).
Keep in mind that these are experienced workers (new starts are covered in a separate course) and the course is 6.5 hours long. The thought of having to go through all of this content, which most already know, was not very desirable from the workers perspective and the downtime from the employing company’s view.
The difference here is that based on the learners results in the pre-assessment, the course is dynamically built, only delivering content to cater for the knowledge gaps. The result is the worker could be out of the invigilated centre in an hour rather than over six hours. This innovative approach, in my opinion this is a win-win scenario, and a good example for the argument of testing first.
One final point the passmark is 100% and to help achieve this the course will continue to dynamically rebuild itself based on the answers to the summative questions, so far it has been very well received.
Comment by Venkat on April 8, 2009 at 14:08
The forthcoming conference on e-Assessment may be of interest to group members. Visit for further details.
Comment by Phil Green on January 26, 2009 at 13:56
I just knew this was going to be a great discussion. I do hope others will come along and contribute too.
Comment by Richard Clewer on January 26, 2009 at 13:17
I certainly agree with you Phil about checking someone's ability to DO, and you were quite correct in that aviation technicians do under-go rigourous skills tests.

I am intrigued by Casson's idea of using a 'test' to get student's thinking about the subject and can see that this idea would be very useful. Anything that stimulates interest in the subject is a bonus and I can see that this would also give the learner self-confidence. If a subject is too 'alien' to them then they will be reluctant to engage, whereas I think Casson's idea will actually make the learner realise they have a contribution to make and will encourage the learner to take part not only in e-learning but to also take part in discussions. Thanks Casson, I think I am going to add that one to my own teaching toolbox. All I ahve to do now is get my design team to produce another item for me!
Comment by Phil Green on January 26, 2009 at 12:54
I applaud your approach Casson, but I'd try to cast off the shackles of assuming that because e-learning was an element of the "performance solution strategy", it must be the channel for assessment. I believe that E-learning, in its many forms, can be a useful means of testing knowledge and attitudes. It can also be used to test the acquisition of a mental or physical skill (for example reaching a timely conclusion, observing a deviation, selecting an object from a mixed background, deciding upon a course of action using given data). But because I KNOW something or BELIEVE something, it does not mean I can DO something. My point is that you can get to mastery of a performance through progressive stages, and test each stage in turn (I can recognise an instrument by picking it from amongst others; I can turn it on; I can use it to perform a variety of tasks, and so on.) Ultimately however the criterion test is can I use instrument X to perform task Y under conditions Z? Scenario-based e-learning might be a closer approximation but it's still not the ultimate test, and so what's wrong with using e-learning as a way of managing work-based skill checks? Why not have the trainee view or download a checklist or model of the required outcome, and have an experienced colleague use it to observe the trainee at work? "Hard" and "soft" data could be used to measure conformance. The results could be recorded on paper or through an LMS with options for further support and improvement.
Comment by Phil Green on January 26, 2009 at 12:30
Richard, Thank Goodness for that! I'd hate to be a passenger on a plane that had been checked by an engineer who understood 95% of how to get aircraft ready to fly safely. I'd be very worried about the 5% he'd failed on. I think. I'd be reassured, not by his or her understanding, but by the manifestation of that understanding and probably that would mean he'd been skill-checked preparing or repairing planes that were to take to the air under conditions similar to those of my flight.
Comment by Casson McRae on January 26, 2009 at 12:02
Hi all

Phil, I really should have said that when they get back to the shop floor I want them to use the knowledge they have gained to deliver the expert customer care that my organisation is known for!

If I then lead from Phil's 1st comment and try to answer Richard too. I use, in my words, I am no expert here but am trying to understand and exploit the tools we can use when developing eLearning content, 'test to teach' I have been experimenting with this approach in a number of ways.

For instance I have used this in situations where the content on the page and the product information referred too is revealing enough for them to 'have a go' at a question about the subject. I am trying to make them reflect and think and I also want to give them a little confidence to enable them to reflect and think and then to have the same confidence in themselves in front of their customers. Does that make sense?

I have also produced a short, 5 minute or so module for new joins. It is a series of 6 scenario based questions. They learner is given 2 chances to find the 'model' answer, (all of the answers are right in some way) and on each go are given feedback in a way that 'layers' the learning. All of the learning is in the feedback. In the whole piece we introduce the learners to the way we treat customers through to how our Advantage Card works, in 6 screens.
Comment by Richard Clewer on January 26, 2009 at 11:50
Some interesting comments there Phil and I can see where you are coming from. However in my field, which is training personnel to carry out maintenance on aircraft, I am very interested in how much has been learnt but, more importantly, I am even more interested in how much has been understood.

I find usually that understanding only comes when the knowledge or learning has been applied. As a result, e-testing can only go part of the way to ensure that I have given the student the knowledge they need to carry out the job.

Members (127)



Sponsor promotion

LSG Sponsors

© 2020   Created by Donald H Taylor.   Powered by

Badges  |  Report an Issue  |  Terms of Service