Information

Assessment

Our learning objectives may be solid, our content looks beautiful but what about our assessments? In these difficult times we must measure the effectiveness of our training and prove that important ROI, how can our assessment strategies help?

Members: 127
Latest Activity: Jun 14

Discussion Forum

accreditation/certification framework

Started by Hootan Zahraei. Last reply by Brian Fox May 4, 2011. 2 Replies

IT Screening tools

Started by Karen Smout Oct 15, 2010. 0 Replies

e-competence assessment

Started by Alison Wright. Last reply by Barry Sampson Sep 24, 2010. 1 Reply

Comment Wall

Add a Comment

You need to be a member of Assessment to add comments!

Comment by Ken Jones on June 25, 2009 at 14:32
Venkat, I believe your statement "bottom line knowledge they will accept from any one claiming certain level of expertise" hits the nail on the head, this should most definitely determine the desired Learning Outcome. The question is how, when assessing competence, can we emulate some of the complicated processes that a human assessor would go through when physically assessing the competence of a learner, maybe a bridge too far for safety critical situations or tasks?
Comment by Stephanie Dedhar on June 25, 2009 at 14:31
I completely agree Venkat! It's crucial that the questions in any assessment are developed with the input of the SME. I also think that it's often useful to create the assessment, or at least a draft of it, before beginning to design the actual training (this seems to match with your advice to identify the basic knowledge required early on). This ensures that what you are testing aligns with the agreed learning outcomes and objectives, and gives you a solid basis on which to design the training units.
Comment by Venkat on June 25, 2009 at 14:10
I have been following the conversation initiated by Stephanie and comments from Ken Jones. From your comments I can see the words of wisdom moderated by scarred and healed wounds !

I face this dilemma of who is best placed to develop the e-assessment question items. I see this situation very similar to the days of developing an expert systems in the late seventies and early eighties. The knowledge engineer(ID) sits with the domain expert (SME), and through several iterations develop the system. This means much stronger engagement with the SMEs and identify early on what is the bottom line knowledge they will accept from any one claiming certain level of expertise in the chosen field. Then map the items that guarantee that level of expertise that minimises the risk of 'lucky guesses'. I am not too hung up about the MCQs per se - it is the quality of items and the nature of the feedback that define the success or failure.
Comment by Ken Jones on June 25, 2009 at 13:20
I agree Stephanie, in fact I would say that the assessment should be considered an integral part of the learning process, as opposed to the ubiquitous test/quiz/exam style currently employed.
Comment by Stephanie Dedhar on June 25, 2009 at 8:43
I like your suggestions for how to approach an assessment to avoid short term memory or lucky guessing, Ken, but like you say there's additional work involved there both in ID and development terms. I've written assessments in the past that follow the format of a standard multiple choice questions, but have taken a scenario or behaviour focused approach because I think this strikes a good balance in terms of producing an assessment that genuinely tests the learner on what is relevant (that is, the choices they make and the behaviour they display, rather than definitions of technical terms or the dates of particular laws) and is written in such a way that the right answer is neither immediately obvious nor near impossible to identify, but doesn't add to the development time. Obviously it adds to the ID time but it's my view that the time for writing assessments needs to be factored in from the start and given just as much importance as writing the training - if it results in a more effective assessment, it's time well spent!
Comment by Ken Jones on June 24, 2009 at 17:36
Whilst I think the concept of separating the instruction and assessment by a stipulated time period, in reality it would be difficult to manage in our industry for two rreasons: the transaction cost of downtime would be difficult to quantify (that is assuming that you would do something about filling the gaps in the learners knowledge if the fail to answer correctly); and a lot of the eLearniing in our sector is based around compliance and this means that if the worker has not passed the course (had the assessment) they can not go offshore, not sure the individuals or unions would be to happy about that.
Potentially you could ask the learners to volunteer to retake the test at a later date without the repercussion of being bumped from the Platform should they fail.

However, on your second point Stephanie, what about having tiered or questions for example:Spot the hazard in the scene (picture), then choose (MC) what type of hazard it is, and then how would you remedy the situtation/hazard (MC). By employing this tactic you would be investigating their understanding at a deeper level and be less prone to "Multiple Guessing" paradigm and hopefully encourage longer retention of the information due to the thought that has to go into answering each question. The downside is that it takes a lot longer to create the questions form ID and Developer perspective.
Comment by Stephanie Dedhar on June 24, 2009 at 9:26
I've just skimmed previous comments to get an overview of how this discussion has developed, so apologies if I repeat anything that's been said (or dismissed!) before...

This is a topic that interests me as I'm finding that many of the people I work with are realising that perhaps an end of training multiple choice assessment isn't the most effective approach - but they aren't yet comfortable with making the leap to anything else. Which is I guess where I come in - how can I work with them to ensure they get the measures they need and that mean something?

What do people think about asking users to work through the training on one occasion, perhaps setting a period in which this must be done, and then having a second assessment period at a later date? Whatever the format this would go some way to avoiding the problem of simply testing short term memory. In terms of format, I think multiple choice assessments can be made more valuable by creating questions which put the user in a situation similar to those they'll experience in their day to day work - so we're asking them to select behaviours and actions rather than simple facts. This perhaps moves towards testing the right thing? Or perhaps the questions all relate to a case study - again, we're then testing them in relation to actual performance.
Comment by Jacob Hodges on June 8, 2009 at 8:51
I work as a contractor to the defense agencies and support the development and funding of advanced distributed learning (ADL) science and technology (S&T) projects. In this vein, as a small UK company, I have developed an ADL S&T Roadmap of the research projects developed and future projects. I can be contacted at jhodges@onrglobal.navy.mil or jhodges@qapltd.com.
Comment by Phil Green on May 29, 2009 at 16:46
Personally I never (well seldom) conduct tests of immediate recall except as a means to build the confidence of learners. I doubt that they are much use as indicators of the likelihood of applying leanring to some future task. I like Bob Mager and Peter Pipe's take on this - the simple question, "How will I know it when I see it?" (Thanks for the biochemistry lesson Charles - my greatest scientific achievement to date was a grade 9 O Level in Physics!)
Comment by Ken Jones on May 29, 2009 at 16:32
Short term, dare I say it! What is the value a check on a compliance sheet, not ideal but all to often fact. Would you consider that in terms of long term maybe the measure of competence is more relevant, often done by assessing a portfolio of evidence or seeing somebody do something in real (or virtual life), either way the assessment has to be more cleverly thought out.
 

Members (127)

 
 
 

Members

Sponsor promotion

LSG Sponsors

© 2019   Created by Donald H Taylor.   Powered by

Badges  |  Report an Issue  |  Terms of Service