Our learning objectives may be solid, our content looks beautiful but what about our assessments? In these difficult times we must measure the effectiveness of our training and prove that important ROI, how can our assessment strategies help?
127 Members

You need to be a member of learningandskillsgroup to add comments!

Join learningandskillsgroup

Comments are closed.

Comments

  • I think you've made good points regarding the difference in the assessment of 'knowing' and of 'doing' , Phil.

    I'd step a little further down the line (or onto the track in front of the train, as the case may be) and suggest that most assessment of 'knowing' that I've seen doesn't actually assess real knowledge acquisition at all.

    In 99% of situations where some form of assessment of the 'learning' is applied, the assessment is carried out during or immediately following some type of learning/training event, or immediately following the opportunity for 'learners' to revise what they've been through during the training.

    This simply tests short-term memory/retention. The pre-test/post-test model is particularly susceptible to misinterpretation and, I think, is potentially damaging as it can lead people to think that real learning has occurred when it hasn't, and that the training was 'successful' when it wasn't.

    All it's really assessing is the ability of the 'learner' to retain information in short-term memory. And short-term memory ipso facto isn't a lot of use once a bit of time has passed. (where did I leave my glasses?)

    There's a swathe of research into the difference between short and long-term memory. Learning professionals should know at least a bit about that stuff, and also appreciate that our job is to help people transfer learning into long-term memory which will result in changed behaviour. After all, real learning is nothing more than changed behaviours.

    To dive a little deeper (just because I spent a few years studying biochemistry in my youth)......

    Different proteins have been identified in the two processes of short-term memory retention and long-term memory retention. In the long-term memory process it's now know that serotonin producing cAMP (cyclic AMP), when in high concentrations in the synapses between neurones, activates a protein/enzyme called PKA (cAMP dependent Protein Kynase) which then targets CREB (the CAMP Respondent Element Binding protein)..... CREB is the 'key' to long-term memory, forming more and different types of synaptic connections that persist. Short-term memory retention is a different process and doesn't do this.

    So, when we talk about assessment, are we talking about assessing short-term memory or long-term memory? If the former, what's the value?
  • Well I am heartened to see how this discussion has moved on. Being a bear of very little brain I like to simplify, so - in the matter of assessment I believe the first question is are we trying to learn about ..... or are we learning how to.....? In the first instance we need a measure of knowledge and in the second we need a test of mastery. It is contriving things too far to separate learning outcomes from performance outcomes even if the performance is some demonstration of having acquired some knowledge (e.g and exam or a quiz). My question always is, "Where do you think learning objectives come from?" Now I won't complicate matters by comparing different motivations to learn, but once again I'll simplify by saying they come directly from performance objectives, and both have the same criteria for assessment. It is only the conditions and standards of the objective that differ. For example, "inflate a punctured tyre sufficient to allow the completion of a journey" may suggest an obvious measure of accomplishment at first glance. However we may then ask for clarification - a tyre on a car? on a bike? on an HGV? on an aircraft? with a full payload on board? across what distance? under favourable conditions? in the dark? when it's wet? whilst under fire? Some elements of the performance may be held in common, for example using a manual pump might require the same physical force in a warm, well-lit garage as it does by the side of the road. However reductionism in instruction has become much maligned and may be an issue too hot for me to argue all in one posting.
  • Is that a statement about our economic times, or the benefit we get from the learning?

    Maybe just "LO" and everybody can choose what it means to them.
  • Another term...

    How about 'Payoffs'
  • Forgive me Neil, you are right. Having a more technical development bent my mind generally thinks of self contained Learning Objects which have meaning, aremanageable and can be managed. It appears my brain decided that a natural extension would be to add "ive" :-(. Maybe there is another term that not quite as big as Learning Outcome it does feel rather grandiose for what might require one page of content and one question, not that I am suggesting a change as it can be as small as you like but just a perception thing, maybe because it is often used in the context of a whole course, just a thought.
  • Assessment should be mapped to "the content between lines". I hope that is explicit. I will paraphrase that when I gather some spare time.
  • Ah! Now you hit my button Ken...

    Objectives? No surely those are yours, the corporate, the designer, the trainer... You have objectives to ensure that the learner leaves with an understanding of the subject.....

    We must consider Learning Outcomes first, not Objectives. They are very different.

    Should we not have our Objectives set in stone long before we get into the design of a course or the assessment, and, why should the objective have any bearing on the assessment process? If our objective is to test the user ad-nausium as we are doing presently in eLearning then we will eventually alienate them altogether.

    Create the Learning outcomes and assessment can come from those outcomes, providing they happened.. Semantical I know, but a rather hot topic...

    Venkat:

    I agree we must not get into assessment driven learning, somebody called it learning by inquisition. I rather liked that term.
  • I think Neil has a good point, however rather than focus on assessment surely creating the Learning Objectives first as they should be key to what questions you develop and of course, what instructional content should be created so the learner will understand enough to answer the question and meet the criteria set out in the LO.
  • I completely agree with Neil Lasher with a minor but an important proviso. You can not completely swing towards Assessment driven content. Then you are in danger of "teaching to the exam". The current view is certainly is that of an after thought. You may ask why - my answer is that online assessment does not count towards any thing in the current curucculum at least in HE. I would like to be proved wrong, if there are any universitties who take into account a non-zero percentage towards the final mark for the student. I better stop there...
  • I would like to get into this mix...

    Running instructional design courses and speaking to many seeking the holy grail in assessment I find one very interesting common occurrence. Now I don't suggest for a minute that everyone falls into this, but it appears to be common.

    Especially in eLearning courses are built, often by following ones knowledge rather than by pure design and then as an afterthought the content it trawled to try to find 20 questions to put into a bank to ask some or all of at the end, to prove what?

    I now suggest that the whole instructional design process should start with the formation of the assessment. Then once you know what it is you are trying to assess, write the learning to teach it....

    For some of you this is teaching your grandmother....... something about eggs!

    But I am always surprised how many light-bulbs come on when I say this.
This reply was deleted.

accreditation/certification framework

Has anyone worked on developing an accreditation or certification framework? We are exploring ways of setting something up and need a Goframework as a structure for the organisation which hasn't done this before. We do have assessments built in our training courses but nothing formal. Any advice would be appreciated.

Read more…
2 Replies

IT Screening tools

I am looking for a tool that will allow us to test the IT competence of both existing staff and those that are applying to join my organisation. It would therefore need to be available via the web. Ideally the tool would have simulations to test competence in Microsoft apps such as Word, Excel etc but it would also be beneficial if there was an authoring element that would allow us to create tests for our own bespoke applications. It would also be advantageous if it could link to an LMS we are…

Read more…
0 Replies

e-competence assessment

We're embarking on a project to develop shared competence assessment tools across our region.   So far, we have developed a region-wide Statutory and Mandatory Training Framework which defines learning outcomes and refresher intervals in nine essential skills areas.  All our hospital Trusts in the South Central area have signed up to the Framework.  We are now progressing the next step, which is to create online assessments for each of the nine skills areas, for staff to undertake prior to…

Read more…
1 Reply

e-Assessment in Practice, 10-11 November 2010

First Call for PapersPlease consider actively participating in the conference either by presentinga paper, poster or demonstrating your recent research on e-Assessment. Pleasefeel free to circulate it among your colleagues as you see fit. Look forwardto meeting you in November. I am more than happy to contact any individuals ifyou want me to pursue them for a presentation.Please find further information athttp://www.cranfield.ac.uk/cds/Symposia/EA10.htmlConfirmed speakeres include ... Dr. Kerry…

Read more…
1 Reply