One of the courses I run has 156 lessons grouped into 15 modules taken over 6 - 8 years. Formative assessments at the end of each lesson and summative assessments at the end of each module are pulled from a single question bank of 2200 questions. The summative assessments must be passed in batches at various stages of students' (junior Army officers) careers in order for them to be promoted. I have 3 main issues:- The questions are mostly multiple choice or similar and thus do not engage students; also they mainly test knowledge rather than understanding.- As the summative assessments are high stakes students focus on passing (by any means) rather than on learning. This is reinforced by a pervasive culture that sees the course as an incumbence.- Many of the questions were produced by subject matter experts rather than educators so the quality is patchy.I have a few ideas on how to address the above issues but would be very interested in your views, particularly on the first two of the issues...

You need to be a member of learningandskillsgroup to add comments!

Join learningandskillsgroup

Email me when people reply –

Replies

  • Graham
    Wow, talk about picking up a cold trail! Thanks for your very informative reply and the new examples. In particular thanks for the link to your articles and the one on repairing MCQs which has prompted me to reinvigorate an offer I had from an academic to do a statistical analysis of the MCQs I'm using. If you're really into this field then Cranfield University (our academic partner at the Defence Academy) are running an e-assessment symposium 10 and 11 Nov (see here).
    For anybody else who has read this far and is still interested here is a link to an excellent blog post by Cathy Moore that really brings the message home.
    Cheers, Jim:)
  • Hi Jim,

    Please could you explain how the SMEs are different to the educators in this instance in order that I can put it into context.

    Also I think there is some issue with MCQs but with a bit of imagination you can test higher levels of learning (using Bloom's taxonomy) by creating assertion/reason questions. I would be happy to give you some examples of these if you have never come across them. They are still MCQs but designed in such a way as to make the student think and to do more than just recall facts and figures.

    Best Regards,

    Joan
    : )
    • Thankyou for your interest Joan. In this case the SMEs are generally military practitioners who are on a posting into a staff or training job. This means that they are unlikely to have had any training/education in instructional design and thus have produced functional but unengaging questions. We've weeded out the questions that are below standard but there are still plenty that are only average; improving those is taking some time given the numbers.

      I am investigating using Intelligent Assement's free text questions in concert with the existing QMP package but that is going to take time and money. I've been through the usual superficial articles on MCQ design but I would be very grateful for any more learned advice that you could provide.

      Cheers, Jim:)
    • I'm slow coming to this conversation and it may be that everything is done and dusted but as it is a subject dear to my heart and I can't resist adding my thoughts.

      I firmly believe there is a role for MCQs in both formative and summative assessments but, as you point out, if left to the untrained SMEs they will, instinctively produce trivial questions based on factual information where difficulty equates to obscurity of the point being tested.

      Sadly, the only answer I have ever found to change this approach is to train the SMEs in how to write effective questions. A couple of examples I use to bring the point home are:

      What is 6 x 3?

      A poor set of distractors would be:
      a) 17
      b) 18
      c) 19
      d) 20

      Better would be:
      a) 2
      b) 3
      c) 9
      d) 18

      as each of these answers could be arrived at through a misunderstanding of the mathematical function to be applied.

      Better, though, would be to reverse the question and ask:

      "If I started with two, different two digit numbers and ended up with the answer 18, what were the numbers and what did I do to them?" as this question requires a different order of cognitive ability to answer correctly.

      Answers now might be:
      a) 1 and 8: addition
      b) 2 and 9: division
      c) 3 and 6: multiplication
      d) 1 and 9: subtraction

      I realise this is a simplistic example but the underpinning methodology works at any level.

      Another example I use in training Item Writers is this:

      Read the following passage:

      The antibacterial effect of penicillin was discovered by Alexander Fleming in 1929. He noted that a fungal colony had grown as a contaminant on an agar plate streaked with the bacterium Staphylococcus aureus, and that the bacterial colonies around the fungus were transparent, because their cells were lysing.

      If I were to ask you to write a number of questions based on the text the vast majority would produce questions along the lines of (I have not bothered to include the options):

      1. Who discovered the effects of penicillin
      2. When did he discover it
      3. What was the bacterium streaked on the agar plate?
      etc.

      I have repeated this experiment many times and the outcome has always been the same. Notice that all these questions are simply designed to test your memory of what you have just read.

      A much better question might be (assuming this was covered in the learning):

      1.Why does Penicillin cause lysis in bacterial colonies?

      because this question cannot be answered correctly if the candidate does not understand the
      subject matter.

      You can find further articles on the subject (if you're interested) in our free resources page at http://grbps.com/articles.htm

      Graham
      Welcome to GR Business Process Solutions
  • Please see the site has a lot of assessment-related resources. Also there are a number of LinkedIn Groups. Some resources are available to non-members and we analyse the different types of assessment throughout the site.

    Rich.
  • Hi Jim.

    I have seen this problem myself having been involved with training the MOD myself. Personally, I was not a fan of multiple choice partly because of the reasons you mention.

    I attended the talk by Lt Cdr Paul Pine RN at the Conference last week ( I am not sure if you did too) and he demonstrated the new assessment method that they are using for the Type 45 Destroyer. It seemed very engaging and the students' reactions to it were very positive.This was quite revolutionary to me and I am hoping to use the ideas myself on my current project. It may be worth you contacting him if you would like to know more about his methods.

    Paul isn't a member here yet, but I have his email. If you would like it then let me know and I will send it on to you.
    • Richard
      Thanks for the comment. Paul has picked up my thread in the MOD group so I will engage through there.
      Cheers, Jim:)
  • Here are a few observations (you might have thought of these already).

    Multiple choice questions that just test factual recall are usually boring and won't engage students. But m-c questions can be made more engaging by increasing the relevance and practicality. So for example you embed each question in a situation that mirrors the situation in which they'd need this knowledge. (you're working on a xxx engine and ask your mate for a tool to let you do yyy. He gives you a zzz. Do you a - use it b- throw it back at him c- adjust it and use it d- ask him for an aaa to go with it) .
    If you haven't already done so read up on Ruth Clark and 'use' vs 'remember' objectives for different types of information. That can also help.
    • Thanks for the reply Norman, it was just me and the tumbleweeds in here for a while. I'll dig out your reference and I've noted Phil Green's excellent articles in the Learning Technologies magazines. I'm also chatting with Intelligent Assessment who seem to have a sound tool for free text answers which should be better than m-c at assessing understanding. What I'd really like to do is use serious games for summative assessment but the industry doesn't seem to be there yet. Your profile has you as interested in games and simulation; have you come across anything in the assessment area other than Caspian Learning who I'm already dealing with? See the att Caspian link if you're interested...
      http://www.caspianlearning.co.uk/MoD_Defence_Academy_Serious_games_...
This reply was deleted.