Information

Assessment

Our learning objectives may be solid, our content looks beautiful but what about our assessments? In these difficult times we must measure the effectiveness of our training and prove that important ROI, how can our assessment strategies help?

Members: 106
Latest Activity: Feb 13

Discussion Forum

accreditation/certification framework

Started by Hootan Zahraei. Last reply by Brian Fox May 4, 2011. 2 Replies

IT Screening tools

Started by Karen Smout Oct 15, 2010. 0 Replies

e-competence assessment

Started by Alison Wright. Last reply by Barry Sampson Sep 24, 2010. 1 Reply

Comment Wall

Add a Comment

You need to be a member of Assessment to add comments!

Comment by David Perring on April 22, 2013 at 11:20

I thought it would be useful to share our latest Independent Research into e-assessments with the forum.  

The whitepaper can be found using the following link:

http://delivr.com/2q8k2

The report covers:

  • What does best practice look like in e-assessments?
  • How can you make the most of the available tools?
  • How do you effectively manage their deployment and sustain the value they contribute?

Let me know what you think..

David

Comment by Venkat on May 2, 2012 at 15:44

I am pleased to bring to your attention the two day conference one-Assessment in Practice, 13-14 June 2012. Visit http://www.cranfield.ac.uk/cds/symposia/ea.html for further details.

e-Assessment%20in%20Practice%20Brochure%20and%20Programme_2May2012.pdf

Comment by Venkat on October 8, 2010 at 9:27
e-ASSESSMENT IN PRACTICE SYMPOSIUM
PROGRAMME
Wednesday 10 November 2010
0900 - 1000 Registration and Coffee
1000 - 1005 Conference Administration
Michael Hewetson, Director Symposia at Shrivenham, Cranfield University, DA-CMT
1005 - 1015 Welcome and Introduction to the Defence Academy
Brigadier Mark Lacey, Head of Technology Division, DA-CMT
1015 - 1045 Beyond Technology: Developing Innovative Items
Dr Kirk Becker, Pearson VUE
1045 – 1115 Developments in Open Source e-Assessment to Support the Independent Learner
Dr Phil Butcher, Open University
1115 - 1145 Coffee
1145 - 1215 e-Assessment and Functional Skills
John Winkley, AlphaPlus Consultancy
1215 - 1245 e-CAF: A System for Efficient Quality Electronic Coursework Assessment and Feedback
Dr Shun Ha Sylvia Wong and Dr Tony Beaumont, Aston University
1245 - 1315 Mobile Phones – Delivering Authentic Assessment
Gavin Cooney, Learnosity and Patrick Craven, Cambridge Assessment
1315 - 1415 Buffet Lunch
1415 - 1445 Ensuring the Psychometric Quality of Questions and Assessments: Theory and Applications
Greg Pope, QuestionMark
1445 - 1515 Marking Complex Assignments Using Electronic Peer Assessment and an Automated Feedback
System
Dr Trevor Barker, University of Hertfordshire
1415 - 1545 What can e-Assessment do for Education?
Prof Cliff Beevers (with John Anderson and Bill Foster), Herriot-Watt University
1545 - 1615 Afternoon Tea
1615 - 1730 Panel Discussion: e-Assessment is Doomed to Failure Unless……
1730 - 1930 Delegates’ Reception and Supper
1930 Delegates transported to Blunsdon House Hotel, Madison Hotel and Campanile Swindon
e-ASSESSMENT IN PRACTICE SYMPOSIUM
PROGRAMME
Thursday 11 November 2010
0830 - 0900 Morning Coffee
0900 - 0930 You May Start Typing Now – e-Assessment of Essay Style Questions
Dr Tom Mitchell, Intelligent Assessment
0930 - 1000 eTests in a Programming Course – Student’s View Versus Teacher’s View
Gerd Holweg, University of Applied Sciences Vienna
1000 - 1030 Raising Standards: Mobile e-Assessment Delivery for the Financial Services Industry
Denis Saunders, Calibrand and Joe Wilson SQA
1030 - 1100 Coffee
1100 - 1130 Secure Assessments – How to Protect the Integrity of Assessment Programmes
Michael Kayser, Pearson VUE
1130 - 1230 Piloting Video Podcasting Feedback with Overseas MBA students
Nigel Jones, Panicos Georghiades and John Gunson, UWIC Cardiff School of Management
1230 - 1300 Panel Discussion
1300 - 1400 Buffet Lunch
1400 - 1430 The Assessment/Testing Dialogue
Piet Kommers, University of Twente
1430 - 1500 e-Assessment Policy and Practice: Making the Connection
Dr David Walker, University of Dundee
1500 - 1530 E-assessment using Simulation and Synthetic Virtual Reality Environments
Dr Majid Al-Kader, Skills2Learn Ltd and Nathan Baker, Holdfast Training
1530 - 1600 Afternoon Tea
1600 – 1630 Using Turnitin2 for e-Assessment
Anne Flood, nLearning Ltd
1630 – 1700 Proximal Solutions to Providing Student Feedback
Dr John Seeley, London South Bank University
Comment by Venkat on April 11, 2010 at 12:43
e-Assessment in Practice, 10-11 November 2010.
http://www.cranfield.ac.uk/cds/symposia/ea10.jsp

Members of the group might be interested in participating in the conference either by presenting a paper or poster or demonstrating your product.
Comment by Venkat on June 25, 2009 at 16:22
Ken, we won't ever be able to match nor replicate human assessor's performance in all its entirety. Talking to some of the SMEs, they are happy to come up with some nuggets of questions/comments such as, "If some one has answered this question or can differentiate this, I will take him my eyes closed." There always appear to be "some key questions/information" a human assessor is looking for. All we have to find is these extremes. Then we can interpolate that support the key questions. I agree with Stephanie - it does take time to develop; but once developed, they are there for ever. Another point worth bearing in mind is that how much item analysis is performed subsequently, to eliminate and to inform subsequent upgrades. I am talking in the dark here. The analysis of items for formative and sumative have different purose.
By the way, does any have any experience with short answer free text items?
Comment by Casson McRae on June 25, 2009 at 16:10
Great conversation! I wanted to pick up on a couple of points too that resonate. We are currently having an issue with setting timelines to complete training, if we have more people being pushed to do it then we have less people on the shop floor keeping things going, our learners feel they have to respond, especially around compliance training, so Stephanie's approach would be problematic for us here. Although I do concur that a delay before assessment would start to help both the learner and the validity of the assessment.

I would also agree about the importance of SME/designer involvement to create the assessments as creating these thing should be treated in the same way, and with sound design principles, as creating the rest of the training itself. And I am finding that to try do do it well is not easy and is time consuming.
Comment by Ken Jones on June 25, 2009 at 14:32
Venkat, I believe your statement "bottom line knowledge they will accept from any one claiming certain level of expertise" hits the nail on the head, this should most definitely determine the desired Learning Outcome. The question is how, when assessing competence, can we emulate some of the complicated processes that a human assessor would go through when physically assessing the competence of a learner, maybe a bridge too far for safety critical situations or tasks?
Comment by Stephanie Dedhar on June 25, 2009 at 14:31
I completely agree Venkat! It's crucial that the questions in any assessment are developed with the input of the SME. I also think that it's often useful to create the assessment, or at least a draft of it, before beginning to design the actual training (this seems to match with your advice to identify the basic knowledge required early on). This ensures that what you are testing aligns with the agreed learning outcomes and objectives, and gives you a solid basis on which to design the training units.
Comment by Venkat on June 25, 2009 at 14:10
I have been following the conversation initiated by Stephanie and comments from Ken Jones. From your comments I can see the words of wisdom moderated by scarred and healed wounds !

I face this dilemma of who is best placed to develop the e-assessment question items. I see this situation very similar to the days of developing an expert systems in the late seventies and early eighties. The knowledge engineer(ID) sits with the domain expert (SME), and through several iterations develop the system. This means much stronger engagement with the SMEs and identify early on what is the bottom line knowledge they will accept from any one claiming certain level of expertise in the chosen field. Then map the items that guarantee that level of expertise that minimises the risk of 'lucky guesses'. I am not too hung up about the MCQs per se - it is the quality of items and the nature of the feedback that define the success or failure.
Comment by Ken Jones on June 25, 2009 at 13:20
I agree Stephanie, in fact I would say that the assessment should be considered an integral part of the learning process, as opposed to the ubiquitous test/quiz/exam style currently employed.
 

Members (106)

 
 
 

Sponsor promotion

Members

 

© 2014   Created by Donald H Taylor.

Badges  |  Report an Issue  |  Terms of Service