We use cookies to provide you with a better experience. By continuing to browse the site you are agreeing to our use of cookies in accordance with our Cookie Policy.
Accept
  • SKIP TO CONTENT
  • SKIP NAVIGATION
  • Drug News
    • Trending
    • Commercial Operations
    • GMPs
    • Inspections and Audits
    • Postmarket Safety
    • Quality
    • Regulatory Affairs
    • Research and Development
    • Submissions and Approvals
    • FDAnews Drug Daily Bulletin
    • Drug Industry Daily
  • Device News
    • Trending
    • Commercial Operations
    • Inspections and Audits
    • Postmarket Safety
    • Quality
    • Regulatory Affairs
    • Research and Development
    • Submissions and Approvals
    • FDAnews Device Daily Bulletin
    • FDAnews Device Daily Bulletin Premium
  • Books
    • FDAnews Books Library
    • Drug Books
    • Device Books
  • Training/Events
    • Webinar Training Pass
    • Events
  • Resources
    • Form 483s Database
    • FDA Approved Drugs
    • eCFR and Guidances
    • White Papers
  • CenterWatch
  • About Us
    • The Company
    • Contact Us
  • Advertising
  • Sign In
  • Create Account
  • Sign Out
  • My Account
Home » Creating Effective Computer-Based Assessment

Creating Effective Computer-Based Assessment

January 14, 2015

Computer-based quizzes are pretty inflexible. If you want to administer and grade an assessment on a computer, you are basically limited to the multiple choice concept. Your eLearning authoring software may dress it up in many different guises, but all the question formats are essentially the same (even pictorial ones): choose the correct response(s) from one or more incorrect ones.

This seems pretty straightforward, but how can you be sure you are evaluating the performance metric you really want to test?

It's very easy to get stuck when writing a multiple-choice test question. First, you come up with the question, which is easy because you know what's important about the subject matter. Logically, the correct response comes next, although you probably had that in mind before writing the question. Job done? Now you just need to add some wrong answers. Depending on the subject matter this may be easy, particularly if you want to test the learner's grasp of technical data where all the answers can be very similar, for example:

Q: How many seconds does it take a 2003 Lamborghini Gallardo to reach 60 mph?
a) 5.4 seconds
b) 4.8 seconds
c) 4.0 seconds
d) 6.2 seconds

This is a good test of the learner's retention of numerical data in regards to a specific vehicle. However, we often want to assess a learner's behavioral understanding, and it becomes much more difficult to offer up wrong answers that are credible. In fact, it can be downright frustrating. Consider this example:

Q: Why should you not open the oven while baking a cake?
a) it will ruin the cake
b) you might get burned
c) it could fall out
d) a bird might fly in

You want answer a) to be the correct response. The problem is that answer b) is also possible, and while your other answers are clearly more ridiculous, even they aren't impossible. What should be a question designed to assess our would-be baker's understanding of a process has become bogged down with ambiguity.

Use Short Story Questions

One solution to multiple-choice is to write narrative questions followed by a simple Yes/No or True/False decision. Presenting the question as a story can place it in a real-world situation and provide valuable context. For example:

Q: You pour the cake batter into the pan and place it in the oven. After 10 minutes, you wonder how it is doing but cannot see properly through the oven door. Would you open the oven to see how your cake is doing?
a) Yes
b) No

By changing the question, you now don't need to invent a list of incorrect answers and you're actually assessing the right metric: the learner’s understanding of the proper behavior.

Provide Feedback

In our short story question, we've stripped down our answers so much that they no longer supply any context. You could make them more elaborate, but beware of falling back into the problem of making credible wrong responses. What we can – and should – do is provide feedback to reinforce the point. Instead of responding with a simple “correct” or “incorrect,” tell the learner why the answer is right or wrong. In our cake-baking example, effective feedback could be:

That’s right (or wrong)! Opening the oven door alters the baking temperature and affects the cake’s ability to rise.

Bring the Message Home

Most assessments relate directly to training content that the learner has been through prior to taking the test, and while the feedback can be a reminder of that content, you can take it further. For example, in a recent question I wrote about sterile medicine manufacturing, I included in the feedback a tragic news story about premature babies who had died due to non-sterile drips. My intention was to remind the learner that the training wasn't just a box-ticking exercise, but that there are real-world implications to working in sterile manufacturing.

Contact information:
Leigh Heath
Learning Developer
MVI/MicronTraining
leighheath@mvitraining.com
www.mvitraining.com

Upcoming Events

  • 04Apr

    Optimizing Quality Control Operations with Unified Quality

  • 12Apr

    The Participant Playbook Webinar Series, Part 3 — Rethinking the Development of Participant-Centric Clinical Trial Technology

  • 20Apr

    Medical Device Enforcement: Latest Developments from the FDA, DOJ and FTC

  • 25Apr

    Effective Root Cause Analysis and CAPA Investigations for Drugs, Devices and Clinical Trials

  • 26Apr

    FDA’s New Laws and Regulations: What Drug and Biologics Manufacturers Need to Know

  • 26Apr

    Building the Foundation for QMS AI

Featured Products

  • FDA’s New Quality System Regulation: Transitioning from QSR to ISO 13485

    FDA’s New Quality System Regulation: Transitioning from QSR to ISO 13485

  • Selecting and Implementing Electronic Document Management Systems in the EU

    Selecting and Implementing Electronic Document Management Systems in the EU

Featured Stories

  • FDA Final Order Requires PMA for Spinal Spheres

  • FDA and Lupus Research Alliance Form Drug Development Consortium

  • GSK and Scynexis Ink Antifungal Licensing Deal Drug

  • FDA Clears Bot Image’s AI Software for Prostate Cancer Screening

The Revised ICH E8: A Guide to New Clinical Trial Requirements

Learn More
  • Drug Products
    • Quality
    • Regulatory Affairs
    • GMPs
    • Inspections and Audits
    • Postmarket Safety
    • Submissions and Approvals
    • Research and Development
    • Commercial Operations
  • Device Products
    • Quality
    • Regulatory Affairs
    • QSR
    • Inspections and Audits
    • Postmarket Safety
    • Submissions and Approvals
    • Research and Development
    • Commercial Operations
  • Clinical Products
    • Trial Design
    • Data Integrity
    • GCP
    • Inspections and Audits
    • Transparency
  • Privacy Policy
  • Do Not Sell or Share My Data
Footer Logo

300 N. Washington St., Suite 200, Falls Church, VA 22046, USA

Phone 703.538.7600 – Toll free 888.838.5578

Copyright © 2023. All Rights Reserved. Design, CMS, Hosting & Web Development :: ePublishing