Proctored Exams

Anecdotal Evidence and Cheating: Insights from the Field

Introduction: The Power and Pitfalls of Anecdotal Evidence in Online Exam Integrity

When you hear the term “anecdotal evidence,” you might instinctively question its reliability, assuming it’s an attempt to generalize from a single data point. This skepticism is healthy, especially in the context of academic integrity. However, when numerous educators across different institutions report similar observations, these anecdotes may point to broader trends worth exploring.

In this article, we’ll delve into the recurring reports we receive from educators using online exam monitoring tools like LockDown Browser and Respondus Monitor. These tools are designed to uphold the integrity of online assessments by deterring cheating. What educators consistently tell us is this: when students take exams in non-proctored environments (such as at home) using a standard browser, their average scores are often inflated—by as much as 10 percentage points—compared to when these monitoring tools are employed.

The 10-Point Drop: A Consistent Trend

One of the most striking patterns we’ve observed is the consistent report of a 10 percentage point drop in average exam scores when LockDown Browser and Respondus Monitor are used. For example, if the class average for a non-proctored online exam is 85%, it typically drops to around 75% when these tools are enforced. This drop isn’t uniform across all classes or exams; some instructors report differences as small as 8 points or as large as 15, but 10 points remains the most frequently cited figure.

A Closer Look: Case Study from an Anonymous University

Consider the case of an instructor from a university that prefers to remain unnamed. She conducted a more in-depth analysis of her students’ performance across several exams. The first exam of the term was non-proctored, and students used a standard browser. The average scores were consistent with those from previous terms. However, for the next three exams, she required students to use Respondus Monitor. Across these exams, she noticed a sharp decline—about 10 percentage points on average—compared to previous terms.

What’s particularly intriguing is that the score decline wasn’t uniform across the entire class. Students who performed exceptionally well on the first exam continued to do so in subsequent exams, even with the monitoring tools in place. However, those who scored in the middle to lower range on the first exam experienced significant drops in their scores—sometimes by as much as 17 percentage points. For instance, a student who scored 82% on the first exam might plummet to 65% or lower on the second exam.

Interpreting the Data: What Does It Mean?

This instructor’s findings suggest that while top-performing students maintain their performance regardless of monitoring, those who are more likely to cheat see their scores drop dramatically when deterrents are in place. Moreover, some students seemed to adjust their behavior after being caught off guard by the introduction of Respondus Monitor. A pattern emerged where a few students performed moderately well on the first non-proctored exam, poorly on the first proctored exam, and then bounced back on the remaining exams. This could indicate that they were unprepared for the stricter conditions of the second exam and subsequently adapted by studying more diligently.

Call to Action: Share Your Experiences

If you’ve observed similar trends in your own courses, we’d love to hear about your experiences. While formal studies on this topic are still in their infancy, the collective insights of educators like you are invaluable. Whether you’ve noticed a significant score drop or have conducted a more formal analysis, your contributions can help shape a more comprehensive understanding of the impact of online exam monitoring tools on academic integrity.

Conclusion: The Importance of Data in Upholding Academic Integrity

Anecdotal evidence, while not definitive, provides a crucial starting point for understanding the complex dynamics of online exams. As more educators share their experiences, we can move towards more formalized studies that offer concrete data on the effectiveness of tools like LockDown Browser and Respondus Monitor in deterring cheating. Ultimately, the goal is to ensure a level playing field for all students, maintaining the credibility and fairness of online assessments.

Leave A Comment

Your Comment
All comments are held for moderation.