The biggest challenge facing academic integrity today is compromised or leaked content. Richard, a professor of Strategic Management, recently found out that his exam material was all available on websites such as CourseHero, Chegg, and Quizlet. The problem is that current market solutions (e.g., proctoring and lockdown browsers, etc.) don’t adequately address the problem of leaked content. There is also evidence to suggest that such solutions lead to negative student outcomes. Richard knew that re-writing all of his exams was too time consuming and that his new questions would also be vulnerable to being posted online. That is when Richard partnered with EXAMIND to investigate the impact of leaked content and to begin creating dynamic questions so he wouldn’t have to re-write his exam questions.
In this case study, we investigate how student exam performance varies with dynamic questions and real-time, non-invasive tactical deterrents compared to exams with compromised content and webcam-based proctoring. Dynamic questions are questions with randomized parameters that are unique to each student. Tactical deterrents are real-time targeted warnings to students who are taking inappropriate actions during an exam (e.g., trying to take screenshots, copying questions, etc.).
#1: Does using uncompromised dynamic questions vs. compromised questions result in better academic integrity and outcomes in open book exams? (TESTS 1 & 2)
#2 Do non-invasive tactical deterrents coupled with uncompromised dynamic questions vs. webcam-based proctoring with compromised questions result in better academic integrity and outcomes in closed book exams? (TESTS 3 &4)
To evaluate our research questions, we selected four sections of the same capstone class in Strategic Management within a College of Business at a major university in the United States. All sections were taught by the same professor, Richard. The professor’s exam questions were all compromised and widely available on the internet (e.g. CourseHero, Chegg, and Quizlet). To create the dynamic questions, the professor’s questions were converted into dynamic questions. The professor alongside another PhD in Strategic Management developed and reviewed the questions to ensure that there was no substantive difference between question difficulty and fairness between the questions and that they tested the same concepts. The professors also ensured that the newly created dynamic questions and solutions could not be easily searched for online.
According to the results of Tests 1 & 2, we find that using uncompromised dynamic questions resulted in a lower average grade by 20%. These results suggest that students didn't know the course material but rather relied on their ability to look up the answers online. As such, it is evident that leaked content has a serious negative effect on learning and the accurate assessment of a student's knowledge.
According to the results of Tests 3 & 4, we find that both webcam-based proctoring and tactical deterrents reduce average student performance. Noteworthy, is that webcam-based proctoring did not reduce average scores to a level near the effect of using dynamic questions. This result, suggests that webcam-based proctoring does not solve the problem of compromised or leaked content.
Student performance was reduced by 20% when using Dynamic Questions compared to compromised questions with web-based proctoring.
Student performance with webcam-based proctoring and leaked content was not reduced to levels near the effect of using dynamic questions.
Student focus on finding answers online rather than learning the content when answers are found online.
Non-tactical deterrents resulted in a similar reduction but slightly greater reduction in student scores (~8%) compared to web-cam based proctoring.(~7%).