The World's Only Test Security Blog
Pull up a chair among Caveon's experts in psychometrics, psychology, data science, test security, law, education, and oh-so-many other fields and join in the conversation about all things test security.
Posted by David Foster, Ph.D.
updated over a week ago
Record and review proctoring is a form of online proctoring. However, in record and review proctoring, there are no proctors watching the test taker while the exam is in progress. Instead, record and review proctoring uses recording equipment to capture the video and audio of the exam administration. The recordings are then reviewed (at a later time) to determine whether cheating occurred, whether exam questions were stolen, and whether the exam was in fact taken securely. The review could occur a day, a week, or even a year later. Unfortunately, for many programs with limited resources, often the recordings are never reviewed.
Online proctoring (also referred to as remote proctoring) is defined as monitoring a test administration over the internet using a webcam and other security technologies. Most versions of online proctoring allow the proctor to watch the examinee take a test in real time. The proctor watches for infractions of the security rules, and when a violation is detected, the proctor is able to pause the test and initiate an immediate conversation with the examinee to determine what has happened. The proctor can then decide to either continue or suspend the exam. The ability of the proctor to immediately respond to a security incident is the main difference between online proctoring and record & review proctoring.
If a security infraction is detected during record and review proctoring, the testing program can respond by conducting an investigation, canceling the test score, initiating other sanctions, or doing nothing at all. Because of the delayed nature of the review, these decisions occur long after the test was taken.
Record and review proctoring has become increasingly popular as a result of the global pandemic of 2020. This is the result of both limited proctoring resources (proctoring companies were progressively at capacity) as well as financial restraints (live proctoring is cost-prohibitive to many programs, and record and review proctoring is a more budget-friendly option).
But is record and review proctoring actually a good security solution? What are the test security benefits? What are the security concerns or limitations? What else can be done to improve overall test security when using record and review proctoring? Let’s deal with each of these questions individually.
There are security advantages to using record and review proctoring, particularly when compared to using no form of proctoring at all. With that said, there is an important caveat to keep in mind: many of the most critical types of cheating cannot be detected by record and review proctoring—or even by live proctoring. Read this white paper and jump to this section to learn more.
The best security benefit of record and review proctoring is that it deters cheating. With record and review proctoring, a test taker knows that their actions are being monitored and recorded. This knowledge influences test taker behavior during the exam. For some examinees (perhaps most), simply knowing they are being recorded is enough to convince them not to cheat. Those test takers who are determined to cheat will have to be more covert and creative with proctoring.
To increase the deterrent effect of record and review proctoring, testing programs often publicize the record and review security measure. Additionally, programs often publicize incidents of cheaters being caught, along with the consequences the cheaters faced. Doing this lets examinees know that the security measures are in place and are effective, that test security is taken seriously, and that cheating simply isn’t worth the risk.
A second security benefit of record and review proctoring is that, when reviewed by well-trained and thorough auditors, you can catch instances when test takers violated the security rules. Examples of types of test fraud that reviewers can catch include (but are not limited to): the use of a prohibited device to look up answers, the use of a camera to take pictures of test sessions, receiving help from someone entering the room, etc.
With record and review proctoring, it is best practice to inform examinees that their scores are provisional until the sessions have been reviewed, and that releasing official scores might take several weeks. Communicating this allows you additional time to review sessions, and even conduct further investigations when necessary. Always plan to quickly review recorded test sessions; if too much time has passed, it will be harder for you to cancel a test score or administer other sanctions if needed when cheating is detected.
One major limitation of record and review proctoring is the delay between when the test is taken, when the review is conducted, and when a decision is made to confirm or invalidate the test score. These delays will mean that the security infractions that are occurring in real time will continue to happen until the review has been completed and the problem confronted. A ripple effect from this could mean that test content has been unnecessarily exposed, which could facilitate further cheating by other examinees—all without the testing program even knowing it is happening.
For example, a student taking a video of each item on the test will be able to capture the entire test, share it online, and enable more students to cheat, all before the test program has caught anything with ther record and review proctoring measures. Furthermore, a person blatantly cheating by using their phone to look up answers will be able to continue to do so on other tests, and they will likely even receive a high score at the end of those exams as well. That examinee will have already realized the benefits of a higher score (such as grade promotions, increased GPA, etc.), all before the cheating incident is discovered and confronted.
Often, the test sessions recorded as part of record and review proctoring are never actually reviewed. This happens because reviewing sessions is a time-consuming and difficult process. Therefore, programs find it easier to simply not conduct any reviews. But this allows cheating and test theft to continue, all absolutely unchecked and with no consequences, thus undermining the validity of test scores.
As mentioned above, any form proctoring is an ineffective method of test security when conducted by itself. There are many security threats that simply cannot be detected by proctors, no matter how diligent and well trained (e.g., the use of hidden cameras, utilizing test content pre-knowledge, memorizing test content, and others). Record and review proctoring shares these limitations, and it is often even less effective because the review process is either delayed or it doesn’t occur. Additionally, record and review proctoring has even greater security weaknesses when the reviewers aren’t trained to reliably detect security infractions. You can learn more about the security limitations of any type of proctoring in this white paper and this infographic.
Because of the limitations of record and review proctoring, a program that uses it must bolster its test security in other ways:
By outsourcing this step, you can ensure your test sessions are always reviewed (and in a timely manner). Additionally, this step allows for trained test security professionals to review your exams for you (check out this all-in-one service). These security folks are specifically trained to detect cheating and theft, and they know what key behaviors to look out for in test session reviews.
There are types of item and test designs that actually prevent cheating and theft by themselves. These designs bolster test security and make reviewing test sessions more efficient by significantly reducing the amount of cheating that could occur. Examples of such test designs include tests that use SmartItem™ technology and items that use the Discrete Option Multiple Choice™ (DOMC) format.
Utilize other test security detection systems to detect cheating. Two of the most effective solutions include monitoring the web for stolen and disclosed test content, and routinely running data forensics analyses that detect cheating by looking for unusual patterns in the testing data.
Think of deterrence as a communication plan meant to discourage and inhibit test takers from cheating. Deterrence can be as simple as letting examinees know that there are consequences to cheating and test theft. Deterrence messages should:
For example, students may be required to sign a “oath” not to cheat, or a list of automatic sanctions for cheating can be given to examinees before the test starts. Get creative!
Makes sure test takers are aware and agree to the fact that the reviews of their test sessions will take time to complete (sometimes several weeks), and that their test scores are “provisional” until those checks have been completed.
Record and review proctoring has a few important security advantages, such as detecting and deterring some forms of cheating and test theft. With that said, record and review proctoring should only be used when other types of proctoring are not possible (e.g., due to budget or time constraints). Even then, however, record and review proctoring is not an effective stand-alone security solution, and it has many security flaws. Therefore, the overall security of your exam needs to be bolstered by the adoption of other test security measures. Security measures that are most effective in conjunction with record and review proctoring include:
A psychologist and psychometrician, David has spent 37 years in the measurement industry. During the past decade, amid rising concerns about fairness in testing, David has focused on changing the design of items and tests to eliminate the debilitating consequences of cheating and testwiseness. He graduated from Brigham Young University in 1977 with a Ph.D. in Experimental Psychology, and completed a Biopsychology post-doctoral fellowship at Florida State University. In 2003, David co-founded the industry’s first test security company, Caveon. Under David’s guidance, Caveon has created new security tools, analyses, and services to protect its clients’ exams. He has served on numerous boards and committees, including ATP, ANSI, and ITC. David also founded the Performance Testing Council in order to raise awareness of the principles required for quality skill measurement. He has authored numerous articles for industry publications and journals, and has presented extensively at industry conferences.View all articles
For more than 18 years, Caveon Test Security has driven the discussion and practice of exam security in the testing industry. Today, as the recognized leader in the field, we have expanded our offerings to encompass innovative solutions and technologies that provide comprehensive protection: Solutions designed to detect, deter, and even prevent test fraud.