Implementing a high-stakes test administration procedure for state tests can be daunting—even for the most experienced test coordinator and administrator. Unfortunately, testing irregularities and incidents happen, and when they do, they must be well documented and followed up on to ensure valid and reliable test scores and also adhere to Peer Review Requirements. There are many ways to collect information from the field, and some methods are easier than others.
A testing irregularity is any incident during a test’s administration or testing window that leads to a question regarding the security of the test or the accuracy and validity of the test scores. An incident can meet one or more of the following criteria:
Testing irregularities come in many forms, shapes, and sizes. They can include, but are not limited to:
One of the most common methods (that I’m sure you are all aware of) is creating a paper form for test administrators to complete when an incident occurs. The form exists to capture all relevant information from the incident—such as the name of each person involved, the name of the individual completing the form, their contact information to be used for follow-up (if needed), the room number or location where the incident occurred, the name of the test being administered, the time and date of the incident, and a thorough description of everything that happened.
You can quickly see that collecting these reporting forms through email, a shared document site, or even fax can get overwhelming. This is where things get dangerous. When the collector gets overwhelmed, the reported incidents go unnoticed, thus defeating the purpose of collecting them in the first place. I know that many of you (as I was during my tenure as Executive Director of Assessment and Accountability in Mississippi) have had more than enough of these paper forms.
A simpler approach to reporting and managing incidents is to collect them using an electronic database. A database can capture all of the same information as the paper forms (and additional information more easily). Databases also enable the information to be filtered and sorted in a unique manner, alert you to high-priority incidents, and generate a report for busy executives to provide a high-level overview to superiors whenever needed.
At the risk of getting too “salesy,” I want to direct you to one such database offered by Caveon, called Core. Core is specifically designed for logging and tracking testing irregularities and generating incident reports. It aims to simplify the intake of information and the collection process for all parties involved. It has sophisticated permissions that enable you to securely upload and share files, while also controlling who sees what, who can add or delete files, and who see certain folders. Utilizing a database like this is especially important when handling hundreds (or even thousands) of incidents during a test administration window. Core also gives you a dashboard of data visualization, allowing busy executives to see what has been collected and what high-priority items need to be acted on first when a serious threat to the validity of test scores is uncovered. A report of the data visualization is also available at the push of a button. This report becomes a piece of evidence that the State Education Agency can submit to meet federal Peer Review requirements (which are outlined in this document).
If Core is not a tool available to you, there are other collection methods that could be implemented in lieu of the traditional paper collection method. For example, consider having test administrators and coordinators fill out an Excel document or answer online questions through a survey tool when a testing irregularity is uncovered. This will at least remove the seemingly never-ending stream of forms, and provide some sorting and organizing capabilities.
Whatever process you choose, the key is to stay organized and follow up on irregularities as they are reported. Don’t allow them to stack up, or worse, pretend they don’t exist! Instead, have a process, train on the process, and work your process.
Lastly, after each test administration, host an after-action review to document what worked and what didn’t. Listen to the field and make adjustments where needed in order to ensure a smoother process for managing and reporting testing irregularities going forward.