Read the Original Article at http://www.informationweek.com/news/showArticle.jhtml?articleID=197001914
The Association of American Medical Colleges' first computerized administration of medical school admissions tests experienced a glitch, but the error appears to be of human origin and not caused by robotic fish.
Some 787 students of about 2,500 taking the Medical College Admission Test on Saturday in mulitple locations, received questions that had nothing to do with the passages they read, says Robert F. Jones, AAMC's senior VP of medical school services and studies. Though it was the first time the test was fully computerized, it appears that the error occurred during the review process, Jones says.
"The passages were about robotic tuna or dolphins, and the questions were about warblers," Jones said during an interview Tuesday. "It would be obvious to most people, who would say, 'Gee, they don't relate to what I just read,' but I think people reacted to it differently."
Several students taking the exam told test supervisors about the problem and sought reassurance that they wouldn't be scored on the nonsensical questions. Jones said computerization actually helped speed communication to Prometric, the company that formatted the test. Prometric then contacted the AAMC, which in turn initiated e-mail contact that quickly reached all 341 testing sites, where students learned they could ignore the questions about warblers and passages about robotic fish. Some students may have tried to complete that portion before notice reached their testing sites, but they will not be graded differently, Jones said.
Students who believe the glitch affected their overall performance will be allowed to void the test results before grades are handed out. Normally, students can void the scores on their way out of testing centers. In this case, the AAMC is sending out letters to all students who received flawed tests and allowing them time to respond with a request to void the score.
Jones said that the AAMC undergoes an elaborate process to determine how publishing errors occur and that process has just begun. He said the association is more concerned with addressing confusion among those who took the test.
No problems have been reported with the software provided by Prometric or a new artificial intelligence scoring system from Vantage Learning. That system, deemed more reliable than human scoring, has not yet been deployed, according to a company spokesperson.
Jones pointed out that publishing errors also occurred occasionally when students took paper tests. The paper tests took nearly 10 hours because booklets had to be passed out and handed back in several stages. Computerization has cut the time in half.
Computerization allowed all scores from the Saturday test to be tallied by 5 a.m. Monday, Jones said. It used to take 30 days to collect all of the exams at a single location and another 30 days to send students their scores. Computerization also is improving efficiency so tests are being administered 22 times annually instead of twice a year. That means students who want to take the test over should find scheduling more convenient and flexible than when paper was used.
This article was edited on Feb. 2 to clarify that the computer error occurred at several test site locations.