TechWeb

Anti-Spyware Vendors Mad About Consumer Reports Test Methods

Aug 25, 2006 (02:08 PM EDT)

Read the Original Article at http://www.informationweek.com/news/showArticle.jhtml?articleID=192300458


Consumer Reports, the independent product review and rating publication, was slammed Friday for using what security experts called "mind-boggling" and "useless" tests of anti-spyware software in its current on-the-stand issue.

"This is beyond anything I've ever seen," said Alex Eckelberry, chief executive of Sunbelt Software, a Clearwater, Fla. security company. "They ran a test that is not a full test of anti-spyware software capability. Consumer Reports scanned for and removed functionality that isn't even real. When I heard what they did, I went 'huh? They did what?' This is just mind-boggling."

For a story on consumer-grade anti-virus and anti-spyware software in the September issue of Consumer Reports, the publication's testers ran various products through the mill. To judge anti-spyware titles, the magazine used the public suite of Spycar scripts published by Intelguardians Network Intelligence. The Spycar site touts the suite as "tools designed to mimic spyware-like behavior, but in a benign form."

Eckelberry took exception with using Spycar in general, and with using only Spycar specifically.

"It's not a serious testing tool," he claimed. "The mantra of any type of security test is that you have to test against real-world scenario. Relevancy is critical."

Spycar fails on those points, he went on. "It does things like install fake registry keys, changes your start page and the like. It is specifically designed to test how well anti-spyware programs block unknown applications, not [how they] scan and remove."

Randy Abrams, for 7 years the man responsible at Microsoft for ensuring that all software it released was malware free, was even more blunt. "I was livid about the testing [Consumer Reports] did. They tested anti-spyware software without ever testing how it detected and removed spyware."

F-Secure's Anti-Spyware and Webroot's Spy Sweeper 4.5 (now superseded by 5.0) tied for first in Based Consumer Reports' tests with matching scores of 89 out of a possible 100. Sunbelt's CounterSpy clocked in at seventh with a score of 70, while Microsoft's free Windows Defender came in dead last with a score of 43.

Eckelberry said it wasn't sour grapes over a low score that lead him to take on Consumer Reports' testing.

"No test is perfect," he said. "But there are certainly degrees [of imperfection]. It should be all about relevancy, but here it's not."

Consumer Reports' September issue was already under fire over its anti-virus testing procedures when Eckelberry raised the flag on anti-spyware. Within days of the issue making the newsstand, McAfee posted an open letter taking the publication to task for hiring a lab to create 5,500 new variants derived from half a dozen malicious code categories.

"Creating new viruses for the purpose of testing and education is generally not considered a good idea," wrote McAfee's Igor Muttik in an entry on the security company's Avert Labs blog. "Viruses can leak and cause real trouble."




Consumer Reports defended its actions by saying it was the best way to test anti-virus software against "novel threats not identified on signature lists."

McAfee and others, including Abrams, now the director of technical education at Eset, a Slovakian security company, rejected that claim. Instead, they pointed out that "retrospective" testing -- where an anti-virus product is installed but then not allowed to update its definitions for several weeks or months -- is a more realistic way to test how software handles new threats. Products can be objectively ranked, said Abrams, when new-found malware is tossed at out-of-date definition files. The practice quickly shows how well each anti-virus program sniffs out new attacks using its behavior-based tools.

"I thought that their anti-virus testing was bogus and useless," said Abrams. "But their anti-spyware testing was worse."

Consumer Reports did not respond to a call for comment, but in a letter from Dean Gallea, its test program leader, to Eckelberry that that latter shared with TechWeb, the publication didn't sound like it was about to change its mind, or its anti-virus and anti-spyware conclusions.

"Thanks for your insights on the use of behavior simulation to test the performance of anti-spyware programs. We believe we understand your concerns, however we chose this approach because we felt it best captured the flexibility of the software," Gallea wrote.

"We are constantly re-evaluating our test program, and will take these and other considerations into account in future tests," he added.

"I grabbed a copy [of Consumer Reports]," said Abrams. "On the cover it said 'sleeping pills, the facts they don't tell you,' or something along those lines. But if they do such an incredibly bad job in testing consumer anti-virus and anti-spyware, how could I ever trust them with something medical related? I was completely dumbfounded by the whole thing," he added.

"I am happy that NOD32 [Eset's anti-virus title] was not tested. There is no honor in claiming that you were number one in a worthless test. It would be like being rated best food by a critic who has no taste buds and doesn't understand nutritional content."