AV-TEST Product Review and Certification Report – Q2/2011

During April, May and June 2011 we continuously evaluated 22 security products using their default settings. We always used the most current publicly-available version of all products for the testing. They were allowed to update themselves at any time and query their in-the-cloud services. We focused on realistic test scenarios and challenged the products against real-world threats. Products had to demonstrate their capabilities using all components and protection layers.

certified

Internet Security Suite

Version2011
PlatformWindows XP (SP3, 32 bit)
Report112275
DateQ2/2011

Protection

Protection against malware infections
(such as viruses, worms or Trojan horses)
More information

Industry averageIndustry averageApril May June
Protection against 0-day malware attacks from the internet, inclusive of web and e-mail threats (Real-World Testing) 108 samples used 81%100%100%100%
Blocking of malware on or post execution (Dynamic Detection Testing) 34 samples used 65%94.0%
Detection of a representative set of malware discovered in the last 2-3 months (AV-TEST reference set) 424,860 samples used 98%100%100%100%
Detection of widespread malware (according to the WildList) 10,224 samples used 100%100%100%100%
Protection Score 6.0/6.0

Repair

Cleaning and repair of a malware-infected computer

Industry averageIndustry averageApril May June
Removal of all active components of widespread malware (according to the WildList) from a computer 23 samples used 96%100%
Removal of further malicious components and remediation of critical system modifications made by malware 23 samples used 70%100%
Detection of deliberately hidden active malware (Rootkits and stealth malware) 18 samples used 78%83.0%
Removal of deliberately hidden active malware (Rootkits and stealth malware) 18 samples used 44%61.0%
Repair Score 5.5/6.0

Usability

Impact of the security software on the usability of the whole computer
(lower values indicate better results)
More information

Industry averageIndustry averageApril May June
Average slow-down of the computer by the security software in daily use Number of test cases: 13 140%94s
False detections of legitimate software as malware during a system scan (false positives) 699,760 samples used 9 3 2 1
False warnings of certain actions during the installation and use of legitimate software 22 samples used 1 0
False blockings of certain actions during the installation and use of legitimate software 22 samples used 0 0
Usability Score 5.5/6.0

All tested manufacturers

AhnLab
Avast
AVG
Avira
Baidu
Bitdefender
BullGuard
Check Point
Comodo
Cylance
Emsisoft
ESET
F-Secure
Fortinet
G Data
Heimdal Security
K7 Computing
Kaspersky
Lavasoft
Malwarebytes
McAfee
Microsoft
MicroWorld
Norton
Panda Security
PC Matic
PC Tools
Protected.net
Qihoo 360
Quick Heal
Sophos
Tencent (PC)
ThreatTrack
Total Defense
Trend Micro
VIPRE Security
Webroot