As we have seen from the past year when it came out to the public, that all AV solutions are dead. It was a real public fear rather than just a marketing trend. Examining the available market solutions on the basis of practical testing reports given out by AV-Comparitives.org, give us a clear picture of what is still alive or dead in the area of statistical viruses. These tests are conducted on the ground of following areas:
1.Performance Tests
2.Dynamic Tests (proactive/normal conditions analysis)
3.Cleaning Tests (detective solution under infected machines to measure the cleaning capabilities)
However to remind that these tests are not limited and are extended with other considerable factors such as, retrospective detection rate (heuristics and signature based) and statistical analysis without user interaction. Looking into the latest February-2009 Report, following products were tested for speed and false alarm rates.
----
avast! Professional Edition 4.8.135
AVG Anti-Virus 8.0.234
AVIRA AntiVir Premium 8.2.0.374
BitDefender Antivirus 12.0.11.4
Command Anti-Malware 5.0.8
eScan Anti-Virus 10.0.946
ESET NOD32 Anti-Virus 3.0
F-Secure Anti-Virus 9.00.149
G DATA AntiVirus 19.1.0.0
Kaspersky Anti-Virus 8.0.0.506a
Kingsoft Antivirus 2008.11.6.63
McAfee VirusScan Plus 13.3.117
Microsoft Live OneCare 2.5.2900
Norman Antivirus & Anti-Spyware 7.10.02
Sophos Anti-Virus 7.6.4
Symantec Norton Anti-Virus 16.2.0.7
TrustPort Antivirus 2.8.0.3011
----
In the overall test evaluation provided in the report at:
http://www.av-comparatives.org/images/stories/test/ondret/avc_report21.pdf
The test-bench given above is contructed and evaluated on the basis of two sets of tests described in the report itself. However, the most interesting factor to notice is "how many malware samples have been tested"? to detect the static (and partially dynamic) behavior of the next-generation badwares.
Today's highly motivated attackers are more diverted into changing the detectable signature to undetectable and transparent malwares. This can easily be accomplished by applying latest cryptors, protectors and/or packing techniques. Thus, it is still viable to consider these set of AV solutions for static virus detections rather than complex and polymorphic malwares.
Comparing the false alarm rate with the set of malware composition from Test-Bench "A"(April 2006-2008) and Test-Bench "B"(May 2008 - Feb 2009), following outcome has been highlighted:
As we can see, Microsoft won this round, but what could be the reason behind it. On further determination, it can be justifiable that Microsoft has a good stand of Win32 machine learning capabilities in-depth at user/kernel layer. On the other side, no matter whichever AV vendor is trying to protect "at best" their customers from rising malware threats, they have to eat the bits
and pieces under the table before coming into the market.
1.Performance Tests
2.Dynamic Tests (proactive/normal conditions analysis)
3.Cleaning Tests (detective solution under infected machines to measure the cleaning capabilities)
However to remind that these tests are not limited and are extended with other considerable factors such as, retrospective detection rate (heuristics and signature based) and statistical analysis without user interaction. Looking into the latest February-2009 Report, following products were tested for speed and false alarm rates.
----
avast! Professional Edition 4.8.135
AVG Anti-Virus 8.0.234
AVIRA AntiVir Premium 8.2.0.374
BitDefender Antivirus 12.0.11.4
Command Anti-Malware 5.0.8
eScan Anti-Virus 10.0.946
ESET NOD32 Anti-Virus 3.0
F-Secure Anti-Virus 9.00.149
G DATA AntiVirus 19.1.0.0
Kaspersky Anti-Virus 8.0.0.506a
Kingsoft Antivirus 2008.11.6.63
McAfee VirusScan Plus 13.3.117
Microsoft Live OneCare 2.5.2900
Norman Antivirus & Anti-Spyware 7.10.02
Sophos Anti-Virus 7.6.4
Symantec Norton Anti-Virus 16.2.0.7
TrustPort Antivirus 2.8.0.3011
----
In the overall test evaluation provided in the report at:
http://www.av-comparatives.org/images/stories/test/ondret/avc_report21.pdf
The test-bench given above is contructed and evaluated on the basis of two sets of tests described in the report itself. However, the most interesting factor to notice is "how many malware samples have been tested"? to detect the static (and partially dynamic) behavior of the next-generation badwares.
Today's highly motivated attackers are more diverted into changing the detectable signature to undetectable and transparent malwares. This can easily be accomplished by applying latest cryptors, protectors and/or packing techniques. Thus, it is still viable to consider these set of AV solutions for static virus detections rather than complex and polymorphic malwares.
Comparing the false alarm rate with the set of malware composition from Test-Bench "A"(April 2006-2008) and Test-Bench "B"(May 2008 - Feb 2009), following outcome has been highlighted:
As we can see, Microsoft won this round, but what could be the reason behind it. On further determination, it can be justifiable that Microsoft has a good stand of Win32 machine learning capabilities in-depth at user/kernel layer. On the other side, no matter whichever AV vendor is trying to protect "at best" their customers from rising malware threats, they have to eat the bits
and pieces under the table before coming into the market.