17.06.2012, 09:15
Zacytuję Wam b. ciekawy komenarz, który uświadomił mi na jak niewielką skalę jest pzeprowadzany ten test dynamiczny
Cytat: Many people describe the "Whole Product Dynamic Test" as being the real veritable test. Let me just state some of my thoughts about it. We are talking about 464 malicious URLs being tested which have been culled among known websites distributing malware.
My first question would be, what are the odds of landing on one of such infected URL? In 4 years I''ve only run once into one of these infected website, and one should consider the built in security of say Chrome and IE9 which is quite effective.
In the last 3 weeks in my work environment, Avira and MBAM (mostly Avira though) caught more than 30 real malware from around 120 USB flashdrives, my computer was without a connection during these operations, can you imagine what would have happened using an AV that relies heavily on cloud scanning?
Back to the results of the "real test". Let''s take 2 examples Avira (my choice) and BitDefender which has had excellent results lately.
Avira caught 97.8 % of 464 malware items, that is 453 out of 464
BitDefender caught 99.1 % of 464, that is 459 out of 464
We are talking about a difference of 6 pieces of malwarein one month looking specifically for malware. BitDefender caught 6 pieces of malware more than Avira.
Let''s take a look at the last "On demand Detection test" March 2012.
Avira caught 99.4% of 300,000 malware items, that is 298,200 out of 300,000
BitDefender caught 98.6% of 300,000, that is 295,800 out 300,000
Avira caught 2400 pieces of malware more than BitDefende r. Readers should draw their conclusions about these tests.
I''d like to point out that I''m comparing results of two companies as an example to comment on the methodology of tests. It is not intended as A versus B, as a matter of fact most companies had better results than Avira.