Testing an antivirus walk on the wild side
Going online these days is a walk on the wild side – millions of bits of malware, infected websites, ransomware, even phishing emails – all waiting for your careless move. With this given level of danger, the next logical question is whether or not the risks and the actual online situations can be accurately simulated and tested – just like they are for your car?
Driving for a planned collision
Independent testers such as the Euro NCAP and the American NHTSA do just that. They develop tests for likely collision scenarios from the front, side, and rear and factor in distinctions for adult and child passengers. Then they run cars through these tests. While the tests aren’t perfect – no test is – the agencies work really hard at making the test conditions closely approximate risks faced in real life. And they are constantly adding new angles such as autonomous braking checks for potential risks and dangers in the future. From the commercial perspective, failure to get a high marking can doom a new car launch.
Now let’s test this online
Testing cars is one thing since they are big and accidents can be fatal. But device security, where the threats are collections of binary code, would seem to be a completely different issue. However, several independent testing agencies including AV-Comparatives and AV-Test do just that – look at real user scenarios and develop tests for various antivirus programs which measure their ability to protect the user and their device. As a specific example, we have the latest AV-Comparatives test for the Apple MAC OS and AV Test’s Consumer Product Report for Windows devices.
There’s more to the test than crashing
Detecting malware is a central part of these tests – “antivirus ‘X’ detects all 1,310 threats including 310 MacOS and 1,000 prevalent examples of Windows malware” – but that is not all. The tests also look at performance and usability. To continue with the car metaphor, that means that these tests are more like an in-depth Tesla TopGear review than a more limited collision rating. After all, people want to use and enjoy this experience of using their devices – not just keep hackers out of their accounts and private lives.
Protection is just the start
Protection is the beginning of the testing process. After all, that is what an antivirus program is supposed to do? AV-Comparatives threw a batch of Mac and Windows malware at the Apple devices. AV-Test had a wider selection of Windows malware sourced from the previous month. “The frequency of this test also makes an up-to-date sample collection and fast creation of detections especially important,” said Alex Vukcevic, head of QA and the Virus Lab at Avira. “Especially for the 0-day malware, a good cloud detection strategy is crucial to achieving full points, as many of the samples have never been seen before by any vendor.”
The look is more than cosmetic
Testers also peered closely at the antivirus interface – how the user had to interact with the antivirus during the installation and day-to-day operation. While some users just want their antivirus to work, others want to know the minute-by-minute progress of scans. AV-Comparatives described Avira Antivirus for Mac UI as quiet, lightweight, and unobtrusiveness.
Measuring the annoyance factor
Having an antivirus program is an issue of need not want – different from a car. Unlike a Tesla 3, you need an antivirus and you also need it to stay out of sight and out of your way. And, in a stark difference from a car review, performance is how much the antivirus slows a user down and usability is the lack of False Positive malware identifications which shut the device down. A comparable measure of Tesla usability would be the driving range and the density of charging stations.
We’ve got an award on the wild side
With all apologies to Lou Reed, we are pleased to point out that its core antivirus products have done quite well in two recent independent tests. Avira secured the “Approved” mark from AV-Comparatives look at antivirus products for the Mac OS. In addition, Avira Antivirus Pro got the “Top Product” rating from AV-Test for their latest Consumer Full Product Testing of Windows antivirus products. We’ve got the awards and recognition, it’s your choice to drive them.
Avira Antivirus Pro: Real World Protection Test "Advanced+" award
In this test, multiple monthly assessments of the products ability to protect a typical real-world system from the newest malware threats are done to see which product has the best protection capabilities over a longer time span.
Due to this, a constantly high detection quality, such as a fast reaction time to new threats as well as a low false alarm rate, is necessary to achieve good results.
The Advanced+ seal shows how well Avira Antivirus Pro has mastered its task.
Avira Antivirus for Mac: "Approved" by AV-Comparatives
Avira secured the “Approved” mark from AV-Comparatives look at antivirus products for the Mac OS. The Antivirus detected 100% of the Mac samples thrown at it.
“Normally you will see little of Avira in operation”, states the test. “The main app window is clear and clean in operation, and tells you immediately if something is wrong.”
Avira Antivirus: “Top Product” rating from AV-Test
Avira Antivirus Pro got the “Top Product” rating from AV-Test for their latest Consumer Full Product Testing of Windows antivirus products and a seal to prove it.
During May and June 2018 AV-Test continuously evaluated 18 home user security products using their default settings. They focused on realistic test scenarios and challenged the products against real-world threats. The different products had to demonstrate their capabilities using all components and protection layers.
The result: Avira Antivirus received a protection score, a performance score, and a usability score of 6/6.