While constructing a timeline for Dennis Technology Labs, the security testing venture I started in 2008, I realised that I've been involved in anti-malware testing for over a decade.
This is because, as a journalist, I used to test anti-malware software every year or so, probably from around 2000.
It seems that some of the rather 'edgy' testing techniques I used in the early days are coming into fashion now that 'Advanced Persistent Threats' (APTs) are perceived as being a significant threat.
Looking through my files I found one of the first anti-virus tests I ever published, in the now defunct Computer Shopper magazine.
The report (PDF), which appeared in 2004, contains this description of how I conducted the tests:
What the article doesn't mention there, but alludes to in the individual product reviews, is that I ran all files and monitored their behaviour to ensure that they were actually a threat to the system."Anti-virus programs should be able to detect the current, high-threat files being sent around the internet. Our test files included some of the most virulent and commonly found viruses, as well as well-known backdoor Trojans and harmful Visual Basic scripts that we generated using well-known virus-generation tools. None of these files should pose any problem to a decent scanner.We compressed copies of each into Zip files, too.Finally, just to add a bit of a challenge, we also used a few tricks to disguise the backdoor files. These tricks rely on freely available executable packers and a wrapper program that can attach the backdoor to another, more innocent file – in our case, Windows’ Minesweeper game. Whoever runs the game can play it, but will also unwittingly hand over control of their PC to the attacker. All the tests were run in Windows XP Professional."
For example, I would try to connect to the BackOrifice Trojan used (but not named) in this test, and control the victim system remotely.
Here's an extract from the Norton Antivirus 2004 product review:
"Even when we increased the heuristic level to the Highest setting, our backdoor Trojans were able to enter and operate on the system unhindered."Clearly such a small test provides the reviewer with time to play with configurations. This doesn't really scale when handling hundreds of malicious URLs and dozens of anti-malware programs.
I received a lot of angst from the anti-virus industry whenever one of these reviews was published. The people I spoke to at those companies really didn't like the idea of testers 'creating' threats, whether or not they used easily-available tools and well-known techniques.
Judging by some of the conversations I'm having with companies producing anti-APT solutions it might be time to start digging out the old BackOrifice, eLiTeWrap and UPX packer tools.
while i agree that there's a need to test anti-malware products ability to deal with things they've never seen before, i question if making new malware is really the best approach.
ReplyDeletesurely this kind of testing is the same kind of testing APTs perform and keep performing until they find something that works. i don't see that doing less than an APT would really measures how well a defense would work against an APT, and i don't see how an independent testing organization can hope to do a comparable amount as an APT unless they have comparable resources.
the point in modifying existing malware for testing seems to be aimed at coming up with something that won't match the signatures and thus be harder to detect. there are a countably infinite number of ways to modify an existing piece of malware and no feasible way to account for all those possible modifications so there will always be a way to modify it to bypass the scanner. maybe it's time for vendors to ship with a way to disable the portion of their products that are signature-dependent specifically for testing how well products cope when the scanner is (inevitably) bypassed.
signature-based defenses cannot possibly hope to stand up to an APT worthy of that designation, so testing APT defenses should look at testing the *other* defenses vendors bring to the table.
Hi Si, is Chris from MRG.
ReplyDeleteThe bottom line is, testing needs to change and it is.
The problem at the moment in my view in terms of the bad guys, the vendors and labs is that it is in this order:
Bad guys - Vendors - Labs
The bad guys create their stuff, the vendors respond with new technologies, then the labs create tests to test the technologies creates by the vendors.
This model ensures the vendors are happy with the labs as they are testing in accordance with their wishes / appropriate for their current technologies.
The trouble with this is that there is a opportunity for the vendor to be the labs customer.
As you know, due to the kinds of clients my lab has, we have worked in a different way.
I suggest we need to move from the bad guy - vendor lab model to the bad guy - lab - vendor model
In this game, the bad guy are the leaders and at the cutting edge and have the element of surprise. Labs need to keep their finger on the pulse, invest in getting their own direct malware feeds (NOT FROM VENDORS!), develop their own internal R&D and investigative facilities so they can be in step with the bad guys, get inside information on them, their plans, the way they operate, the emerging trends and threats.
By changing to the bad guy - lab - vendor model, the key changes are this:
1) The Lab moves towards being the client of the consumer, as they are able to show the shortfall between the protection they really need now and in the future and that that they are being provided at the moment.
2) The Vendors get real useful testing that helps them improve the efficacy of their product - which will help their customers - instead of Labs just producing tests for vendors to use as marketing collateral.
I love your quote "The people I spoke to at those companies really didn't like the idea of testers 'creating' threats" - LOL - you mean like how the bad guys create and modify threats!
This reminded me of a line from Blackadder, when he said that when he joined the army "the prerequisite for any battle was that the enemy should under no circumstances carry guns" and then went on to state he was quite shocked when 4.5 million heavily armed Germans "hoved into view".
This is the best analogy I can think of!.
My lab will carry on and increase the proportion of testing we do using simulators based on new APT, emerging threats etc - and you would be surprised how many vendors, service providers and influencers are approaching us and saying this is what they want.
Cheers,
Chris.