Tuesday, September 04, 2007

file infecting viruses vs digital signatures

vesselin's comments to my previous article inspired me to consider actual attack scenarios rather than just weaknesses in a proposed system so i've turned the title of this entry around to indicate more focus on the threats rather than the vulnerabilities...

in one of the responses to the comments i posited a scenario where a malicious entity compromises a legitimate software vendor and infects their software in such a way that they distribute infected programs with valid digital signatures and which can in turn sign executables they infect... today nishad herath posted a similar scenario over at the mcafee avert blog so at least i'm not the only one who thought of this possibility...

but you know what, that's actually a rather complicated scenario so i got to thinking maybe there's a simpler one... then it dawned on me; the premise for this protective technique is to manage the integrity of files and the assumption is that you can't infect files without changing them and thus affecting their integrity... it turns out that this assumption is wrong - certain types of companion viruses can infect a host program without modifying it at all... so if a malicious entity were to get a certificate they could easily sign their companion virus with it and so long as no one figured out the entity was signing malicious code their certificate would never be revoked and the virus could spread unhindered... and in joanna's world where digital signatures replace all the tricks that are used to detect the presence of viruses there would be nothing to alert the general public that the code was malicious and the certificate should be revoked...

ouch, that seems pretty damning, doesn't it... but you know what, companion viruses were always pretty obscure... it wasn't a particularly popular strategy in part because the extra files gave it away to those who were looking for extra files (oh, another one of those nasty virus detection tricks joanna thinks we can do without)... ok, so then how about a virus that inserts the host program into a copy of itself, rather than itself into the host program (the so-called amoeba infection technique) and then signs this new copy with the aforementioned maliciously obtained certificate? once again, a digital signature based whitelist isn't going to stop this from happening...

now that takes care of one of the aspects that kept companion viruses obscure but it doesn't really improve on the obscurity itself as this technique is even more obscure that companion infection... if you've followed me to this stage perhaps something else has dawned on you - if a virus can sign a copy of itself with a host program inside of it, why shouldn't it be able to sign a host program that it had inserted a copy of itself into using a completely conventional infection technique and the malicious entity's certificate? the answer is there is no reason it can't...

so it would seem that a digital signature based whitelist where vendors sign their own programs (effectively vouching for the safety of their own code) wouldn't really prevent file infecting viruses at all if that were the only thing the world were using... you still need all sorts of tricks to figure out when a vendor's certificate can't be trusted anymore, which points back to a fundamental problem with this kind of self-signing system - it correlates identity (which is the only thing a certificate authority can test) with trustworthiness in spite of the fact that they don't actually have anything to do with each other... just because the vendor's front man is who he says he is and hasn't done anything bad in the past (that anyone knows of) doesn't mean the vendor itself isn't a malicious entity... currently, standard certificates (like the ones used for websites) have become so easy for anyone to get that they've become meaningless and this lead to the creation of extended validation certificates which simply involves more in-depth investigation of the entity and which in turn has no bearing on what the entity will do after getting the certificate... i can see no way for a digital signature system for code to work any different than the one used for websites so the same problem will apply; and then even if we somehow figure out that the entity cannot be trusted, their virus(es) will continue to spread until a revocation is issued for their certificate and that information trickls down to all the affected systems...

trusting the vendor (or whatever else you want to call the software provider) to attest to the trustworthiness of their own software just seems far too naive from a security standpoint, which is why i originally didn't even consider it to be the model joanna had been talking about... a system where independent reviewers checked programs for malicious code before signing them (essentially certifying programs rather than program providers) seemed to be a safer solution, though it's got the same scaling problems that conventional centrally managed whitelists have... a system that certifies programs rather than program providers would be less vulnerable to the scenarios mentioned here (i think only a variation on the first one should be able to allow viruses to still spread) but either way, both options still allow viruses to operate if used on their own... at best (and by now this should sound familiar) digital signature based whitelists should be something we use with the more conventional tricks we're used to, not instead of them as joanna rutkowska would like you to believe...

4 comments:

Vess said...

Ah, still missing the point, I'm afraid. :-)

Re: companion viruses. Even if we leave aside that a companion virus is extremely unlikely to spread (not a single one has ever been ITW), this infection technique fools the user - not the OS. If a digitally signed executable becomes companion-infected, the OS will try to execute the companion instead of the original executable. Assuming that the verification of the digital signatures is closely integrated with the OS (it has to be), it will notice that the companion is not signed at all or is not signed with a trusted key and will refuse to execute it.

Re: the amoeba infection scenario. As you note later, it is meaningless - if the virus cannot sign the modified executable (that includes the original signed executable), the OS won't let it run due to signature verification failure. And if it can signed it, then there is no reason to resort to this infection technique - you can infect parasitically anything in any way and then sign it.

One point that you seem to be missing is that it is not sufficient to run only signed executables - that's not what Office does with the macros. You have to run only executables digitally signed with a key marked as trusted by the user.

This way, even if the malware author signs his malware, it won't help unless the user has explicitly indicated that executables signed with this key should be run. Yes, users will keep making mistakes. But, first, there is a whole lot of difference between explicitly indicating which producers to trust and the current situation of silently and blindly running anything. And, second, a properly implemented PKI should allow the offender to be traced easily and it should be easy to indicate (without user interaction) that his executables are not to be trusted by revoking his key.

Yes, just because a vendor says "trust me" doesn't mean that you should. But a digital signature establishes the identify of that vendor. Once his lack of trustworthiness has been established (by other means), it allows for easy propagation of this mistrust. You don't need any "tricks" for that - key revocation protocols are pretty straightforward.

You're right that code signing is equivalent (in terms of identity, trust and security) to Web certificates. How many phishing sites with valid certificates have you seen? :-) Oh, there are some - but the phishing problem would be hugely reduced if every site had a certificate and the browser refused to visit sites with valid certificates.

The drawbacks of digital code signing lie elsewhere: a rigorous procedure penalizes the small software producers (and the freeware producers) because it is expensive. Also, there are executables that can't be protected this way (e.g., a CodeRed-like worm). That's why it won't solve the malware problem. But it will help reducing it.

kurt wismer said...

"it will notice that the companion is not signed at all or is not signed with a trusted key and will refuse to execute it."

??? did you miss the part where i said the malicious entity signs the companion infector?

i realize that in the real world it would be very hard to keep a trusted certificate if you're distributing malware, but i don't see that there's any reason to believe it would be hard to get one in the first place...

and in the unlikely fantasy world where digital signatures have supplanted all other anti-virus techniques there would be nothing to indicate that the certificate should be revoked...

"One point that you seem to be missing is that it is not sufficient to run only signed executables - that's not what Office does with the macros. You have to run only executables digitally signed with a key marked as trusted by the user."

well gosh, if the certificate authority (microsoft, verisign, whoever) trusts them then why shouldn't i?... if the user can't figure out which code to trust (the reason for the digital signature scheme in the first place) why would one think they can do any better with keys?

"Yes, users will keep making mistakes."

exactly, and that means it is technically possible for file infecting viruses to self-replicate in spite of a digital signature based whitelist... perhaps not to the same degree as they can without it, but it doesn't stop it completely and so the tricks (as joanna puts it) that the av industry has spent so many millions on developing are still needed...

"But, first, there is a whole lot of difference between explicitly indicating which producers to trust and the current situation of silently and blindly running anything."

again, i'm not saying there isn't and i'm not saying such a system wouldn't help... what i'm saying is that such a system cannot stand alone... without something else to tell us when a cert has been given to a malicious entity such certs are never going to be revoked...

"And, second, a properly implemented PKI should allow the offender to be traced easily"

the person who gets investigated in the process of issuing the cert and the person who actually uses the cert don't have to be the same person... a malware creator could easily pay someone to get a cert for them... tracability is a red herring...

"Once his lack of trustworthiness has been established (by other means), it allows for easy propagation of this mistrust. You don't need any "tricks" for that - key revocation protocols are pretty straightforward."

what you need 'tricks' for is those 'other means'...

"You're right that code signing is equivalent (in terms of identity, trust and security) to Web certificates. How many phishing sites with valid certificates have you seen? :-)"

how many phishing sites need them... users don't pay attention to that so it's not necessary but if it were the phishers could easily get a cert... comodo, for example, hands out free ones that are good for 90 days...

"The drawbacks of digital code signing lie elsewhere: a rigorous procedure penalizes the small software producers (and the freeware producers) because it is expensive."

and this is exactly why it won't be rigorous... hurting small business doesn't happen without consequence, small business contributes to the economy so if you make it infeasible for them to get certs guess what happens and guess who they'll complain to...

"That's why it won't solve the malware problem. But it will help reducing it."

once again, not arguing that point - i don't know why you think i am... i feel i was very clear this time about the fact that i was simply arguing it can't do what joanna says it can do (solve the virus problem all by itself)...

Vess said...

i realize that in the real world it would be very hard to keep a trusted certificate if you're distributing malware, but i don't see that there's any reason to believe it would be hard to get one in the first place...

First of all, it can be hard - ever tried to get a Symbian S60 R3 executable or a Vista driver signed? Second, even if it is not, it would still mean that any malware author will be able to produce only one signed malware program. Third, it will mean that the author is traceable instead of anonymous as it is now.

All this will significantly reduce the malware problem. No, it won't eliminate it completely. But reducing it significantly is not bad.

and in the unlikely fantasy world where digital signatures have supplanted all other anti-virus techniques there would be nothing to indicate that the certificate should be revoked...

First, I am not saying that digital code signing will replace other anti-virus techniques - I am only saying that it is something worth doing, because it will help reduce the malware problem. Second, believe me, people tend to notice that something is amiss even without any anti-virus programs in place. If that weren't so, they wouldn't realize that they needed such programs in the first place - yet many people start looking for such programs after they realize that their machine has become infected. Ergo, they do realize it somehow.

if the certificate authority (microsoft, verisign, whoever) trusts them then why shouldn't i?

That's not how it works. I really suggest that you should intimate yourself with the subject more closely. The user would very rarely notice that the certification authority trusts the key - only if they select to view the code signer's certificate, which is not that trivial. It works like this - foreign code arrives that's either not signed or signed with a key you haven't specified as trustworthy - it gets silently ignored. (As opposed to the current situation when it is silently executed.) If you want it to be run, you have to manually undergo several steps to specify that you want to trust code signed with this producer's key. If the key is not signed by a certification authority, you might even not be able to do that (although this is not currently the case with Office macros).

if the user can't figure out which code to trust (the reason for the digital signature scheme in the first place) why would one think they can do any better with keys?

Because there is a huge quantity difference, which results in a quality of its own. Now, all code gets silently executed - often without the user's knowledge. With code signing, the user decides the code of which producer to trust - on a producer-by-producer basis instead of on program-by-program basis.

that means it is technically possible for file infecting viruses to self-replicate in spite of a digital signature based whitelist...

Oh, yes. But, remember, the idea is not to make any form of malware impossible - the idea is to reduce the malware problem. The population of machines on which the scenario you're talking about will be possible will be vastly smaller than the current population of infectable machines. In fact, it will be small enough to make it impossible for a virus to survive in the wild. You see, there are still DOS and Win9x machines in existence, which means that boot sector viruses are possible. But this population is small enough, meaning that such viruses are no longer a problem.

perhaps not to the same degree as they can without it, but it doesn't stop it completely and so the tricks (as joanna puts it) that the av industry has spent so many millions on developing are still needed...

Exactly! Forget Joanna - as I said, she's missing the point too. It will reduce the malware problem without making malware impossible. But just this reduction makes it worth doing.

i'm not saying such a system wouldn't help... what i'm saying is that such a system cannot stand alone...

We're in agreement, then. :-) Because what I'm trying to emphasize is that such a system will help. I never said that it should stand alone. To begin with, it needs heavy OS (and maybe even hardware or at least firmware) support. You can't have trusted execution path if the system doign the checks can be compromised.

tracability is a red herring...

Not at all. Again, look at the real world, instead of using made-up theoretical arguments. How many Symbian S60 R3 viruses are there? Hint: none, just one spyware program. How many signed Office macro viruses? Hint: three and they can't spread anyway, because they aren't signed with a trusted key. How many phishing sites on the .gov domain? Hint: none. How many phishing sites on the .no domain? Hint: next to none, because at least until recently they required a verifiable postal mail address in Norway before assigning a domain. And so on.

Tracability, compared to anonymity, is a huge impediment to criminals. And, keep in mind, the idea is to reduce the problem of malware significantly - no reasonable person would claim that it can be eliminated completely.

what you need 'tricks' for is those 'other means'...

Not at all, unless you, like Joanna, label the whole AV industry "tricks". :-)

how many phishing sites need them... users don't pay attention

First, you're wrong about that - users do pay attention to that. When Microsoft moved the padlock icon in IE 7, do you know how many people noticed that it wasn't at the usual place when visiting their secure sites and started asking where it was? Check Yahoo! Answers sometime.

Second, it's not the users who have to be "paying attention" about that. It's the OS/browser/whatever. Office silently ignores macros that are not signed with a trusted key - no matter whether the users are paying attention or not. In fact, the users have to do some very explicit manual steps, in order to indicate that the key of a particular producer has to be trusted. If all sites were using Web certificates, the browser could similarly be configured to refuse visiting sites without a certificate that the user has explicitly marked as trustworthy. This would greatly reduce phishing - without completely eliminating it, of course.

hurting small business doesn't happen without consequence

Well, Symbian did it and seems to be getting away with it...

it can't do what joanna says it can do (solve the virus problem all by itself)

I don't think that even she is saying that. She's just claiming that it would make parasitic infection impossible. She's wrong, of course - which is why I said that she's missing the point too. The idea is not to make any form of malware impossible - that's impossible without making the machine unusable. The idea is to reduce significantly the malware problem - and digital code signing will help in that aspect.

kurt wismer said...

"First, I am not saying that digital code signing will replace other anti-virus techniques"

i realize... the point is that my arguments in these two blog posts are mostly directed towards joanna who is saying precisely that...

"That's not how it works. I really suggest that you should intimate yourself with the subject more closely. The user would very rarely notice that the certification authority trusts the key - only if they select to view the code signer's certificate"

what the user is likely going to notice is that they aren't prompted to trust a new CA so the assumption will be that one of their existing trusted CA's trusts the vendor... a user is no more likely to research the vendor in order to decide whether to trust the key in this scheme than they are to research the vendor in order to decide whether to trust an app in a conventional application whitelisting scenario...

"Exactly! Forget Joanna"

sorry, that's a little hard to do - you see she's the media darling, she's the one everyone who doesn't know better is listening to... her bad arguments are the ones that need to be countered and i think they need to be countered in a higher profile way than just comments to her blog post which may or may not get published...

"We're in agreement, then. :-) "

to be honest, i think we're probably in agreement (more or less) on most things in this discussion... i think our major differences are in details such as the ease of getting a cert, the way people will interact with the keys, and the significance of traceability...

""tracability is a red herring..."

Not at all. Again, look at the real world, instead of using made-up theoretical arguments."

my argument about traceability isn't so much made up as it is a re-application of a principle that's already being used to launder money... the malicious entity can simply anonymously pay some flunky to play the role of the traceable entity...

"Well, Symbian did it and seems to be getting away with it..."

because the symbian platform is such that the freedom to run unsigned code was not widely used (or used at all) prior to locking it all down with digital signatures...

on a platform with a huge existent software development industry, nobody is going to be able to get away with making certificates hard to get... i can't see software certs being significantly more difficult to get than ssl certs and we already know how easy that is (ie. so easy that ev certs needed to be invented)...

"I don't think that even she is saying that. She's just claiming that it would make parasitic infection impossible."

i don't think it's an incredibly far stretch to say that if you could make parasitic infection impossible you would effectively solve the file infecting virus problem... and she most definitely referred to her proposal as an 'elegant solution'...