[ Pobierz całość w formacie PDF ]
A Pact with the Devil
Mike Bond and George Danezis
Computer Laboratory, University of Cambridge,
15 JJ Thomson Avenue, CB3 0FD, UK
f
Mike.Bond, George.Danezis
g
@cl.cam.ac.uk
Abstract. We study malware propagation strategies which exploit not
the incompetence or naivety of users, but instead their own greed, mal-
ice and short-sightedness. We demonstrate that interactive propagation
strategies, for example bribery and blackmail of computer users, are ef-
fective mechanisms for malware to survive and entrench, and present an
example employing these techniques. We argue that in terms of propaga-
tion, there exists a continuum between legitimate applications and pure
malware, rather than a quantised scale.
1
Introduction
\The stars move still, time runs, the clock will strike,
The devil will come, and Faustus must be damned."
The Tragicall History of D. Faustus | Christopher
Marlowe
Computer viruses and their payloads have developed through an interesting
historical chain. Morris Jr.'s early example of a self-replicating program was
caused signicant damage to the early Internet through poor calibration of its
propagation strategy [5]. In the age of oppy disks, memory persistence across
warm reboots, and multiple disk drives were the vectors of the highly destructive
viruses that evolved as weapons in a war of kudos between hackers. In the age
of Internet worms, exploits are a dime a dozen, and the most successful viruses
use multiple infection vectors in parallel. Meanwhile, the payloads (temporarily
impotent during the exploratory \development" phase) now perform targeted
acts of malice against one site on the net or another, or are simply about money
making. Modern worms are more concerned with creating and controlling botnets
and sending spam than they are with deliberately upsetting users.
The countermeasures used against the above schemes have also followed a
similar evolution. They range from simple `computer hygiene' (cold boot your
computer), through user education, to sophisticated anti-virus software, which
today include full virtual machines [6], thousands of virus detection templates,
and constant patching to online systems. It is unlikely that in the near future
any of these techniques will bring an end to malware propagation, and it is far
more likely that the arms race between propagation and defence will continue
ad innitum. A key point is that, through all this evolution, it has been assumed
that users are the enemies of the malware which (nearly by denition) acts
against their interests.
But in the study of all the selsh schemes that use replicating code to achieve
petty human end, have we in fact under-estimated the enemy? To better under-
stand the spectrum of propagation techniques, we introduce the concept of the
Satan Virus (named following the tradition from [7]). This virus propagates and
survives not only using the conventional arsenal of exploits and deception, but
also employs interaction with the user. We present a concrete example which
employs bribery and blackmail to acquire and retain hosts.
Our key contribution, following recent work on the economics of information
security [1], is to demonstrate that malware can provide enough incentives to
users for them to willingly maintain it on their systems, and can again provide in
the medium-term enough disincentives to them removing it. Users can therefore
enter in \a pact with the devil" that confers on them some powers, that the virus
shares with them, but as they soon realise, some heavy responsibilities too. Not
surprisingly, it is the darker human traits that such malware seeks to foster and
exploit { greed, curiosity, need for power, fear, shame, lust, to name but a few.
In section 2 we present and analyse an example design, and in section 3
we explore the full range of threats and rewards that viruses can use to inu-
ence user behaviour. In section 5 we consider countermeasures, and explore a
disturbing truth: otherwise useful software containing malware or ad-ware, the
free clients for peer-to-peer networks, and the numerous mainstream applica-
tions that bundle together desirable and undesirable features all come together
to form a continuum from `legitimate software' to the `Satan Virus'.
2
The Satan Virus
The term `Satan Virus' is somewhat symbolic { it is a conceptual super-virus
that carries the malice of the devil, and will employ the most ruthless techniques
to achieve its ends. While we hope that this ideal is beyond conception, many
cruder images of it could be just around the corner. We can thus speak of a family
of Satan Viruses, each credible and implementable. In this section, we explore
one instantiation: a simple but concrete design for a virus that propagates and
entrenches using bribery and blackmail; we then analyse its properties.
2.1
Design Principles
Satan Viruses are based on two fundamental design principles:
1. The carrot principle. First, the virus convinces the user to execute it
by conferring him or her with a certain advantage. This advantage is true
and tangible, it is backed up by evidence that clearly demonstrates it can
be provided, and should ultimately satisfy the user. There is no deception
involved at this stage { and the user knowingly \sells his soul to the devil"
2
to acquire this advantage. As long as he honours his side of the \pact" the
advantage is provided. This rst principle provides incentives for the user to
execute and maintain the virus alive.
2. The stick principle. Second, the virus, in its co-existence with the user,
gathers information about the user's activities, lifestyle and habits. It then
tells the user that if an attempt is made to remove the virus, the gathered
information will be used to hurt the user. This provides further disincentives
for the user to remove the virus.
In its purest form this Satan Virus does not deceive: it provides the advan-
tages it claims, and does not gratuitously hurt the user { it fulls its side of the
contract. The main catch lies in the contract terms, which can be ever expand-
ing: to maintain the virus alive on the \possessed" computer, to assist the virus
in spreading, whatever lies within the imagination of the virus author.
Naturally there is no reason to expect such pure (and in some twisted way,
honest) strategies to be employed in isolation, when mixing them with deception
and other more automated ways of executing malware would provide the virus
designer with better propagation characteristics. We again stress that the strate-
gies presented here may appear a poor substitute for traditional mechanisms: it
is because they are not intended to be a total replacement.
2.2
The Instantiation
For the threat to be tangible, the virus must implement the carrot and stick
principles in a robust way: the principles must respectively seduce the user com-
pellingly, and resist trivial bypass. For this instantiation, we will use access to
another user's les as the carrot, and revelation of this access to the party spied
upon as the stick. Assume there are three parties: Alice, Bob and Charlie. Al-
ice is already infected with the virus, and Bob and Charlie are related to her
(employees, colleagues, friends or family). The virus propagates in the following
manner:
1. Temptation. The virus sends an email from Alice to Bob, oering access
to all of Alice's emails and documents. To make the oer more enticing,
extracts from these documents containing Bob's name, or other interesting
keywords can be included. Bob can chose to accept this oer, by downloading
the virus (that can be hosted on Alice's computer or bundled in the email)
and executing it. As a result he should have full access to Alice's documents,
with a search interface to help locate les of interest.
2. Monitoring. As soon as the virus has installed itself, it starts recording
everything that Bob does, and in particular the accesses to Alice's informa-
tion. Crucially, this includes the search queries performed as well as logs of
the documents retrieved. This information is sent back to Alice or another
infected third party (that can be known through Alice) for safekeeping, but
it is not revealed. The key intuition is that the virus avoids the hard prob-
lem of automatic detection of `blackmail' material on Bob's computer, by
3
collecting evidence on the unsavoury act of spying that it has tempted Bob
to commit. The unauthorised access to Alice's computer, both in the les
Bob views, and the search terms he uses (revealing his suspicions of Alice)
should in most cases be incriminating material.
3. Blackmail. When a critical mass of incriminating evidence of unauthorised
accesses from Bob to Alice's machine has been gathered, the virus emails
Bob with a warning. The warning species that if an attempt is made to
remove the virus the information gathered will be revealed. A snippet of the
information can also be provided to substantiate the threat. To safeguard
the virus against retaliation, it sets up a life-line between Bob and Alice's
machine (or a compromised third party holding the incriminating evidence),
to monitor Bob's computer, and ensure that it remains infected. If Bob's
computer does not appropriately respond, the evidence is released.
4. Voluntary Propagation. Bob is asked by the virus to provide a target to
which it might spread. Bob selects Charlie. Bob is told that Charlie would
have the ability to read Alice's documents (not Bob's) and that he would
have the ability to read Charlie's documents. The `invitation' will appear to
be coming from the virus residing on Alice, in the form of an email tempting
Charlie to read her documents. Thus the incentives are aligned for Bob to
assist, and the virus propagates.
5. Involuntary Propagation. In case the virus has not propagated enough
through the addresses provided by Bob, it considers that Bob has breached
his side of the \pact", and sends itself to Bob's contacts, as harvested through
emails, contact lists, documents, etc. The virus now encourages recipients to
install it, using the incentive of access to Bob's les.
As a side eect of the propagation method described above, the virus nodes
can construct a peer-to-peer network, that can be used to propagate payloads or
commands from their \master". The lifelines between them can also be used to
manage the network, and make sure that it stays connected. While such peer-to-
peer design problems must be tackled by the virus writer, we shall not discuss
them further as they apply equally well to traditional non-interactive malware,
which has already demonstrated some capability in this area. Some further im-
plementation issues and open research problems are presented in section 4.
2.3
Analysis
We will spend some time considering the incentives of Alice, Bob and Charlie,
in each case where action is solicited.
The temptation step of the propagation is by far the most uncertain and
risky from the point of view of the virus. At this stage, a tempted Charlie
can just say `No' { without any repercussions. The challenge here is incentive
design: the author of a virus exploiting interactive propagation must design
lures which appeal to Charlie, and a framework for soliciting assistance from
Bob. This might require Bob to aid in customising an email to tempt Charlie
with Alice's les, or require him to surreptitiously visit Charlie's PC and click
4
to open the attachment. Strategies commencing with a threat of harm on an
identied `signicant other' of Charlie lie beyond the scope of this example.
Just as important as design of the lures, the virus author must calibrate the
strategy to spread with the correct amplifying eect. Studying the inuence of
simple local tactics for involuntary propagation in existing virus code already
creates a headache for analysts; the same challenge will meet the designer of
interactive propagation routines. A partial solution could be to design lures which
are good for propagation in particular directions through the social network; it
will need lures to spread both up and down hierarchies in corporations, and from
friend to friend across `gossip bridges' between corporate networks.
The generic lure of spying in this virus instantiation will likely have imme-
diate appeal to Charlie if there is an asymmetric power relation between them
(whether known to Bob, or data-mined [8]), so it will certainly travel eectively
down hierarchies, and plausibly up as well. With the minor addition of a hint of
what salacious material might be found (substantiated or unsubstantiated) peo-
ple in peer groups can also be encouraged to spy. In essence, human curiosity is
at the heart of this lure. We believe the algorithm proposed above is suciently
simple to correctly calibrate for an amplifying eect.
Alice
1.
2.
Bob
Charlie
Fig. 1. Bob is tempted by Alice's les (1) and then uses Alice's les to tempt Charlie
(2). As a result Bob can access Alice and Charlie's les.
During the monitoring phase Bob is probably aware that the virus might be
giving other people access to his les (it is not yet apparent that the incriminating
information to be actually used will be the browsing patterns on Alice's les).
This might lead the user to moderate his activities, and delete incriminating
information from his computer. If this is done after the virus has been installed
it might be used to discern what material is interesting to use for blackmail
purposes (deleted material is bound to be more interesting than random les).
The material could have been deleted before the virus has been installed (as a
precaution). This illustrates quite vividly the \virtuous sinner" paradox: if Bob
does not perceive that he has something to hide, genuinely or because he has
deleted the information, he will be more tempted to spy on Alice, since he is less
worried about it happening to him.
5
[ Pobierz całość w formacie PDF ]