[ Pobierz całość w formacie PDF ]
Technical Baseline Report
A Framework for Deception
A Framework for Deception
by Fred Cohen, Dave Lambert, Charles Preston, Nina Berry, Corbin Stewart, and Eric Thomas *
July 13, 2001
Fred Cohen: Fred Cohen & Associates, University of New Haven, Sandia National Laboratories
Dave Lambert: SPAWAR Systems Center
Charles Preston: Information Integrity, University of New Haven
Nina Berry: Sandia National Laboratories
Corbin Stewart: Sandia National Laboratories
Eric Thomas: Sandia National Laboratories
This research was sponsored by the United States Department of Defense under MIPR1CDOEJG102
2112040 162-3825 P633D06 255X 633006.247.01.DD.00 JGBZZ.1 JOAN 1JG8CA
Executive Summary
This paper overviews issues in the use of deception for information protection. Its objective is to create a
framework for deception and an understanding of what is necessary for turning that framework into a practical
capability for carrying out defensive deceptions for information protection.
Overview of results:
We have undertaken an extensive review of literature to understand previous efforts in this area and to compile
a collection of information in areas that appear to be relevant to the subject at hand. It has become clear
through this investigation that there is a great deal of additional detailed literature that should be reviewed in
order to create a comprehensive collection. However, it appears that the necessary aspects of the subject have
been covered and that additional collection will likely be comprised primarily of detailing in areas that are now
known to be relevant.
We have developed a framework for creating and analyzing deceptions involving individual people, individual
computers, one person acting with one computer, networks of people, networks of computers, and
organizations consisting of people and their associated computers. This framework has been used to model
select deceptions and, to a limited extent, to assist in the development of new deceptions. This framework is
described in the body of this report with additional details provided in the appendixes.
Based on these results; (1) we are now able to understand and analyze deceptions with considerably more clarity
than we could previously, (2) we have command of a far greater collection of techniques available for use in
defensive deception than was previously available and than others have published in the field, and (3) we now
have a far clearer understanding of how and when to apply which sorts of techniques than was previously
available. It appears that with additional effort over time we will be able to continue to develop greater and more
comprehensive understanding of the subject and extend our understanding, capabilities, and techniques.
Further Work:
It appears that a substantial follow-on effort is required in order to systematize the creation of defensive
information protection deceptions. Such an effort would most likely require:
The creation of a comprehensive collection of material on key subject areas related to deception. This has
been started in this paper but there is clearly a great deal of effort left to be done.
The creation of a database supporting the creation of analysis of defensive deceptions and a supporting
software capability to allow that database to be used by experts in their creation and operation of
deceptions.
A team of experts working to create and maintain a capability for supporting deceptions and sets of
supporting personnel used as required for the implementation of specific deceptions.
We strongly believe that this effort should continue over an extended period of time and with adequate funding,
and that such effort will allow us to create and maintain a substantial lead over the threat types currently under
investigation. The net effect will be an ongoing and increasing capability for the successful deception of
increasingly skilled and hostile threats.
Introduction and Overview
According to the American Heritage Dictionary of the English Language (1981):
"deception" is defined as "the act of deceit"
"deceit" is defined as "deception".
Since long before 800 B.C. when Sun Tzu wrote "The Art of War"
[28]
deception has been key to success in
warfare. Similarly, information protection as a field of study has been around for at least 4,000 years [41] and
has been used as a vital element in warfare. But despite the criticality of deception and information protection in
warfare and the historical use of these techniques, in the transition toward an integrated digitized battlefield
and the transition toward digitally controlled critical infrastructures, the use of deception in information
protection has not been widely undertaken. Little study has apparently been undertaken to systematically
explore the use of deception for protection of systems dependent on digital information. This paper, and the
effort of which it is a part, seeks to change that situation.
In October of 1983, [25] in explaining INFOWAR, Robert E. Huber explains by first quoting from Sun Tzu:
"
Deception: The Key
The act of deception is an art supported by technology. When successful, it can
have devastating impact on its intended victim. In Fact:
"All warfare is based on deception. Hence, when able to attack, we must seem unable; when using our
forces, we must seem inactive; when we are near, we must make the enemy believe we are far away; when
far away, we must make him believe we are near. Hold out baits to entice the enemy. Feign disorder, and
crush him. If he is secure at all points, be prepared for him. If he is in superior strength, evade him. If your
opponent is of choleric temper, seek to irritate him. Pretend to be weak, that he may grow arrogant. If he is
taking his ease, give him no rest. If his forces are united, separate them. Attack him where he is unprepared,
appear where you are not expected."
[28]
The ability to sense, monitor, and control own-force signatures is at the heart of planning and executing
operational deception...
The practitioner of deception utilizes the victim's intelligence sources, surveillance sensors and targeting
assets as a principal means for conveying or transmitting a deceptive signature of desired impression. It is
widely accepted that all deception takes place in the mind of the perceiver. Therefore it is
not
the act itself
but the acceptance that counts!"
It seems to us at this time that there are only two ways of defeating an enemy:
(1) One way is to have overwhelming force of some sort (i.e., an actual asymmetry that is, in time, fatal to
the enemy). For example, you might be faster, smarter, better prepared, better supplied, better informed,
first to strike, better positioned, and so forth.
(2) The other way is to manipulate the enemy into reduced effectiveness (i.e., induced mis-perceptions
that cause the enemy to misuse their capabilities). For example, the belief that you are stronger, closer,
slower, better armed, in a different location, and so forth.
Having both an actual asymmetric advantage and effective deception increases your advantage. Having neither
is usually fatal. Having more of one may help balance against having less of the other. Most military
organizations seek to gain both advantages, but this is rarely achieved for long, because of the competitive
nature of warfare.
Overview of This Paper
The purpose of this paper is to explore the nature of deception in the context of information technology
defenses. While it can be reasonably asserted that all information systems are in many ways quite similar, there
are differences between systems used in warfare and systems used in other applications, if only because the
consequences of failure are extreme and the resources available to attackers are so high. For this reason,
military situations tend to be the most complex and risky for information protection and thus lead to a context
requiring extremes in protective measures. When combined with the rich history of deception in warfare, this
context provides fertile ground for exploring the underlying issues.
We begin by exploring the history of deception and deception techniques. Next we explore the nature of
deception and provide a set of dimensions of the deception problem that are common to deceptions of the
targets of interest. We then explore a model for deception of humans, a model for deception of computers, and
a set of models of deceptions of systems of people and computers. Finally, we consider how we might design
and analyze deceptions, discuss the need for experiments in this arena, summarize, draw conclusions, and
describe further work.
A Short History of Deception
Deception in Nature
While Sun Tzu is the first known publication depicting deception in warfare as an art, long before Sun Tzu there
were tribal rituals of war that were intended in much the same way. The beating of chests [44] is a classic
example that we still see today, although in a slightly different form. Many animals display their apparent fitness
to others as part of the mating ritual of for territorial assertions. [35] Mitchell and Thompson [35] look at human
and nonhuman deception and provide interesting perspectives from many astute authors on many aspects of
this subject. We see much the same behavior in today's international politics. Who could forget Kruschev
banging his shoe on the table at the UN and declaring "We will bury you!" Of course it's not only the losers that
'beat their chests', but it is a more stark example if presented that way. Every nation declares its greatness, both
to its own people and to the world at large. We may call it pride, but at some point it becomes bragging, and in
conflict situations, it becomes a display. Like the ancient tribesmen, the goal is, in some sense, to avoid a fight.
The hope is that, by making the competitor think that it is not worth taking us on, we will not have to waste our
energy or our blood in fighting when we could be spending it in other ways. Similar noise-making tactics also
work to keep animals from approaching an encampment. The ultimate expression of this is in the area of
nuclear deterrence. [45]
Animals also have genetic characteristics that have been categorized as deceptions. For example, certain
animals are able to change colors to match the background or, as in the case of certain types of octopi, the
ability to mimic other creatures. These are commonly lumped together, but in fact they are very different. The
moth that looks like a flower may be able to 'hide' from birds but this is not an intentional act of deception.
Survival of the fittest simply resulted in the death of most of the moths that could be detected by birds. The
ones that happened to carry a genetic trait that made them look like a particular flower happened to get eaten
less frequently. This is not a deception, it is a trait that survives. The same is true of the Orca whale which has
colors that act as a dazzlement to break up its shape.
On the other hand, anyone who has seen an octopus change coloring and shape to appear as if it were a rock
when a natural enemy comes by and then change again to mimic a food source while lying in wait for a food
source could not honestly claim that this was an unconscious effort. This form of concealment (in the case of
looking like a rock or foodstuff) or simulation (in the case of looking like an inedible or hostile creature) is
highly selective, driven by circumstance, and most certainly driven by a thinking mind of some sort. It is a
deception that uses a genetically endowed physical capability in an intentional and creative manner. It is more
similar to a person putting on a disguise than it is to a moth's appearance.
Historical Military Deception
The history of deception is a rich one. In addition to the many books on military history that speak to it, it is a
basic element of strategy and tactics that has been taught since the time of Sun Tzu. But in many ways, it is like
the history of biology before genetics. It consists mainly of a collection of examples loosely categorized into
things that appear similar at the surface. Hiding behind a tree is thought to be similar to hiding in a crowd of
people, so both are called concealment. On the surface they appear to be the same, but if we look at the
mechanisms underlying them, they are quite different.
"Historically, military deception has proven to be of considerable value in the attainment of national security
objectives, and a fundamental consideration in the development and implementation of military strategy and tactics.
Deception has been used to enhance, exaggerate, minimize, or distort capabilities and intentions; to mask
deficiencies; and to otherwise cause desired appreciations where conventional military activities and security
measures were unable to achieve the desired result. The development of a deception organization and the
exploitation of deception opportunities are considered to be vital to national security. To develop deception
capabilities, including procedures and techniques for deception staff components, it is essential that deception
receive continuous command emphasis in military exercises, command post exercises, and in training operations."
--JCS Memorandum of Policy (MOP) 116 [10]
MOP 116 also points out that the most effective deceptions exploit beliefs of the target of the deception and, in
particular, decision points in the enemy commander's operations plan. By altering the enemy commander's
perception of the situation at key decision points, deception may turn entire campaigns.
There are many excellent collections of information on deceptions in war. One of the most comprehensive
overviews comes from Whaley [11], which includes details of 67 military deception operations between 1914 and
1968. The appendix to Whaley is 628 pages long and the summary charts (in appendix B) are another 50 pages.
Another 30 years have passed since this time, which means that it is likely that another 200 pages covering 20
or so deceptions should be added to update this study. Dunnigan and Nofi [8] review the history of deception in
warfare with an eye toward categorizing its use. They identify the different modes of deception as concealment,
camouflage, false and planted information, ruses, displays, demonstrations, feints, lies, and insight.
Dewar [16] reviews the history of deception in warfare and, in only 12 pages, gives one of the most cogent
high-level descriptions of the basis, means, and methods of deception. In these 12 pages, he outlines (1) the
weaknesses of the human mind (preconceptions, tendency to think we are right, coping with confusion by
leaping to conclusions, information overload and resulting filtering, the tendency to notice exceptions and
ignore commonplace things, and the tendency to be lulled by regularity), (2) the object of deception (getting the
enemy to do or not do what you wish), (3) means of deception (affecting observables to a level of fidelity
appropriate to the need, providing consistency, meeting enemy expectations, and not making it too easy), (4)
principles of deception (careful centralized control and coordination, proper preparation and planning,
plausibility, the use of multiple sources and modes, timing, and operations security), and (5) techniques of
deception (encouraging belief in the most likely when a less likely is to be used, luring the enemy with an ideal
opportunity , the repetitive process and its lulling effect, the double bluff which involves revealing the truth
when it is expected to be a deception, the piece of bad luck which the enemy believes they are taking advantage
of, the substitution of a real item for a detected deception item, and disguising as the enemy). He also (6)
categorizes deceptions in terms of senses and (7) relates 'security' (in which you try to keep the enemy from
finding anything out) to deception (in which you try to get the enemy to find out the thing you want them to
find). Dewar includes pictures and examples in these 12 pages to boot.
In 1987, Knowledge Systems Corporation [26] created a useful set of diagrams for planning tactical deceptions.
Among their results, they indicate that the assessment and planning process is manual, lacks automated
applications programs, and lacks timely data required for combat support. This situation does not appear to
have changed. They propose a planning process consisting of (1) reviewing force objectives, (2) evaluating your
own and enemy capabilities and other situational factors, (3) developing a concept of operations and set of
actions, (4) allocating resources, (5) coordinating and deconflicting the plan relative to other plans, (6) doing a
risk and feasibility assessment, (7) reviewing adherence to force objectives, and (8) finalizing the plan. They
detail steps to accomplish each of these tasks in useful process diagrams and provide forms for doing a more
systematic analysis of deceptions than was previously available. Such a planning mechanism does not appear to
exist today for deception in information operations.
These authors share one thing in common. They all carry out an exercise in building categories. Just as the long
standing effort of biology to build up genus and species based on bodily traits (phenotypes), eventually fell to a
mechanistic understanding of genetics as the underlying cause, the scientific study of deception will eventually
yield a deeper understanding that will make the mechanisms clear and allow us to understand and create
deceptions as an engineering discipline. That is not to say that we will necessarily achieve that goal in this short
examination of the subject, but rather that in-depth study will ultimately yield such results.
There have been a few attempts in this direction. A RAND study included a 'straw man' graphic [17](H7076) that
showed deception as being broken down into "Simulation" and "Dissimulation Camouflage".
"Whaley first distinguishes two categories of deception (which he defines as one's intentional distortion of another's
perceived reality): 1) dissimulation (hiding the real) and 2) simulation (showing the false). Under dissimulation he
includes: a) masking (hiding the real by making it invisible), b) repackaging (hiding the real by disguising), and c)
dazzling (hiding the real by confusion). Under simulation he includes: a) mimicking (showing the false through
imitation), b) inventing (showing the false by displaying a different reality), and c) decoying (showing the false by
diverting attention). Since Whaley argues that "everything that exists can to some extent be both simulated and
dissimulated, " whatever the actual empirical frequencies, at least in principle hoaxing should be possible for any
substantive area."
[29]
The same slide reflects on Dewar's view [16] that security attempts to deny access and counterintelligence
attempts while deception seeks to exploit intelligence. Unfortunately, the RAND depiction is not as cogent as
Dewar in breaking down the 'subcategories' of simulation. The RAND slides do cover the notions of observables
being "known and unknown", "controllable and uncontrollable", and "enemy observable and enemy
non-observable". This characterization of part of the space is useful from a mechanistic viewpoint and a
decision tree created from these parameters can be of some use. Interestingly, RAND also points out the
relationship of selling, acting, magic, psychology, game theory, military operations, probability and statistics,
logic, information and communications theories, and intelligence to deception. It indicates issues of
observables, cultural bias, knowledge of enemy capabilities, analytical methods, and thought processes. It uses
a reasonable model of human behavior, lists some well known deception techniques, and looks at some of the
mathematics of perception management and reflexive control.
Cognitive Deception Background
Many authors have examined facets of deception from both an experiencial and cognitive perspective.
Chuck Whitlock has built a large part of his career on identifying and demonstrating these sorts of deceptions.
[12] His book includes detailed descriptions and examples of scores of common street deceptions. Fay Faron
points out that most such confidence efforts are carried as as specific 'plays' and details the anatomy of a 'con'
[30]. She provides 7 ingredients for a con (too good to be true, nothing to lose, out of their element, limited
time offer, references, pack mentality, and no consequence to actions). The anatomy of the confidence game is
said to involve (1) a motivation (e.g., greed), (2) the come-on (e.g., opportunity to get rich), (3) the shill (e.g., a
supposedly independent third party), (4) the swap (e.g., take the victim's money while making them think they
have it), (5) the stress (e.g., time pressure), and (6) the block (e.g., a reason the victim will not report the crime).
She even includes a 10-step play that makes up the big con.
Bob Fellows [13] takes a detailed approach to how 'magic' and similar techniques exploit human fallibility and
cognitive limits to deceive people. According to Bob Fellows [13] (p 14) the following characteristics improve the
changes of being fooled: (1) under stress, (2) naivety, (3) in life transitions, (4) unfulfilled desire for spiritual
meaning, (5) tend toward dependency, (6) attracted to trance-like states of mind, (7) unassertive, (8) unaware of
how groups can manipulate people, (9) gullible, (10) have had a recent traumatic experience, (11) want simple
answers to complex questions, (12) unaware of how the mind and body affect each other, (13) idealistic, (14)
lack critical thinking skills, (15) disillusioned with the world or their culture, and (16) lack knowledge of
deception methods. Fellows also identifies a set of methods used to manipulate people.
Thomas Gilovich [14] provides in-depth analysis of human reasoning fallibility by presenting evidence from
psychological studies that demonstrate a number of human reasoning mechanisms resulting in erroneous
conclusions. This includes the general notions that people (erroneously) (1) believe that effects should resemble
their causes, (2) misperceive random events, (3) misinterpret incomplete or unrepresentative data, (4) form
biased evaluations of ambiguous and inconsistent data, (5) have motivational determinants of belief, (6) bias
second hand information, and (7) have exaggerated impressions of social support. Substantial further detailing
shows specific common syndromes and circumstances associated with them.
Charles K. West [32] describes the steps in psychological and social distortion of information and provides
detailed support for cognitive limits leading to deception. Distortion comes from the fact of an unlimited
number of problems and events in reality, while human sensation can only sense certain types of events in
limited ways: (1) A person can only perceive a limited number of those events at any moment, (2) A person's
knowledge and emotions partially determine which of the events are noted and interpretations are made in
terms of knowledge and emotion (3) Intentional bias occurs as a person consciously selects what will be
communicated to others, and (4) the receiver of information provided by others will have the same set of
interpretations and sensory limitations.
Al Seckel [15] provides about 100 excellent examples of various optical illusions, many of which work regardless
of the knowledge of the observer, and some of which are defeated after the observer sees them only once.
Donald D. Hoffman [36] expands this into a detailed examination of visual intelligence and how the brain
processes visual information. It is particularly noteworthy that the visual cortex consumes a great deal of the
total human brain space and that it has a great deal of effect on cognition. Some of the 'rules' that Hoffman
describes with regard to how the visual cortex interprets information include: (1) Always interpret a straight line
in an image as a straight line in 3D, (2) If the tips of two lines coincide in an image interpret them as coinciding
in 3D, (3) Always interpret co-linear lines in an image as co-linear in 3D, (4) Interpret elements near each other
in an image as near each other in 3D, (5) Always interpret a curve that is smooth in an image as smooth in 3D,
(6) Where possible, interpret a curve in an image as the rim of a surface in 3D, (7) Where possible, interpret a
T-junction in an image as a point where the full rim conceals itself; the cap conceals the stem, (8) Interpret each
convex point on a bound as a convex point on a rim, (9) Interpret each concave point on a bound as a concave
point on a saddle point, (10) Construct surfaces in 3D that are as smooth as possible, (11) Construct subjective
figures that occlude only if there are convex cusps, (12) If two visual structures have a non-accidental relation,
group them and assign them to a common origin, (13) If three or more curves intersect at a common point in an
image, interpret them as intersecting at a common point in space, (14) Divide shapes into parts along concave
creases, (15) Divide shapes into parts at negative minima, along lines of curvature, of the principal curvatures,
(16) Divide silhouettes into parts at concave cusps and negative minima of curvature, (17) The salience of a cusp
boundary increases with increasing sharpness of the angle at the cusp, (18) The salience of a smooth boundary
increases with the magnitude of (normalized) curvature at the boundary, (19) Choose figure and ground so that
figure has the more salient part boundaries, (20) Choose figure and ground so that figure has the more salient
parts, (21) Interpret gradual changes in hue, saturation, and brightness in an image as changes in illumination,
(22) Interpret abrupt changes in hue, saturation, and brightness in an image as changes in surfaces, (23)
Construct as few light sources as possible, (24) Put light sources overhead, (25) Filters don't invert lightness,
(26) Filters decrease lightness differences, (27) Choose the fair pick that's most stable, (28) Interpret the highest
luminance in the visual field as white, flourent, or self-luminous, (29) Create the simplest possible motions, (30)
When making motion, construct as few objects as possible, and conserve them as much as possible, (31)
Construct motion to be as uniform over space as possible, (32) Construct the smoothest velocity field, (33) If
possible, and if other rules permit, interpret image motions as projections of rigid motions in three dimensions,
(34) If possible, and if other rules permit, interpret image motions as projections of 3D motions that are rigid
and planar, (35) Light sources move slowly.
It appears that the rules of visual intelligence are closely related to the results of other cognitive studies. It may
not be a coincidence that the thought processes that occupy the same part of the brain as visual processing have
similar susceptibilities to errors and that these follow the pattern of the assumption that small changes in
observation point should not change the interpretation of the image. It is surprising when such a change reveals
a different interpretation, and the brain appears to be designed to minimize such surprises while acting at great
speed in its interpretation mechanisms. For example, rule 2 (If the tips of two lines coincide in an image
interpret them as coinciding in 3D) is very nearly always true in the physical world because coincidence of line
ends that are not in fact coincident in 3 dimensions requires that you be viewing the situation at precisely the
right angle with respect to the two lines. Another way of putting this is that there is a single line in space that
connects the two points so as to make them appear to be coincident if they are not in fact conincident. If the
observer is not on that single line, the points will not appear coincident. Since people usually have two eyes and
they cannot align on the same line in space with respect to anything they can observe, there is no real 3
dimensional situation in which this coincidence can actually occur, it can only be simulated by 3 dimensional
objects that are far enough away to appear to be on the same line with respect to both eyes, and there are no
commonly occuring natural phenomena that pose anything of immediate visual import or consequence at thast
distance. Designing visual stimuli that violate these principles will confuse most human observers and effective
visual simulations should take these rules into account.
Deutsch [47] provides a series of demonstrations of interpretation and misinterpretation of audio information.
This includes: (1) the creation of words and phrases out of random sounds, (2) the susceptibility of
interpretation to predisposition, (3) misinterpretation of sound based on relative pitch of pairs of tones, (4)
misinterpretation of direction of sound source based on switching speakers, (5) creation of different words out
of random sounds based on rapid changes in source direction, and (6) the change of word creation over time
based on repeated identical audio stimulus.
First Karrass [33] then Cialdini [34] have provided excellent summaries of negotiation strategies and the use of
influence to gain advantage. Both also explain how to defend against influence tactics. Karrass was one of the
early experimenters in how people interact in negotiations and identified (1) credibility of the presenter, (2)
message content and appeal, (3) situation setting and rewards, and (4) media choice for messages as critical
components of persuasion. He also identifies goals, needs, and perceptions as three dimensions of persuasion
and lists scores of tactics categorized into types including (1) timing, (2) inspection, (3) authority, (4)
association, (5) amount, (6) brotherhood, and (7) detour. Karrass also provides a list of negotiating techniques
including: (1) agendas, (2) questions, (3) statements, (4) concessions, (5) commitments, (6) moves, (7) threats,
(8) promises, (9) recess, (10) delays, (11) deadlock, (12) focal points, (13) standards, (14) secrecy measures, (15)
nonverbal communications, (16) media choices, (17) listening, (18) caucus, (19) formal and informal
memorandum, (20) informal discussions, (21) trial balloons and leaks, (22) hostility releivers, (23) temporary
intermediaries, (24) location of negotiation, and (25) technique of time.
Cialdini [34] provides a simple structure for influence and asserts that much of the effect of influence
techniques is built-in and occurs below the conscious level for most people. His structure consists of
reciprocation, contrast, authority, commitment and consistency, automaticity, social proof, liking, and scarcity.
He cites a substantial series of psychological experiments that demonstrate quite clearly how people react to
situations without a high level of reasoning and explains how this is both critical to being effective decision
makers and results in exploitation through the use of compliance tactics. While Cialdini backs up this
information with numerous studies, his work is largely based on and largely cites western culture. Some of
these elements are apparently culturally driven and care must be taken to assure that they are used in context.
Robertson and Powers [31] have worked out a more detailed low-level theoretical model of cognition based on
"Perceptual Control Theory" (PCT), but extensions to higher levels of cognition have been highly speculative to
date. They define a set of levels of cognition in terms of their order in the control system, but beyond the lowest
few levels they have inadequate basis for asserting that these are orders of complexity in the classic control
theoretical sense. The levels they include are intensity, sensation, configuration, transition / motion, events,
relationships, categories, sequences / routines, programs / branching pathways / logic, and system concept.
[ Pobierz całość w formacie PDF ]