Nice little book with practical applications to ensure privacy for individuals; “getting your obfuscation work out into the world, where it can begin doing good by making noise.”
Quotes:
Obfuscation is the deliberate addition of ambiguous, confusing, or misleading information to interfere with surveillance and data collection.
Obfuscation has a role to play, not as a replacement for governance, business conduct, or technological interventions, or as a one-size-fits-all solution (again, it's a deliberately small, distributed revolution), but as a tool that fits into the larger network of privacy practices. In particular, it's a tool particularly well suited to the category of people without access to other modes of recourse, whether at a particular moment or in general-people who, as it happens, may be unable to deploy optimally configured privacy-protection tools because they are on the weak side of a particular information-power relationship.
Obfuscation, at its most abstract, is the production of noise modeled on an existing signal in order to make a collection of data more ambiguous, confusing, harder to exploit, more difficult to act on, and therefore less valuable.
This is not an argument against other systems and practices; it is merely an acknowledgment that there are circumstances in which obfuscation may provide an appropriate alternative or could be added to an existing technology or approach.
obfuscation is, in part, a troublemaking strategy. Although privacy is served by the constraints of law and regulation, disclosure limits imposed by organizational best practices, protective technological affordances provided by conscientious developers, and the exercise of abstinence or opting out, the areas of vulnerability remain vast. Obfuscation promises an additional layer of cover for these. Obfuscation obscures by making noise and muddying the waters; it can be used for data disobedience under difficult circumstances and as a digital weapon for the informationally weak.
Individuals have good reason to question whether their privacy interests in appropriate gathering and use of information will be secured any time soon by conventional means.
When entering the realms of the political, obfuscation must be tested against the demands of justice. But if obfuscators are so tested, so must we test the data collectors, the information services, the trackers, and the profilers. We have found that breathless rhetoric surrounding the promise and practice of data does not say enough about justice and the problem of risk shifting. Incumbents have embedded few protections and mitigations into the edifices of data they are constructing. Against this backdrop, obfuscation offers a means of striving for balance defensible when it functions to resist domination of the weaker by the stronger. A just society leaves this escape hatch open.
getting your obfuscation work out into the world, where it can begin doing good by making noise.
I. AN OBFUSCATION VOCABULARY
1.1 Chaff: defeating military radar 8
1.2 Twitter bots: filling a channel with noise 9
1.3 CacheCloak: location services without location tracking 12
1.4 TrackMeNot: blending genuine and artificial search queries 13
1.5 Uploads to leak sites: burying significant files 14
1.6 False tells: making patterns to trick a trained observer 15
1.7 Group identity: many people under one name 15
1.8 Identical confederates and objects: many people in one outfit 16
1.9 Excessive documentation: making analysis inefficient 17
1.10 Shuffling SIM cards: rendering mobile targeting uncertain 18
1.11 Tor relays: requests on behalf of others that conceal personal
1.12 Babble tapes: hiding speech in speech 21
1.13 Operation Vula: obfuscation in the struggle against Apartheid 21
2 Other Examples 25
2.1 Orb-weaving spiders: obfuscating animals 25
2.2 False orders: using obfuscation to attack rival businesses 25
2.3 French decoy radar emplacements: defeating radar detectors 2
2.4 AdNauseam: clicking all the ads 26
2.5 Quote stuffing: confusing algorithmic trading strategies 27
2.6 Swapping loyalty cards to interfere with analysis of shoppingpatterns 28
2.7 BitTorrent Hydra: using fake requests to deter collection of addresses 29
2.8 Deliberately vague language: obfuscating agency 30
2.9 Obfuscation of anonymous text: stopping stylometric analysis 31
2.10 Code obfuscation: baffling humans but not machines 33
2.11 Personal disinformation: strategies for individual disappearance 35
2.12 Apple's "cloning service" patent: polluting electronic profiling 36
2.13 Vortex: cookie obfuscation as game and marketplace 37
2.14 "Bayesian flooding" and "unselling" the value of online identity 38
2.15 FaceCloak: concealing the work of concealment 39
2.16 Obfuscated likefarming: concealing indications of manipulation 40
2.17 URME surveillance: "identity prosthetics" expressing protest 40 2.18 Manufacturing conflicting evidence: confounding investigation 41