Jump to ratings and reviews
Rate this book

Trevor Paglen: Adversarially Evolved Hallucinations

Rate this book
How machine and computer vision produces contemporary images.

How do machines look at images? To ask such a question is to observe the degree to which images today are increasingly produced by machines for machines. Symbolically, these images are void of an aesthetic context. They are not propagandistic (they do not try to convince us), nor are they instructive (they are not interested in directing our attention). They exist as abstract binary code rather than pictograms and are not, crucially, content-based. Made by machines for machines, they make things happen in our world. However, void of anthropological or aesthetic intention, the functionality of such images anticipates the obsolescence of "perception" as a human-defined activity and, in turn, the ascendancy of "machine vision." We cannot see how they make things happens in our world, in short. All of which leaves us with a in the absence of being able to perceive such images, how do we think about their impact on societies more broadly? Taking Trevor Paglen’s series “Adversarially Evolved Hallucinations” as a starting point for an analysis of these and other questions, this volume explores the role of algorithms and Artificial Intelligence (AI) in these processes. More specifically, it examines Paglen’s research methods and the extent to which they encourage the viewer to think from within such apparatuses rather than merely reflect upon them. Can, we will ask, the black-box-like technologies that produce such images be negotiated with or, indeed, modulated by methods of envisioning/engaging with their operative logic? How can we, if at all, hold the post-digital, machine-produced image to account?

160 pages, Paperback

Published August 20, 2024

58 people want to read

About the author

Trevor Paglen

28 books46 followers
Trevor Paglen is an artist, writer, and experimental geographer whose work deliberately blurs lines between social science, contemporary art, journalism, and other disciplines to construct unfamiliar, yet meticulously researched ways to see and interpret the world around us.

Paglen's visual work has been exhibited at Transmediale Festival, Berlin; The Andy Warhol Museum, Pittsburgh; Institute of Contemporary Art, Philadelphia; San Francisco Museum of Modern Art (SFMOMA); Massachusetts Museum of Contemporary Art, North Adams; the 2008 Taipei Biennial; the Istanbul Biennial 2009, and has been featured in numerous publications including The New York Times, Wired, Newsweek, Modern Painters, Aperture, and Art Forum.

Paglen has received grants and commissions from Rhizome.org, Art Matters, Artadia, and the Eyebeam Center for Art and Technology.

Paglen is the author of three books. His first book, Torture Taxi: On the Trail of the CIA’s Rendition Flights (co-authored with AC Thompson; Melville House, 2006) was the first book to systematically describe the CIA’s “extraordinary rendition” program. His second book, I Could Tell You But Then You Would Have to be Destroyed by Me (Melville House, 2007) an examination of the visual culture of “black” military programs, was published in Spring 2008. His third book, Blank Spots on a Map, was published by Dutton/Penguin in early 2009. In spring 2010, Aperture will publish a book of his visual work.

Paglen holds a B.A. from UC Berkeley, an M.F.A. from the School of the Art Institute of Chicago, and a Ph.D. in Geography from UC Berkeley.

Paglen lives and works in Oakland, CA and New York City.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
2 (22%)
4 stars
3 (33%)
3 stars
2 (22%)
2 stars
2 (22%)
1 star
0 (0%)
Displaying 1 - 3 of 3 reviews
Profile Image for Will.
83 reviews37 followers
June 27, 2024
Adversarially Evolved Hallucinations continues Trevor Paglen’s work on exploring human-made systems that don’t necessarily want to be seen or understood. In this case, he creates AI-generated images as a vehicle for entering the dark heart of AI (see also: machine learning); what it is and how it works.

The book opens with a set of what looks like low-resolution images that you can almost make out: the terrible arch of a rocket preceding its destruction, a sky dyed toxic, a mess of red and white like flesh and bone. The images are chunky and pixelated, as if distortions of something violent or pornographic, censored for our unsuspecting eyes. They’re disconcerting and apocalyptic but are robbed of significance through lack of context. Are the images related? How were they made? Are they actually AI-generated or images made to replicate the strangeness of early AI-generated images?

The images are followed by an essay that briefly touches on their significance as standalone pieces and then takes its sweet time trusting the reader with the concepts to meaningfully understand and engage the topic. One such sentence that told me this book probably wasn’t meant for me: “In focusing on the latent, hallucinatory, and systemic domain of algorithmic computation, we can see how conventional, and increasingly instrumentalized, applications of machine learning systems can be provisionally uncoupled from their utilitarian applications” (56). Sure, dude.

The essay is followed by the image datasets that the AI were trained on and an interview between the essayist and Paglen that does a lot more to communicate the significance of this project. As I understand it, the AI-generated images are a means of communicating how AI works, both broadly and in terms of the models that Paglen used. To me, this feels like the heart of this book and should be read first for the rest to make sense. It’s a decent discussion on AI and how it has the potential to inform our perceptions of reality but I’d guess that it’s introductory-level material for people interested in the ethical use of AI. And while the interview is insightful, I find it mildly frustrating that neither participant fully owns that AI is, at the end of the day, a tool whose use or misuse reflects the intent and beliefs of the user.

If I’m honest with myself, I’d give this book three stars but being such a new publication, I don’t want that to divert curious readers that may find value in it. Have a read if you’re curious about AI and its politics.
Profile Image for Nat.
734 reviews91 followers
Read
January 1, 2025
Don't open this up unless you want to see some uncanny, nightmarish AI-generated images! Paglen discusses wanting to disrupt a naïve way of thinking about the relation between the categories that gather together images that are used to train AI image recognition/generation systems, which he calls "machine realism":

Creating a training set involves the categorization and classification, by human operators, of thousands of images. There is an assumption that those categories, alongside the images contained in them, correspond to things out there in the world. There are a few metaphysical assumptions here: First, that there is an uncomplicated correspondence between images and the things out there in the world...The second assumption is that the 'world out there' can be neatly organized into a bunch of self-similar categories (turtles, bananas, clouds, etc.). What follows from the first two assumptions is that you can use quantitative approaches to interpret images (i.e., you can build an algorithm to 'recognize' turtles). I refer to these assumptions as 'machine realism. (p. 118)

The disruption is brought about by constructing corpora of images according to categories that aren't "middle-sized dry goods" (to use J.L. Austin's term), like OMENS AND PORTENTS, a category that includes black cats, rainbows, comets, etc, or THE INTERPRETATION OF DREAMS. It turns out AI produces some super weird images in response to these categories; the resulting weirdness poses a challenge to the idea that the computer is simply representing things that are "out there".
Displaying 1 - 3 of 3 reviews

Can't find what you're looking for?

Get help and learn more about the design.