Guy Tal's Blog

October 9, 2025

Aboutness—Further Thoughts

Today, I share with you more thoughts on the topic of “aboutness,” which I discuss in a couple of my books. As always, I offer these essays free of the annoyances of advertising and paywalls. Still, if it is within your means to contribute as little as a couple of dollars to support this blog—the work and costs that go into publishing and maintaining it—please consider making a small financial contribution. It makes a difference for me.


The artist … knows that none of these other men will ever love and understand his work as he loves and understands it.


— Friedrich Nietzsche


In my book, More Than a Rock, I mentioned aboutness as the quality I strive for in my photographic work—making my images not just pictures of things but pictures about things. In my latest book, Be Extraordinary, I mentioned using the term concept in my teachings to refer to the thing that an image is about (distinct from content, referring to the things an image is of).

[Should you be interested, my books are available for purchase on my website, as well as from Amazon and other booksellers.]

I always make sure to qualify that a concept is not necessarily something that can be expressed in words. The important thing about a concept is that you—the artist—know what it is, even if you can’t communicate or explain it clearly and unambiguously to others. A concept may be a named emotion or a relatable context, but it may just as well be a vague and/or complex feeling that informs your creative choices as you work and that may be impossible to express—fully or even at all—to others.

It’s not uncommon, when I mention the distinction between images of things and images about things, that someone will point to one of my images and ask what it is about. It’s rare that I can offer an answer other than saying the image is about a meaningful feeling I’ve had, and encourage the person asking to embrace whatever meaning, or aboutness, they find in the image, rather than attempt to understand mine. I’d like to believe that a beholder’s impressions of my work overlaps with mind to some significant degree (it is, after all, something I attempted to express), but I never concern myself with the extent of the overlap, accepting that it may be greater for some, with whom I share some similarities, than for others, who, as far as I am concerned, are entirely welcome to assert or infer their own meanings and hopefully find some value in seeing my creations.

Aboutness is only relevant to me in the sense that it guides my own choices and helps me distill in my own mind what I wish to express, which I often can’t state clearly in words: I know it in my mind, in my heart, in my gut, in the feelings and memories I wish to commemorate and re-experience when seeing my work in some future times.

Conversely, my own aboutness is never something I wish to impose on others, nor something I expect anyone to go to any effort to decipher. If anything, aboutness—regardless of whether it overlaps with my on—is a sense I hope viewers will experience intuitively when seeing my work, not attempt to arrive at by conscious analysis or guesswork. This is part of the reason I also admonish those who wish to gain the most benefit from beholding an artwork to avoid the “critic’s mindset,” or at least defer it to a time beyond their first encounter with an artwork. Critical analysis—whether of content or form, technique or classification, correctness or faults—may be helpful to artists in an educational sense, and to viewers in the sense of improving and broadening their understanding of art or of specific artists, but it may defeat its purpose of experiencing art if, by defaulting to critical analysis, one unwittingly robs oneself of the power of first impressions.

~~~

When someone asks me what one of my images is about, I’m sometimes reminded of a story about the poet Robert Frost, who was known for his sharp humor. When asked about the meaning of one of his poems, Frost responded: “you want me to say it worse?” Wittiness aside, both the question and the answer belie misunderstandings about meaning in art.

The person asking for an explanation of meaning errs in believing that meaning—aboutness—in one medium can always be expressed or explained in another. This is sometimes the case but often—especially when it comes to great art—it is not. Sometimes, there may be words to describe some of the effects of music, visual art, poetry, dance, or other forms of artistic expressions, but sometimes there are not. Sometimes, words for certain effects may exist in one language but not in another. But, regardless of whether the meaning of an artwork has a verbal correlate, it is almost always the case that the experience of encountering meaningful art in the raw, without explanation, is more powerful and nuanced when it arises intuitively in response to sensory stimulation and is greatly diminished or oversimplified by any attempt to encapsulate it in words. This is what Frost meant by “saying it worse.”

On the other hand, Frost’s answer, for all its humorous cleverness, also rests on a common error: the false belief that a sufficiently skilled beholder must understand and experience a work of art as intended by the artist, and that failing to do so suggests some deficit or fault in the beholder. This is utter nonsense. Art is by nature ambiguous and people coming from different backgrounds, possessing different personalities and sensibilities, different levels of experience, knowledge, maturity, etc., are all too likely to bring their own “beholder’s share” to any encounter with art, which may differ from the artist’s. Ironically, one of the best-known examples for this effect is Frost’s own poem, “The Road Not Taken,” containing the famous lines:

Two roads diverged in a wood, and I—

I took the one less traveled by,

And that has made all the difference.

Many readers consider the poem as an admonition or permission to follow risky, less-traveled paths. In fact, Frost’s poem was intended to poke fun at his friend, Edward Thomas, whom Frost believed was wasting too much time and energy lamenting and questioning his past choices. Frost’s own intended meaning—that taking a less-traveled path makes no difference at all and is not worth dwelling upon—is in this case the opposite of the common interpretation.

~~~

As artists, we may take one of two approaches to the ambiguity of aboutness: we may choose to limit ourselves to the narrow range of artistic expressions whose meanings are easily and readily shared and understood by most, or we may consider aboutness as our own inner guide and accept that our work may elicit different “aboutnesses” in different people. I am decidedly in the latter camp. One reason is that my primary motivation in making art is to elevate my own living experiences. Another is that I see no value in imposing my own aboutness on others, especially if they may find greater value and meaning in my work by aligning it with their own values and beliefs.

Making my work’s aboutness a personal matter rather than something I wish to impose on my viewers allows me the freedom to delve deeper into my own psyche when creating, without concern for who may understand it or to what degree. At times, I even know from the outset that what I wish to express is not something any other person is ever likely to understand—certainly not without a much deeper insight into my personality, thoughts, and emotions than I would be comfortable sharing.

These thoughts came to me after a recent re-reading of Fyodor Dostoevsky’s novel, The Idiot, in which he wrote, “There is something at the bottom of every new human thought, every thought of genius, or even every earnest thought that springs up in any brain, which can never be communicated to others, even if one were to write volumes about it and were explaining one’s idea for thirty-five years; there’s something left which cannot be induced to emerge from your brain, and remains with you forever.”

Besides the obvious relevance of Dostoevsky’s observation about each of us possessing memories, ideas, and feelings we can never fully convey to another no matter how much we may try to, there is another—much less obvious—reason his words inspired me to write this essay. I am writing these words on what happens to be the thirty-fifth anniversary of a meaningful day in my life: the day I was discharged from military conscription—an experience that has shaped the course of my life, the person I am, and consequently what my work is about in ways more complex and pervasive than I can begin to explain or would even know how to.

It occurred to me that no other person is likely to know this from seeing my work. It also occurred to me that even I am not always conscious of the effects of this history when I work or when I look back at my own images. And yet, as soon as I acknowledge it I can trace exactly how my mode of work, my choices, my subjects, my personality all tie into this history. Aboutness, it seems, is not only inexpressible to others, but at least in part may not even be entirely obvious to the artist.

 •  0 comments  •  flag
Share on Twitter
Published on October 09, 2025 05:00

September 4, 2025

Those Great and Simple Images


Camera, camera, what do you do—and I damn your eye, damn your wink, damn your memory—for with all of that you still can’t think.


—W. Eugene Smith


In 1937, Albert Camus, then just 24 years old, published his fist book under his own name—L’Envers et l’Endroit (variously translated into English as The Wrong Side and the Right Side, or as Betwixt and Between). The book consisted of several essays, deeply raw, introspective, and personal, recounting his early experiences growing up in French-occupied Algeria, and some of his early travels. The book received relatively little attention. Twenty years later, in 1957, at age 44, following the publication of his more famous titles and more mature philosophy, he became the second-youngest recipient of a Nobel Prize in Literature.

The following year, owing to Camus’ newfound fame, L’Envers et l’Endroit was republished with a new preface in which he reflected on his life and work in the two decades that passed. In the new preface, he wrote:

Every artist keeps within himself a single source which nourishes during his lifetime what he is and what he says. When that spring runs dry, little by little one sees his work shrivel and crack. These are art’s wastelands, no longer watered by the invisible current. His hair grown thin and dry, covered with thatch, the artist is ripe for silence or the salons, which comes to the same thing. As for myself, I know that my source is in The Wrong Side and the Right Side, in the world of poverty and sunlight I lived in for so long, whose memory still saves me from two opposing dangers that threaten every artist, resentment and self-satisfaction.

He concluded the preface, writing, “At least I know this, with sure and certain knowledge: a man’s work is nothing but this slow trek to rediscover, through the detours of art, those two or three great and simple images in whose presence his heart first opened. This is why, perhaps, after working and producing for twenty years, I still live with the idea that my work has not even begun.” Less than two years later, he died in a car crash at age 46.

When I think about my own “great and simple images”: those in whose presence my heart first opened—my most meaningful and lasting memories—my first realization is that, despite spanning many different places, people, and experiences; and despite the broad range of emotions they inspire—from euphoric joy to deep despair—all have one thing in common: they stand out within the strange and intricate tapestry that is my life story as the most intensely beautiful.

Upon further reflection, another—less obvious—commonality emerged: none of these images is today manifested in any tangible form: in an actual, visible picture. This is a rather jarring conclusion for one who has dedicated much of his life to the study and creation of beautiful images.

I have long ago lost interest in images that merely, as Hermann Hesse put it, “attempt nothing except to be beautiful.” In photography, I have come to refer to images aiming no higher than to impart aesthetic pleasure as “photographic tautologies”: images that are beautiful for no other reason than that they depict beautiful objects. In previous writings I have referred to such images as representational—literally re-presenting aesthetics that would have been obvious to anyone who might behold the same subject or scene, the artist’s role limited merely to the skill of rendering them faithfully and/or at some fortuitous time, but adding no further personal, deliberate meaning to them beyond their obvious appeal to the senses.

This may seem at odds with my belief—affirmed often and without fail in nearly six decades of life—that the pursuit of beauty in any form may be the closest thing I may cite as my reason for living. But in fact it is not at odds. The beauty that Hesse referred to—aesthetic beauty—the simplest, most obvious, surface-level sort of beauty: a beauty that may be a pleasant veneer of some images, experiences, and thoughts but does not penetrate deeper than that, is not the sort of beauty that Camus referred to as “great and simple.”

The greatness Camus referred to goes beyond surface-level aesthetics. Being a perception corresponding to a person’s most cherished memories, it must stand out from among all other memories in some significant way, whether manifested in depth, complexity, intensity, mystery, elation, perhaps even pain.

Simplicity as Camus intended likewise it is not meant to denote ease, superficiality, or lack of complexity. The simplicity he wrote of—the simplicity of a person’s most cherished memories—is of a different nature. It is the kind of simplicity contained in expressions such as “be your true self,” or “do what you love”—a simplicity that, for each person, refers to things of inexpressible depth and complexity that, no matter how skilled a person may be in any expressive medium, can never be wholly conveyed to another in words or in artistic creations.

We may share with another person some aspects of deceptively simple words representing thoughts, feelings, experiences, beliefs. We may indicate to another person that they are important and meaningful to us. But in the end, the other person’s empathy, sensibilities, and own values must fill the gaps to form the perception of understanding: theirs, not ours. And so, for each of us our great and simple images are by necessity ones we can never render in all their depth and complexity in any tangible form: an object, words, or sounds. But if we are skilled in some expressive medium, we may still impart to another person the experience of meaningfulness—if not the same greatness and simplicity as we feel it, at least their vividness and beauty, and the intensity of our feelings toward them.

There are many examples I know of in art and literature (also in mathematics and science, for those trained in these disciplines). One that comes to mind is Norman Maclean’s famous passage in A River Runs Through It: “But when I am alone in the half-light of the canyon, all existence seems to fade to a being of my soul and memories, and the sounds of the Big Blackfoot River and a four-count rhythm, and the hope that a fish will rise. Eventually, all things merge into one, and a river runs through it.”

There is also the famous scene from the film Citizen Kane where the character of Jerry Thompson, puzzled by his inability to decipher the meaning of “Rosebud”—Kane’s most profound “great and simple image”— concludes, “I don’t think any word can explain a man’s life,” only to have the tragic error of his statement then rendered in one of the greatest, most heart wrenching cinematic finales of all time.

“Alas! The memories that are swallowed up in the abyss of the years!,” wrote Paul Cézanne shortly before his death to his friend and patron Ambroise Vollard. He then mused, “I’m all alone now and I would never be able to escape from the self-seeking of human kind anyway. Now it’s theft, conceit, infatuation, and now it’s rapine or seizure of one’s production. But Nature is very beautiful. They can’t take that away from me.”

Another example are these simple lines in Rainer Maria Rilke’s poem, “Remembering”:

You think of lands you journeyed through,

of paintings and a dress once worn

by a woman you never found again.

And suddenly you know: that was enough.

You rise and there appears before you

in all its longings and hesitations

the shape of what you lived.

I tried to think of examples of “great and simple images” in the medium of photography but came up very short. I believe I have managed to come close to expressing a small handful of my own great and simple images in some of my photographs (perhaps five at best). Even for those, I doubt if or to what degree they may result in an opening of hearts in anyone but me, having the memories of their making to bolster their effect. I considered in contrast examples such I quoted above, which seem to achieve their effect consistently and powerfully even in people who have never experienced the things or events they mention. The same is of course true for some music, which may impart profound emotions by form alone, often without reference to any material things.

I was moved to write this essay today because I have come across one such rarity: a photograph that struck me as a great and simple image and opened my heart. It was a photograph of a young boy and his dog alone in a beautiful field.

I have been a young boy with a dog in a field. It is without question one of, perhaps even the, greatest among my own great and simple images, looming vibrant from among so many memories of an otherwise confusing and fearful childhood, in a place and a time that no longer exist. So vivid it is for me that I can recall even the smells—of the field, of the air, of the dog’s fur. Also, the sounds and the breeze and the sunlight on my skin. For all I have experienced in life, I have never known a better thing to be than a boy with a dog in a field. If I could choose one experience from among my decades of life to spend all eternity in, that would be it.

But I can’t help wonder if this same image would have aroused the same feelings in people who do not share my memories, my love of dogs and fields, or the settings against which this experience stands out to me so boldly, in the same way that Maclean’s or Rilke’s words may arouse people who may not share a love of rivers or a yearning for unresolved memories of chance encounters.

 •  0 comments  •  flag
Share on Twitter
Published on September 04, 2025 07:46

August 25, 2025

Classical Photographers and Jazz Photographers

I’ll play it and tell you what it is later. —Miles Davis

This is a re-edited version of an article originally published in the online magazine On Landscape. If you are not already a subscriber, I recommend it highly. (Note, this is an unsolicited, unpaid personal recommendation, not an advertisement.)

~~~

Ansel Adams spent much of his early years training to become a classical pianist. He often mused about the ways his musical training has influenced his photography. In one interview, Adams said, “Study in music gave me a fine basis for the discipline of photography. I’d have been a real Sloppy Joe if I hadn’t had that.” He also famously claimed, “The negative is the equivalent of the composer’s score, and the print the performance.”

Adams’s references to discipline and his analogy of performing a composer’s score likely are relatable to anyone familiar with the rigors of practicing and performing classical music. But such references may not apply in quite the same way to jazz musicians who, while still practicing relentlessly to achieve technical excellence, improvise in real time when performing rather than adhere precisely to a composer’s score.

The difference between classical and jazz musicians occurred to me some years ago when preparing a set of prints for an exhibition. Although I had printed the same images numerous times before, I found myself re-editing every one of them, some in quite different ways from my original visualizations. My original exposures—film and RAW files—did not point me to any singular “right” interpretation in the same way that a composer’s score might direct a classical performer. Instead, these exposures and the memories of their making set for me a general mood—a visual rhythm, a baseline to improvise around, in some cases to depart from in a quite undisciplined, spontaneous, and enjoyable ways.

Although I recalled my original intents and visualizations quite vividly, I also experienced new epiphanies, enjoyed experimenting with new interpretations, applied new tools and techniques I did not possess when making my initial edits, and in some cases ended up “performing” quite different “visual music” than I originally conceived. I realized then that performing a score was not a good analogy to describe my way of working. Instead, I felt more like I was jamming, riffing, improvising, experimenting in much the same way that a jazz musician may explore new possibilities while playing.

In both classical and jazz music, the act of playing may amount to executing a practiced routine by rote, or it may be an opportunity for a creative player to express their own interpretation of the music. Skilled classical and jazz musicians alike may enrich a composer’s original notes, adapt their performance to imbue the original score with their own feelings and style. Performing both forms of music also may yield the effect of flow, ensuing from intense attention-consuming concentration while performing.

The main difference, as far as the performer is concerned, is the degree of freedom that players allow themselves to depart from the score and from their practiced routines. Classical musicians aspire to perform the music on stage exactly as they rehearsed it. For jazz musicians, however, the rehearsed music is just a starting point they may choose to freely depart from at any time. When performing, jazz musicians do not feel beholden to any prior interpretation—not even their own—and may adapt their playing in real time into something different from any other performance of the same piece.

Thinking about the distinction in a broader sense, I believe that some photographers, like classical musicians, seek to elevate the emotional impact of their visual performances by way of technical excellence and by use of beautiful light, colors, and tonal transitions, but generally stick to “performing” their visualized—preconceived—compositions (whether these compositions are their own or copied from others). Other photographers, like jazz musicians, vary their compositions and performances in real time, aiming to express their own emotions as they arise by whatever means are available to them, in general disregard to their original visualizations or to how others may have photographed the same subjects.

“Classical photographer” may plan and practice intensely, aiming for a perfect performance of a preconceived score. “Jazz photographers” use their original “score” as a baseline but strive to make choices in real time, adapt quickly to unexpected conditions, new moods, emergent feelings, and serendipitous epiphanies. They may change their minds at any time, even when re-processing or re-printing images they previously considered as “finished.”

Recent studies have validated Adams’s claim about the general benefits of musical training in improving discipline and accuracy. Brain imaging studies show that musicians’ brains have conspicuously greater connectivity between the left and right hemispheres compared with the brains of non-musicians. Similar studies also show that brains of classical musicians adapt in different ways than brains of jazz musicians.

A 2018 study done at the Max Planck Institute showed differences between classical and jazz pianists in the way they prepare themselves to play consecutive notes. According to the study, musicians constantly consider and predict how they will play upcoming notes to ensure proper technique and harmonious progression. This mental preparation involves considering what note to play next (i.e., what keys on the piano they will need to press), and how they will play this note (i.e., the fingering technique needed to press the keys correctly). The study showed that classical musicians focus more on the “how”—on technique—while jazz musicians, in contrast, focus more on the “what”—on deciding what the next note will be in real time, ensuring it will fit harmoniously with the previous notes, even if departing from them (and from the original score, if there is one) unexpectedly.

In classical music, emotional meaning comes largely from the composer. In jazz, meaning comes from the performer, who is free to express different meanings in different performances, even if ostensibly playing the same title.

When it comes to the separation of roles between composer and performer, photography as a medium lags considerably behind music. Most photographers and viewers of photography make no distinction between composer and performer, assuming implicitly that they are always the same person, despite this often not being the case. In landscape photography, especially, the case is almost always the opposite: few original composers make meaningful, novel creations, which are then “performed” repeatedly by many others (who usually have no qualms about claiming the entire production—composition, performance, and all, as their own).

The failure to separate composers from performers in photography has led some to question the artistic merit the medium, and especially of such styles as “straight photography.” For example, in 1907, pictorialist photographer Robert Demachy posed this thought experiment:

Choose the man whom you consider the very first landscape artist photographer in the world; suppose he has, thanks to his artistic nature and visual training, chosen the hour and spot, of all others. Imagine him shadowed by some atrocious photographic bounder furnished with the same plates and lens as the master. Imagine this plagiarist setting his tripod in the actual dents left by the artist’s machine and taking the same picture with the same exposure. Now, suppose that both are straight printers? Who will be able later on to tell which is the artist’s and which is the other one’s picture?

The moral of this fable is twofold. It shows that a beautiful straight print may be made by a man incapable of producing a work of art, and that a straight print can not possibly be a work of art even when its author is an artist, since it may be identical to that taken by a man who is no artist.

I agree generally with Demachy’s sentiment. Rampant and commonly accepted plagiarism is a formidable hurdle for photographic artists wishing to be recognized for their original, creative efforts. Still, if we consider Demachy’s scenario with a clear separation between the roles of composer and performer, I think it is fair to say that the person who composed the original photograph is very much deserving of praise and artistic recognition. Bach, for example, is no less an artist just because countless performers have played his original scores in ways that may be difficult to distinguish among.

I know myself to be a “jazz photographer”—a real-time improviser, not a disciplined performer of preconceived/visualized “scores.” When I set to make a print for myself or for an exhibition (i.e., not a print purchased by a customer expecting it to match the appearance of a digital version or of a previous “performance”), I consider it as an opportunity to make a new creation—new “visual music”—not necessarily aiming to re-perform my original visualization by some singular, fixed, “right” interpretation. Each “performance” is for me a chance to make something new and original.

“An original work, an aha! product or a fresh insight”, wrote journalist Timothy Egan in a 2014 New York Times piece about creativity, “is rarely the result of precise calculation at one end producing genius at the other. You need messiness and magic, serendipity and insanity.” To me, this description suggests quite a different endeavor—and attitude—than, as Adams put it, “performing a composer’s score.”

~~~

If you’ve enjoyed this article, please consider making a small contribution to support this blog. Thank you!

 •  0 comments  •  flag
Share on Twitter
Published on August 25, 2025 05:00

July 1, 2025

Finding Creative Fulfillment – An Interview with Grant Swinbourne

It was my great pleasure recently to speak with Grant Swinbourne’s as part of his podcast, Landscape Photography World. My thanks to Grant for featuring me and for the excellent questions.

Be sure to check out Grant’s archives for more interesting conversations, and subscribe to be notified of future ones. (I happen to know at least a couple of Grant’s upcoming guests, and I can’t wait to hear what they have to say.)

I hope you enjoy our conversation.

 •  0 comments  •  flag
Share on Twitter
Published on July 01, 2025 05:00

June 26, 2025

Discretionary Time

This article is offered in response to an anonymous reader who wrote to say that they “would love to spend more time philosophizing, but who’s got the time?”

Have a philosophical question you’d like me to write about? Please don’t hesitate to write.

Please note that if you purchase a book from Amazon using one of the links below, I will be paid a small commission. Alternatively, you will find a more comprehensive discussion of these and other philosophical thoughts in my recent book, Be Extraordinary.


Everything is public now, potentially: one’s thoughts, one’s photos, one’s movements, one’s purchases. There is no privacy and apparently little desire for it in a world devoted to non-stop use of social media. Every minute, every second, has to be spent with one’s device clutched in one’s hand. Those trapped in this virtual world are never alone, never able to concentrate and appreciate in their own way, silently. They have given up, to a great extent, the amenities and achievements of civilization: solitude and leisure, the sanction to be oneself, truly absorbed, whether in contemplating a work of art, a scientific theory, a sunset, or the face of one’s beloved.


—Oliver Sacks


Søren Kierkegaard believed that boredom is the root of evil. I agree with him. Boredom seems to me, in the purest sense, as wasted living. It’s no wonder bored people constantly seek ways to alleviate boredom. Some people do so by pursuing lifestyles that Kierkegaard referred to as aesthetic, characterized (among other things) by constant pursuit of easy distractions: ways of relieving boredom that are not too demanding on either the body or the intellect. Other people, whom Kierkegaard refers to as ethical, take their life choices more seriously: they consciously contemplate all the options open to them, weigh these options for their morality, and authenticity, and choose those that are most ethical and life-affirming, regardless of how difficult or risky they may be or how they may be judged by other people.

Kierkegaard also had the wisdom to distinguish boredom from idleness. In his book Either/Or, he wrote, “Idleness, it is usually said, is a root of all evil. To prevent this evil one recommends work. However, it is easy to see from the remedy as well as the feared cause that this whole view is of very plebeian extraction. Idleness as such is by no means a root of evil; quite the contrary, it is a truly divine way of life so long as one is not bored” (italics mine).

This sentiment is echoed also in the writings of many other thinkers. John Lubbock, for example, put it in terms I hope will resonate with readers of this journal, who I expect share my interest in nature and art. He wrote, “Rest is not idleness, and to lie sometimes on the grass under the trees on a summer’s day, listening to the murmur of water, or watching the clouds float across the blue sky, is by no means waste of time.”

Modern psychology and brain science give us deeper insights into the human mind than were available in Kierkegaard’s day. Impressively, in many cases these sciences validate and support some of his intuitions. Relying on science, we can now define more deliberately what boredom is, and why the suggested remedy of “work” may not always… work. In fact, some kinds of work may make boredom worse. Some kinds of work may even induce boredom in a person who may otherwise be perfectly content being physical idle but whose mind is active.

So long as we are alive and not in a coma, our brains are never idle even if our bodies are. Our brains do, however, alternate between two distinct states of attention: the state of paying focused attention to something, and the state of drifting “mindlessly” without conscious focus. In this latter state, which I assume is what Kierkegaard referred to as “idleness,” boredom cannot exist. This is for the simple fact that when we feel bored, our attention doesn’t drift: it is focused consciously on the fact that we feel bored and on seeking ways to relieve boredom. Boredom only happens when our minds are actively seeking something to do, and may be exacerbated when we are engaged in an activity (e.g., work, conversation, social obligation) that is decidedly not interesting to us.

It may seem that boredom arises from things imposed on us by external circumstances—having to perform uninteresting jobs to earn an income, attending uninteresting events to curry favor with or to signal allegiance to other people, partaking in uninteresting conversations, waiting in line, being placed on hold, or stuck in situations where we don’t see opportunities to engage in something interesting. But this is not always the case. Much boredom is rooted in self-imposed habits (addictive ones being the worst)—activities that may be only mildly interesting or rewarding but that don’t challenge us sufficiently to ward off boredom entirely. At times, we may choose to engage in such activities even when we know of better uses for our time if these other uses demand investment of physical of cognitive effort that we are loath to expend, such as reading a book, going for a walk, writing, attending a class, or volunteering to help at an animal shelter.

While idleness (in the sense of not focusing our attention on anything in particular) may indeed be a better use of time than boredom, it is alas an impossible state to accomplish at will. This is because true idleness engenders the same paradox that enlightenment does in some Eastern traditions, which is this: the very act—even just the intention—of trying to achieve a state of pure idleness (or enlightenment), puts this state beyond reach. (Note that what I refer to by “idleness” here is not the same as the state of detached equanimity—the state of “no mind,” also known as mushin in some Eastern traditions—that some meditators strive for, which is a highly-focused conscious state in which meditators strive to become aware of and to actively detach themselves from their own thoughts and feelings.)

Fortunately, our brains achieve idle, unfocused states on their own. Neurologically speaking, these states are characterized by coordinated activity in brain regions known collectively as the Default Mode Network (DMN). The word “default” should tell you that this something your subconscious brain does on its own, with no conscious intervention (which would disrupt it). This is to say that mechanisms in your brain you are not even aware of, when recognizing opportunities to free up conscious attention when it’s not needed, automatically activate the DMN, allowing your mind to wander until conscious attention is called for.

Given that idleness, in the sense of not paying conscious attention to anything specific, is beyond our ability to summon at will, it seems that any consciously chosen cure for boredom would have to be an activity we opt to engage in that is also interesting. At times, we may be fortunate to find such interest in our professional lives. (Alas, surveys show this is not the case for most workers.) Of course, when not working or otherwise obligated, if we are sufficiently motivated, we may opt to partake in rewarding hobbies. But what to do when life compels us to busy ourselves with activities that bore us?

As it turns out, there is something we can do to escape boredom even when we are engaged in uninteresting activities: we can summon excess attention to contemplate. Friedrich Nietzsche, after observing that “agitatedness is growing so great that higher culture can no longer allow its fruits to mature,” suggested this: “One of the most necessary corrections to the character of mankind that have to be taken in hand is a considerable strengthening of the contemplative element in it.”

So long as we have some cognitive “space”— attention—to spare, we are free to apply this excess attention toward contemplation. We are also free to choose the things we wish to contemplate. Contemplation seemingly requires no special resources beyond a working brain having some amount of spare capacity to think (which is often the case when engaged in boring activities). But in fact, contemplation requires a couple more ingredients that are harder to come by: intention and self-discipline. Thankfully, both are qualities that, even if we are lacking in them at the outset, we may train ourselves to acquire and to improve with exercise, which is to say: start where you are, with what you have, and then do it repeatedly, pushing yourself a bit harder each time you feel you can handle more. So long as you contemplate regularly and consistently, you will get better at it. In time, your thoughts will become more focused, deeper, and more complex. They will also become more effective at eliminating boredom, which is to say: more interesting.

When contemplating, think about big questions but don’t feel like you must find answers to them, or to arrive at any conclusion. Challenge yourself instead to understand things better, to consider as many aspects and nuances of the thing you are contemplating as you can, to see patterns and connections among bits of knowledge as you acquire them. Moments of discovery will arise as you go. A line in a novel may connect with the philosophy of some thinker, a scientific discovery may affirm or refute some belief or conjecture, information from unrelated sources may coalesce to enhance and clarify your understanding of some concepts, failing to answer some questions may lead you to ask different, perhaps better, questions.

In choosing topics to contemplate, don’t limit yourself to just the challenges of the day, suggested or dictated to you from the outside; consider what things you may wish you knew more about—for any or no reason. When you are engaged in some boring activity, use the information already in your memory to conjure up questions and try to answer them to the best of your capacity. Later, when you have the freedom and time to do so, research, read, learn, experiment, deepen your knowledge, so that when you bored again you will have more raw materials for contemplation. Become the intellectual equivalent of a curious dog excitedly following its nose, not knowing where an alluring scent may lead but irresistibly excited to track it to its source.

This is the point where cynics may attempt to justify avoiding contemplation by a common excuse: who’s got the time? The answer: most likely, you do.

Nietzsche wrote, “men are divided into the slaves and the free; for he who does not have two-thirds of his day to himself is a slave.” It may seem that by this characterization most professionals are, according to Nietzsche, slaves. But this is not so, at least not by the measure of time one may have available for discretionary contemplation. This is easy enough to demonstrate when we consider that the average time internet users spend online is about 6.5 hours per day, which is more than two thirds of a workday. (In some places, and among some segments, the number is greater still.) Inevitably, most of this usage occurs when a person is (or pretends to be) engaged simultaneously in other activities (e.g., work), which is to say that it taps into excess attention one has when performing other tasks. Excess attention while doing other things may perhaps not be sufficient for such activities as going for a hike, writing an essay, or practicing the violin, but it may be used for contemplation.

~~~

I would be remiss to not mention (with great trepidation, given the sensitivity of the topic) that discretionary time is also an area where, as Benjamin Franklin put it, “an ounce of prevention is worth a pound of cure.”

Think of the things that compel people to invest great amounts of time in activities that may, at least to a significant part, be unenjoyable or uninteresting. To mention a few (without judgment): careers, family obligations, traditional rites. Consider also factors that may limit some people’s options for activities they might find interesting and rewarding. Again, to mention a few: the need to pay down a debt, experiences not possible when living in certain places, things considered inappropriate in one’s social circle or chosen lifestyle. Now ask yourself honestly which of these may be, or may have been, preempted by making certain deliberate choices—to do, or to not do—certain things, however difficult or consequential these choices may be. This is the premise underlying the distinction some existentialists have made between living in “good faith” (i.e., authentically, according to your own freely-chosen values) and living in “bad faith” (i.e., inauthentically, according to other people’s expectations and imposed values).

Of course, considering such choices (or the failure to make them) may be pointless beyond the point where they are possible. But to not mention them at all, I believe, is to do a disservice to those for whom such choices are still open. In fact, existentialists and some of their precursors, including Kierkegaard, have warned sternly against allowing such opportunities to pass by passively: to avoid making choices during the “window of opportunity.”

Kierkegaard used the parable of a ship’s helmsman who considers whether to change the ship’s course. The helmsman knows that even if failing to steer the ship, it will not just remain still—the current and winds will carry it in some direction. “Similarly with a human being,” Kierkegaard wrote, “if he forgets to take that headway into account, the moment eventually comes when there is no longer any question of an either/or, not because he has chosen but because he has refrained from choice, which can also be expressed in another way: because others have chosen for him, because he has lost himself.”

Much as we may dislike feeling rushed or obligated to make consequential choices, some such choices which may have profound implications on your quality of life regardless of whether you make the choice deliberately or just allow yourself to be carried by the current of life, or kowtow to other people’s expectations. One of these implications is the amount of discretionary time you may have throughout the remainder of your life, and the ways you may put this time to use. Again, Kierkegaard (in the voice of his character, the wise Judge Wilhelm): “The moment of choice is for me very serious, less on account of the rigorous pondering of the alternatives, and of the multitude of thoughts that attach to each separate link, than because there is a danger afoot that at the next moment it may not be in my power to make the same choice, that something has already been lived that must be lived over again.”

What is having discretionary time truly worth to you? Don’t expect others to give you the answers, because there are some questions other people—justifiably—have no business asking you. It is up to you to ask them of and for yourself: What if you quit your job? What if you retire early? What if you change or end or not even commit to a given relationship? What if you don’t start a family? What if you choose to resign from certain clubs or institutions? What if you choose to live somewhere else? What if you don’t settle down?

There are no universal right or wrong answers to these questions, which is exactly the point: never assume that the right answers for another person will also be right for you. Don’t even assume that the right answer for the majority of people is necessarily the right answer for you. Outliers exist. You may be one. Also, when it comes to such questions there may not be an obvious, absolute “right” for any given person. Only “more right” or “less right.” Life is a finite resource. Allocate it wisely and deliberately. Like it or not, when you truly consider all the choices you are truly free to make—not just those that are convenient, safe, or expected of you—you may realize that, as James Baldwin put it, “Nothing is more unbearable, once one has it, than freedom.” Or, as Kierkegaard famously quipped: “Anxiety is the dizziness of freedom.”

In his philosophical novel, Huis Clos (No Exit), Jean-Paul Sartre wrote one of his most famous and most often misunderstood lines: “L’enfer, c’est les autres”—Hell is other people. Some may think that Sartre intended sarcastically that other people are burdens to us because of their habits, judgment, pettiness, irrationality, etc. But this is not exactly what he meant. As an existentialist par excellence, he believed that individuals must make choices for themselves in order to live in “good faith”—no other mode of life being as worthy. Making choices in good faith doesn’t mean making choices arbitrarily; it means making choices within the limitations of the reality available to each of us as an individual: what Sartre termed “facticity.” Out duty as individuals is to (using Sartre’s term) transcend our facticity according to our own values, but only when possible. Other people are “hell” in his opinion, not just because they may have the power to limit our choices, but because we often judge our own actions, and make (or fail to make) authentic choices not only according to our own values but according to what we think other people may think of our choices.

In his own way, Sartre was paraphrasing Nietzsche, who, in turn, was paraphrasing his own mentor, Arthur Schopenhauer. In his essay, Schopenhauer as Educator, Nietzsche wrote, “We are responsible to ourselves for our own existence; consequently we want to be the true helmsman of this existence and refuse to allow our existence to resemble a mindless act of chance. One has to take a somewhat bold and dangerous line with this existence: especially as, whatever happens, we are bound to lose it. Why go on clinging to this clod of earth, this way of life, why pay heed to what your neighbour says? It is so parochial to bind oneself to views which are no longer binding even a couple of hundred miles away.”

(Put more simply, Nietzsche was expressing the same sentiment intended by such common present-day ripostes as, “you only live once,” and, “in a hundred years, who’s gonna care?”.)

~~~

As the person who asked the question that inspired this essay (diatribe?) is, as I’m sure other readers of this journal are, a photographer frustrated by lack of time to pursue photography, I should also mention the dispiriting effect—all too common among photographers—of feeling rushed when making photographs during short-lived respites from work or other obligations. My advice: don’t do that. The reason? If you feel stressed and anxious when photographing, no matter the reason, you undermine the very joy and satisfaction you ostensibly seek to find in it. Alan Watts described the futility of not finding respite from work even in our so-called “spare time.” In his book, The Wisdom of Insecurity, he wrote:

This “dope” we call our high standard of living, a violent and complex stimulation of the senses, which makes them progressively less sensitive and thus in need of yet more violent stimulation. We crave distraction—a panorama of sights, sounds, thrills, and titillations into which as much as possible must be crowded in the shortest possible time. To keep up this “standard” most of us are willing to put up with lives that consist largely in doing jobs that are a bore, earning the means to seek relief from the tedium by intervals of hectic and expensive pleasure. These intervals are supposed to be the real living, the real purpose served by the necessary evil of work. Or we imagine that the justification of such work is the rearing of a family to go on doing the same kind of thing, in order to rear another family … and so ad infinitum. This is no caricature. It is the simple reality of millions of lives, so commonplace that we need hardly dwell upon the details, save to note the anxiety and frustration of those who put up with it, not knowing what else to do.

If being unproductive in photography makes you feel stressed in the same way you may feel stressed if unproductive in your job or other pursuits, it is because you bring the same attitude to your discretionary, creative, leisurely activities (including photography) that you apply in other situations. This is a mistake. Discretionary activities should offset and balance the stress of other activities, or else what is the point? One way to remedy the imbalance is this: favor consciously the quality of your experience over productivity. The calculus is simple: a good experience resulting in few (or no) photographs to show for it will have a greater benefit to your wellbeing than an entire portfolio of excellent photographs ensuing from a benign or miserable experience.

Susan Sontag observed this unfortunate effect, too. In her book, On Photography, she wrote: “The very activity of taking pictures is soothing, and assuages general feelings of disorientation that are likely to be exacerbated by travel. Most tourists feel compelled to put the camera between themselves and whatever is remarkable that they encounter. Unsure of other responses, they take a picture. This gives shape to experience: stop, take a photograph, and move on. The method especially appeals to people handicapped by a ruthless work ethic—Germans, Japanese, and Americans.”

~~~

In conclusion: you have more time than you think, even if your life may feel busy and hectic. Choose to claim this time rather than allow it to become usurped by unrewarding activities that may add to your stress. And, whatever discretionary time you may have, however you acquire it, and however you wish to invest it—whether in contemplation, travel, photography, watching clouds, or any other activity you find interesting and rewarding—beware of sabotaging yourself by bringing into it a “ruthless work ethic.”

 •  0 comments  •  flag
Share on Twitter
Published on June 26, 2025 05:09

May 8, 2025

Random Morning Thoughts

My thanks to those of you who sent me philosophical questions (see previous post)! Please keep them coming. I’ll have some answers for you soon.

If you’d like to purchase any of my books, I’ll fulfill all new orders in the next couple of days. Please be aware that orders placed after Monday, May 12th will not ship until the week of May 26 due to travel.

As always, thank you for your interest and readership, and a very special thank-you to those of you who generously support my work as patrons and/or donors. You make these writings possible.

—Guy


The problem arises when people are so fixated on what they want to achieve that they cease to derive pleasure from the present. When that happens, they forfeit their chance of contentment.


Though the evidence suggests that most people are caught up on this frustrating treadmill of rising expectations, many individuals have found ways to escape it. These are people who, regardless of their material conditions, have been able to improve the quality of their lives, who are satisfied, and who have a way of making those around them also a bit more happy.


Such individuals lead vigorous lives, are open to a variety of experiences, keep on learning until the day they die, and have strong ties and commitments to other people and to the environment in which they live. They enjoy whatever they do, even if tedious or difficult; they are hardly ever bored, and they can take in stride anything that comes their way.


—Mihaly Csikszentmihalyi


Almost every day I find reason to feel grateful for being a short sleeper and an early riser. Most of my days, whether at home or out camping, begin with savoring fresh brewed coffee in the quiet pre-dawn hours, soaking in the darkness and peace, watching details of the world coming slowly into view, celestial lights dimming and disappearing, colors materializing, morphing, some fading, some intensifying, daytime life awakening.

Predictably, neighbors show up at various times. In some seasons, a herd of deer will make its way across the field toward the junipers on the ridgeline in the soft light before sunrise. I know they will return in the opposite direction in late afternoon. As the days grow longer and warmer, they will eventually depart for higher elevations and will not return until things begin to cool down again. The resident pair of ravens is here year-round, showing up to scout for food each morning, making several appearances throughout the day, frolicking and cawing, sometimes harassing a raptor. When the field becomes awash in sunlight, prairie dogs emerge, then scurry around throughout the day exchanging loud chirps.

As it is early spring now, passerines are starting to arrive. Bluebirds, sparrows, meadowlarks, phoebes, and starlings have been around for about a month already, and some are busily looking for nesting sites or collecting padding materials. Finches and mockingbirds are more recent arrivals. Small flocks of scarlet ibises traverse the sky a few times each day en route to nearby ponds. Hummingbirds should start arriving in the next couple of weeks.

Various raptors patrol the area regularly. Wintering bald eagles had already departed but golden eagles hover high above throughout the day. Various hawks and falcons perch and hover here and there. A harrier (usually just one) skims the field back and forth several times each day. Sometimes in the dim light on the edge of night I may catch the dark silhouette of a great horned owl or a barn owl floating silently above the field, sometimes perching for a short while on a juniper tree. Soon, nighthawks will arrive for the warm season, too, and dart through the air with incredible speed and agility in the late afternoon hours and into the early night.

It had rained a lot yesterday and overnight, enriching the air with the intoxicating scents of wet earth and sagebrush. Rather than be content with watching the world through glass, I wrapped myself in a thick jacket and sat outside with my coffee to inhale deeply the smells along with the silence.

In these early hours of the day, I am especially mindful to not allow mundane or cynical thoughts to arise in my mind and spoil my peace and reverence, or steal attention away from the light, the scents, and the animals.

More rain is the forecast for today. There were times when this would have been enough to prompt me to abandon prior plans (on the uncommon occasion that I had any) and to leave the house early, ready for a night or two of camping in case I decide to do so on a whim, to face the elements, prepared (sometimes hoping) to become stranded by inclement weather. Or voluntarily. But not today. Not in a while, in fact. I don’t know whether I’ve “become soft” in my elder years or whether at some point I had unwittingly crossed a threshold toward feeling I no longer have anything to prove to myself by it. (I stopped feeling I had to prove anything to others many years before that.) Knowing I might change my mind at any moment, perhaps more than once, I think I’ll stay in today. I feel like writing.

What should I write about? More philosophical musings about how best to live? or why our psychology prejudices us in many ways against such wisdom? or the beauty of Nature, or the nature of beauty? Perhaps about how life, science, and perceptions of beauty emerge from and are ruled by mathematics? All these subjects have been on my mind in recent days, but I concede with some regret that writing about them may come at the cost of diminishing or even extinguishing some readers’ interest, perhaps also the emotions inspired in me as I’m mindful of all the things I mentioned in the previous paragraphs. I’d really hate to lose them. Especially with the smells of the wet desert still heavy and sweet in the air and as the light has now become bright enough to see out to the mountain ranges on the horizon, where low white clouds hang below summits now adorned in fresh vernal snow, and as an early meadowlark has just burst into song. Another day, then. For now, putting my stream-of-consciousness to words seems more appropriate.

Every year, when spring arrives and winter comes to an end, the memory of the preceding months suddenly and viscerally feels to me like a long, surreal dream from which I had suddenly awakened. Not a bad dream entirely, yet marked with pervasive sadness, rumination, quiescence, austere beauty, and notably absent joyous forms of happiness. While not a stranger to euphoric moods such as those ensuing from joy, physical pleasure, even states of awe and rapture, none of these seem to me worthy of pursuing as life goals in themselves. To me, the most intense and life-affirming state of mind is not mere happiness but utter fascination.

In other writings I have extolled the virtues of flow—the state of mind ensuing from having attention consumed entirely by some activity, leaving none for such things as rumination, anxiety, anger, or guilt. When in a state of flow, paying no conscious attention to anything outside my immediate experience, I am liberated for a time from all sources of distress: worries regarding things that happened, or have not yet happened, or may perhaps happen in other times and places, even from physical pain.

Flow may arise from any activity, creative or tedious, from one’s professional pursuits or discretionary avocations, dutiful or playful, so long as one invests one’s attention entirely in this activity to the exclusion of all other preoccupations. Of all possible preconditions that may give rise to flow, however, to be utterly fascinated seems to me the most profoundly enjoyable and rewarding. I therefore am mindful to be (to the extent I have a say in the matter) in one of two states: either open and mindful as to notice something fascinating or being lost in fascination with such a thing when I find it. The thing may be anything: an object, a phenomenon, a work of art, an idea. Sometimes, not often, a person. Not necessarily a living one.

I consider it my greatest accomplishment to have attained—by effort and luck—the freedom to place myself in such states in most of my waking hours and on most days. I have always been perplexed by people who, having the means and opportunity to claim such freedoms, still chose the drudgery of routines they found tedious or meaningless. It always seemed odd to me to hear, upon asking why, such answers as “I wouldn’t know what to do with myself.” The thought always bubbled in my mind in response: I could live to be a thousand and fill every second with some interest and still feel like I did not have enough time for everything I wanted to do, learn, experience, contemplate, create.

Of the many reasons people may choose to stick with unfulfilling routines—to, as the poet Rumi put is, “stay in prison when the door is so wide open”—is not fear of change or the unknown (although these may certainly play a part), but because they come to associate certain attitudes and behaviors with their identity, and never stopping to wonder whether it truly is, or, if it once was, whether it still is. It is too common in our younger years to present ourselves to others as certain models of success we have been taught to pursue personally and professionally.

We signal to the world our successes in professions, titles, possessions, beliefs, and lifestyles we wish for them to associate with us. But then we continue to do so even as we mature and grow wiser and may aspire for different kinds of success. What will people think if we change course? If we “betray” our former identity and seemingly those who may consider us allies and colleagues in those forms of success? And so, we give up. When pressed, we may fall back to self-comforting rationalizations: “if I could start over, I would have done things differently, but I can’t.” No, we can’t start over. But in many cases, we can start anew. Like all choices, it will have consequences. But the same is true for all failures to choose.

“Nothing is more unbearable, once one has it, than freedom,” wrote James Baldwin. He was not alone. Practically all existentialists likely would have agreed, and all in one way or another have made it their business to admonish the rest of us with infuriating snootiness, “and yet, here you are, free. What are you going to do about it?”

“To know how to free oneself is nothing,” wrote André Gide, “the arduous thing is to know what to do with one’s freedom.”

“Whatever it be, whether art or nature, that has inscribed in us this condition of living by reference to others, it does us much more harm than good,” wrote the self-obsessed Michel de Montaigne, “We defraud ourselves out of what is actually useful to us in order to make appearances conform to common opinion. We care less about the real truth of our inner selves than about how we are known to the public.”

“I have too much to lose,” you may think. But perhaps on occasion it may be useful also to think, “of all the things I have to lose, which is the least severe?” There may be little you can do about years you feel you’ve wasted, but what about the ones you haven’t yet?

There I go with the philosophy again. And psychology. That was not the plan. But I did write. That was the plan! I’ll call it close enough. It’s so beautiful outside. I’m going hiking. I might stay out for a night. Or two. Oh heck, I won’t have to decide for a few more hours at least. I’ll leave that one to you, future self.

 •  0 comments  •  flag
Share on Twitter
Published on May 08, 2025 12:40

April 18, 2025

Art and Nihilism

Announcements:

Ask Me Anything Philosophical:
Today’s post was inspired by Jason Pettit’s comments on my previous post, Sub Specie Aeternitatis. After many years of writing primarily about photography, hoping to inspire readers to pursue it with more seriousness and creativity, I’d like to extend my range of topics into one of my other passions: philosophy, hoping likewise to inspire more readers to take an interest in it.
So, if you have any questions related to philosophy, please send them over. I have a fairly broad interest in many branches of (mostly Western) philosophy from various periods, schools, movements, and thinkers, so please be as broad or as specific as you wish. (My only caveat: I prefer not to discuss politics or theology here.)After a few years’ break, The Moab Photography Symposium is back for an encore this year, and just a few months away. I’ll be there to deliver the closing presentation, alongside several of my great friends and colleagues who will be presenting and teaching. Claim your spot before they are sold out.I’ve recently had the pleasure of interviewing with Lewis James Phillips and Terry Livesey for their podcast, The Dirty Negative. If you’d like to watch our conversation, it is now available on YouTube and other platforms.

The nihilist is right in thinking that the world possesses no justification and that he himself is nothing. But he forgets that it is up to him to justify the world and to make himself exist validly.


— Simone de Beauvoir


Tell people you are interested in philosophy and soon enough one of them will respond jokingly, “So, what is the meaning of life?” Where you go from here depends on how far you want to carry the conversation. If you just want to go on about your day, you may chuckle knowingly and leave it at that. But, if you want to impress your interlocutor with your literary prowess (in addition to your seemingly casual penchant for using words like “interlocutor”), and perhaps draw them into a deeper philosophical exchange, you may instead quote Shakespeare’s Macbeth:

Life’s but a walking shadow; a poor player,

That struts and frets his hour upon the stage,

And then is heard no more: it is a tale

Told by an idiot, full of sound and fury,

Signifying nothing

Shakespeare’s use of “signifying” instead of “meaning” is important. When people ask about the meaning of life, they generally don’t inquire about the meaning of the word “life” but about the meaning of the thing referred to—signified—by that word: specifically, whether this thing called life possesses deeper significance or purpose beyond just the mechanisms and qualities separating living things from non-living things. (That, I should add, is a thorny philosophical problem for another day, as there does not seem to be an obvious hard line separating life from non-life.)

To the dismay of some, and despite a plethora of beliefs and intuitions to the contrary, no scientific investigation to date has revealed that such “meaning of life” exists or even can exist objectively in the world. The more we learn about the nature of reality (at least those aspects of reality that present themselves to our senses, instruments, and intellect—the things philosophers refer to as “phenomena”) the more it seems that life in our universe is likely just a byproduct—an epiphenomenon—of the laws of nature evolving in time, making our material existence possible for a limited period before the universe moves on to spend the majority of its existence in states where life as we know it will no longer be possible.

Our universe is (in absolute terms) very young: a mere 13.7 billion years of age. According to prevailing theories about the future evolution of our universe, soon (also in absolute terms) space will expand to such immense extents that the very atoms that now make up material existence—stars, galaxies, planets, life forms, molecules—will break and decay, and their subatomic constituents will spread out over a vast and cold darkness for trillions of years to come, perhaps for eternity. The laws of thermodynamics suggest that energy will become so spread out throughout space that stars will cease to form and to emit light in about 100 trillion years. In about a googol (10100) years, energy will become so spread out that the universe will cool to near absolute zero and no region of it will have sufficient energy to power mechanical processes, including those needed to form and sustain life.

Even in the present time, on a planet teeming with life, after thousands of years of scientific study, we have not found one iota of evidence to suggest that material existence has or strives for any deliberate purpose, let alone a purpose for the phenomenon of life. Meaning does not seem to exist anywhere outside our minds. Put another way, we make meaning out of our knowledge and experiences as conscious beings in the world. When the universe no longer has life, no meaning will exist because there will be nothing capable of making or experiencing meaning. The emergence, existence, and extinction of life—human and other—will ultimately be far less significant an episode in the history of our universe than, say, the eruption and loss of baby teeth is in the life of a person. More succinctly: life can have no meaning in a universe destined to spend most of its existence in lifeless states.

So, why worry about the meaning of life if, so far as we know (or can know), in the grand scheme of things, life is meaningless?

The answer: because, regardless of life’s insignificance in the grand story of the universe, we living beings absolutely can experience and find great significance in the thing signified by the word “meaning.” Put more simply: to a living being, many things—experiences, perceptions, acts, relationships, sensations, knowledge, hopes, beliefs, ethics, aesthetics, emotions—may be profoundly meaningful, even if they do not exist objectively in “the world” and are destined to ultimately fade out of existence, along with life itself. So, while there may be no such thing as a meaning of life, there is absolutely such a thing as meaning in life.

Meaning in life is no small thing. In fact, the presence or absence of meaning in life is known from studies in psychology to be correlated with physical and emotional well-being. Indeed, to some people, in some extreme cases, meaning in life may even hold the top position among factors that make their lives worth living.

What may not be obvious about the distinction between meaning of life and meaning in life is that the latter comes with an important implication, which is this: since the world doesn’t possess objective meaning for us to find and thus doesn’t impose any meaning on us, we are (at least to a degree) free to choose for ourselves what to regard as meaningful: to form our values and goals according to what we consider to be important and worthy, and strive to live according to these values with the overarching goal of maximizing meaningfulness and its benefits.

So, the answer to “what is the meaning of life?” is this: there is no such thing as a meaning of life. There are only, for each of us, according to our own self-chosen values, things that may make our lives feel meaningful.

~~~

Nihil is a Latin term meaning “nothing,” hence the term nihilism referring to an ideology of nothingness—more accurately, an ideology founded on the assumption that life means nothing (which is not the same as saying that a life can’t be meaningful to the person living it).

If you intuitively consider nihilism to be a “bad thing,” you are certainly not alone (and not entirely wrong). Still, it’s worth pointing out that the statement “nihilism is bad” doesn’t refute nihilism (i.e., it doesn’t offer an argument for the existence of objective meaning, or evidence for it). Saying that a thing is “good” or “bad” (or “better” or “worse” than another thing) is, in philosophical terms, making a normative (i.e., value-based) judgment about this thing. (In this case, an ethical judgment, which, like all ethical judgments, is by nature a matter of subjective opinion rather than objective truth—yet another philosophical problem for another day).

This is where we run into a problem: what does it mean to say that something which—as far as we know—is true (i.e., that life has no universal, objective, discoverable meaning) is also bad. Does it mean we’ll we be better off denying the truth or pretending not to know it? Thankfully, this question is easily resolved if we withhold judgment and ask instead why nihilism is bad, or, better yet, when is nihilism bad?

Consider these two statements: 1) life has no meaning, therefore I should not care about anything, not bother striving for anything, and not invest effort in anything beyond the minimum necessary to meet my basic needs and gain me hedonic pleasures; 2) life has no objective meaning but it can still feel meaningful, which is known to be positively correlated with physical and emotional well-being, therefore I should make it my goal to find things that amplify my feeling of meaningfulness: care deeply about things that feel is important, strive for what I believe to be right and morally just, study, experience, and investigate anything that might make my life more interesting.

Both statements take the truth of nihilism (in the sense of life having no objective meaning) as their foundation, but while the first may indeed be plausibly considered as “bad” in the sense of having little or no positive affect, it’s hard to say the same about the second.

The idea that we should strive to make our lives subjectively meaningful despite there being no evidence of an objective meaning for life, is a core tenet of existential philosophy. As Jean-Paul Sartre put it very bluntly: “life has no meaning a priori. Before you come alive, life is nothing; it’s up to you to give it a meaning.”

The tension between the human capacity and inborn need for meaning, and the apparent lack of objective meaning we may hope to find in the world, is what Albert Camus characterized as the “absurdity” of human life. In The Myth of Sisyphus, he wrote, “The absurd is born of this confrontation between the human need and the unreasonable silence of the world.” Expressing his frustration with this absurdity, Camus wrote:

I don’t know whether this world has a meaning that transcends it. But I know that I do not know that meaning and that it is impossible for me just now to know it. What can a meaning outside my condition mean to me? I can understand only in human terms. What I touch, what resists me—that is what I understand. And these two certainties—my appetite for the absolute and for unity and the impossibility of reducing this world to a rational and reasonable principle—I also know that I cannot reconcile them.

My favorite summary of the absurd condition comes from an off-hand remark by the writer Charles Bowden. I doubt he even gave it much thought as he was rattling off a train of thought in response to an interview question, but something he said seemed so poignant to me that I had to write it down immediately and have quoted it to others many times since. He said, “Anybody can prove the world’s pointless. But so what? You’re in it.”

What do you do when you find yourself craving meaning in a pointless world—what existentialists refer to as “thrownness”: finding yourself thrown into a meaningless world, having had no choice in the matter, to make something of your living experience? You make your own meaning. As the painter Francis Bacon put it, “You must understand, life is nothing unless you make something of it.”

This is where nihilism meets art. Art can be many things. Among the most powerful and important of these things is making meaning. As psychologist Eric Maisel put it, “In the act of creation, they [creators] lay a veneer of meaning over meaninglessness and sometimes produce work that helps others maintain meaning. This is why creating is such a crucial activity in the life of a creator: It is one of the ways, and often the most important way, that she manages to make life feel meaningful.”

It is important to emphasize that art, like life, is not meaningful in itself. Merely making or beholding art is no guarantee that you will gain meaning from art. To gain meaning from art requires a deliberate conscious effort to make the experience of engaging with art—whether as a creator or as a spectator—meaningful. This is by far not the default way most people approach art.

In my decades of teaching and working with photographers, I have seen many who seek no greater purpose in their work than traveling to photogenic locations to capture pleasing images, perhaps somewhat inspired by the experience but almost never to such states as awe, reverence, or flow. Even in places of rare beauty, facing great feats of nature or human drama, most have no deeper interest in their subjects, in elevating the quality of their own experience, or in pursuing any purpose beyond just coming away with a “good shot.”

I have also seen painters working on commission to produce generic portraits, renditions of bucolic scenes designed to appeal to tourists, or abstract pieces made to fit into or augment a home decor. I have seen potters in a factory producing beautiful vessels in cold, assembly-line fashion. In the same vein, Hermann Hesse had this to say about poets of his day:

Because ‘beautiful’ poems make the poet beloved, a great quantity of poems come into the world that attempt nothing except to be beautiful, that pay no heed to the original primitive, holy, innocent function of poetry. These poems from the very start are made for others, for hearers, for readers. They are no longer dreams or dance steps or outcries of the soul, reactions to experience, stammered wish-images or magic formulas, gestures of a wise man or grimaces of a madman-they are simply planned productions, fabrications, pralines for the public. They have been made for distribution and sale and they serve to amuse or inspire or distract their buyers. So just this sort of poem finds approval. One does not have to project oneself seriously and lovingly into such poems, one is not tormented or shaken by them, rather one sways comfortably and pleasurably in time to their pretty, regular rhythms.

All these examples describe art made to impress others, to sell to others, to gain the approval of others. Too few artists by comparison create art with the goal of enriching their lives with meaning. Meaning may be found—by anyone, including professionals who earn their living in art—in the process of conceiving art, learning about art, making art, presenting art, and beholding the art of others. But meaning doesn’t “just happen.” It must be sought deliberately and it requires conscious investment of emotions, thought, time, and labor beyond just those needed to produce a finished piece.

I consider it unfortunate that the motives of pleasing, impressing, communing with and selling art to others—while undoubtedly valuable—are yet so powerful as to crowd out the meaning-making motive for most artists entirely. Unfortunate because, in my experience, the rewards to be had by seeking meaning and elevated states of mind in the experiences of contemplating, making, and beholding art eclipse—by a great margin—the sum of all other rewards to be found in art.

If making and experiencing art was not a source of meaning for me, I would not be an artist. But art for me doesn’t stand alone as a way of making meaning. I work in places and situations that inspire in me feelings like awe, reverence, and flow. I deliberately avoid work that requires me to rush or to worry about productivity or salability. I will not make a photograph or engage in creative writing unless I feel those elevated states, the desire to create, the significance—if only to me—of what I am doing, with the knowledge that the time and effort I invest in my work would likely not yield me greater rewards if I instead used in other ways.

~~~

Anticipating the question of what I mean by “meaning,” I have a simple answer: something is meaningful if it feels meaningful. It doesn’t matter in the least whether others will find the same things as you to be meaningful. It doesn’t matter in the least what others expect you to find meaningful if you don’t. It either feels meaningful or it doesn’t. Likewise, something is important if it feels important, rewarding if it feels rewarding, and interesting if it feels interesting. None of these things are found objectively in “the world,” nor can they be imposed on you by fiat.

Other people may expect you to respect and to treat some things as important and meaningful even if they don’t feel that way to you, and it may well be in your interest to comply for the sake of maintaining good relationships and a desirable social order. But keep in mind that you don’t have to go beyond that: that you are always free, at least to some degree, to choose for yourself what is meaningful in your own life and work, and to seek opportunities to depart from and transcend other people’s expectations and the objective meaninglessness of the world so you can claim more of your time and resources toward making your own life as meaningful as it can be, according to your own sensibilities and values. This includes choosing the most meaningful ways to create your own art and to engage with other people’s art.

Nihilism and art are, in this sense, not opposites forces but complementary forces in the pursuit of freedom: one frees you from having to seek or comply with meanings outside yourself, the other frees you to make meaning of your own rather than succumb to despair in futile pursuit of meaning in an objectively meaningless world.

Camus, again:

Our aim is to shed light upon the step taken by the mind when, starting from a philosophy of the world’s lack of meaning, it ends up by finding a meaning and depth in it.

It was previously a question of finding out whether or not life had to have a meaning to be lived. It now becomes clear, on the contrary, that it will be lived all the better if it has no meaning. Living an experience, a particular fate, is accepting it fully.

 •  0 comments  •  flag
Share on Twitter
Published on April 18, 2025 05:53

March 20, 2025

Sub Specie Aeternitatis — A Philosophical Meditation


If sub specie aeternitatis there is no reason to believe that anything matters, then that does not matter either, and we can approach our absurd lives with irony instead of heroism or despair.


— Thomas Nagel


Someone asked me if I had a “message of hope” to share. I do not. I think I have a better message—one that doesn’t require hope. To hope is to “borrow” imaginary pleasure from something that hasn’t—and may not—happen: pleasure we believe we’ll feel if some imagined reality comes to pass. As in other situations, borrowing may be helpful in transcending some hurdle. But, as in other cases, if we have ways of transcending the hurdle in other ways or even eliminating it using means we already possess, we may be better off not borrowing at all.

This is why hope is so treacherous. If what we desire doesn’t come to pass, our future disappointment may offset or exceed whatever positive affect we gained in the present by hoping. And, if what we hope for comes to pass, we likely will revel in it for a bit, but immediately begin hoping for something else, feel dissatisfied again, and be tempted to hope—to borrow—more.

Eventually, borrowing happiness and meaning from uncertain futures becomes habitual, even addictive. Addiction is something we do compulsively to assuage ourselves in the short term even if it leads to adverse consequences in the long term. In their definition of addiction, the American Psychological Association (APA) states, “The term is often used as an equivalent term for substance use disorder or substance dependence and can be applied to non-substance-related behavioral addictions.” Although it’s likely that no psychologist will say so outright, given the risk of worsening rather than helping a patient’s suffering, it remains true in the strictest sense that too many of us are addicted to hope. (This, by the way, is part of the reason I am skeptical of so-called “positive psychology” and other disciplines that promise healing by means of self-deception, and thus are only effective so long as the deception can be sustained.)

Our brain’s reward system is designed such that that the pleasure we feel when we get what we hope for usually does not live up to the pleasure we believed we’ll feel when we hoped for it. As Daniel Lieberman and Michael Long wrote in The Molecule of More, “Our worlds of fantasy can become narcissistic havens where we are powerful, beautiful, and adored. Or perhaps they’re worlds where we are in total control of our environment the way a digital artist controls every pixel on his screen. As we glide through the real world, half blind, caring only about things we can put to use, we trade the deep oceans of reality for the shallow rapids of our never-ending desires. And in the end, it might annihilate us.” Later in the book, they concluded bluntly, “Living our lives in the abstract, unreal, dopaminergic world of future possibilities comes at a cost, and that cost is happiness.”

To hope is, according to Albert Camus, “a sin against life.” The sin, Camus explained, “consists perhaps not so much in despairing of life as in hoping for another life and in eluding the implacable grandeur of this life.” If we could find sufficiently satisfying meaning in the present—in the experiences, beauty, and opportunities open to us—in reality as it is, not as we hope it to be—we would have no need for hope. To hope, therefore, we must first be dissatisfied. It follows that people who are perpetually hopeful are also perpetually dissatisfied. To what end?

The great pessimist philosopher Arthur Schopenhauer explained the futility of hope. In his essay, “The Vanity of Existence,” he wrote:

Even though we are always living in expectation of better things, at the same time we often repent and long to have the past back again. We look upon the present as something to be put up with while it lasts, and serving only as the way towards our goal. Hence most people, if they glance back when they come to the end of life, will find that all along they have been living ad interim: they will be surprised to find that the very thing they disregarded and let slip by unenjoyed, was just the life in the expectation of which they passed all their time. Of how many a man may it not be said that hope made a fool of him until he danced into the arms of death!

Unlike Schopenhauer (and perhaps to the surprise of some who may have assumed it), I am not a pessimist. This is for the same reason that I am not an optimist: both optimism and pessimism are forms of bias, which is to say they skew our judgment (and consequent perceptions) away from rational assessment. I strive consciously (admittedly, sometimes against my instincts) to exclude biases from my judgments when I become aware of them, or to self-correct when I realize too late that I had fallen victim to some prejudice.

One such realization came to me recently after noticing that on several of my recent hiking excursions—an activity I consider vital to my wellbeing and that I know from experience to have the power to heal and elevate my spirit like no other—I have become mired in feelings of frustration and anxiety. I would return from my outdoor explorations feeling depleted rather than energized, my mood and thoughts permeated with ennui rather than enriched with inspiration and new ideas as I had… ahem… hoped… for them to be. Upon reading Schopenhauer’s words above, I understood why. “We often repent and long to have the past back again,” he wrote. That was my problem.

After recent setbacks, seeking to recover my capacity for inspiration and to experience beauty, I went to familiar places to commune with familiar things, expecting them to affect me in familiar ways. But they did not. It was this expectation that prevented me from appreciating my experiences fully. I was trying to relive, to re-experience, to re-feel things I remember feeling as the person I was before the change—the person I remember myself being, and have unwittingly come to consider as “the real me,” or as “my true self.” But “the real me” is not an absolute, static, well-defined entity. It is an entity that is by nature constantly changing and evolving. It is not something that I am or something that I became; it is something that is always in the process of becoming. “This secret spoke Life herself unto me,” wrote Friedrich Nietzsche, “‘Behold,’ said she, ‘I am that which must ever surpass itself.’”

Without at first realizing why, I became anxious when I did not feel things as I expected to feel them, as I remembered feeling them. Rather than recognizing the need to form new, adaptive, feelings for these places, things, and experiences, I instead worried about the prospect that I may never feel again as I used to. But this worry is futile. It is a fact that I will never feel them in quite the same way again—that I will never again be the person I used to be, at least not entirely. I knew then that I must learn to feel as, and to love as, the person I now am. Someday, likely, I will have to do so again as the person I will become. Worrying about it will change nothing. “My formula for greatness in a human being,” Nietzsche wrote, “is amor fati [Latin for “love of fate”]: that one wants nothing to be different, not forward, not backward, not in all eternity. Not merely bear what is necessary, still less conceal it-all idealism is mendacity in the face of what is necessary—but love it.”

“The owl of Minerva first begins her flight with the onset of dusk,” wrote Georg Wilhelm Friedrich Hegel. The owl, symbolizing wisdom, only takes flight at the end of the day, after the day’s lessons have been learned, contemplated, and reflected upon. Not before. Wisdom is something gained by experience or contemplation, not by hoping.

To yearn for an unchangeable past or to an imagined future that we consider as “better” than the present is implicitly to consider the present—the only time we get to experience—as “not as good” if not as outright “bad.” Yet it remains true that we are better in the present, if for no other reason than that only in the present we are free to make choices, discharge our wisdom, learn new knowledge, experience with our senses, create, think.

We may consider some of our memories as better than our present experiences. But memories are parts of our present experience. If they are good, their goodness counts toward our pleasure in the present. If they are bad, we may also feel pleasure in the present for having outlived them. In the same sense, we may say that memories to the past are what hopes are to the future: things we may take pleasure in now despite them not happening now. But the laws of entropy and probability make memories and hopes different. Memories may be considered as earned values, as already realized returns on investments already made. Hopes, on the other hand, must be considered as having speculative values—the value of a gambles, subject to the same laws of probability. This is why memories are better than hopes.

Value judgments—what philosophers call “normative” judgments—as well as judgments about how good or bad something is—what cognitive scientists call “valence”—come to us so intuitively that we rarely even think to doubt them. We believe we know innately what is good for us and what is not. We feel it in our guts as clear, unambiguous readings from our “moral compass,” which we assume to be always right. We rarely stop to compare these moral compass readings against our rational maps. We rarely consider the fundamental difference between moral compasses and magnetic compasses: that one points to something absolute outside of us, while the other points to something relative and subjective within us. One points to a magnetic pole we cannot change, the other points to a pole we are free to relocate by mere thought. We may even believe that to question our moral values—especially those that feel most noble—amounts to heresy and a moral wrong. So powerfully have our minds been shaped by doctrines, by society, by evolutionary selection. And yet, as William Shakespear expressed it in the voice of Hamlet, “there is nothing either good or bad, but thinking makes it so.”

Ironically, Hamlet uttered his profound realization to his friend Rosencrantz, attempting—against Rosencrantz’s protest—to defend his previous claim that his kingdom, Denmark—and indeed the whole world—is a prison. Ironically, he failed to see the wisdom in his own words. By his own admission, he could, if he wanted, free himself from his imaginary prison by the powers of his own thoughts. So can I. So can you. But it is not easy. It requires not only changing our minds but changing the ways we make up your minds. As John Stuart Mill put it in his Autobiography, “no great improvements in the lot of mankind are possible, until a great change takes place in the fundamental constitution of their modes of thought.”

One such great change is a change of perspective: seeing, perceiving, and thinking about the world not from the perspective of a person surrounded by a vast reality of irresistible forces, subject to whims and turns of fate as the Stoics would have us do, but from the perspective of existence itself: the highest form of intellectual transcendence that the human mind is capable of.

In his Ethics, published after his death in 1677, Baruch Spinoza introduced the Latin term sub specie aeternitatis (which, for the purpose of this article, may be translated as, “from the perspective of eternity”) into the philosophical jargon. The term became popular with many later thinkers as a useful way to examine the world as phenomena (as it presents itself to our senses and intellect, distinct from the way it may “really be,” which may be beyond our ability to perceive or imagine) without bias and prejudice.

From the perspective of eternity, all things are of equal value—neither good nor bad, neither beautiful nor ugly, neither important nor unimportant. Such judgments may be made only within limited contexts: by certain people in certain places, situations, and times, based on subjective opinions, beliefs, dogmatic tenets, unprovable axioms, or utilitarian calculations, and not biased by unwarranted optimism or pessimism. From the perspective of eternity, things are just what they are, having no valences or normative values: qualities we associate with them, often without deeper consideration, often leading us to absurd beliefs, thoughts, and ways of life. “The absurd,” Camus wrote, “is born of this confrontation between the human need and the unreasonable silence of the world.”

To live our limited human lives under the assumption eternity, is the only form of freedom available a human being. This is because freedom of the mind implies making judgments and choices, to the extent possible, by way of reason rather than under the constraints or impulses of opinions (one’s own or others’), superstitions, desires, biases—the very things that become meaningless when considered from the perspective of eternity. “It is the nature of reason,” Spinoza wrote, “to regard things under the assumption of eternity.” Elsewhere, he wrote, “I call free him who is led solely by reason; he, therefore, who is born free, and who remains free, has only adequate ideas; therefore, he has no conception of evil, or consequently (good and evil being correlative) of good.”

The perspective of eternity is polarizing because it implies something profound yet doesn’t suggest an obvious “right” way for us to feel about it. We strive for meaning in our subjective lives, but eternity implies objective meaninglessness. Whatever we do, think, or feel, no matter how profoundly meaningful—the fact that we even existed—will in time mean nothing at all to no one at all. Existence is nihilistic. And we may be so, too. But we don’t have to be. Just as anything we do, think, or feel will someday become meaningless, so will our choice to live meaningfully, even if in error. This may be the hardest thing of all to accept, especially for those who take comfort in rationality: in eternity, even the truth doesn’t matter. To live sub specie aeternitatis is to live in recognition of the bounded finality of our individual existence, with the goal of maximizing subjective meaningfulness within these boundaries, knowing they will ultimately dissolve to nothing, and mean nothing. “For each person,” wrote Derek Parfit, “there is one supremely rational ultimate aim: that his life go, for him, as well as possible.”

The realization of cosmic meaninglessness may lead some—those who fail to realize that within their bounded lifespans they have a choice in the matter—to despair. But, considered differently, it may also yield the greatest sense of equanimity, the greatest urgency is making one’s life meaningful and interesting, knowing that it is short and exactly because it is short. It may be considered as the most liberating notion, freeing one from arbitrary impositions of dogmas, traditions, and expectations, if only in one’s mind. But often, we may choose to free ourselves form such things in body, too: defy, rebel, depart from the herd.

No doubt, there may be risks, perhaps even great ones, in asserting such liberation. But the question should always be on our minds: which is the bigger risk? To spend one’s quota of living moments in safely, in compliance with expectations, comforted perhaps by imaginary hopes, or to live dangerously—at least for as long as one can—as intensely and as (subjectively) meaningfully as one may get away with? Nietzsche had no qualms offering his opinion on this matter. “For believe me,” he wrote, “the secret for harvesting from existence the greatest fruitfulness and the greatest enjoyment is: to live dangerously!”

What may not be obvious is that we get to choose which of these attitudes to adopt in making consequential life decisions: whether to live tacitly (as Thoreau put it) “a life of quiet desperation,” perhaps relying on hope more than (or the the exclusion of) certain experiential rewards for spiritual sustenance, or a life perhaps plagued by suffering and difficulties but underscored with such states of mind as deep awe, profound inspiration, and moments of great insight. One may think there may be a middle ground, a balance, a way of getting “just enough” of both. Perhaps there is. Perhaps this middle ground is only open to some people possessing certain personalities. I don’t know. I just know that for many years I have tried to find it and could not.

Although perhaps not a sine qua non requirement for a life of intense experiences, making and beholding art may fit well with, and greatly enrich such a life. Hermann Hesse described a life of intense experiences as one out of which “like a precious, fleeting foam over the sea of suffering arise all those works of art, in which a single individual lifts himself for an hour so high above his personal destiny that his happiness shines like a star and appears to all who see it as something eternal and as a happiness of their own.”

Søren Kierkegaard wrote cryptically, “Only when one has thrown hope overboard is it possible to live artistically; as long as one hopes, one cannot limit oneself.” What I think he meant is that to make art and to behold art (at least when one approaches these endeavors seriously) is in a sense also to acknowledge and reify the importance of making meaning within our finite and limited lives. Art exists and is justified in the narrow, limited realm of experiences that are meaningful but not practical. That is a severe limitation, one we’d have no reason to self-impose if the results did reward us with transcendence of our own practical limitations. Hope may also reward us with such transcendence, but art does so without the need to wager our happiness on future events. Tying together art, life, and the perspective of eternity, Ludwig Wittgenstein wrote, “The work of art is the object seen sub specie aeternitatis; and the good life is the world seen sub specie aeternitatis. This is the connection between art and ethics.”

My message, then, is not one of hope, but in a sense the opposite of that: rather than hope for some imaginary reality, live and strive such that you find meaning, interest, and beauty in the reality you have, even if difficult, even if tragic, even if not obviously beautiful. Where you encounter injustice or suffering, work and take meaning in trying to alleviate or oppose it, but don’t do so for hope—for the sake of taking pleasure in imagining some distant outcome. Take pleasure in how it makes you feel right now to have done the right thing according to your own values. Recognize that others may have their own.

Always keep in the forefront of your thoughts that the times you spend in dissatisfaction and despair, hoping for something better, comes out of the same finite pool of living moments given to you as the times you invest in experiencing and contemplating beauty and meaning.

To live sub specie aeternitatis—to perceive consciously and deliberately ourselves, the world, and to  consider our choices from the perspective of eternity and our finite, meaningless place in it—doesn’t mean that nothing matters; it means that nothingness matters.

 •  0 comments  •  flag
Share on Twitter
Published on March 20, 2025 15:11

March 8, 2025

Beyond Storytelling


To speak of “reading” a picture is appropriate but dangerous at the same time because it suggests a comparison with verbal language, and linguistic analogies, although fashionable, have greatly complicated our understanding of perceptual experiences everywhere.


—Rudolf Arnheim


Generalizations are dangerous things. Sweeping statements, even if true in some cases (or even if true in a majority of cases), may belie important nuances and exceptions which, if we fail to acknowledge them, may lead us to incorrect conclusions, leaving us with incomplete, flawed, or outright mistaken sense of understanding. What prompted this thought was a recent encounter with such a general statement in an article about photography. The statement was this: “photographers are visual storytellers.” Just that. No nuance, no qualification—no “some” or “most” or “aspire to be.” To my dismay, an internet search for that phrase brought up a disturbingly large number of results. I disagree.

In fact, I disagree vehemently enough that I consider this statement, indeed any generalization, potentially harmful, especially if taken at face value by those who have not yet found the form of photography most suitable and rewarding to them, and who may be discouraged by such statements from even considering forms of photography other than storytelling.

I am a photographer. (A professional one, even.) I am not and have no aspiration to be a visual storyteller. In fact, I believe that photography by itself is a very limited medium for storytelling. Rather than the bombastic and overly generalized statement “photographers are visual storytellers,” I propose that a truer and more accurate statement is this: some photographers, if they are also good wordsmiths, may—if they wish—be visual storytellers.

The purpose of storytelling is communication. Communication, to be effective, relies of parity between the information expressed by the author and the information inferred by the recipient. For such parity to exist, a certain degree of shared knowledge and meaning must exist a-priori between the person who wishes to communicate a story and the person attempting to comprehend the story.

Communication, in turn, is just one of many goals that may be achieved or aided by photographs or other artistic creations. It is, by a long shot, not the only one. In fact, the goal of much art today is in some ways the opposite of communication: it is to present viewers with ambiguous, open-ended impressions from which they may (indeed, are encouraged to) proceed to conjure their own stories, pose and attempt to answer their own questions, experience their own feelings, and make their own meaning without any of these being overtly imposed or even intended by the artist, which is to say: they are intended by design to not tell stories, to not communicate accurate, unambiguous information or precise meaning.

Photographs (or for that matter music or other nonverbal media) are certainly capable of inspiring or prompting viewers to perceive, to seek, or to conjure complex and powerful stories. However, I contend that photographs (or other nonverbal media) are very limited in their capacities for telling such stories.

It is the Achilles heel of general statements that all it takes to disprove them is just one example to the contrary. I will offer a couple, to make my point.

One example, not my own (alas, I don’t know its origin) is this: imagine a picture of a burly man striking a small girl in the street. The girl’s facial expression and tears leave no doubt that she is hurt and terrified. What is the story? A bad man hurts an innocent little girl? Perhaps. But what if you were told (verbally) that the man is a physician and the girl his patient who is deathly allergic to bee stings. Seeing a bee landing on the girl’s shirt, the physician jumped to her rescue and swatted the bee, perhaps saving her life. Certainly, the image told a story, but it could not tell the story. The image alone did not tell the whole story. Words may have but perhaps not as powerfully as the visual rendering of the scared little girl. Insisting on using just one medium or the other would have fallen short of telling the story in all its nuances. The power of the image exceeds that of words in eliciting emotions. The power of the words is in communicating precise, unambiguous details. Visual storytelling does not simply mean telling complete stories in images (which I argue is impossible, at least if a story is complex); it implies that images may be used to enhance and increase the power of a story. The conclusions: being a good storyteller is not about the medium you choose; it’s about knowing which medium is best suited to a desired effect.

Now consider the surreal paintings of Salvador Dali. What is the story told by a melting clock or a deformed face? Before attempting an answer, consider my previous statement that stories are a form of communication relying on shared knowledge and understanding of certain words or symbols. Is Dali attempting to communicate something to his viewers with whom he shares a common understanding? Not according to Dali. In his essay “Conquest of the Irrational,” he wrote: “The fact that I myself, at the moment of painting, do not understand my own pictures, does not mean that these pictures have no meaning; on the contrary, their meaning is so profound, complex, coherent, and involuntary that it escapes the most simple analysis of logical intuition.”

How about Picasso’s cubist works? Would you know that his famous 1937 painting depicting an ox, a horse, and deformed, disembodied body parts was the story of a horrific war scene if he did not title it Guernica, or if you did not know what historic event is referred to by this title? The picture does not tell a story; it prompts and challenges interested viewers to research and to construct the story from information found outside the picture.

Consider the haunting photographs of Gregory Crewdson. Ask yourself “what is the story?” then ask someone else to answer the same question, and another. What are the odds that these stories will be consistent and accurate? Accurate to what? There is no one real story. The pictures suggest stories; they challenge viewers to unleash their imaginations, to come up with their own stories, but they don’t tell stories. This is the power of photography intended as art, rather as a medium for storytelling.

My own goal in photography is not to tell stories. My goal as an artist working using the medium of photography is to create impressions hoping they may inspire stories—perhaps complex, powerful, even ineffable stories. Certainly, viewers may attempt to unravel “the” story—my story—what I felt and wished to express in my photographs. They may even be successful in this to some degree. But no matter how much a viewer may wish to believe otherwise, there is no way that anyone other than me can infer from, say, an arrangement of rocks or qualities of light in my photograph exactly how this photograph may relate to my mindset and emotional state, how these may have been influenced by the preceding hours of solitary hiking in the desert, the music or the silence I’ve been listening to. All are, to me, my story—a story I strove to express not with the hope of communicating it in all its details and nuances, but because it was meaningful to me: because I wished to intensify and to commemorate it, and because my experience was elevated by engaging in creative work—for my own sake. However, I do so also in full knowledge that the photograph will not—cannot—tell my story in all its dimensions—factual or imaginary—to anyone but me. If I felt compelled to share these aspects of my story with others, I would use words.

For nearly two decades I have been studying and teaching the expressive powers of images, investigating and striving to explain to my students the concept of a “visual language.” On my naïve early explorations, I had hoped to discover a means of translating “verbal language” into “visual language” with the goal of relating complex narratives—thoughts, feelings, and yes stories: things I could express in words, but instead using photographs. Inspired by Alfred Stieglitz’s idea of “equivalence,” I had hoped that there was sufficient overlap between verbal and visual language: a sufficiently broad range of visual cues that may be perceived as equivalent to verbal expressions in the same way that two verbal languages may be used to express the same things only using different words and expressions. This, I can say now with confidence after having studied the topic both as an artist and as a lay scientist, is not the case.

When it comes to storytelling, visual language may no doubt enhance and intensify stories, but, beyond a certain degree of complexity and desired precision, is no match for the expressive powers of verbal language. By this I mean to say that the more complex the story you wish to tell and the more precisely you wish to tell it (i.e., to ensure that audiences will infer and relate to your story in “specific and known” ways, to borrow the words of Minor White), the sooner you will exhaust the expressive capacities of visual language. On the other hand, some aspects of a story may be expressed much more effectively in visual language than in verbal language—but only some. Consider for example the difference in emotional response between describing a person being mistreated and showing an image of a person being mistreated.

Photographs may be roughly equivalent with or even superior to words in making such plain statements as, “this is what a White-Crowned Sparrow looks like,” “this person is sad,” “this view is awe inspiring,” or “I saw this happen.” But stories told in words can go considerably beyond such simple statements—to epic novels spanning many characters and prolonged time periods, explanations of complex philosophical ideas, descriptions of elaborate processes, or precise articulation of unobvious scientific theories.

Verbal language has its limitations, too. For example, in some cases where words leave off, the descriptive powers of “mathematical language” may still go further, extending to such concepts as explaining the behavior and properties of quantum particles, and how such particles that defy our intuitive perceptions of “real” may give rise to everything we perceive as real, theorizing spaces having dimensions beyond just height, width, depth, and time, even universes having laws of physics that may be different from those found in the one we inhabit: things that no amount of words can fully relate. “To those who do not know mathematics,” wrote Richard Feynman, “it is difficult to get across a real feeling as to the beauty, the deepest beauty, of nature.”

For photography to tell stories in the sense of communicating information based on a shared understanding, these stories must consist largely of elements having known and predictable connotations arousing predictable perceptions and reactions in a majority of viewers. As such, the best examples of “visual storytelling” I know of generally fall within one type of storytelling: reportage—journalistic accounts testifying to the existence of certain things or the occurrence of certain events the photographer has witnessed. As a creative artist, I find this to be a draconian and unnecessary limitation to impose on my work—draconian because my goal in photography is not to report on the visible world; unnecessary because, when I do wish to tell stories, I have at my disposal not only visual language but also verbal language, which—whether alone or with the aid of photographs—can tell such stories more effectively.

The reason photographs fall so far short of words in telling stories is that visual language is by nature and necessity ambiguous. Beyond very simplistic expression, people’s perception of what a photograph “means” or the story it may tell rely on subjective factors as much as objective, known, universal associations. Simply put, there isn’t—and cannot be—a “dictionary” for visual language. While some visual elements may be perceived without ambiguity, others may assume meanings related to cultural symbols (e.g., the color of one’s flag), one’s experiences, philosophy, or phobias (e.g., the image of a snake may arouse fear or disgust in some, fascination in others). While verbal language is not entirely free of such subjective perceptions, it is far less susceptible to them.

Those who attended my classes may recall this summary slide used in my discussion of visual language, explaining how it differs from verbal language:

Other than the inescapable ambiguity inherent in visual expression (which increases with the complexity of what is being expressed), the second point—regarding how our brains make meaning from visual information vs. verbal expressions—is crucial. Our visual processing depends on momentary impressions—gut reactions—formed by our brains, which take in a large amount of information from our sensory organs and must distill salient information from it as quickly as possible. Most of this information is discarded—filtered out—according to so-called “rules of Gestalt” before it ever reaches our consciousness. This is very different from the way our brains process verbal information (in specialized structures such as Wernicke’s area and related areas), which consider a much larger portion of the information conveyed in words, their order, punctuation, possible meanings in different contexts, etc.

Photographs may certainly enhance stories—make stories more relatable and visceral—but photographs cannot tell stories, at least not ones with any significant degree of depth or complexity, without prompting viewers to go beyond the photographs: to fill in information from other sources and media. Also, photographs may be created without any intent to tell stories, but to create powerful impressions, to arouse emotions, to give rise to aesthetic experiences not intended to communicate any information. To suggest that all photographers are storytellers without further qualification is like saying that all writers are reporters or authors of cookbooks. Photography can tell stories, but it’s not limited to telling stories. It is for each photographer to choose how to employ their medium for their own purposes and according to their own motivations, whatever they may be.

 •  0 comments  •  flag
Share on Twitter
Published on March 08, 2025 14:23

February 21, 2025

On Seeing


Accustom yourself every morning to look for a moment at the sky and suddenly you will be aware of the air around you, the scent of morning freshness that is bestowed on you between sleep and labor. You will find every day that the gable of every house has its own particular look, its own special lighting. Pay it some heed and you will have for the rest of the day a remnant of satisfaction and a touch of coexistence with nature. Gradually and without effort the eye trains itself to transmit many small delights, to contemplate nature and the city streets, to appreciate the inexhaustible fun of daily life. From there on to the fully trained artistic eye is the smaller half of the journey; the principal thing is the beginning, the opening of the eyes.


—Hermann Hesse


My fondest childhood memories are of times I spent roaming the seemingly endless fields around my home rapt in fascination with critters, flowers, interesting rocks, and the occasional relic of past cultures, almost always alone, sometimes with my dog. This was long before the age of mobile phones. I did not own a camera then. Back then, it never occurred to me to make any record of my finds nor to share my experiences with others—not in images, not in writing, not even telling anyone what I saw and felt. The point of being outside was not to bring anything back or to let anyone know what I was up to. The point was to experience—to see, to explore, and to be fascinated by—sensations and discoveries.

Little did I know then how profoundly these early experiences will shape my life in the years to follow in two important ways: they taught me the great pleasure of doing certain things for the sheer sake of doing them, and they rewarded me with a trove of beautiful memories to carry with me along the journey of life. They gave me the skills and sensibilities to find solace, sanctuary, and healing in natural places in both good times and in times when life was otherwise hard to bear. And, when I was not able to do so in body, they gave me a rich inner world of beauty and memories I could relive at will and could take with me anywhere I went. In a larger sense, they also gave me the absolute knowledge that even when the future seemed hopeless, I had already lived a deeply meaningful life and to feel grateful for all the wonder and beauty I was fortunate to have seen and felt, which seems to me the most worthwhile state one may aspire to, and without which life—no matter how otherwise pleasant or materially abundant—may seem pointless.

Photographers, more than other people, seem to relish and glorify seeing, sometimes even characterizing their discipline as “the art of seeing.” Still, in my experience, despite so many making such statements, few appreciate seeing in earnest as a source of pleasure and meaning, separate from and independent of “capturing” some of what was seen. So many times, I have heard photographers referring to times spent even in the most sublime of places without making any photographs as “wasted.”

The thought occurred to me recently when out in the desert peering through my binoculars at nothing in particular; just scanning and admiring the vast view and geography before me. So often when I’m out, I stop to point my binoculars at almost anything of aesthetic interest—critters going about their business, backlit grasses, flowers, the sides of mountains, insects covered in pollen just a few feet away—and become so fascinated that for a period whose duration I’m not even conscious of, I forget all else. I lose myself in the wonder of seeing, with no other goal or expectation other than to savor details, colors, movement. For the life of me, I can think of no less appropriate a word to refer to such times than “wasted.”

Very few photographers I know truly savor and appreciate seeing for its own sake, not even as an important and enjoyable precursor to making photographs. One such photographer that comes to mind is Walker Evans. “Leaving aside the mysteries and the inequities of human talent, brains, taste, and reputations,” Evans wrote, “the matter of art in photography may come down to this: it is the capture and projection of the delights of seeing; it is the defining of observation full and felt.” Another such photographer was Henry Peach Robinson. “Art,” he wrote, “is the result, in the first place, of seeing rightly, and, in the second place, of feeling rightly, about what is seen.” Yet another was Minor White who often wrote such quips as, “Watching the way the current moves a blade of grass—sometimes I’ve seen that happen and it has just turned me inside out,” and “I seek out places where it can happen more readily, such as deserts or mountains or solitary areas, or by myself with a seashell, and while I’m there get into states of mind where I’m more open than usual. I’m waiting, I’m listening. I go to those places and get myself ready through meditation. Through being quiet and willing to wait, I can begin to see the inner man and the essence of the subject in front of me.”

Of course, most people don’t share the circumstances of my formative years nor the personality traits that predisposed me to spend so much of my time in solitary communion with nature, watching, listening, smelling, allowing my thoughts to wander, most often for no other reason than to deepen and enrich the quality of my living moments. Most people, alas, are conditioned from a young age to strive for some kind of result to justify their investment of time: something they can share with others; something to prove that their time was not “wasted” in “idleness.” This is a shame.

In my writings and teachings over the years, I have always stressed the rewards—in terms of the quality and meaning of one’s life experiences and memories, and that these are well worth prioritizing well above any photographs or qualities of photographs, and even to the exclusion of making photographs if the act of photographing may distract from or prevent prolonged immersion in a deeper emotional state. After so many years I regret to say that this advice—perhaps the most important one I had to give as an educator—has also been the one that most of my students (even those I was able to convince) have failed to put into practice. This is, of course, because of no fault of theirs: it is much easier to form good habits where none yet exist (as I was fortunate to do by mere coincidence) than to unlearn bad ones, which may take great discipline and a lot of practice and may be rife with frustration. And yet, I remain convinced that—hard as it is—evolving the capacity to appreciate, to revel in, and to find reward in seeing for its own sake is one of the most worthwhile investments that a person can make in themselves, even if they never touch a camera.

 •  0 comments  •  flag
Share on Twitter
Published on February 21, 2025 04:58