An Ambiguous Utopia
Read the full post on my own site here
This is the subtitle of Ursula Le Guin's The Dispossessed.
It's stuck with me because, apart from being a major fan of her writing in general, being explicit about the ambguity that often exists in far-future settings that seem to be utopias is rare.
But it's all over the place. My own soon-to-be-published novels is one in some ways, and I'll briefly discuss my setting at the end. But first, I want to give examples from one of my other favourite Sci Fi settings, one that its author has described like this:
CNN: Would you like to live in the Culture?
Iain M. Banks: Good grief yes, heck, yeah, oh it���s my secular heaven....Yes, I would, absolutely. Again it comes down to wish fulfillment. I haven���t done a study and taken lots of replies across a cross-section of humanity to find out what would be their personal utopia. It���s mine, I thought of it, and I���m going home with it ��� absolutely, it���s great.
Iain M. Banks Culture series on the surface seems like a near perfect utopia.
Sure, there are issues around the fringes: How Contact and Special Circumstance deals with the civilizations surrounding them, the same way e.g. Star Trek has to deal with surrounding species that are sometimes a threat, sometimes needs help and where solving this is a problem.
But The Culture itself represents a very specific view of the future that I argue makes it an ambiguous utopia.
The first that comes to mind is that The Culture represents a society that has explicitly rejected "subliming", a sort of ascension to a state past material bodies. They've embraced a certain level of development and then gone "here but no further".
They're effectively luddites.
This is easy to ignore because they're so much further advanced than us, but these are people who have decided to fundamentally put the brakes on advancement.
But it's not just sublimation.
We learn of the extistence of virtual reality technology sufficient to simulate whole brains and put them in virtual environments, yet The Culture instead keeps building vast, wasteful structures instead of trivially simulating environments that could be far more dramatic, with far less impact on the surrounding world.
Surely this is fine - they're all happy and content, right?
To an extent, but we also learn that most Culture citizens choose a lifespan of only a few centuries.
Of course this can be argued to be a necessity, or they'd run out of resources.
And there is that ambiguity.
We could argue maybe they genuinely don't want to live any longer anyway, but is not that to an indication of ambiguity? Their world is not interesting enough for people to crave to go on living.
Note that I'm not arguing that these are flaws of Banks writing - they're part of what makes this world interesting.
Writing about a perfect utopia and making it interesting is hard.
Indeed, even with the issues I raise, most of Banks stories happen on the "fringes", where The Culture interfaces with the rest of the universe, the same way we learn frustratingly little about the internals of "stable" parts of the Federation in Star Trek, and mostly about the fringes.
When I started writing my own novels, I started describing something that on the surface is pretty damn amazing for a scifi geek that likes big space stations and big fancy space ships (spoiler: it has both).
But let me describe some of the ambiguous aspects of my own world, and relate it to Banks'.
The SingularityThe first thing I want to point out is the lack of a technological singularity.
The idea of a point in time where technological growth starts going so fast that we can't make meaningful preditions past it, is one I consider likely, given the advances I see in my day job at a venture capital firm today.
Both The Culture, and my own series, reject that notion. Why would I do that if I believe it's likely to happen? Well, the problem with it from a writers point of view is that the changes are unforeseeable, and any attempt at creating a world around it would be incredibly challenging to write to start with, but far worse to try to make relatable.
For me that means that my books will only include relatively limited artificial intelligences, because it seems unreasonable to me that we would develop a general artificial intelligence ("general" here implies one that can reason in a manner comparable to a human), like the Minds in The Culture without shortly reaching a singularity.
The reason for this is simple: If you develop a general artificial intelligence, then it can be copied. Which means you can teach it all about how to build an artificial intelligence, and copy it however many times you can afford (computing resources could be costly), and set it to work 24/7 without breaks on cost-reducing and improving itself.
Many would object to taking the risk of doing that, but it only takes one person in possession of a copy of a general artificial intelligence willing to do so before it's out of our hands.
Progress might well be slow at first, but you have to consider how rapidly it would compound.
Let's say you identify a 1% improvement every month. Even though finding further improvements might get harder, the intelligence of the minds working on it would also improve each time.
In a year that'd be ~12.7%. In 10 years, it'd be 330%. In 20 years, it'd be ~1090%.
But lets say you take that pool of 1% improvers, and you double it (or you split the initial pool in two - it'll soon catch up, bear with me).
Half work on improving intelligence. The other half works on improving efficiency, be it of software or hardware, so that for the same cost, each month you can afford 1% more capacity.
Where before you had 1000 "minds", after 10 hears you'd have 3300 minds.
You'll note this growth rate is far below the rate of growth of performance of computers we've had historically.
Yet, while improvements in minds does not translate linearly into speeding up completion of a project, it's reasonable to assume that a 3.3x increase in number of minds would increase the rate of improvements in intelligence, and that the increase in intelligence would increase the rate of efficiency enhancements.
So we'd expect this to compound far faster, and accelerate off into a level we realistically can't predict.
As a result, to me, the development of super-intelligent AI's seems to necessarily lead to a technological singularity.
Of course, it is possible that the AI's will behave like Banks' Minds and decide to not really change things all that much - part of the challenge of a technological singularity after all is that we can't predict what will happen next.
But it feels unlikely (but it makes for great reading - I love the Minds in The Culture), and so because I've wanted a "harder" style, I've avoided them in my setting. But in doing so, something needs to have gone wrong in the past that have basically brought AI development to a standstill in very specific ways to fit with what we know works today.
I haven't quite decided how to resolve that yet.
Virtual RealityThe second point, is that it is relatively reasonable to assume that virtual reality technology coupled with AI technology will at some point get us to a level where we can simulate brains in an environment that can be shaped to be however we want.
Banks introduces that, but most of The Culture still lives outside it. Or do they? - we can't really tell.
This to me is a challenge because on one hand I could see people rejecting it, on the other hand, such a world would provide wonders beyond imagining.
For my world, I've similarly stunted VR because I didn't just want to handwave it away as something people had opted out of. Coupled with the AI restrictions above, it's easy to explain away this: My world doesn't have the tech to simulate a fully, conscious mind, and so VR would be closer to e.g. Ready Player One's suits, and so a diversion but not a replacement for having a body.
LifespansOne of the things that to me makes the Culture feel like an ambiguous utopia is lifespans. Never mind uploading minds or "subliming", in the Culture, Banks makes it clear that Culture citizens for the most part choose to not live past a few centuries.
On one hand that does seem utopian - people live as long as they want.
On the other hand, to me this inherently feels strange. A society as advanced as The Culture which has not found a way to make people want to live longer?
It's an issue that is complicated. Maybe people just feel "ready" at some point? Yet other species do not - they sublime or have digital afterlives where they keep on living. The Culture is full of quitters!
At the same time it is framed as a choice. Of course those same people could go out and seek out whatever makes other people want to keep living. So why don't they?
Part of the challenge, presumably, is that if you want to posit a society where people live nearly forever, you need to write something very different. You'll be writing about a society with few to no children relative to people far older, for example. Interestingly in fiction there is a tendency - with a few exceptions - that societies with near immortality tends towards being passive, contemplative etc..
Would that be true? I don't know. We have some counter-examples, where sheer boredom makes such characters act crazy, like e.g. Michael Moorcock's The Dancers at the End of Time, an amazing and hilarious time travel cycle.
The eyes of the beholderWhat makes most such ambiguous utopias utopian tends to be our vantage-point.
Certainly that is the case for my setting for Galaxy Bound:
Technology allows for amazing space travel, and augmentation of the body and mind, and you can go and travel and see incredible things.
It's easy to get dazzled. I want to try to include descriptions that makes it seem like fun.
But on the other hand the edges are frayed. Some I've mentioned: AI has clearly hit a dead end. The singularity hasn't come. People still have jobs rather than just galivanting around the universe because they feel like it.
Part of the point is to dazzle enough with adventure and exciting tech that at least the geeks among us are prepared to ignore that these are people who keep putting themselves in crazy dangerous situations and live in uncomfortable quarters on a cramped converted freighter, and still kind of want to be there.
My setting tends much less towards an actual utopia than The Culture, and that is on purpose. Partly, I'll admit, out of fear that I'd struggle to create tension and conflict in a world like Banks'.
But at the same time, even just writing that, I'm reminded of Banks writing about the Idiran war, and casually describing more than 850 billion casaulties.
In the setting of The Culture, this is a tiny little proportion of people. And yet someone could write stories set in those decades of war for a whole career and just scratch the surface.
Ultimately what makes a utopia is the eyes of the beholder. What we focus on.
...Read the full post on my own site here


