Vidar Hokstad's Blog

December 10, 2022

Mana Winds

Read the full post on my own site here

We sail the mana winds into interstellar space. We surf the bursting magic of the stars. ...

Read the full post on my own site here

 •  0 comments  •  flag
Share on Twitter
Published on December 10, 2022 16:00

July 1, 2021

Galaxy Bound #2 Out now

Galaxy Bound (Sovereign Earth, #2) by Vidar Hokstad

Galaxy Bound


The Centauri Gate has opened, and humanity has taken its first steps into the unknown.

When an Earth freighter is destroyed in Centauri space, Zara Ortega and the crew of the Black Rain are sent to investigate, accompanied by an Earth military officer and a Centauri diplomat.

It’s apparent something is wrong, and when they chase down the leads, they end up facing off against heavily armed warships that should not be there.

Soon they are in another desperate race to survive. They seek shelter in a Centauri colony near Proxima. But those following them mount a brutal assault, and they pay a high price to flee.

They need to get word out, or it means war.
1 like ·   •  0 comments  •  flag
Share on Twitter
Published on July 01, 2021 23:46

March 13, 2021

The Universal Logic Bomb

Read the full post on my own site here

"We don't know at all either the spatial or temporal extent of whatever we actually live in, if we exist as more than shadows of simulated characters. Maybe the entirety of existence is just this moment in your mind replayed over and over."

I was used to the professor suddenly blurting out things I didn't understand, but this was extra odd, and against my better judgment I replied:

"What do you mean?"

"Have you heard about the Simulation Argument?" He said.

I shook my head.

"It's an argument from statistics that if a certain set of condition holds true, then we most likely live in a simulation."

"You mean... we'd be characters in some computer game?"

"Pretty much, yes. The idea that we can't trust our senses is very old. After all, if you can't trust your senses, how would you determine anything about reality?"

...

Read the full post on my own site here

 •  0 comments  •  flag
Share on Twitter
Published on March 13, 2021 03:56

March 12, 2021

Window

Read the full post on my own site here

Tove often pondered what reality really was. Maybe it was true that we are all in a simulation?

So when a window opened in front of her, hanging in thin air, it oddly didn���t bother her so much as it made her think ���about time���.

...

Read the full post on my own site here

 •  0 comments  •  flag
Share on Twitter
Published on March 12, 2021 16:00

January 26, 2021

Interviewed by MasonSpeaks

You can find an interview with me here. I had a very nice, interesting conversation with Mason. Check it out, and like/subscribe.
1 like ·   •  0 comments  •  flag
Share on Twitter
Published on January 26, 2021 09:55

December 16, 2020

" Amazing! I totally love this!"

First review of The Year Before the End on US Amazon:

5 stars

"I wasn't sure about this but decided to give it a chance as I love first contact stories and real hard sci-fi. And yes, it was way better than so many I have tried, so I have to recommend it."

Read the rest
1 like ·   •  0 comments  •  flag
Share on Twitter
Published on December 16, 2020 15:53

December 14, 2020

Interview

1 like ·   •  0 comments  •  flag
Share on Twitter
Published on December 14, 2020 14:18

November 7, 2020

Nesting

Read the full post on my own site here



Professor Greene spread his wings while he brushed six fingers over his facial hair. I did not dare disturb him while he was thinking, and waited anxiously for his feedback.

"You're right, this does pose a problem. The simulation will grind to a halt if they keep this up."

We looked at the footage of the simulated people in their simulated lab as they were coming to terms with the simulated complexities of building a simulated simulation just like the simulation they were in.

The problem was obvious to both of us. All our careful optimization worked through layers on layers of "dirty hacks". We cared about observing and learning from their behaviour, and so we did not need every little physical detail of the simulation to be precise.

The simulation carefully tracked what they observed, and discarded all state we could tell they could not keep track of. This again allowed us to massively prune what we simulated accurately vs. what we replaced with crude approximations.

Weather, for example, is so inherently complex, that we could generate random perturbed patterns from crude, cheap to compute models that matched their expectations of complexity. We only needed to fill in the detail in narrow zones around simulated weather stations, and to a lesser extent around simulated people.

"As far as I can tell their simulation logs enough data that there's no obvious way for us to just synthesize predictions. If something looks out of place, they'll try to trace it, and they'll spot discrepancies."

Professor Greene had started pacing.

"It'll ruin all our work! It's already a massive resource drain to simulate their damn computers, but this is on another scale entirely. It'll slow the simulation down orders of magnitude."

We simulated their computers as much as possible by "lifting". We translated their programs into code that could run on our computers.

Heavily sandboxed and firewalled of course, we didn't want to risk an "escape" - anything that would let their code detect details of something outside their simulated reality.

We'd "tap" the inputs of their computers, and send it to the translated programs running on ours, and feed the outputs back. To them it looked like their computers did the work, but we just simulated the power drain and heat they'd cause, and the people running them were none the wiser.

Their own computers didn't atually do anything unless they hooked up diagnostics equipment. In those rare cases we'd let them work, while they observed. The system automatically detected this. It was quite a clever bit of code.

But simulating the kind of large-scale simulation they were setting up... that was another thing entirely.

"You don't think they suspect?" Greene asked.

"Well, they have come up with a quite reasonable understanding of simulation," I told him. "But it's still seen as fringe. We've checked their communications. They're not doing this to test if they're themselves in a simulation in any way. They're motivations are pretty much the same as ours at this point. They want to understand consciousness, just like us."

"Whatever their motivations, we need to figure out a workaround."

Greene put on his glasses.

"We'll continue tomorrow. It's late."

He went outside, and I could see him fly off towards his nest through the window.

I wasn't ready to give up yet, and spent the evening probing and prodding our simulation to see what I could come up with.

I don't know what time I fell asleep. Only that it was bright outside by the time professor Greene tapped me on the shoulder and woke me up.

"I think I have a solution," was the first thing I said to him.

It was not clear to me when it had come to me. It might have been the previous evening as I was going through our code. It might have come to me in a dream.

But it was a logical extension to our growing pile of hacks.

"Do tell," he said, with a smile.

"The problem is they're logging extensive data about everything, right? Full traces?"

"Yes, that's the biggest problem. The complexity of faking the data is one thing, but cross-correlating it and fixing all their logs and all the dependent data, and even their memories if any of them remember details would be almost as complex as allowing them to run their simulation itself."

"But we do the same thing! We have extensive logs as well. And since the simulated world is based on ours, everything is a relatively close match. After all, we want to understand our world."

"Are you suggesting?"

He paused.

"You're suggesting we feed them our logs?"

"Exactly! We just perturb them a bit to prevent it getting too recognizable."

He thought it over and gave it his approval.

It was an elegant solution, I though. We'd train one of our detectors to recognise attempts at running a large scale universe simulation. It would alert us if they tried running anything unusual, but assuming the simulation matched the parameters of our own simulation, it'd add a patch - bypassing the slow, low-level computer simulation, and even the faster dynamic translator, and instead start feeding them traces from our own recordings of our own simulation of them.

They'd look at their data, and think it was their simulation, but instead they were looking at their real lives.

There were many details to figure out, but we'd caught them early - they still had lots of coding to do. And of course we could arbitrarily slow down the simulation if necessary.

We were apprehensive when they first switched their simulation on, but also certain that they'd write off any discrepancies in the results as bugs, and give us a chance to address them.

We were right.

"Congratulations," professor Greene said as we were watching them in their simulated lab, watching the output from our/their simulation, and celebrating our/their success. The scene was so familiar. Just how we had celebrated when we first got everything running.

He held out his hand, and I grabbed it. He rarely gave praise, so when he did, it meant a lot.

I watched them long after he had flown home, before I too flew to my nest.

We observed their simulation exercise less and less often over the coming weeks and months. The initial urgency was over, and we had a big world to monitor, and research papers to write about what we were learning.

One morning the alarm went.

The one we had set to trigger if something abnormal was happening with their simulation project.

When professor Greene joined me, I had determined the cause already.

"Our latest speedups... They mean the simulation will be a bottleneck."

We'd had a whole team spend months on new hacks to accelerate the next version of the simulator so it'd now on average run about twice "real time" - a standard 26 hour day would be simulated in a little under 13 hours.

Some ingenuous engineering coupled with a grant for more computing power meant we hoped to get it down to less than 3 hours per day evenutally.

But in our excitement we'd forgotten their simulation.

"They're rapidly catching up to our logs. When we run out, we have to slow everything down or let their simulation actually run whenever it overtakes. Which will also slow everything down."

"We can't do that. We'd have all kinds of questions about why we spent that time and money only to be unable to speed things up. It defeats the entire purpose of the new grants."

Professor Greene rarely shouted like this.

He paced for a bit. Then he brought up some notes and looked it over.

"But we might have a solution. The new Eraser module."

"You mean? But it'll waste a lot of effort too."

"You're right, but their simulation project isn't important to us. We'll need to verify, but I think the impact on their society will be minor enough to be acceptable."

We agreed to meet again a couple of days later, when we'd had a chance to work through the implications.

"It all checks out," I told him. "We can delete the lab and the lead scientists with relatively small impact."

The Erasure module was a new, last resort, cleanup tool. It'd trace memories and observations of someone by other people in the simulation and "smudge them" into a generic memory of non-specific people that made the simulated person we deleted disappear not just from the simulation, but from the memories of the other simulacra.

They'd have just a vague recollection of someone they'd be unable to place.

We'd erase the whole lab, wipe their simulation, and hope it'd be a long time before someone else would come up with the idea. It'd save us a whole lot of trouble. But they'd be gone.

I'd not paid attention to what they were up to in there for a while, but it felt a bit weird. After all I'd spent a lot of time watching these simulations and their lives. I told myself it was okay. After all they were not real.

"Let's do it," professor Greene said.

I typed a command and watched the ripple through the simulation as the lab disappeared and with it professor Browne and his assistant, right in the middle of dealing with the same problem we had been. Reduced to some data in our logs.

---

"Let's do it," professor Reede said.

I typed a command and watched the ripple through the simulation as the lab disappeared and with it professor Greene and his assistant, right in the middle of dealing with the same problem we had been. Reduced to some data in our logs.

---

"Let's do it," professor Jaune said.

...

...

Read the full post on my own site here

 •  0 comments  •  flag
Share on Twitter
Published on November 07, 2020 16:00

October 4, 2020

Tell, don't show

Read the full post on my own site here



From the department of "oh, no, he's avoiding his editing again"

I know a few of you - authors, English literature students, and a few others - might have gotten close to choking on something when you saw "tell, don't show" since you've been told the opposite over and over.

Or maybe you're angrily firing up Twitter just from the title ( come at me! )

Ironically I've showed you, rather than told you about, one of the oldest tricks in the book: Make a controversial statement to pull you in, only to dial it back in the actual post.

Yeah, sorry, clickbait (I considered, but couldn't get myself to, using a title along the line of "13 weird reasons you should tell, not show, number 7 will amaze you" - merely typing that out here made me feel dirty)

But I'm serious. Somewhat at least.

I got to thinking about this Sunday morning, in bed, because I was trying to figure out how to wrap up editing of The Year Before The End, and particularly how I was procrastinating like an absolute champ last week.

...

Read the full post on my own site here

 •  0 comments  •  flag
Share on Twitter
Published on October 04, 2020 16:00

September 28, 2020

Causal Boom

Read the full post on my own site here



(this was sent to my mailing-list first; sign-up below if you want more like this, along with other updates)

"Light in a vacuum moves at the speed of propagation of causality," he said.

I looked back at him with a blank expression. I had no idea what that meant.

We were walking past the accident site towards the tents that had been housing the investigation.

...

Read the full post on my own site here

 •  0 comments  •  flag
Share on Twitter
Published on September 28, 2020 16:00