"We then scrapped it and started over again, many times, until we had a finished scenario that we thought was plausible."
. . .
"For about three months, [AI model] expands around humans, tiling the prairies and icecaps with factories and solar panels. eventually it finds the remaining humans too much of an impediment: in mid-2030, the AI releases a dozen quiet-spreading biological weapons in major cities, lets them silently infect almost everyone, then triggers them with a chemical spray. Most are dead within hours; the few survivors (e.g. preppers in bunkers, sailors on submarines) are mopped up by drones. Robots scan the victims' brains, placing copies in memory for future study or removal."
lol ok. Then when this doesn't happen the authors'll be like "oh well of course this didn't happen; you took our recommendations into account and avoided the doomsday scenario."