What’s the worst case scenario?
An artificial superintelligence could theoretically escape any control system and trigger a slew of Hollywood nightmare scenarios, ranging from the collapse of the entire banking system to global nuclear war.
An AGI intent on harming us could behave like a computer virus, taking over online markets, a mental virus, spreading ideas that corrupt our ability to live in peace, or even a biological virus, rapidly wiping out much of the population.
AGI might also lack malice entirely and accidentally wipe us out, just as we might drive over a flower bed without noticing. The possibilities for extinction in the hands of an unchecked AGI are endless.
However, in this book, we will steer away from speculating on what an evil machine might do. Instead, we’ll focus on what we know best—human nature—and imagine its interactions with AGI.
As Hanlon’s razor wisely advises, we should not “attribute to malice that which is adequately explained by stupidity.” Our focus, then, should be on preventing stupidity.
This page's topic is:
What’s the worst case scenario?