An existential risk is anything that can cause the extinction of humanity. A catastrophic risk is one that can cause our near-extinction, or destroy human civilization as we know it. Existential and catastrophic risks are often divided into human induced risks and natural phenomena.
The natural phenomena include things like huge asteroids hitting the Earth, and solar flares or coronal mass ejections with a high enough intensity to knock out our power grid, destabilize nuclear cooling stations and lead to uninhabitable biosphere and ocean acidification.
Although natural risks are important to look at, Daniel Schmachtenberger is more concerned about human induced existential risks.
These include man-made climate change, which can cause agricultural failure, mass migration, and ultimately state failure and war. It also includes the risk from nuclear weapons and exponentially growing technologies like AI, biotechnology, and nanotechnology.
Daniel Schmachtenberger claims that an underlying problem behind many of man-made existential risks is our collective loss of ability to make sense of the world. With the rise of fake news, scientific research being funded by corporate interests, and technologies evolving faster than we can keep up with, it’s getting harder and harder to understand what is true and what is important.
“Exponential tech increases our ability to affect things, but not the quality of our choice.”
Designing Post-Capitalist Systems
We have a system that runs on competitive advantage, both in national markets and a global level. This system is based on win-lose game theory, which incentivizes actions that give people and entities a competitive advantage, rather than actions that benefit humanity as a whole. This leads to pharmaceutical companies wasting time and talent repeating research (or not doing it at all if it’s non-patentable) while people die, and the earth holding enough nukes to wipe out the planet more than 10 times over.
Far from hopeless, Daniel reminds us that most of what we think of as “flawed human nature” is actually just social conditioning. Conditioning which is possible to change, or redirect with properly placed incentives.
The fundamental question Daniel is trying to answer is this: how do we design a system inside which all incentives are properly aligned, and we start valuing things around us for their systemic and not differential value?
“We have a system that attracts, incents and rewards pathological behavior.”
In This Episode of Future Thinkers Podcast:
- The loss of sense making
- Why do people spread intentional disinformation?
- What is the value of a tree?
- What do we do with the shitty jobs?
- Human nature
- We’re the tentacles of the universe!
- I don’t exist outside the context
- The power and wisdom of Gods
- Are we entering a new age?
“Shit, the whole evolutionary process resulted in me.”
“If you’re scaling towards the power of gods, then you have to have the wisdom and the love of gods, or you’ll self destruct. ”
“The real existential risk is a loss of the ability to make sense of the world around us: what is worth doing, and what the likely effects of things will be”
“We do have an innate impulse towards agency, towards self actualization. Within a win-lose game structure, that will look like a competitive impulse. But within a win-win structure, that will look like the desire to go beyond my previous capacity.”
Mentions and Resources:
- The Grey Goo Scenario
- Capitalism is a paperclip maximizer
- Barbara Marx Hubbard
- I am, because of you: Further reading on Ubuntu
- The Choice is Ours (documentary)
- Zeitgeist Addendum (documentary)
- Paradise or Oblivion (documentary)
- Resource Based Economy
- Future of Life Institute
- Center for The Study of Existential Risks