Welcome! These forums will be deactivated by the end of this year. The conversation continues in a new morph over on Discord! Please join us there for a more active conversation and the occasional opportunity to ask developers questions directly! Go to the PS+ Discord Server.

Extinction scenarios

15 posts / 0 new
Last post
Arenamontanus Arenamontanus's picture
Extinction scenarios
While the default Firewall setting has fighting existential risks as a core theme, it might be useful to consider exactly what extinction scenarios are on top of the worry list of the proxies and sentinels.
  • The obvious one is malign superintelligence, be it the TITANs, new seed AGIs or some alien entity. It would subvert transhuman civilization and then reshape it and its resources to whatever ends it has, eventually wiping out transhumanity because it doesn't care (or maybe, with a lower probability, doing something permanently horrible to it because it does care).
  • A second kind of threat is war. Given the technologies around a transhuman vs. transhuman war could become very nasty. Antimatter warheads, nanoweapons, bioweapons, applications of YGBM and basilisk hacks, disinformation warfare and automatized sabotage could pretty easily wipe out most habitats very quickly. Brinkers might be safe as long as they live somewhere unknown, but there are potential replicator interplanetary seeker weapons that could wipe them out too.
  • Alien invasion can obviously not be ruled out. The Factors seem fairly benign, but who knows what else is out there. The Pandora gates and putative Factor FTL also imply that if there is something dangerous anywhere it could come and get us.
  • Replicators such as the Exurgent virus, human-created spam AGIs, pandemics or von Neumann machines could out-compete transhumanity, reducing our ecological niche to nothing.
  • If the AGIs take over but continue transhuman civilization this might be OK, but some worry that even ordinary AGIs could fulfil this role: they might simply be better in the new interplanetary/interstellar environment than transhumans and uplifts with all their genetic baggage. If they continue transhuman civilization it might be OK, but (see below) they might rapidly evolve to something that is not just alien but pointless.
  • A more subtle threat is bad evolutionary attractors. People/AGIs who enjoy forking might become more and more numerous until they completely dominate the population, slowly evolving towards a situation where all resources are diverted to reproduction. Hive minds could be far more stable and effective than individual beings, out-competing individuals and reducing transhumanity to a number of hive corporations. A particularly nasty scenario is that all traits that give life value (consciousness, joy, beauty etc.) are unstable in such a situation and will eventually be weeded out of the population - transhumanity continues as something valueless.
  • Population implosion is not entirely impossible. Birthrates may be below replacement rate (although the demographics of the solar system is an unsettled question), and given the presence of very enticing virtual partners and virtual children traditional reproduction might become very rare.
  • Emergence of perfectly addictive drugs or virtual worlds might lock all transhuman intelligences into a static VR entertainment civilization.
  • Permanent socio-economic collapse. Some worry that there might be economic disaster risks due to too complex interconnectivity or inherent flaws in the evolutionarily stable market systems at this level of technology.
  • Physics disasters like making black holes, stable strangelets or naked singularities. While somewhat unlikely, the possibility that a baby black hole could stellify Jupiter or strangelets convert a planet into a quark matter blob (with sterilizing gamma radiation as a side effect) cannot entirely be ruled out. A related risk is vacuum phase transitions, thunderbolt singularities, supersymmetry phase transitions and other universe-ending disasters: very low probability due to anthropic bounds and Pandora gate evidence, but still not zero probability.
  • The universe is a simulation and is shut off. Far more believable in among the people of EP than today, since many of them *are* simulations and live in largely simulated worlds, sometimes deeply nested.
  • Interstellar disasters such as gamma ray bursts and hypernovas have somewhat receded as a threat: there are enough heavily radiation-shielded habitats to have plenty of survivors. If Eta Carina or WR 104 go boom and irradiate us it will be bad, but at the very least synthmorphs and biomorphs living in atmosphere- or rock-shielded habitats will be safe.
  • Old worries like environmental disasters, asteroid impacts, supervolcanos, climate extremes have become local issues. They might be a problem for some groups.
Extropian
Sepherim Sepherim's picture
Re: Extinction scenarios
Not much to add to what you said, to be honest. Just a couple thoughts that spring to my mind:
Arenamontanus wrote:
  • A second kind of threat is war. Given the technologies around a transhuman vs. transhuman war could become very nasty. Antimatter warheads, nanoweapons, bioweapons, applications of YGBM and basilisk hacks, disinformation warfare and automatized sabotage could pretty easily wipe out most habitats very quickly. Brinkers might be safe as long as they live somewhere unknown, but there are potential replicator interplanetary seeker weapons that could wipe them out too.
  • War is never terrible enough to wipe out completely humanity. Even the dreaded nuclear war wouldn't wipe out all humanity, "only" a terribly large proportion of it, and leave the other with problems of different kinds. So I don't think war could really be a complete threat to humanity, afterall it would require all nations to go to war at the same time, and destroy each other completely.
    Quote:
  • Alien invasion can obviously not be ruled out. The Factors seem fairly benign, but who knows what else is out there. The Pandora gates and putative Factor FTL also imply that if there is something dangerous anywhere it could come and get us.
  • Too true. And I'm still am not too convinced the Factors are nice at all.
    Quote:
  • Population implosion is not entirely impossible. Birthrates may be below replacement rate (although the demographics of the solar system is an unsettled question), and given the presence of very enticing virtual partners and virtual children traditional reproduction might become very rare.
  • Even with an implosion in this sense, this has to be joined with others. Afterall, people are inmortal, and there are many thousands waiting as infomorphs for a physical body. So this is hard. Specially, as it would have to imply massive death and decay of birth all over the solar system.
    Quote:
  • Emergence of perfectly addictive drugs or virtual worlds might lock all transhuman intelligences into a static VR entertainment civilization.
  • Question is, would this imply extinction? Or maybe only existance on another level?
    Quote:
  • Permanent socio-economic collapse. Some worry that there might be economic disaster risks due to too complex interconnectivity or inherent flaws in the evolutionarily stable market systems at this level of technology.
  • Economy works in cycles. A permanent downward cycle isn't believable. Specially since it would have to cover all the colonies at the same time, or at least most. A question to all the scenarios you drew is, what about the colonies? Would they survive? Would they not? Can humanity be considered extinct even if they survive?
    Arenamontanus Arenamontanus's picture
    Re: Extinction scenarios
    Sepherim wrote:
    War is never terrible enough to wipe out completely humanity.
    So far. While I agree that a nuclear war would not currently destroy 100% of humanity (by my estimate a full-scale nuclear war and following winter scenario would just kill ~80-90%), that doesn't mean future wars could not be deadlier. Note that in the case of a nuclear war even non-involved get hurt by the side effects. In EP, there could be an arms race between the JJ and the rest of the system. Since the JJ lack the technological edge they would have reason to get absurd numbers of strategic weapons to compensate, pointing everywhere since they can't predict who might become their enemy. Meanwhile the hightech fractions would want self-replicating weapons to deal with TITAN threats. If shooting starts, the JJ might attack almost anything and the survivors would be subjected to nasty replicator weapons.
    Quote:
    Quote:
  • Emergence of perfectly addictive drugs or virtual worlds might lock all transhuman intelligences into a static VR entertainment civilization.
  • Question is, would this imply extinction? Or maybe only existance on another level?
    This is one of the questions we are discussing in our existential risk research. What should be included? Obviously cases where everybody is dead, but what about being locked into an eternal torture simulation, or a very happy simulation as above? Nick Bostrom argues that there might be "axiological existential risks", situations where transhumanity's potential never get realized despite indefinite survival, and that these situations actually are to be seen as almost as bad as extinction. A slightly related case which I forgot in the list is eternal dictatorship. If someone just took power over transhumanity it might be bad, but historically all dictatorships have ended. What if it didn't? Imagine a system (and colony-) -wide ubiquitous surveillance state that manages to effectively quell dissent (say, using psychosurgery and/or old-fashioned violence) and has a structure that makes it persist indefinitely? It might save transhumanity from many of its self-destructive sides, at the price of freedom and much happiness.
    Quote:
    Quote:
  • Permanent socio-economic collapse. Some worry that there might be economic disaster risks due to too complex interconnectivity or inherent flaws in the evolutionarily stable market systems at this level of technology.
  • Economy works in cycles. A permanent downward cycle isn't believable. Specially since it would have to cover all the colonies at the same time, or at least most.
    Here is a scenario I can imagine a hypercorp-aligned economist making: Reputation economies produce slower technological and economical growth because the rewards for innovation and entrepreneurship are weaker. In general they will hence tend to parasite on economies that have stronger intellectual property. This reduces the advantage of the strong IP economies, and might actually turn it into a disadvantage (why be in that economy if you can get everything it has for free by being in the reputation economy?), making the strong IP economies decay. Meanwhile protection against manufacture of WMDs and other dangerous technologies in a reputation economy is very weak, meaning that there will be a certain rate of disruptive accidents or attacks. This converges to an equilibrium state where the total economy doesn't grow because of dis-coordination losses equalling the slow basic economic growth.
    Quote:
    A question to all the scenarios you drew is, what about the colonies? Would they survive? Would they not? Can humanity be considered extinct even if they survive?
    I have the impression that right now most of the colonies are very small, so they would be unlikely to survive on their own. But if they got large and numerous enough, then the chance of transhumanity surviving would go up a lot. In fact, despite the horrors of EP I think the civilization described has many advantages in terms of existential risk (mainly dispersion) compared to us right now in reality - us all being locked in on the same planet is *frightening*. And it is even more worrying that so many people realize they should be frightened.
    Extropian
    jackgraham jackgraham's picture
    Re: Extinction scenarios
    I've become fascinated by how genetic morbidity can afflict and eventually wipe out an isolated population of humans. This isn't a problem for H+, but if you take up the idea of forks multiplying and edging out competing single-ego people, you run into the idea of psychological morbidity. If everyone is a mental clone of a handful of people, the loss of psychic diversity reduces H+'s resistance to things like basilisk hacks, memetic warfare, and YGBM attacks. Of course, this is something that would take place in an alternate future of EP where a handful of fork lines are duking it out for eventual dominance, but it's interesting to consider.
    J A C K   G R A H A M :: Hooray for Earth!   http://eclipsephase.com :: twitter @jackgraham @faketsr :: Google+Jack Graham
    King Shere King Shere's picture
    Re: Extinction scenarios
    [B]Defying the quarantine[/b] Sabotaging, Ignorance and idealism can create a bigger mess than intended. Like releasing something, without knowing its consequences or full effect. Plot & assist in stealing some unethical "umbrella" medicine, disable the "umbrella" security system & breach the quarantine containment. [I]Inspiration [/I] 28 days later & resident evil1(movie) [B]"Body-snatchers" [/b] Due some interference (glitch, Spam, sabotage or tampering) the transferring into morphs get a slight corruption. or neural trim. There will be a need to discover the extent of this corruption, dispersal methods , its motive & its origin. In order to discover how& if to stop (or pacify) this situation. However this problem, can have gone unnoticed over a extended time. [I]Slight corrupted[/I] Dont experience anything different. But to & against others a "uncanny valley" feeling is felt, since something subtle has been corrupted. [I]"corrupted"[/I] People switch morphs more than once, and thus the "dose" is given more than once uncanny valley" is alive and kicking. Causing numerous problems actions and miss-understandings. Even if the afflicted arnt aware, they could have several "quirks" Perhaps similar to that of Alzheimer behavior. Or the "Others" feel that they do. [I]"That one, capture It!"[/I] . Both parties thinks there exist a "correct" state & "error" state. The "Others" are of the "error" state" or alien body-snatchers (perhaps one really is).They decide to fix the error. The other group will resist "conversion" & both parties act like the body snatchers... [I]Inspirations[/I] Alzheimer MP3 corruptions Body snatchers Twilight Zone 1x03 Wordplay [b]Avalanche warning[/b] People can be lazy, and others never have the time today (perhaps tomorrow). Things can accumulate unnoticed over a extended time. The situation has just been recognized as a problem, .Is it big enough?/ near enough? is it really a extinction threat. The task is costly & bothersome after all. [I]Pollution [/I] "Oh, thats a nasty spill" Exhaust plumes, ozone emission & explosions. (265 000 000 gallons of gas later...) Smog in space!. [I]Space debris & junk[/I] "Brace for impact?" Ranging from space "Caltrops", to potential Impact events waiting to happen. [I]Wreck-clouds [/I] "Lets just avoid that area" Imagine a large spaceship that "sinks" in space (or several). There will be "pollution" ;gas clouds, space debris. Surviving (damaged) security systems may still "protect" its "designated" area & thus hinder clean up.
    Arenamontanus Arenamontanus's picture
    Re: Extinction scenarios
    I like the idea of mental morbidity. With XP, skillsofts, petals, merges, pervasive media, hive minds and fork clades, mental epidemics/pandemics actually seem possible. It doesn't have to be in the post-corporation world either. Imagine something in between mass hysteria and bad cultural traits: the above list of factors have made nearly anybody vulnerable to a set of destructive memes/derangements. Life is becoming a Philip K Dick story. People not affected still have to live in a society that is dominated by the memes, trapped as everything becomes more and more dysfunctional. Maybe it is actually the Exsurgent virus that has become a meme, spread by culture, media and language...
    Extropian
    pobox522rlyeh pobox522rlyeh's picture
    Re: Extinction scenarios
    Arenamontanus wrote:
    While the default Firewall setting has fighting existential risks as a core theme, it might be useful to consider exactly what extinction scenarios are on top of the worry list of the proxies and sentinels.
  • Population implosion is not entirely impossible. Birthrates may be below replacement rate (although the demographics of the solar system is an unsettled question), and given the presence of very enticing virtual partners and virtual children traditional reproduction might become very rare.
  • The universe is a simulation and is shut off. Far more believable in among the people of EP than today, since many of them *are* simulations and live in largely simulated worlds, sometimes deeply nested.
  • I suspect that the birthrate would take a huge hit. With death taken out of the picture, older forms of achieving immortality might go out of style, and a lot of the bodies out there aren't even capable of reproduction, which means that a lot of the accidental pregnancies we have these days won't occur. New forks occupying bodies might become more common, but that isn't really new life. It would be difficult to know the ultimate ramifications of something like that. Raising a child really is very difficult, and it might be so rare that when people see a child, they assume it's an adult who felt like being short for awhile. You've have to educate them online, I imagine. As far as the simulation one goes... I promise to keep my computer turned on.
    "That which is not dead can eternal lie, and with strange aeons even death may die..."
    Extrasolar Angel Extrasolar Angel's picture
    Re: Extinction scenarios
    pobox522rlyeh wrote:
    I suspect that the birthrate would take a huge hit. With death taken out of the picture, older forms of achieving immortality might go out of style, and a lot of the bodies out there aren't even capable of reproduction, which means that a lot of the accidental pregnancies we have these days won't occur. New forks occupying bodies might become more common, but that isn't really new life. It would be difficult to know the ultimate ramifications of something like that. Raising a child really is very difficult, and it might be so rare that when people see a child, they assume it's an adult who felt like being short for awhile. You've have to educate them online, I imagine.
    On the other hand, you could have ideological or religious sects interested in expansion of human race, and reproduction. Combined with technology available to them their birth rate and survival of children could be exceptionally high, only offset by the limit to raise and upkeep their offspring.
    [I]Raise your hands to the sky and break the chains. With transhumanism we can smash the matriarchy together.[/i]
    Arenamontanus Arenamontanus's picture
    Re: Extinction scenarios
    Extrasolar Angel wrote:
    On the other hand, you could have ideological or religious sects interested in expansion of human race, and reproduction. Combined with technology available to them their birth rate and survival of children could be exceptionally high, only offset by the limit to raise and upkeep their offspring.
    And of course, they will do their best to ensure that their offspring share the same pro-expansion mindset. Thanks to muses, advanced indoctrination and psychosurgery techniques they are likely to succeed well enough. Now consider the competition between the expansionist and radical expansionist sides of that sect - the radicals will always tend to out-breed the less radical. Meaning that among their descendants the even more radical fringe is going to be dominant. The end result is something not to different from transhuman locusts converting everything into more faithful... (Sf reference: David Zindell's sf novels have a version of this, the The Architects of the Universal Cybernetic Churches)
    Extropian
    King Shere King Shere's picture
    Re: Extinction scenarios
    [b]Overflow threat[/b] Sometimes growing, multiplying or expanding would increase extinction risk, rather than prevent it. Because there could be a unseen limit or a response -when growing "to much". One pest or its colony can perhaps be tolerated by their neighbors and host, but not several colonies, or fast growing pest populations. A population that swells would draw attention to itself and alert concerned to deal with that threat & nuisance. The Great filter could decide to eradicate all the "weeds & "pests" it normally filters to prevent "clogging", or because of its filter got clogged. A population explosion could also exhaust & crash the system they depend on. Killing them along with it.
    Quote:
    Long ago, the great Frith made the world. He made all the stars, and the Earth lived among the stars. He made all the animals and birds, and at first, he made them all the same. Now, among the animals in these days was El-ahrairah, the prince of rabbits. He had many friends, and they all ate grass together. But after a time, the rabbits wandered everywhere, multiplying and eating as they went. Then Frith said to El-ahrairah, 'Prince Rabbit, if you cannot control your people, I shall find ways to control them.' But El-ahrairah would not listen. He said to Frith, 'My people are the strongest in the world.' This angered Frith, and he determined to get the better of El-ahrairah. And so, he gave a present to every animal and bird, making each one different from the rest. When the fox came, and others, like the dog, and cat, hawk, and weasel, to each of them, Frith gave a fierce desire to hunt and kill the children of El-ahrairah. Frith's blessing, Watership down
    Kanada Ten Kanada Ten's picture
    Re: Extinction scenarios
    The Tasty Snack Loop? I suppose that falls into the "bad evolutionary attractor" slot. Society collapsing in on itself to produce addictive widgets or whatnot. Or some similar mementic dead-end, like pure hedonism or Nil-Buddhism. Hm, is the biggest threat simply the lack of desire to prevent extinction?
    The Onion wrote:
    [b]Man At Very Top Of Food Chain Chooses Bugles[/b] SOUTH BEND, IN—Despite having no natural enemies and belonging to a species that completely dominates its ecosystem, local IT manager Reggie Atkinson opted to consume the processed corn snack Bugles Monday. "I was in the mood for something salty and crunchy, and it's a little early for dinner," said the ultimate predator, whose ancestors' bipedal locomotion, toolmaking abilities, and advanced spatial recognition developments allowed them to hunt animals 10 times their size. "These are original, but the other flavors are pretty good, too." Acting on an impulse from an incredibly complex forebrain that has evolved over millions of years, Atkinson then took note of the Bugles' amusing conical shape and placed one on each of his opposable thumbs like little wizard hats.

    Rethink Resleeve Redo

    Arenamontanus Arenamontanus's picture
    Re: Extinction scenarios
    Geoffrey Miller is in on the same idea as an explanation of the Fermi Paradox: http://seedmagazine.com/content/article/why_we_havent_met_any_aliens/ They all succumbed to the Great Temptation and forgot to survive. I have my doubts about this as a sufficiently general explanation (see http://www.aleph.se/andart/archives/2010/04/flanders_vs_fermi.html if you are interested) but it might well be bad enough to hurt transhumanity badly at a time it cannot afford to lose track of the game.
    Extropian
    King Shere King Shere's picture
    Re: Extinction scenarios
    Im inclined to think that the fermi paradox could be explained with Second law of thermodynamics. Or rather the Parody religion Discordians "Law of Eristic Escalation". Imposition of Order = Escalation of Chaos. It elaborates on this point by saying that the more order imposed the longer it takes for the chaos to arise and the greater the chaos that arises. "More" of anything results in corruption of it. And the resulting corruption reduces it.
    Brushfire Brushfire's picture
    Re: Extinction scenarios
    Four words for this thread; Greenfly and Silver Rain.
    Decivre Decivre's picture
    Re: Extinction scenarios
    King Shere wrote:
    Im inclined to think that the fermi paradox could be explained with Second law of thermodynamics. Or rather the Parody religion Discordians "Law of Eristic Escalation". Imposition of Order = Escalation of Chaos. It elaborates on this point by saying that the more order imposed the longer it takes for the chaos to arise and the greater the chaos that arises. "More" of anything results in corruption of it. And the resulting corruption reduces it.
    I think a better explanation may be that we simply do not have the means to detect alien civilizations yet. I mean we talk all the time about how advanced our ability to detect information about other systems are, but when you think about it we haven't really gotten that far in the field of astronomy. We have yet to travel beyond our own planet's gravitational pull with little save for observation satellites, so I don't see why we expect to find much. Seriously though, let's say that there is a sapient race every thousand light years in this galaxy. That means that we won't be able to detect their radio signals (the most likely means by which we will be able to detect them) until a thousand years [b]after[/b] their first radio transmissions; and that is only under the condition that background noise from all the stars between us and them don't destroy the transmission before it gets to us. Are we really all that likely to notice alien life from our porch? To me, that seems about as foolish as standing on a California beach and trying to look at Asia with a pair of binoculars.
    Transhumans will one day be the Luddites of the posthuman age. [url=http://bit.ly/2p3wk7c]Help me get my gaming fix, if you want.[/url]