Welcome! These forums will be deactivated by the end of this year. The conversation continues in a new morph over on Discord! Please join us there for a more active conversation and the occasional opportunity to ask developers questions directly! Go to the PS+ Discord Server.

Exhuman questions

19 posts / 0 new
Last post
Arenamontanus Arenamontanus's picture
Exhuman questions
Typical coffee break question at our institute: What probability of dying would you accept if you would become an immensely powerful and enlightened posthuman if you survived? The fun thing is that even among people I know who are fairly radical transhumanists the majority at most accept somewhere around 10%, even though they believe that a posthuman life could be *amazingly* much better than our current life. However, there are a few who take their estimates seriously and would accept something like 99% chance of dying, since the rewards outweigh the risk so much. In EP, exhumans are likely making the same estimation. And accepting enormous risks for the shot at becoming gods. Of course, with backups you can even try again and again if you fail - so what if a hundred copies of you die in agony or are driven permanently mad if one of you can transcend all human limitations? A related question: what risk of wiping out mankind is acceptable for learning something interesting in your favourite field? From a friend who had been arguing with him about AGI-related existential risks I learned that a world famous AI researcher agreed that even if he believed that there was a 10% chance of his research led to the end of the world he would still do it. Not the one chance in a billion (or less) that LHC physicists seem to accept as a existential risk limit for when to be concerned, but 10% chance. Maybe the researcher was not taking his numbers seriously, but it is still intriguing.
Extropian
Draconis Draconis's picture
Re: Exhuman questions
I'd accept a 10% chance but I lean towards the radical side. And no I don't need to be a god I'd accept merely better than human.

[img]http://boxall.no-ip.org/img/infected_userbar.jpg[/img]
[img]http://boxall.no-ip.org/img/exh_userbar.jpg[/img]

"Do not ask who I am and do not ask me to remain the same" - Michel Foucault

Axel the Chimeric Axel the Chimeric's picture
Re: Exhuman questions
This is all an issue of balances. Risk versus benefit is a big question here. Risking a 99% chance of death for a 1% chance of, say, having fewer cavities in my lifetime is not a good improvement. I'm certain that's not the level of improvement you had in mind but I don't really know what level you had in mind. Frankly, if you offered me a chance to, say, inhabit a potentially immortal Remade for a 10% risk of death, I'd honestly not be sure even then, especially if the potential cause for risk is nebulous. If you offered me a chance to be a law-of-physics-defying super intelligence, well... That's something that may make me reconsider the odds. As for what's the risk for destroying mankind in exchange for learning something... I'm not really sure. Below 0.005% seems to be the acceptable benchmark for most scientists. I doubt this world famous AI researcher would really pursue this course of action if he thought the actual risk is that high (whether it is or isn't making no difference).
root root's picture
Re: Exhuman questions
root@Exhuman questions [hr]
Arenamontanus wrote:
A related question: what risk of wiping out mankind is acceptable for learning something interesting in your favourite field? From a friend who had been arguing with him about AGI-related existential risks I learned that a world famous AI researcher agreed that even if he believed that there was a 10% chance of his research led to the end of the world he would still do it. Not the one chance in a billion (or less) that LHC physicists seem to accept as a existential risk limit for when to be concerned, but 10% chance. Maybe the researcher was not taking his numbers seriously, but it is still intriguing.
I asked a few engineering students this question, which was sort of interesting. The younger students (early 20's ) went the route of "I would never knowingly put anyone in harms way," and the died a little inside when I pointed out that as engineers pretty much anything we make can cause harm if it fails, or if it gets repurposed by someone more violent. The other answer I got rejected the possibility of total apocalypse, but when even 10% of the race would survive, the number given was up to 90% depending on the value of the proposed discovery. For this limited population of engineering students without any position in relevant fields, the answer is that most of us would build the cylons without anything resembling hesitation. Sorry.
[ @-rep +1 | c-rep +1 | g-rep +1 | r-rep +1 ]
icekatze icekatze's picture
Re: Exhuman questions
hi hi Its like Pascal's Wager all over again. Is there an option to wait until you are about to die from natural causes and then try the switch?
Extrasolar Angel Extrasolar Angel's picture
Re: Exhuman questions
The idea of backup makes the question of dying somewhat irrelevant in my opinion(of course if one of your personalities becomes a psychopatic uber-monster hunting down other you's then you are in trouble). As to humanity-there a lot of people who won't care about humanity's fate, they exist even now. Personal growth, or success of their own ideological faction would mean more to them-I guess this is one pool of humans that exhumans come from. The intentons might not be evil-let's say a researcher works on stopping potential TITAN invasion and tries to augument his abilities by rapidly expanding his neural network, until his form and essence becomes completely alien to humans he wanted to protect.
[I]Raise your hands to the sky and break the chains. With transhumanism we can smash the matriarchy together.[/i]
Axel the Chimeric Axel the Chimeric's picture
Re: Exhuman questions
Extrasolar Angel wrote:
The idea of backup makes the question of dying somewhat irrelevant in my opinion(of course if one of your personalities becomes a psychopatic uber-monster hunting down other you's then you are in trouble).
I'd argue that a backup hardly makes the question of dying irrelevant. If I die, there's no proof that the backup is in any way me; the best hope is recovery via cortical stack, and even then, that's shakey at best. It's the old teleporter-clone-murder problem. If you wanted to test the procedure on a copy of yourself, that's another story. Not sure how moral it is, though.
nezumi.hebereke nezumi.hebereke's picture
Re: Exhuman questions
I don't think the question makes sense. After all, how often are you given the choice, 'cake or death'? It is almost always a question of grades. How much of an advantage are you willing to risk your life for, and how much risk are you willing to accept? Teflon kneecaps have less than a 1% chance of death (and a higher chance of other medical issues), but may give you the benefit of never having weak knees in your old age. A brain implant has a much higher rate of death and injury, but the pay-off is better - perfect memory, and a freedom from mental degradation. On this scale, I would tend to take the moderate route. I don't want godhood. But I do want an edge, and I'm willing to run risks to get it. Similarly with research. I wouldn't risk the world to find a new species of butterfly. I would risk the world to save the world, though. Your engineers aren't risking the world for their creations, only the lives of those people who are most likely to benefit from those inventions. Although this could just be my bias as a risk analyst :P
root root's picture
Re: Exhuman questions
root@Exhuman questions [hr] At a guess, those engineers are willing to let the world burn because I asked them that question during finals week. This is a group of people that sit around cranking out the math on whether group suicide would be better than studying. The results on that study are unclear.
[ @-rep +1 | c-rep +1 | g-rep +1 | r-rep +1 ]
Axel the Chimeric Axel the Chimeric's picture
Re: Exhuman questions
I wonder how many people would let the world burn if offered god-like posthumanhood... My bet is most of them.
root root's picture
Re: Exhuman questions
root@Exhuman questions [hr]
Axel the Chimeric wrote:
I wonder how many people would let the world burn if offered god-like posthumanhood... My bet is most of them.
I never said I was a good person.
[ @-rep +1 | c-rep +1 | g-rep +1 | r-rep +1 ]
root root's picture
Re: Exhuman questions
root@Exhuman questions [hr] Oh, this is cute. Nothing could possibly go wrong with that. [EDIT] I don't know of I should be horrified or excited. I could build Muse00. Should I do that? [EDIT#2] Sarah Connor doesn't really exist, does she? That's just a scary story they tell to nutty engineering students to keep them in line, right?
[ @-rep +1 | c-rep +1 | g-rep +1 | r-rep +1 ]
Axel the Chimeric Axel the Chimeric's picture
Re: Exhuman questions
She exists, she's just currently a German R&B singer.
Rhyx Rhyx's picture
Re: Exhuman questions
I thought she just hung out at the Tech Noir with 80's hair and shoulderpads.
fafromnice fafromnice's picture
Re: Exhuman questions
every body will gladly kill every body for personnal benefits Why ? We did it every days ... look at your shoes, probably some kids make it in malesia or china if we can have the edge over some people (normaly poor people) we will do anything for those of who (like me) tinks that having god like power will give you the power to change things, like killing every one that don't have the same political idea or economics idea, this is bullshit you do exactly the same thing your ennemies does, at best is a dictature Hitler killed so much people we dosen't count anymore but medecin gain two step ahead, Albert Einstein invent relativity, we make a A-Bomb, Griffith invent a new type of system for telling a story, we make propaganda movies, etc. yes we live in a world like that ... keep smiling it could be worst

What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?

root root's picture
Re: Exhuman questions
root@Exhuman questions [hr] It's true, it could be worse. The world might not have a neat website like this where we work over transhuman tropes and come up with the answers that we hope to the great-machine-god-not-yet-born gets listened to by the crazy savages who work in the fields that may lead to exhumans. Most of us still consider the Singluarity to be something of an embarrassment to discuss aloud in professional circles, but we are there. I've had professors slip up and mention it, and fellow students go on rants about downloadable skills, and about how neat it would be to study in a hivemind. Fuck you world, we are coming, and there is no soul in our eyes. [EDIT] My manufacturer promises that the firmware reboot on the sense of humor module is scheduled sometime between ten and six. Then they told me that my call is important to them, and would I please hold?
[ @-rep +1 | c-rep +1 | g-rep +1 | r-rep +1 ]
flatpointer flatpointer's picture
Re: Exhuman questions
"What probability of dying would you accept if you would become an immensely powerful and enlightened posthuman if you survived?" I take it you and your colleagues aren't just assuming the chances are static, eh? This to me is the central issue when talking about such an idea - if we can make you Super Awesome Posthuman with a 90% chance of Irreversible Fatal Human Smear, right now, then in 10 years of research won't the chances improve a decent clip? Spend a few decades on it and the chances get better and better? I think this would be an important factor. Of course, this assumes that someone doesn't make it to the posthuman stage before you and somehow lower or eliminate your chances of doing so. But why would a posthuman want to be so evil? (knocks on wood)
Arenamontanus Arenamontanus's picture
Re: Exhuman questions
Yup, those gliding probabilities really complicate decision-making. They are even more annoying in xrisk discussion: what is the probability of AGIs wiping out humanity? That depends on the probability of a nuclear war or bioweapon before full AGI is developed - so if there is a big risk from an early threat late threats have their risks reduced. So talking about probabilities of late threats require saying things like "the risk of X conditioned on not Y, Z or W happening before". In the exhuman case you can try using discounting: the value of the future is reduced by 5% per year into the future (or at some other suitable rate). But 1) humans do not normally think in terms of proper discounting (we do so-called hyperbolic discounting, which is either irrational or an adaptation to the real world), 2) go sufficiently far into the future and everything is discounted to zero (the value of something in 100 years is just 0.006 of its current value) which means you don't care about the long term, 3) the discount rate is often based on expected risk of randomly dying, so if you are willing to go exhuman you should have a huge discount rate, and 4) hypothetical posthuman states can become better, so that in ten years time you would want to go for an even more radical state that is a million times better, and ten years later the same thing happens... In short, decision-making is hard. Especially under uncertainty. We do it every day, but usually the stakes are small.
Extropian
root root's picture
Re: Exhuman questions
root@Exhuman questions [hr] And the numbers get worse when subjective emotional states get added in. For instance, my likelihood of pressing the doomsday button has dropped from 90% to well below 10% in the last few days just from a reduction in stress levels. Given how easy it is to drive someone to that particular level of burnout, I wonder if there are more scenarios where the outcome is terrible because the people making decisions just don't give a damn anymore.
[ @-rep +1 | c-rep +1 | g-rep +1 | r-rep +1 ]