Typical coffee break question at our institute:
What probability of dying would you accept if you would become an immensely powerful and enlightened posthuman if you survived?
The fun thing is that even among people I know who are fairly radical transhumanists the majority at most accept somewhere around 10%, even though they believe that a posthuman life could be *amazingly* much better than our current life. However, there are a few who take their estimates seriously and would accept something like 99% chance of dying, since the rewards outweigh the risk so much.
In EP, exhumans are likely making the same estimation. And accepting enormous risks for the shot at becoming gods. Of course, with backups you can even try again and again if you fail - so what if a hundred copies of you die in agony or are driven permanently mad if one of you can transcend all human limitations?
A related question: what risk of wiping out mankind is acceptable for learning something interesting in your favourite field?
From a friend who had been arguing with him about AGI-related existential risks I learned that a world famous AI researcher agreed that even if he believed that there was a 10% chance of his research led to the end of the world he would still do it. Not the one chance in a billion (or less) that LHC physicists seem to accept as a existential risk limit for when to be concerned, but 10% chance. Maybe the researcher was not taking his numbers seriously, but it is still intriguing.
—

[img]http://boxall.no-ip.org/img/infected_userbar.jpg[/img]
[img]http://boxall.no-ip.org/img/exh_userbar.jpg[/img]
"Do not ask who I am and do not ask me to remain the same" - Michel Foucault
root@Exhuman questions
[hr] I asked a few engineering students this question, which was sort of interesting. The younger students (early 20's ) went the route of "I would never knowingly put anyone in harms way," and the died a little inside when I pointed out that as engineers pretty much anything we make can cause harm if it fails, or if it gets repurposed by someone more violent. The other answer I got rejected the possibility of total apocalypse, but when even 10% of the race would survive, the number given was up to 90% depending on the value of the proposed discovery. For this limited population of engineering students without any position in relevant fields, the answer is that most of us would build the cylons without anything resembling hesitation. Sorry.@-rep +1
|c-rep +1
|g-rep +1
|r-rep +1
]root@Exhuman questions
[hr] At a guess, those engineers are willing to let the world burn because I asked them that question during finals week. This is a group of people that sit around cranking out the math on whether group suicide would be better than studying. The results on that study are unclear.@-rep +1
|c-rep +1
|g-rep +1
|r-rep +1
]root@Exhuman questions
[hr] I never said I was a good person.@-rep +1
|c-rep +1
|g-rep +1
|r-rep +1
]root@Exhuman questions
[hr] Oh, this is cute. Nothing could possibly go wrong with that. [EDIT] I don't know of I should be horrified or excited. I could build Muse00. Should I do that? [EDIT#2] Sarah Connor doesn't really exist, does she? That's just a scary story they tell to nutty engineering students to keep them in line, right?@-rep +1
|c-rep +1
|g-rep +1
|r-rep +1
][img]http://boxall.no-ip.org/img/A_Rep.jpg[/img] 2 [img]http://boxall.no-ip.org/img/R_Rep.jpg[/img] 7 [img]http://boxall.no-ip.org/img/C_Rep.jpg[/img] 2
[img]http://i.imgur.com/qtBZ9.jpg[/img]
[img]http://i.imgur.com/AT25J.jpg[/img]
What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?
root@Exhuman questions
[hr] It's true, it could be worse. The world might not have a neat website like this where we work over transhuman tropes and come up with the answers that we hope to the great-machine-god-not-yet-born gets listened to by the crazy savages who work in the fields that may lead to exhumans. Most of us still consider the Singluarity to be something of an embarrassment to discuss aloud in professional circles, but we are there. I've had professors slip up and mention it, and fellow students go on rants about downloadable skills, and about how neat it would be to study in a hivemind. Fuck you world, we are coming, and there is no soul in our eyes. [EDIT] My manufacturer promises that the firmware reboot on the sense of humor module is scheduled sometime between ten and six. Then they told me that my call is important to them, and would I please hold?@-rep +1
|c-rep +1
|g-rep +1
|r-rep +1
]root@Exhuman questions
[hr] And the numbers get worse when subjective emotional states get added in. For instance, my likelihood of pressing the doomsday button has dropped from 90% to well below 10% in the last few days just from a reduction in stress levels. Given how easy it is to drive someone to that particular level of burnout, I wonder if there are more scenarios where the outcome is terrible because the people making decisions just don't give a damn anymore.@-rep +1
|c-rep +1
|g-rep +1
|r-rep +1
]