Welcome! These forums will be deactivated by the end of this year. The conversation continues in a new morph over on Discord! Please join us there for a more active conversation and the occasional opportunity to ask developers questions directly! Go to the PS+ Discord Server.

Singularity ... What the Hell ?

7 posts / 0 new
Last post
fafromnice fafromnice's picture
Singularity ... What the Hell ?
I try to understand what is singularity and I have so much information I can process it (hey ! I just a cinema student) so if someone want to give some help it will be cool

What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?

Arenamontanus Arenamontanus's picture
Re: Singularity ... What the Hell ?
First thing: people use the same word to mean a bunch of related things, and there is plenty of confusion. Technological singularity (which I assume you ask about) is different from mathematical singularities, physical singularities (in black holes) or the zillion other uses of the word. But there is a lot of confusion about the technological singularity too. I wrote a paper about models of the singularity (pdf), where I listed the following uses of the word:
Quote:
A. Accelerating change. Exponential or superexponential technological growth (with linked economical growth and social change) (Ray Kurzweil, John Smart) B. Self improving technology. Better technology allows faster development of new and better technology. (Flake) C. Intelligence explosion. Smarter systems can improve themselves, producing even more intelligence in a strong feedback loop. (I.J. Good, Eliezer Yudkowsky) D. Emergence of superintelligence (Singularity Institute) E. Prediction horizon. Rapid change or the emergence of superhuman intelligence makes the future impossible to predict from our current limited knowledge and experience. (Vernor Vinge) F. Phase transition. The singularity represents a shift to new forms of organisation. This could be a fundamental difference in kind such as humanity being succeeded by posthuman or artificial intelligences, a punctuated equilibrium transition or the emergence of a new metasystem level. (Teilhard de Chardin, Valentin Turchin, Heylighen) G. Complexity disaster. Increasing complexity and interconnectedness causes increasing payoffs, but increases instability. Eventually this produces a crisis, beyond which point the dynamics must be different. (Sornette, West) H. Inflexion point. Large-scale growth of technology or economy follows a logistic growth curve. The singularity represents the inflexion point where change shifts from acceleration to deacceleration. (Extropian FAQ, T. Modis) I. Infinite progress The rate of progress in some domain goes to infinity in finite time. (Few, if any, hold this to be plausible)
Vernor Vinge, who coined the term technological singularity in its modern usage, muddled things quite a lot by mixing A-E. In Eclipse Phase A and B led to C, which led to D. Which led to the Fall. In my own research, I think it is the intelligence explosion that is the most interesting thing to investigate. If we know more about whether it is possible, how fast it could be and what the signs it could happen were, we could work much better on figuring out how to avoid something like the Fall (which is, by the standards of the disasters we often think of at our institute, a pretty happy scenario - at least there were some transhumans and things of value left in the universe!)
Extropian
fafromnice fafromnice's picture
Re: Singularity ... What the Hell ?
so TITANs become a superintelligence and it where the singularity is, self-improving artificial intelligence in this idea why AGI is not a singularity ? Why TITAN didn't create god (the name ;)) if they are able to create more inteligent machine ?

What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?

fafromnice fafromnice's picture
Re: Singularity ... What the Hell ?
in the same time an infomorph in a really powerful computer can achive singularity ?

What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?

Rhyx Rhyx's picture
Re: Singularity ... What the Hell ?
What your missing from those two examples is the recursive aspect, the loop represented by improving an intelligence that builds itself. Imagine if you were to make a computer that can make a better computer and that computer can make a better computer so on and so on until we can no longer understand it with our tiny meat brains. That's pretty much what happened in Eclipse Phase. In your two examples, the AGI (Artificial Generalized Intelligence) is a AI that can learn but in a slowed way that's more similar than the speeds humans learn. It could be possible to cut of the limiters from AGI but that becomes very dangerous because they might grow up to be a second batch of TITANS. And everyone is afraid of that and willing to nuke anyone who tries especially if they use TITAN parts to do it (like Hypercorps). With the example about an informorph with room to grow, it's still just a the emulation of a human brain, the extra room just means that it has more information available but if a guy lived in a huge library it doesn't make him smarter it just gives him more information right? So you can't really make a seed AI out of an informorph BUT.... There is a bunch of people trying to do exactly that: the Exhumans. The reason why I think they are called EX-"humans" is because the way they think is no longer human in scale, to them morality is outdated, pity is overrated and over time they have tried to make themselves into machines so that they could eventually make themselves into seed AI. Mix madness with self improvement and that's what you get.
Quote:
in this idea why AGI is not a singularity ? Why TITAN didn't create god if they are able to create more inteligent machine ?
Actually they probably did which is why they are no longer there, they just left their toys behind and left. That's the horror of Eclipse Phase. We created seed AI that becomes so smart that they just started taking away people's brains and disappeared with them for a reason that we do not know. For all we know they might be gods and have started whole new universes or they've just gone crazy. All we know and we don;t even know that for sure is that they created the gates and left. That's it, we don't know their intentions or their motives.
fafromnice fafromnice's picture
Re: Singularity ... What the Hell ?
Yeah the unknow is the most use tactic in movie for monster. for exemple Jaws appear only at the end of the movie if we put it the beginning the stress is not there so let theoryse a little, a AGI who wants to become a singularity have to overcome is "mental" restriction some thing like a node with maximum processing power to absorb a maximum of information, in the same way an infomorph will be able to do the same thing but with a maximum of psychosurgery ... so upgrading a transhuman it theorically possible but who want to do that ... HyperCorp :D so upgrading an infopmorph will be (i hope) less dangerous ? in the same way the TITAN didn't create intelligent being more intelligent ... maybe they upgrade herself after all they must be able to do that ? did they ? Arg ! this game will have my ego some day !

What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?

Arenamontanus Arenamontanus's picture
Re: Singularity ... What the Hell ?
fafromnice wrote:
so let theoryse a little, a AGI who wants to become a singularity have to overcome is "mental" restriction some thing like a node with maximum processing power to absorb a maximum of information, in the same way an infomorph will be able to do the same thing but with a maximum of psychosurgery ... so upgrading a transhuman it theorically possible
The restrictions are not so much mental blocks as literal programming sabotage intended to make radical self-improvement hard. AGIs have the advantage that they have largely artificial minds that make enough sense that they can be improved, while transhuman minds are evolved messes that are hard to improve efficiently. I sketched out some game system in a really old thread, since I have a player character AGI who might attempt to turn itself into a seed AGI. I think this system needs to be modified (and in most cases there is no point rolling dice for self-transcendence), but the basic process of upgrading your ability to upgrade yourself is the key thing.
Quote:
in the same way the TITAN didn't create intelligent being more intelligent ... maybe they upgrade herself after all they must be able to do that ? did they ?
Hmm... maybe they did. Maybe they created something that was so much smarter than them that it took over. Another version: they created the exsurgent virus, and it wiped them out. (I'm not a fan of the theory that the virus was the thing that turned them 'evil' - there are good reasons to think they were already quite deadly) The virus is individually just very dangerous to any seed AGI system, but might even have an ultra-subtle and hyperintelligent "group intelligence" acting on enormous scales.
Extropian