Welcome! These forums will be deactivated by the end of this year. The conversation continues in a new morph over on Discord! Please join us there for a more active conversation and the occasional opportunity to ask developers questions directly! Go to the PS+ Discord Server.

Building Ego from scratch

30 posts / 0 new
Last post
fafromnice fafromnice's picture
Building Ego from scratch
So I was watching Caprica, the prequel to Battlestar Galactica and I was wondering is it possible to (re)create an ego from scratch. In Caprica they don't scan ego from the brain but from the online data an algorithm find on you on the net. If the idea seems to me farly impractical (and borderline stupid) it make me wonder first lets say that you are resleeve and you found out that it's been a week, a month or a year that you're death BUT we have all the data(trail) on you we can find (let face it we live in a panopticon so ...) and because we have this data we can recreate souvenirs and mostly all of your toughs patterns for this period of time so that the copping with you being dead and the lost of time will be less stressful on yourself. But if we can do that where can we go, what is the limit ? And an algorithm like that will close the gaps between AGI and "real" Ego, where your ego and and become a "machine" ? am I a crazy ?

What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?

GenUGenics GenUGenics's picture
My assumption has been that
My assumption has been that AGIs are either scratch-built egos or are conceptually indistinguishable from them. In my campaign AGIs are intended to simulate the mind/persona of some personality from the past who is no longer present, built to generally model some celebrity or genius or comedian or schmuck who was lost or otherwise unobtainable. Ronald Reagan or Josef Mengele, for example. I can’t see what purpose AGIs otherwise serve in a milieu filled with quite competent AIs and transhuman minds, except as experiment on consciousness.
fafromnice fafromnice's picture
AGI create in a way to be
AGI create in a way to be like human but they state in some books that some AGI are built whit new Toughs patern (personally I like the AGI who will stop at anythng to have the most paper clip in the Universe)

What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?

fafromnice fafromnice's picture
In a way how the programmers
In a way how the programmers of the AGI will influence is personality If i want to build an Margareth Tatcher AGI and I hate Margareth Tatcher I will program the AGI with racist, sexist, war monger Tought Paterns. In the other way around if I like Tatcher it's possible that I will program her/it with a mother-ly view for his/her people but we getting in another way of the initial question :P

What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?

ShadowDragon8685 ShadowDragon8685's picture
Creating an ego from scratch?
Creating an ego from scratch? That's called either "Growing an AGI" and making an AGI (which I think you all are misunderstanding here,) is almost always an act of [i]raising[/i] rather than programming, or "doing a Lost Generation." If you're trying to recreate a person from, say, profiles and databases and such, you're not gonna be able to do so. What you [i]might[/i] manage, though, especially if they lifelog or made extensive personal XP libraries available, is to make an AI which is a reasonable facsimlie thereof. The more data is available, the better the facsimilie (and the less like the amusement-park version) will be, but it's still only going to be a facsimilie.
Skype and AIM names: Exactly the same as my forum name. [url=http://tinyurl.com/mfcapss]My EP Character Questionnaire[/url] [url=http://tinyurl.com/lbpsb93]Thread for my Questionnaire[/url] [url=http://tinyurl.com/obu5adp]The Five Orange Pips[/url]
GenUGenics GenUGenics's picture
fafromnice wrote:If i want to
fafromnice wrote:
If i want to build an Margareth Tatcher AGI and I hate Margareth Tatcher I will program the AGI with racist, sexist, war monger Tought Paterns. In the other way around if I like Tatcher it's possible that I will program her/it with a mother-ly view for his/her people
Yes; but let’s assume there’s some kind of competitive ethic about building simulative personalities that’s centered around accurately representing that persona—otherwise, what’s the point? A phony product would be perceived as shoddy and widely ridiculed. Thinking about this, there are perhaps two (probably more) independent and therefore complementary means to approach accurately representing a generated personality: 1. Assuming a digitized mind carries some sort of pattern—let’s call this a Rorschach—independent teams could create their own concept of the personality and compare Rorschachs. Patterns that are largely duplicative can be considered accurate to the personality, derived through independent means; patterns that are mismatched can be considered inaccurate in some way. 2. Interviews and interactions with others who knew this personality in life, perhaps themselves artificially generated to superb accuracy. Perhaps even MT-A and MT-B could have a chat and see if the other seems "off" somehow. Seems like, starting with biographical data to begin with, you might fairly accurately simulate a personality through reiterative process.
fafromnice fafromnice's picture
ethic could be really
ethic could be really different in a Jovian Habitat and Anarchist one, in this habitats we could find two different MT who would be utterly different and be perceive to be a "true one" and if we could create/grow a facsimile so real that it/her become indistinguishable from the real one, in that case we could make/grow a lost one, right ?

What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?

GenUGenics GenUGenics's picture
fafromnice wrote:ethic could
fafromnice wrote:
ethic could be really different in a Jovian Habitat and Anarchist one, in this habitats we could find two different MT who would be utterly different and be perceive to be a "true one"
And yet I still think you could brand a model as "Accurate to .999999%®" and provide the methodology that comports to standard modeling. This is what we can do with public opinion polls, certainly subject to partisan hackery, and the better the methodology the less it is perceived as hackery. Some polls are understood to be more celebrated than others. Some hypercorps would have investment and branding in the accuracy of their generated personas, assuming there's a market, the Swiss Watchmaker of Persona AGIs. Iterative Model #3: Artificially and independently generate a current living ego and compare to the original as control. If highly accurate, generative model is sound.
fafromnice fafromnice's picture
Yes I presume, in that case a
Yes I presume, in that case a "real one" will only be the "one" in the eye of the population and what it think about this particularly MT, no ?

What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?

GenUGenics GenUGenics's picture
fafromnice wrote:Yes I
fafromnice wrote:
Yes I presume, in that case a "real one" will only be the "one" in the eye of the population and what it think about this particularly MT, no ?
True. And in the accuracy you have created value in whatever it was you were attempting to extract from the generative personality, whether genius or clownishness. We've used political people as an example, but I think AGI comedians might be more in demand. Interesting, we're pushing at the boundaries of what is facsimile and what is genuine, the very heart of the apprehension about sleeving and farcasting, and the EP milieu confidence at accurate duplication of "a soul."
fafromnice fafromnice's picture
could be the start of a game
could be the start of a game like The Astronaut's wife :P "dear Firewall, Something is odd with my husband since his last resleeve. Could you look in to that ? Thank you"

What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?

GenUGenics GenUGenics's picture
Heh. IMEP, I’ve introduced a
Heh. IMEP, I’ve introduced a very competent AGI named John Glenn who seems to know a vaguely uncomfortable amount of detail about the Factors, with whom he has apparently spent an unaccountably large amount of time. He seems to be as much an authority on them as anyone, and they—strangely, mysteriously—about him. Some Persona AGIs I perceive would, once generated, prefer to be deleted or stored. Einstein-99%, I imagine, would chafe at learning God really does play at dice; and I’m not convinced his particular genius can be modeled and he would know that (or suspect that) and despair. Certain personas like Kurzweil-99% or Hawking-99% I imagine would revel in it. I suppose, ethically, any infolife that despaired of their condition and desired instead to be stored should be granted that.
fafromnice fafromnice's picture
a lot of stress could result
a lot of stress could result from discovering you are in fact an AGI persona of an important dead person (Serial Einstein-99% Killer ?) and some interesting question will appear. Does I have the right to be call Einstein ? am i Einstein ? Some psycho-surgeon will have the worst headache of the transhumanity on this (and I want to know what is all the fuzz with your John Glenn-AGI :D )

What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?

GenUGenics GenUGenics's picture
fafromnice wrote:and some
fafromnice wrote:
and some interesting question will appear. Does I have the right to be call Einstein ? am i Einstein ?
This is at the heart of other existential questions of identity and self EP raises about sleeving and farcasting and forking, etc (and why I love it so). I hadn't really thought about this before, but it really makes AGI an interesting control group for other assumptions about authenticity and facsimile. If you could scratch-build an ego ex nihilo with some assurance of accuracy and comparatives, it really lends assurance of authenticity when you have an ego that can be direct copied—"if we're this good with hacks, imagine what we can do with the real item..."
Quote:
(and I want to know what is all the fuzz with your John Glenn-AGI :D )
That would be telling ;-)
fafromnice fafromnice's picture
it will create a new kind of
it will create a new kind of immortality, the kind where if you are remember you will probably AGIsed and so will live "forever" a family could pay to AGIsed a loved one dead in the fall or with no backup available

What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?

GenUGenics GenUGenics's picture
I should add here that IMEP I
I should add here that IMEP I've limited these sorts of high-accuracy Persona AGIs to have required some sort of extant brain in some sort of condition that can be scanned and "improved" through a (re)generative process. Einstein's brain, for example, is in a pickle jar on a shelf somewhere... and I make the assumption that its contents, while degraded and corrupt, might be recovered to some accuracy. There are a number of corpsicles whose brains have been flash frozen, and I imagine these might be the easiest to recover from corruption; and I imagine these might be even be the test models for more sophisticated sorts of recoveries. The oldest of these cryonic subjects was born in the 1890s, and it is intriguing to think some transhuman minds in EP might have been around for a very long time. A lot of these folks counted among the earliest champions of transhumanism and technological immortality, and I think EP should tribute them by imagining them still around in one form or another, but that's just MO. I've fancied some folks like Josef Mengele would be rather fussily inclined to the preservation of their brains, and I imagine the nascent Project Ozma was sneaking brain material from certain luminaries at least as far back as the Kennedy era (whatever happened to his brain after Dallas, anyway?). There was a certain Austrian scientist, a baron, who experimented with brain matter animated by electricity. He was born in 1770 and his brain was frozen, and therefore preserved, in the Arctic. I imagine his might be the oldest transhuman mind operating on a server somewhere. Happy Halloween!
fafromnice fafromnice's picture
some way to reinvent a
some way to reinvent a Transhuman Frankenstein or an Exurgent Strain who force infected to absorb ego part of other transhuman by the mouth of the infected. Ooooouuuuh Spooky

What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?

GenUGenics GenUGenics's picture
I like the idea that there
I like the idea that there are some mental entities lurking around that are just as terrifying as the TITANS, and maybe even a bit more malevolent. Anders Sandberg has posted ideas along these lines—minds forked and merged and reforked and remerged again and again until they make the Ultimates seem like conservative pussycats in the comparison. Scary!
fafromnice fafromnice's picture
with all this forking end
with all this forking and merging it is probable that new mental illness will appear, mental illness that we didn't see before something like the aggressive sens that the person in front of you inhabit YOUR morph, a morph disillusion of some sort we're pretty far of the initial question :P

What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?

fafromnice fafromnice's picture
ShadowDragon8685 wrote:
ShadowDragon8685 wrote:
That's called either "Growing an AGI" and making an AGI (which I think you all are misunderstanding here,) is almost always an act of [i]raising[/i] rather than programming, or "doing a Lost Generation."
that's my understanding that you cannot force a children to be what you want, so how an hypercorp AGI maker could grow a fighter AGI or an astronomer one ? if ShadowDragon is still reading us :P

What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?

ShadowDragon8685 ShadowDragon8685's picture
fafromnice wrote
fafromnice wrote:
ShadowDragon8685 wrote:
That's called either "Growing an AGI" and making an AGI (which I think you all are misunderstanding here,) is almost always an act of [i]raising[/i] rather than programming, or "doing a Lost Generation."
that's my understanding that you cannot force a children to be what you want, so how an hypercorp AGI maker could grow a fighter AGI or an astronomer one ? if ShadowDragon is still reading us :P
You can't force a child to be what you want, but you can weight it, if you have the resources. Someone whose family have a history of aviators, who receives extensive schooling, has a certain amount of pressure on them to enter the field of aviation, etc, will have a much higher liklihood of going into the field of aviation as compared, to, say, construction. On the other hand, you can certainly program an AI to be whatever you want. Remember: AGIs and AIs are not the same thing. An AGI is a fully-fledged Ego, sapient, with its own goals, desires, motivations, etc. To some degree they can be controlled, but those who are controlled in such manners are likely to resent it. AGIs are marked by having (the potential to) have a Moxie score, and having regular skill caps. An AI is a non-sapient computer program, which is programmed to perform as required. They can, under certain circumstances, evolve into AGIs, but almost never will that actually happen. AIs are still capable of significant sophistication, and have skill caps of 40, but cannot have a Moxie score unless they're a Beta or lower-grade fork of an actual Ego, which can then have a Moxie score of 1.
Skype and AIM names: Exactly the same as my forum name. [url=http://tinyurl.com/mfcapss]My EP Character Questionnaire[/url] [url=http://tinyurl.com/lbpsb93]Thread for my Questionnaire[/url] [url=http://tinyurl.com/obu5adp]The Five Orange Pips[/url]
fafromnice fafromnice's picture
if I understand well (some)
if I understand well (some) Hypercorp are growing AGI to be sold. It seems to me that is a pretty huge risk on your investment if you're not able to assure the quality of an Astronomer AGI you just grow. I understand that you could have a good "batch" and fork it indefinitely but before this "good batch" you are waiting for a lucky strike and luck and business don't match well

What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?

ShadowDragon8685 ShadowDragon8685's picture
fafromnice wrote:if I
fafromnice wrote:
if I understand well (some) Hypercorp are growing AGI to be sold. It seems to me that is a pretty huge risk on your investment if you're not able to assure the quality of an Astronomer AGI you just grow. I understand that you could have a good "batch" and fork it indefinitely but before this "good batch" you are waiting for a lucky strike and luck and business don't match well
With 60x acceleration, you can crank through batches pretty quickly, especially if you have psychosurgeons (preferably AGI psychosurgeons themselves, though that sounds like inviting trouble,) to add the significant power of psychosurgery to your AGI growpramming weighting.
Skype and AIM names: Exactly the same as my forum name. [url=http://tinyurl.com/mfcapss]My EP Character Questionnaire[/url] [url=http://tinyurl.com/lbpsb93]Thread for my Questionnaire[/url] [url=http://tinyurl.com/obu5adp]The Five Orange Pips[/url]
fafromnice fafromnice's picture
the maximum acceleration is
the maximum acceleration is not 3x ?
What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?
ORCACommander ORCACommander's picture
6x for human/transhuman ego's
6x for human/transhuman ego's for AGI's and Seed ai's they can handle 60x. apart from permanent death why bother with facsimile AI ego's when you can literally keep them around forever after a 1-2 hour brain scan
ShadowDragon8685 ShadowDragon8685's picture
I think it's 6x for egoes
I think it's 6x for egoes running on a meatbrain, 60x for a pure infomorph, regardless of whether it's an AGI or a transhuman ego in an infomorph? Have to double-check.
Skype and AIM names: Exactly the same as my forum name. [url=http://tinyurl.com/mfcapss]My EP Character Questionnaire[/url] [url=http://tinyurl.com/lbpsb93]Thread for my Questionnaire[/url] [url=http://tinyurl.com/obu5adp]The Five Orange Pips[/url]
kindalas kindalas's picture
I'm 95% sure that anything
I'm 95% sure that anything running on a powerful enough server can his 60x And I'll point out that Mental Speed even thought it only gives two extra complex actions it allows for reading at 10x the normal rate so even in a meat box speed can be achieved.
I am a Moderator of this Forum [color=red]My mod voice is red.[/color] The Eclipse Phase Character sheet is downloadable here: [url=http://sites.google.com/site/eclipsephases/home/cabinet] Get it here![/url]
ThatWhichNeverWas ThatWhichNeverWas's picture
Brainwashing for fun and profit.
kindalas wrote:
And I'll point out that Mental Speed even thought it only gives two extra complex actions it allows for reading at 10x the normal rate so even in a meat box speed can be achieved.
That's a good point. A really good point. In fact, is 60x acceleration simply having Speed 4 and Mental Speed? Because that would answer a lot of questions. The way I picture growing an AGI is this: First you start of with an AGI "Newborn completely devoid of all knowledge and skills beyond the very basics required for it to remain coherent. This Newborn is then imparted with those skills and aptitudes it will require for it's purpose using Psychosurgery, which can be assumed to be permanent as they will occur under ideal conditions (Rules Wise, we can assume the equivalent of a Slave Eidolon on a Lockbox modified so the inhabitant has Malleable Mind level 2). Installing software upgrades at this point would also make sense, and also provides an interesting thematic for AGIs not wanting to sleeve into Biological brains, because not all of their “mind” comes with them. After this, the AGI will be placed in multiple simulspace environments conducive to the desired skillset; one set to be an astronomer may be “sleeved” into a ship lost in space, and must use it's knowledge to find it's way home, whilst a combat AGI would be placed in various combat arenas. Less scrupulous programmers may use evolutionary forking to increase effectiveness by making multiple forks compete in these environments and only retaining/reintegrating the most successful, but those for those who consider AGIs and Forks people this is essentially mass murder. Even if you can't define an AGI's personality, giving it the tools and abilities to excel in the desired field and a conducive environment means you can guide it along the desired path quite effectively, Back to the OT, you could edit an AGI to think it were a specific person by psychosurgically altering their personality and implanting artificial memories. It's just seriously messed up.
In the past we've had to compensate for weaknesses, finding quick solutions that only benefit a few. But what if we never need to feel weak or morally conflicted again?
ShadowDragon8685 ShadowDragon8685's picture
ThatWhichNeverWas wrote:Back
ThatWhichNeverWas wrote:
Back to the OT, you could edit an AGI to think it were a specific person by psychosurgically altering their personality and implanting artificial memories. It's just seriously messed up.
Yep. It would be better to do this with a dumb AI, if you wanted to skin an AI to act like an historical personality. A lot of people's muses are probably like that.
Skype and AIM names: Exactly the same as my forum name. [url=http://tinyurl.com/mfcapss]My EP Character Questionnaire[/url] [url=http://tinyurl.com/lbpsb93]Thread for my Questionnaire[/url] [url=http://tinyurl.com/obu5adp]The Five Orange Pips[/url]
fafromnice fafromnice's picture
it seems that the idea behing
it seems that the idea behing growing/building an AGI change for every two or three people :P I was in the idea that an AGI is an AI so advance that it was sapient. You have two type of AGI those who were grown and those who were build, the two have advantage and disadvantage, etc. etc. it seems I was in the blue :P

What do you mean a butterfly cause this ? How a butterfly can cause an enviromental system overload on the other side of a 10 000 egos habitat ?