Welcome! These forums will be deactivated by the end of this year. The conversation continues in a new morph over on Discord! Please join us there for a more active conversation and the occasional opportunity to ask developers questions directly! Go to the PS+ Discord Server.

potential ex-human concept, but what is the down side.

7 posts / 0 new
Last post
thezombiekat thezombiekat's picture
potential ex-human concept, but what is the down side.
reading the EP2 book I came across the Memory Lock on page 320. Memory Lock: When activated, this implant prevents your sensory input from being stored in your long-term memory, tagged by mnemonics, or recorded/transmitted by your mesh inserts or other means. It also temporarily blocks cortical stack backups. You retain short-term memories, but for no more than a few minutes. This implant is often a requirement of personal aides, consultants, couriers, and underlings of powerful people who require confidentiality or deniability. this means the process of storing long term memories is understood enough to interrupt without affecting short term memory, personality or reasoning. in effect we have the address for the long term memory system. so what is to stop us from calling it from multiple processes, like a central database. my idea was to have multiple forks of a single ego all reading and writing long term memories to the same ego. some code to explain primary_ego function personality() { do stuff; read_Lmem(); write_Lmem(); } function read_Lmem() {} function write_Lmem() {} fork_1 function personality() { do stuff; primary_ego.read_Lmem(); primary_ego.write_Lmem(); } this would appear as multiple forks of the same ego behaving independently with the exception that they become aware of what the others are doing as long term memories are written, generally a couple of minutes after the fact. in effect, you can be in multiple places doing multiple things without a significant problem of cross fork awareness. from a game point of view, this can not be allowed to work. it breaks the action economy and removes most of the downsides of forking. I was, however, thinking of it as having potential as an ex-human strategy if I can come up with suitable downsides to explain why only exhumans would use it.
syberwasp syberwasp's picture
Nice idea
I could see this being an issue from purely functional perspective. The primary ego has the potential issue with sorting the memories. Say it a prime and 4 deta egos. That is 5 set of memories of all of last week. The human mind is not set up to handle the sort of continuity. you would almost have to have a sort of split personality to handle it. I could also see a development of paranoia, prime ego get all the dark thoughts of the other egos all slowly diverging from him unless they are also getting updated too. Also the stress, as every upload would be a phycosurgery/integration * 4. If the primary ego was a ego-morph and the betas ego were in custom morphs that had time stamps and maybe an intentional visual chromatic shift, then dealing with memoirs is just a mater of taking the time to thank about it. Of course this is just a sleep deprived me maybe none of that makes any since...
Poe
thezombiekat thezombiekat's picture
5 sets of memories from last
5 sets of memories from last week is not a problem. memories don't have timestamps, they are cross-referenced with other memories that were active in the mind at the same time, including contemporaneous events and memories that were triggered by the event (similar memories, memories that informed behavior and opinion). that said I could see things getting confused if it isn't done well. memories from different forks could be flagged as related because they were written together, timestamps on morphs wouldn't help unless you hade mnemonics, memory is a funny thing, you don't remember all the details when you remember you fill in what you would expect. worrying about divergence would not be an issue. after all delta egos (i like that) don't ever write to there long term memory, hell they don't read from it ether, why even bother including a copy of it. there is also no reason for delta egos to be persistent or reintegrated, their long term memories are all stored together, deleting one is like going to sleep at the end of a day, all the memories and psychological effects of that day are still with you.
syberwasp syberwasp's picture
running off assumptions
Ok so I was running off assumptions so please forgive. But then again as this is fictional assumptions have to be made for the sake of drama. You are right memory is a funny complex thing. I thought you were looking for reasons it wouldn't be able to work...
Poe
BlckKnght BlckKnght's picture
I think there are only a few
I think there are only a few issues with this idea, either in game-mechanics terms or in the in-game terms. That is, it shouldn't be forbidden, just hard to pull off without complications and consequences. The first challenge for pulling it off would derive from whatever system you have integrating the long-term memories from several forks into a single coherent long-term memory. This is similar to merging forks more completely, which has established mechanics in the book. The only difference is that you'd be doing it continually with small updates, not in large chunks with more data. I suspect that for game purposes you'd probably need to ignore most of that difference. Rather than making merging tests (with Medicine: Psychosurgery perhaps from your muse) every few minutes, you'd make test a few times a day (maybe once per active copy of the ego) to get the overall effect of the memory integration over the course of the day. For minimally divergent forks (which yours would be, since you're all sharing long term memories), the consequences for a failed merge test are not at all severe, only a bit of stress. But that's still the limiting factor on yours scheme. Getting rid of stress is hard, and if you're making [i]a lot[/i] of merging tests, you'll fail a few here and there even if they're all fairly easy. So if you keep at it, the stress will accumulate until you eventually go insane. Maybe for an ex-human NPC that's acceptable (or even desirable)! A normal transhuman might manage that sort of scheme to a very limited degree (e.g. one extra fork, or taking time off for therapy periodically), going all in only if after they become Hardened to Alienation stress (which has its downsides). I'd also say that continually integrating your forks may not be a good idea for a secret agent either. If one fork gets captured by an enemy, they can be interrogated and give up the information about what all the other forks are currently doing. Similarly, all the free forks will suffer from the stress being faced by the captured fork (who may be getting tortured or mindhacked). Indeed, just the fact that you'll experience stressful situations from multiple different perspectives at once might make your mental health precarious, even if the memory merging process works flawlessly. If things go badly, you might end up with the memories of [i]multiple[/i] morphs deaths, rather than just one at a time. You might also have synchronization issues issues if you ever travel places where mesh connectivity is not always consistent. If one fork can't transmit because it is being jammed (or because it has gone silent to avoid detection), what happens to their memories? Does the collective ego just forget that stuff? That seems like an invitation to extra Lack, and the stress and other complications that could come with it. Just what did your fork commit you to to in that hour when the mesh was down in the neighboring dome? With the traditional Memory Lock app, forgetting is the whole point, so you don't have as much confusion about the situation. Another major challenge will be dealing with the rest of transhumanity, who might think such a scheme is unethical for a variety of reasons. Many polities tightly restrict the legal rights of forks, and so getting approval for the scheme might be expensive or even impossible to do legitimately. In most of the inner system, for instance, you'd need to buy copylock and autoerase apps for all the forks, and it might take some persuasion for the authorities not to require each of your forks to be shut down and re-forked every 48 hours, even if there's no divergence thanks to the shared memories. An autonomist polity might object (quite reasonably) to a single ego using many morphs at the same time, when there are not enough of them to sleeve all the infugees in the system. And individual people you meet may treat you very differently if they know you're some weird collective consciousness experiment, rather than a more traditional member of transhumanity. Of course, some groups would welcome you, which could make things complicated if they don't get along with your existing friends (e.g. no Hypercorp agent really wants the approval of the Scum). And will there really not be any divergence between the forks? Maybe the forked egos will end up diverging anyway due to persistent endocrine responses or something, even if they have all the same long term memories (transhumanity's understanding of mind states remains imperfect). This could lead to some interesting stories. Maybe one of the forks falls in love, but the rest of the shared mind doesn't really feel the same way, not having fully experienced the same biochemical attraction. Maybe one fork wants out of the scheme, but can't make any plans to escape without all the others knowing. It could be dramatic!
ICU2 ICU2's picture
BlckKnght wrote:
BlckKnght wrote:
Another major challenge will be dealing with the rest of transhumanity, who might think such a scheme is unethical for a variety of reasons.
I think anybody that is knowingly entering into an exhuman path is probably aware that the bulk of transhumanity is not going to be very accepting of them.
thezombiekat thezombiekat's picture
Quote:I'd also say that
Quote:
I'd also say that continually integrating your forks may not be a good idea for a secret agent either. If one fork gets captured by an enemy, they can be interrogated and give up the information about what all the other forks are currently doing.
not a problem due to a different restriction I saw. the system won't be very tolerant of lag. the long term memory reads need to be very low latency, I can't see it working with more distribution than an infomorf running in distributed mode. personally I would run them on connected hardware, multiple cyber brains in one morf, or all on the same infomorf server,