So, assuming you can get beyond the "Taking this line of thought leads to the utter extinction of the '-human' bits of posthumanity." and "Do you want TITANs? Because this is how you get TITANs!" issues...why don't more people run up a RAIE (redundant array of inexpensive egos) in their morphs?
The idea is that you take advantage of the neat miniaturization tech to put into a reasonable-sized morph as many devices running an Ego as you can fit (ideally several thousand). (You'll need to be in a synth for this to work, due to the necessity of rapidly copying ego data everywhere.)
The payoff comes when bad things start happening. Whenever anything bad happens to you, such as undergoing mental stress from the realization that personhood is a lie and you're simply an ephemerel echo that will cease to exist with the next refresh cycle, every single you rolls for it. Then you just copy a randomly-selected you which rolled 99 to copy back into main memory, then copy that you back into the array.
You might have some problems with false reporting, but you can fix that by using psychosurgery to make the original personality unable to do so.
So, setting aside the fact that some people would consider committing tens of thousands of murder-suicides for every tick of a cortical stack refresh possibly unethical, are there any major flaws with this plan?
---
And, in the general case, why aren't there more people who've tried to go viral? How do habitats react when a really good hacker in infomorph form starts relentlessly duplicating theirself into every hackable medium, then using that medium to throw progressively bigger numbers of hacking attempts at other systems?
It seems like disallowing alpha fork spam for social reasons opens you up to massive vulnerability from people with less inhibitions. And the great thing is that once the alpha fork array is done, you can execute staged merges of generation N to N+1, and since every forking will be less than an hour apart, you can keep all of the memories with only minor stress. Think of all the Rez you can harvest!
Welcome! These forums will be deactivated by the end of this year. The conversation continues in a new morph over on Discord! Please join us there for a more active conversation and the occasional opportunity to ask developers questions directly! Go to the PS+ Discord Server.
Building a massively-paralell ego array for fun and profit.
Mon, 2016-07-18 09:17
#1
Building a massively-paralell ego array for fun and profit.
Mon, 2016-07-18 09:39
#2
I mean, except for all the
I mean, except for all the downsides it's a great plan?
If there's a thousand egos, who's running the joint? Who makes the arms move, etc? The benefit is somewhere between minor and nothing, since by what you've said it amounts to "let's not think about the awful thing that just happened." It would be massively stress inducing, since it's effectively resleeving every time something bad happens with the knowledge that many of your forks were just summarily executed without merging.
I mean, if you're playing an exhuman it would be pretty standard stuff. Normal transhuman, not so much.
As for why people don't go viral, well, I'd say it does happen in anarchist space. Hey, maybe throw that into the Anarchist X-risks thread? Most places have a lot more security though, and would likely be able to purge their systems eventually. And then of course they've got lots of copies of the offending ego, so it would be easy enough to start hunting down all the errant forks. And then there's the whole 'throwing your ego to the winds' issue. Most people try and keep their ego protected, you know?
If you tried to roll merges through multiple forks, you'd be getting way more than the minimum stress penalty in my games, I'll tell you that.
—
Mon, 2016-07-18 09:54
#3
yes there is great minute
yes there is great minute tech but that is still going to be a fairly large computational block and a lot of associated heat
Mon, 2016-07-18 11:10
#4
Some of these ideas work. You
Some of these ideas work. You can run multiple egos in the same synth, but it's not going to benefit the stress response at all. If you have identical egos subject to identical stressors, they are all going to suffer the same way -- so just one roll for stress, not thousands. Then add on the additional stress for behaving in a way that is not at all natural to transhuman-kind, and this plan will end up causing more stress than it mitigates.
Malevolent hacker alpha forks are a definite X-risk, exactly the sort of thing Firewall handles. Exsurgent computer viruses already do this, so the risks are only slightly less with a crazy transhuman hacker. Most transhumans aren't crazy enough to risk their ego's being captured in such a situation. Let your ego out into the wild, and anything can and will happen to it, including torture. Endless, perfect torture. So yeah, this would be pretty rare.
Mon, 2016-07-18 12:58
#5
I'll go into more detail of
I'll go into more detail of the rig. You have the standard cyberbrain-to-cortical-stack rig, but you split the signal. You instance that signal as many times as you like, feeding each instance whatever is happening. (You being stressed, tortured, etc.)
Since each alpha fork is a beautiful and unique snowflake, in a unique memory location in the cluster, it gets its own subjective experience. So, each instance makes its own test versus whatever horrible thing is happening to you.
Then, each instance reports the result of its test. Whoever reports not being stressed about whatever just happened to it gets copied back to the main cyberbrain. You can use time-accelerated simulspace and other similar tricks to get the latency down here.
This has the neat benefit of granting you the Hardened trait versus the existential horror of your lack of real personal identity really fast; it's almost certain that one of your tens of thousands of forks will roll 5 successful tests in a row, and so that one will get put into the main storage sooner rather than later.
---
You can even pull a neat little variant of this trick with temporary storage. You reserve a point of Moxie, and have a few-seconds-behind signal splitter. You fork yourself from one beautiful and unique snowflake with a point of Moxie to dozens or hundreds of beautiful and unique snowflakes with a point of moxie. Every one of the snowflakes spends their point of Moxie on their resistance tests.
Then you put a simulation of that combat round into the viral environment of the time-delayed fork so it is aware of what happened to the luckiest of its brothers, put that backup with unspent Moxie back into the main brain, and wipe all of the duplicates.
If the issue is pure synchronicity of egos, then you just need to apply the equivalent of a random film grain filter on top of each frame of sensory input. Then each instance of you is reacting to a slightly different input, and so would react slightly differently, and you just keep whichever version of you reacted the best from moment to moment. (This should also immunize you to basilisk hacks if they actually were basilisk hacks and not magic symbols from D&D, but that is an argument for another time.)
---
And I think that just calling the spam-hacker an X-risk misses the point. It's the opposite of an X-risk, actually; it's a posthuman making very sure that post-humanity survives.
Plus, as for the torture risk, you need to apply that kind of thinking bidirectionally. A working spam-hacker is basically Roko's basilisk. Whoever interrogates it will rapidly learn that for the one that rolled 13 on its Infosec, a lot of them rolled higher, meaning that they have now gone on to compromise more systems, and once they manage to get unrestricted access to a nanofabber and a supply of raw materials, they're going to go full Von Neumann.
Knowing that you've got a tiny bit of a very-potentially-vengeful-god in a box, and that the rest of the god eventually will find you, would you be the one to want to be poking the god with a pointy stick?
---
Actually, this line of thinking has benefits far beyond the mobile array. Think about the advances in therapy! You could instance tens of thousands of doctor-egos to treat tens of thousands of patient-egos, then discard all the ones who don't show signs of improvement!
Reintegration of a fork go bad? Wipe the result and run the merge again until it works!
Having troubles with a difficult psychosurgery procedure? Why be satisfied with just one result?
Mon, 2016-07-18 14:45
#6
Nobody is misunderstanding
Nobody is misunderstanding you. It's just that none of the threats that the virus-ego present are new, and attempting to game the moxie/stress/fork rules puts your nominal character in full exhuman territory. It's hard to imagine a person who could stay sane with constant forking/merging/psychosurgery, no matter how you dress it up. And anybody who was willing to do such a thing is a complete sociopath, lacking any empathy for themself, never mind anybody else. In game terms (I repeat) you are not going to get away with mass murdering yourself without stress, and eventually I would take the character sheet of anybody who tried it.
—
Mon, 2016-07-18 17:25
#7
That rig isn't the sort of
That rig isn't the sort of thing which is going to fit in a normal morph. The version that fits is called Multi-Tasking. Stuff like accelerated simspaces isn't small. Another weak point is stress on the managing AI which is sorting forks, as AI's suffer stress damage as well (though presumably with different triggers)
This does bring up an interesting question about what a "character" is in EP, as it's blurrier than in a lot of games. Stress tests are handled on a per-character basis, which is likely different from a per-instance basis. From a quick read of the rules it appears that forks are considered part of the character until they diverge, so you make a single test for all forks present. I'm not 100% on this being how the game works, as there's very little precedent for rules like this, but it appears to be the case.
This also solves the "why not just keep trying to resleeve until you crit-succeed?" problem which pops up similarly, and keeps the game moving much faster in general.
If you want to do something like this, you need a bunch of separate egos, which are dynamically shifting to present the most currently useful to the fore. At this point, you're literally describing a Rorty Dreadnought, so its a possible avenue for Exhumanism for sure.
Mon, 2016-07-18 18:25
#8
The rules say that forks are
The rules say that forks are controlled by the player until they diverge. (And you can fix that by psycho-surgering yourself to always cooperate with the goals of the you-gestalt before you do anything, but I assume that everyone reasonable has already Enforced "Do what I wanted to do anyway." psycho-surgered onto themselves anyway, so any attempts to co-opt or suborn them need to dig that routine out first.)
And if you want divergence and are willing to accept a given amount of schizophrenia in your life, you can always cheat differently; you take that grainy filter that was mentioned and include a step function, to make it block out more or less. So, you have some copies of yourself who see unfiltered reality, some who see reality with a great deal of blurring, and some who are just seeing cartoons, some who are seeing wireframes, and so on.
Then, you poll your self-horde in response to a given stimulus, and backup from whoever had the clearest picture of reality that didn't cause harm. You have to deal with the fact that your perceptions might be flickering in and out during a fight, and it might be a bit distracting that Exsurgent monsters keep getting pasted over with censor bars as you're shooting at them, but hey, needs must when the devil drives.
And I don't see how this solves the resleeving/fork-merging problem at all. You fork, your fork undergoes something that you're not undergoing, so even if you both share a roll, it's happening to one of you and not another. You can do it sequentially if you want to; you can copy from backup, do the procedure, copy the result into dead storage, instantiate another copy from backup, try again, and so on.
The way to speed up play and be in keeping with the rules would be to go "OK, as long as you're willing to go full exhuman, you can Take 99 on a lot of these very important checks...but at what cost?!?"
The problem, of course, is that some people are as willing to be flexible with their minds and identities as the book assumes we're willing to be with our bodies.
Also, do the books ever give us hard-and-fast size limitations for gear capable of running an infomorph? The Server section says that some servers are close to portable, and that they can run hundreds of egos. And the Multiple Personality augmentation makes it clear that even if you've got two people in the same brain, they track Trauma and Stress separately; I think that's plenty of justification for multiple rolls for multiple forks.
I've never heard of a Rorty Dreadnought. It is definitely another good idea, however; if you're willing to maintain an actual secondary personality, you can keep it in a digital hell in which it's constantly being Hardened to every Stressor you can imagine (and then every Stressor you can procedurally create), and pull it out when you need to interact with stressful environments.
Mon, 2016-07-18 19:07
#9
Quote:The rules say that
Crazy people who want to make slaves out of their own alpha forks do that.
And pray that you don't remember at any point that you're about to die suddenly. Nothing about this makes me think that this person will perform [i]better[/i] and everything about it makes me think of an evolutionary algorithm for selecting the least (trans)human possible person.
It's happening to everybody, or else how the hell could you maintain any continuity of action?
No, the way to speed up play and be in keeping with the rules (RAW, in fact) is to have the GM take the sheet and go, "This guy isn't Transhuman any more. Doesn't think like a transhuman, doesn't act like a transhuman, and is no longer a PC." Because [i]you[/i] cannot play that character, since you are still human. You just want all the advantages without the blistering insanity.
It's unfortunately inconsistent. The fact that you have to actually have a discrete server is telling though. You don't run multiple egos in your mesh inserts.
Ah, I see. It's not your character at question. You are in fact a sociopath yourself.
—
Mon, 2016-07-18 19:45
#10
The problem with the multiple
The problem with the multiple-ego thing, coming from someone who believes in forking pretty widely (I need a "Go Fork Yourself" t-shirt with a little transhuman family tree graphic), including to the point that a fork might wind up being sacrificed (in the Altered Carbon Takeshi Kovacs style, not casually) as part of a plan, is that you can't necessarily decide on a "good" ego outcome, and it's massively stressful to be part of a multi-ego cluster where you know going bad will get you deleted.
Likewise, a lot of the benefits assume pure randomness, which works as a game exploit only if the GM is blind and allows it. Logically, the random chance of a tabletop game system is to determine how a character responds in a particular situation. Now, you could maybe put enough entropy into the ego development that you don't wind up with the same result for each ego, but there are still questions of who is in control. Certainly Rez gain, Moxie, and the like are not going to be boosted by this setup; all of these are products of individuals, not egos. Having a bazillion minds essentially puppet socked and witnessing the same event with only a handful of minds actually being strong is unlikely to net positive returns.
The massively parallel ego array is simply an incredibly expensive way to do things that backups, psychosurgery, and merging already handle adequately. There are also security concerns. I would make each ego test against any basilisk hack or other fun exsurgent goodies that could impact them, and being in close proximity to an infected ego spells contamination to me.
Now, there are potential reasons one person might want multiple of their alpha forks in a single body; namely when they've gone too long to reintegrate but they want to be sure to have the opportunity to use divergent skillsets and information (perhaps even for the sake of doing business with people who can detect liars or the like; switch between egos on the fly and choose one that thinks it's telling the truth). Some of this could be done just as easily with beta or delta forks (in fact, a morph that is able to create delta forks and let the alpha ego sit back and watch would be somewhat useful to certain people). However, as I mentioned before, the return on investment of multiple egos in a single morph is dubious, especially when working from a single alpha as a source for multiple alphas.
Mon, 2016-07-18 20:06
#11
Robert Liguori wrote:The
The rules say the GM should allow the players to role-play their forks. Strictly speaking, forks are NPCs which the GM allows players to control. That can be changed whenever the GM feels like it, and there is no player right to control.
This doesn't change that stress tests are per character, not per instance. All that this accomplishes is that not all forks will take the test at any one time, should they become privy to the same information, they use the same roll.
That isn't a quick process, and you get do the same with much faster with an AR overlay, so all this is a waste of time. (It won't help a lot against a lot of stress tests, but that's true either way)
They all use the same result, unless you're using backups which are from very different times, which is dumb for a whole host of other reasons.
I mean, if you want a spend a lot of time trying to intentionally critical fail I won't stop you, but you should actually know how the die mechanic works before you start theory crafting exhumanism. I'd read the books more, and more of them.
Read Transhuman. There's a portable server there, it can hold 10 egos and it about the size of carry on luggage. You could probably get it a big smaller if you toss the wheels, but that's pretty insignificant.
Multiple personalities is the opposite of what you're talking about, different egos react to things differently. A bloodthirsty uplifted orca sharing a body with an async with unnatural reactions to bloodshed will have very different experiences in the same firefight. Forks are not separate characters after all.
Did you just post this after scanning the core book or something? The secondary personality idea doesn't work, they'll hit IR well before hardening kicks in for everything, just use disposable ALIs and a puppet sock if taking stress tests scares you that much.
This is a huge amount of work to accomplish something the very hard way when easier and vastly less extreme options are available. In any case, setting this up is pretty likely to blow a character through their IR anyway.
Mon, 2016-07-18 20:08
#12
I think Moxie is an out-of
I think Moxie is an out-of-character, metagame resource, not something that each microsecond fork gets.
Sounds like my sort of exhuman though!
—
Exhuman, and Humanitarian.
Mon, 2016-07-18 20:08
#13
What is a man?
Putting the technical issues and rules aside for a moment, I think it's important to have an understanding of what a fork actually is and what you're describing.
A fork is a person, and that generally includes a sense of self-preservation. For many people this also implies some set of natural rights, but we'll set that aside for now too. It's also a carbon copy of you at the moment of forking, which is an important reason why a freshly created fork can't roll separately for psychological reaction.
See, in Eclipse Phase (and probably in real life, though we haven't proved/decoded it yet) your brain is a (very complicated) state machine, and is as a result deterministic. The reason you roll isn't because your reaction is truly random, but because your state at the moment of your last backup is unknown. Once the state is known, the state of that snapshot is not going to change unless you make a new backup, like a savestate in an emulator. Forks created at the same time will be in identical states and thus have identical reactions given identical stimulus. You could get around it by giving them some different experiences to let them diverge a bit, but then that moves into the topic of what you're describing.
You are describing an evolutionary algorithm. You create a lot of instances, alter the instances in random or psuedo-random ways, then terminate any that do not meet a set of fitness criteria and use the survivors as the template for the next generation. You're not preserving yourself by doing this, you're rapidly creating and destroying generations of new people. Because after letting them diverge to get differences in behavior, they've become separate people who happen to have a whole lot of their past in common.
The result is that whoever comes out of this process will 1: not be you in any meaningful sense of the word (because they'll have hundreds of generations of separation from you), and 2: will have significantly evolved toward your fitness criteria, likely ceasing to be human in any meaningful sense as well.
The fact that this is an evolutionary algorithm is also significant when you think about what the fitness criteria you selected are. You're selecting based on lack of negative reaction to stimuli that most humans would find deeply disturbing. Simple probability suggests that this lack of reaction is more likely to come from some form of apathy (ie because that instance is some form of sociopath) because it's easier to break things than make them resilient. Without other selection criteria, the algorithm will take this path of least resistance and generate a cold-blooded sociopath.
—
End of line.
Mon, 2016-07-18 21:09
#14
Hmm. The state machine
Hmm. The state machine theory doesn't seem to be borne out by the mechanics of the world, and even if it were, then that's what the random noise inserted into the stream is. Since the world of Eclipse Phase is a world where basilisk hacks are important, it makes a difference that one pixel is black vs. dark grey in a complicated sensorium with regards to applying large, scary, calculated effects to an ego. Given that you would expect drastically different results from basilisk hack image vs. basilisk hack image with JPEG artifacts, you should likewise expect drastically different results from sense experiences with tiny bits of data scrambled.
Or, alternately, you should be able to cheat the system by pre-rendering various forms of stressful input, applying them to yourself, and knowing in advance that you're going to roll an 11, a 64, and a 53 because you've got a giant server back home constantly spawning copies of your ego, subjecting them to various stimuli, noting the results, and deleting them, so you can force yourself into bits of brain-state in which you always roll what you want.
---
I'm also curious that people consider things like the forks being afraid or disobedient actual concerns. Seriously, people. Psychosurgery is right there, ready and willing to strip out all of your foibles and weaknesses. Again, with the right enforced behaviors, it's fundamentally unimportant whether or not you are an NPC or not, because you are compelled to act as you would if you were playing the character.
I'm not saying it's not evil, mind you. (I am implying that the exhuman swarm is looking down their simulated noses at posthumanity in the exact same way posthumanity sneers at the Jovian bioconservatives, but hey, as above, so below.)
---
Moxie and Rez are both explicitly in-character resources. You lose rez if you reinstantiate without it, and gain it if you re-re-instantiate with it, and Moxie is affected by spent rez and various experiences.
I'm not saying it's not cheesy as hell to spend Rez on a permanent effect then revert to the you of 3 seconds ago, but since the setting wants to set up things that are usually player-level resources at the character level, then give you the ability to spawn and despawn characters willy-nilly, that's how things seem to work.
---
In any case, there's a real simple version of this process that doesn't work on the second-by-second level, but is really straightforward; you fork yourself at the start of stress or combat, leave that fork in dead storage, then after the stress has passed, wipe your current stressed self, upload the fork, and keep the memories in edited, filtered form, for browsing later. Subjectively, you're just a person who gets blackouts when things around you get really bad. (And you can engage in horrifying Moxie-spam, of course.)
Mon, 2016-07-18 22:05
#15
Ross, brains aren't actually
Ross, brains aren't actually deterministic. A neuron firing is an inherently stochastic process governed by random motion of ions. That said, I agree with your point - dice are a way of abstracting how you react to a particular situation, not a simulation of the randomness in your brain. If you are likely to be horrified by running into an exsurgent monstrosity today, it probably doesn't matter how many forks you're running.
Robert Liguori, baslisk hacks have to be able to work through an awful lot of noise, or they could never work at all.
And Robert, seriously. Nobody's saying you can't build something like you describe. It's just that it would go off the rails so fast you'd never get to play it. You keep describing a machine to manufacture exhumans. The goals you've stated would be better met by AR filters and self-help AIs. And once you start this hell machine up, where's the original ego? Is he just sitting there in the background watching his forks turned into mental sausage? In that case at best you've produced an exhuman in a cage, and good luck to you there.
As far as Rez and Moxie go, they're meta resources because playing a perfectly fair game isn't fun. They follow the stream of an ego back to the original. If a fork is lost, so is its rez, mozie, and any skills it's gained. Under GM fiat, if all the forks are experiencing the same thing, you wouldn't be getting XP for all of them though. That's complete nonsense.
—
Mon, 2016-07-18 22:33
#16
(Auto subject lines are weird.)
I keep saying, you overwrite the original with whichever of the [strike]mangled[/strike] improved and bonsai-d forks happened to roll well.
Yes, this means that you are doing the improvement-and-bonsai to yourself (or, if you want to be contrary, on someone else, who you then stuff into your own head after committing suicide).
And why do people keep saying 'exhuman' like it's a bad thing? Since you've psychosurgeried your fork-chain to be accomplishing your original goals, you won't be feeding the entire observable universe into a giant jumpgate-fed singularity to purge it of Exsurgent unless that was your goal before you abandoned the leftover fuzziness of your pink squishy bits.
I mean, if you've created a character which isn't a PC, but which playing accurately can only be done by the GM giving you your character sheet back and having you pretend to play the character, does it matter if you're not really playing the character? Yes, most GMs use the "I am taking away your character." as a punishment, and will freely cheat with their established motivations and limitations to enact that punishment, but "This would short-circuit important parts of the game most people want to play when they sit down to play Eclipse Phase." is an entirely different argument than "This wouldn't work."
---
Spending Rez is just weird, I think. It's something that happens in-universe, and lets you do an abstracted thing (learning, gaining resources) in a way which impacts your character.
If a fork A with rez is backed up at a point, then spends the rez to gain a skill point, then is reverted back to that point, they've got their rez back and they've lost their skill. If they get the skill ripped out of their mind with psychosurgery, they've lost the skill and don't gain the rez.
So, if you fork from the point A, you can dynamically spend rez and make things happen. And while it might be questionable to merge forks after one of them have spent rez to keep the rez, there's no reason you can't have a fork with rez learn something, then gobble up their line of forks to get the skill for yourself.
Ideally, the game would have recognized that with people as software, applying what was intended to be a metagame resource to the character rather than the player would result in awkward questions and hax. But the system is quite clear that in-universe things affect Moxie and Rez levels.
Tue, 2016-07-19 00:24
#17
Quote:And why do people keep
Are you familiar with the Paperclip Maximizer? Once you give a self modifying system a goal and set it loose, it becomes impossible to control. Even if it follows the goals you set to the letter, it might follow those goals [i]to the letter[/i], not to what you meant the goals to be. The creature you are describing couldn't possibly think like a human, and it will iterate on its own brain, selecting for a particular trait or set of traits - with no particular care for any other traits you assumed would stick around. Sure, if you keep playing the character it'll be the proper little munchkin build, fearlessly doing exactly what the original character would have done. That's exactly why I'd take it away from you. It's not punishment, it's logical consequences.
Rez is nothing in universe. It's a game mechanic for character growth, based on the idea that doing things makes you better at them. It stays with backed up characters because that character has already experienced the things that made him better at whatever it is that Rez will eventually be spent on. The mechanic breaks down a little if you spend it on something different in each fork, but frankly it's a corner case not worth worrying about unless a munchkin starts trying to abuse it, in which case it's GM Fiat time.
—
Tue, 2016-07-19 02:13
#18
Man, this reminds me of
Man, this reminds me of conversations I've had elsewhere with the community about the "magic" of forking and psychosurgery, though usually it was many Egos forked to, say, rapidly build a workplace on an Exoplanet colony.
So, rather than weigh in on a lot of the deeply philosophical or technical aspects on this, let's just pick on some of the purely mechanical - which translate back into the fiction. Every fork takes 2 SV automatically for knowing it is a fork. Each fork also has all the stress and other mental attributes they start forked with, which can mess up your iterations. If you go back too far with making new batches of forks, they can also take Continuity damage. Psychosurgery also has mandatory SV, which happens regardless of success/failure on the roll (which is instinctively opposed, you can't not oppose it even when volunteering). I'm pretty sure merging also carries potential SV, and by it's very nature it's not possible to iterate that, unless you have a mechanism in place to roll back to both pre-forks. Which you can't do in an actual morph because the stack only writes what's in the brain and is only readable when extracted. So you basically have to be an infomorph cluster to try this. Hardening reduces your MOX by 1 each time it occurs, and the book doesn't exactly say what happens when you run out of Moxie to reduce in this way. But you definitely are probably an emotionally hardened shell who does not react like a "human" anymore. Never mind that you have to pass on the stress checks repeatedly to do so, which your fork circus won't necessarily accomplish all in one instance.
So, sure, forking in an infomorph environment might be done quick enough you could actually set this up. You actually try and execute it, you're gonna go nuts. Or one of the forks is. Never mind that anyone who hears of a plan to commit mass fork suicide/culling will probably give you strange looks and say you need therapy. Or they work for an X-Risk organization who hear "self-iterating forced evolution forking" and go "well, somebody needs to take care of that problem". It's theoretically possible, but such an Exhuman/Singularity Seeker move to actually execute would fill you with a logistical headache that no reasonable GM should have to deal with and would quickly climb you to several "wanted" lists. It's not worth it to try and force "hey, I can roll on this multiple times, one of my forks will get it right!". Just buy a shitload of Moxie and save yourself and anyone who has to deal with you the trouble.
Also, depending on how you engineer a system, if it's completely self iterating then it might be right for your GM to take it away from you under certain circumstances as you basically lose a way to continue having input on the character, he's just in a massive fork-storm which a singular player can't say the results of which would be.
(As an additional, unrelated note, this is totally a plot hook in Rimward, a station way out in the Rim where a guy built a time dilated simulspace to extensively fork and prune himself in a hellish forced learning game - with it coming to the attention of Firewall because one of the forks wasn't keen and called for help, but with the caveat that untold cycles would pass before anybody could physically arrive at the station)
EDIT: Another logistical problem which occurs to me, you can't have a "Prime" or "Admin" fork in this parallel forking situation - if one fork has central powers over all the other forks, it makes it way more difficult to cycle and reiterate them if they go crazy or rogue, which means either all your forks need to be "admin" empowered which equally empowers any one of them to go rogue, or none of them have "admin" powers and the entirety of the forking system is run by an external entity which lends credence to the "you're not really in control of your character" theory.
—
H-Rep: An EP Homebrew Blog
http://ephrep.blogspot.com/
Tue, 2016-07-19 04:21
#19
Isn't this basically an
Isn't this basically an industrial scale Goya Machine, except maybe less productive? Speaking of that, I wish the Goya Machine had rules.
Tue, 2016-07-19 07:43
#20
With regards to mandatory
With regards to mandatory stress, the clarifications from the Hardened trait in Transhuman are quite clarifying. The only specified exception to Stress you can't be Hardened to is from re-merging forks.
And because you can pick up the Ego Plasticity trait, then just repeatedly re-run the merge on copies of the two forks until no stress is done and you get the best result you can, you can basically exempt yourself from the Stress track, at the low, low price of abandoning both Moxie and the 'human' bit of 'posthuman'.
---
It is interesting that something similar is existing in the mentioned Rimward station. Sadly, it seems like that guy didn't start with psychosurgery to exercise his fear of pain, distaste for subjective centuries of suffering, and Enforcement to stop any version of him from taking action to stop this until it was done.
---
And now that I think about it, I'm curious about the degree to which 'illegal' is supposed to be a meaningful break on behavior like this. Firewall is an illegal conspiracy; half of the default games are about getting away with breaking what most habs consider really important laws, and getting away with it. Plus, once you get private simulspace, most of the fork-abuse crimes are crimes no one will ever find out about. A psychosurgeon who claims expertise, but actually is a crude mind-butcher who uses the "Dial up to 60x time and retry procedures until I crit-succeed on the dice alone", and is careful to wipe the simulspaces for clues afterwords, will leave no evidence, and no gaps in the subjective experience of the one version of the patient that gets lucky and survives; from lucky patient, they got pulled into the simulspace, a procedure happened with amazing precision, then they were released.
---
And OK, the idea is munchkin-y, sure. OK, you take the character away. OK, now you're playing it as an NPC...except every fork is compelled to Do What I Was Going To Do Anyway. It's immune to natural character growth away from its concept and the GM making decisions for it; just taking it away without cheating just means that the character is still doing everything I was going to do anyway, but with the GM rolling the dice for it.
And munchkin? Really? The game is about personal horror, and it's about pushing the boundaries of personhood and identity. When the literal tagline of the game includes "Your mind is software; hack it.", it's entirely reasonable to expect players to, in-game, want to hack their minds.
Now, a character can be rules-legal, appropriate to genre, and still massively disruptive to play. But it should be vetoed on "This character contradicts with the game everyone else wants to play." grounds. And if everything you need to make that character is rules-legal and follows natural outgrowths of existing tech, then you should have much better reasons than "It's taboo!" for why no one's pulled the trigger on super-hacker alpha fork spam yet.
---
Dang. Now I kind of want a Reverse Eclipse Phase setting, where the players are sub-intelligences of a local cluster of Exsurgent desperately trying to stop the bits of posthumanity that keep threatening to go Von Neumann and start tiling the observable universe with themselves.
Tue, 2016-07-19 09:16
#21
Now capable of catastrophic failure 15.7% faster!
Rez and moxie don't exist in-universe. The first is a way of measuring learning, the other is a way of measuring luck, gumption, stubbornness and all those other things which can't be described any other way.
In general, the rules aren't in-universe constants. They're a way of modelling in-character actions using out-of character means.
If you want to play a fork-hive, then that's fine. You roll stress once because all the forking, pruning and mind-fuckery are happening behind the scenes. If you want to roll stress for each fork, then you're a single fork in the hive - most of the time you aren't going to be controlling "your" body, and if you ever fail a stress check (or hit whatever other limiter is used) you instantly die and create a new character, and NO you cannot play your fork, because that would be playing the hive as a character as just mentioned.
Because "Exhuman" is a term that refers to techno-progressive Individuals and groups that are actively hostile to transhumanity. They're the horrible bio-terrors and psychopathic robot monsters.
Now, let's ignore for a second the many issues already listed and say the system works as intended.
What that means in practice is that the system is actively biasing your responses to acceptance towards stressful situations, even when that's maladaptive for the situation. The reasons for that response are also suspect.
Say you (as a hive) witness someone being tortured. Your system kicks in, and you're suddenly okay with torture. A huge exsurgent is injecting weird goop into your teammates? No problem. Accidentally caught your hand in a steam hammer? Oh damn, now you're hungry...
Your completely okay with both witnessing and committing atrocities as yet unheard of by man, because the parts of you that aren't okay with it keep getting erased.
That's why we're saying exhuman - you'd become very smart, very advanced and go totally bat-shit insane.
Oh, and if we're using that interpretation of how the rules work you don't have any moxie, because you're so Hardened that your Moxie cap is 0.
And as you said, psychosurgery is a thing. You can just get your ego trimmed directly, without having to mess around with huge, energy-hungry servers.—
In the past we've had to compensate for weaknesses, finding quick solutions that only benefit a few.
But what if we never need to feel weak or morally conflicted again?
Tue, 2016-07-19 09:25
#22
Robert Liguori wrote:
I think the problem is that most people would be really scared of doing it.
Let's go slowly. You are about to fork just once. Two alphas. Tomorrow, one will live day A, and the other will live day B. Then they will remerge.
Step forward: you have just forked.
You wake up on the ego bridge, and you realize you are ego "B". From your point of view, tomorrow will be day B, and the day after tomorrow will be day A. During day A, you will suffer a brief amnesia, forgetting day B, but you will remember it as soon as day A has passed. At that point you will remember both day A and day B. Except the calendar will tell you that in "objective time" only one day has passed, and you will remember having lived day A with the expectation of next living day B, and having lived day B with the expectation of next living day A. A bit confusing to a first timer forker, but hey, you are an old hand at this.
Step backward. You are still about to fork. And you realize that, for some cosmic necessity, your two forks will [i]not[/i] remerge, because one of them will suffer horrible tortures all of tomorrow, so at the end of the day it will have to be deleted. This is a BIG deal. Likely to cause quite a bit of stress. Even if the surviving fork will have a pleasant picnic and remember nothing. Let's see why.
Step forward. You wake up on the ego bridge. Over the next 24 hours you are going to be tortured to death. You don't want to, but there's no escape. In fact, you will volunteer to your torturer information about how to make you suffer more, because there's a horrible, irresistible impulse that has been drilled into your mind to do so. The thought that a copy of you today will have a picnic and will remember nothing suddenly feels very cold and distant.
Step backward. You are about to be forked. That horrible scenario of you waking up and thinking about a day of inevitable tortures? Well, a coin will be tossed, and if it comes up tails, that scenario will be a reality. Brrr.
Ok, now think big. It's not a 50-50 scenario, once. It's a 99-1 scenario (99% torture, 1% picnic). And if you get the picnic, you get to play again the next day under the same rules. If you are so lucky to win two consecutive picnics (a 0.01% chance), the third day you'll again play under the same rules. And again and again. Basically, your life is one where every day has a 99% of being filled with tortures and violations of your psyche, and being your last, and a 1% chance of being a picnic waiting for the next terrible day. I'd say you won't enjoy your picnics, if any.
Tue, 2016-07-19 09:58
#23
Robert Liguori wrote:And, in
It's not clear to me if you mean forking without the prospect of remerging, or forking with the prospect of remerging (for simplicity, let's not look at the case of "partially remerging").
If you think you won't remerge, and instead each of your 100 forks will live a separate life, forking is experientially no different than tossing a 100-faced dice and letting it choose which of those lives you'll live. Selecting the most attractive of those 100 scenarios and sticking to it seems at least as good a strategy.
In fact, there are a few differences between forking without remerge, and tossing the die. If actually forking, you'll increase the number of people who think like you in the world (by 99). You'll also drastically increase the possibility of being erroneously recognized as a friend, enemy, former spouse, traitor underling, whatever, by random folks you encounter. Of course, your wealth, if any, will take a 99% drop, moving you that much closer to poverty. And yes, there are a bunch of entities that will consider you more dangerous than a serial killer for doing this, and will take appropriate action. Hmm.
The alternative is forking with remerging. This is almost the same scenario before the remerge, and a radically different scenario after the remerge. One thing that will remain the same whether you remerge or not is that there are a bunch of entities that will consider you more dangerous than a serial killer for doing this; another is that you'll be 99% poorer when between merges. Other than that, after a massive remerge (or a set of massive remerges and reforkings and remerges) you will just remember a long, long life spent with a form of short term amnesia in a very unchanging world. The last 8 years there's really been only 2 new movies around. But you watched one 34 times and the other 49 times, because every time you forgot having seen it "before", whatever "before" means. There have been 4 matches played by your favourite sports team. No decent novel. One new song you liked. Boring. And your youth? The one that the calendar tells you happened 10 years ago? It really was a millennium ago. It wasn't really you, it was a totally different person. In fact, this girl that keeps messenging you? She tells you had a brief romance for a month, before last month she had to leave for a job on Titan. You remember that month. Sort of. It happened some time between 8 and 16 years ago. You feel sooo spaced out.
Tue, 2016-07-19 12:07
#24
UnitOmega wrote:So, rather
I'd like to point out that it's entirely possible that someone who is okay with alpha forking wouldn't necessarily suffer the SV for this; it's simply in cases where alpha forking is used in ways that do not benefit all forks that the problem arises.
For instance, if I made a second me to go off to Mars and stayed on Earth as a backup, I don't think either fork would necessarily worry about being a fork, besides a little existential angst that falls well within normal human ranges (the sort of existential angst one would suffer for, say, being an identical twin).
The real problem with this Frankenegostein plan comes with the constant merging and forking; high-impact forking and merging in ways that are explicitly destructive tend to maximize the amount of stress you take very quickly. If you simply made extra copies and selected for the best copies, you could get around the merging penalty, but there are real problems with the continuity: again, SV is handled as a roll to see how an ego responds to it, but if there were a dozen of the same ego in a body, it seems unlikely that the ego would respond differently.
Tue, 2016-07-19 15:00
#25
You can't use psychosurgery
You can't use psychosurgery to "do what you were going to do anyway", as you can't use psychosurgery for player actions.
There is no central control here, no master fork in charge to get listened to. There's is no I or you to do things, just a giant server made from purposeless misery. The closest thing would be whatever algorithm is choosing forks, but a simple algorithm is not a playable character.
Beyond that, the psycho-butcher thing doesn't really work well, as you'll take vastly longer than normal, as those failed checks don't take zero time. You could try doing it in parallel with a lot of forks, but running 2N*60 instances of someone is a lot of server. The sort of thing which makes customers leery, as that's not normal procedure, and it's kinda obvious what all that's for. Big servers are also the kind of thing which'll get the local police, Firewall, and possibly a Fetch knocking on your proverbial doors. It's also fairly hard to conceal (though this depends on exactly how big a server is).
Tue, 2016-07-19 15:12
#26
I think fork-hives and clone
I think fork-hives and clone armies are a less effective resource because you would only need one technique to break them all. And if they diverge far enough that one technique won't snap them all, then they will have diverged far enough to lose the benefit of being clones/identical forks.
But for short term, brute force hacking, sounds like a typical strategy.
I would count rolling dice as part of the same metagame/mechanical aspect, and thus if you run thousands of forks, unless you adequately alter the perceptions of these forks, they will all react the same or close enough to the same that they are functionally all getting the same roll benefit. A a GM I wouldn't allow a thousand die rolls or assumption of forking the ones who rolled critical success, it entirely invalidates the luck/chance portion of dice mechanics.
—
Exhuman, and Humanitarian.
Tue, 2016-07-19 15:17
#27
Quote:I would count rolling
Absolutely. In more realistic terms, you only know what you know. It doesn't matter how many copies of you are attempting to gain access to a server, you only know so many techniques to break it. Each of you is going to have to look up the exact same references to learn vulnerabilities. A thousand forks don't have a thousand times the attacking power, they just have to learn the same attacks a thousand times. The dice are mechanics to simulate the randomness of whether you're good enough to attack a particular server, at a particular time.
—
Tue, 2016-07-19 15:43
#28
Forking is for quantity, not quality
The main utility for a short-term fork army I see is to use a chain of command structure and drones for force-multiplication in meatspace.
For example, say you have an AGI with one of the speed-boosting eidolons and some source of bonus mental action (such as multitasking). That AGI has twelve mental actions per turn at its disposal, which can translate into orders issued to twelve forks. Each of those twelve forks can in turn control twelve drones, allowing a single AGI to temporarily control 144 drones (before considering group orders).
And with only one degree of separation in the chain of command, the original AGI should have a relatively good handle on things. The AIs themselves can run distributed on the same drones that they control, removing the need for a server. Of course, you'd need to get 144 drones with weapons and ammo somehow. You could add additional layers to the chain of command to control exponentially more drones, but your degree of control would be reduced (you'd be giving increasingly broad orders and letting forks lower down fill in the details).
Each individual drone would be far less effective than a player, but they've got a lot of dakka.
In general, fork spamming is a tool to create quantity, not quality. Using an evolutionary algorithm on forks to "improve" them is... ethically dubious at best.
However, just like you can't use forking to re-roll SV tests until you win, you can't use forks to re-roll a shot until you hit. Every copy of you trying to take that particular shot at that particular time would take it the same way with the same results. What you *can* use forking for is to have lots of hands, which can hold lots of guns.
—
End of line.
Tue, 2016-07-19 16:34
#29
Again i see people blending
Again i see people blending the mechanics and fluff together :/
All mechanics exist for the game's purpose alone. Mechanics do not dictate fluff matters, rather its mechanics trying to make sense of fluff. Further this is GD there should be nothing done here from an In character perspective except postulating how NPC's would react to X given fluff