Welcome! These forums will be deactivated by the end of this year. The conversation continues in a new morph over on Discord! Please join us there for a more active conversation and the occasional opportunity to ask developers questions directly! Go to the PS+ Discord Server.

Survival of the Soul

169 posts / 0 new
Last post
Lilith Lilith's picture
I'm inclined to think
I'm inclined to think donating alphas to Firewall is a moot point, since given their control of their sentinels' backups they're probably liable to spin off alpha forks on their own as needed. For the greater good, of course.
Alkahest Alkahest's picture
Smokeskin wrote:Yeah well, my
Smokeskin wrote:
Yeah well, my enjoyment of sex is just some evolved trait and really nothing but some hormones and electrical activity in my brain. It has no real meaning. That doesn't mean I'm going to ignore it, or that I could even if I would. The same thing with my sense of identity and survival instinct. I'm not just going to dump those because some philosopher decides that everything is meaningless. Everything is obviously meaningless, except for the meaning we inject into the world, by that haphazard jumble of traits and preferences that evolution and chance bundled up in those fancy brains of ours. What is the next thing you want to remove? Morality? Ambition? Curiosity? Valuing truth over lies?
I believe you're missing my point. I'm not saying that identity is "meaningless", I'm saying that it doesn't exist separate from things that actually exist, like memories, personality and so forth. Happiness exists. Pleasure exists. Memories exist. Love exists. Why do they exist? Because they are, as you say, "nothing but some hormones and electrical activity in my brain". That makes them very important. Identity (separate from memories and personality) does not exist as hormones and electrical activity in the brain. Your sense of identity exists, but that's not the same thing as identity. And your sense of identity would still exist in a fork, if you think that's valuable. You seem to believe that we have a choice between accepting reality and living functional, happy, moral human lives. I believe the two are intrinsically connected. So my questions stand: a) What evidence do you have that a "you", a "consciousness" or an "identity" exists separate from your memories, personality traits, emotions, etcetera? b) Why are you afraid of losing this "you", "consciousness" or "identity"? I have seen this topic debated so many times, and every time people have the same arguments about what "they" would and wouldn't see/experience/etcetera. It's just frustrating to see people have long discussions without defining their terms. So until someone can define and prove the existence of a self separate from all the things that would be copied in a fork, I'm just going to replace every instance of the words "identity", "I" or "you" in such a context with "an invisible gremlin". (You don't need to get cute and point out that I use the word "I", too. In my case, you can replace the word "I" with "the set of memories and other mental phenomena the systems formulating this sentence is currently accessing".)
President of PETE: People for the Ethical Treatment of Exhumans.
Smokeskin Smokeskin's picture
Alkahest wrote:Smokeskin
Alkahest wrote:
Smokeskin wrote:
Yeah well, my enjoyment of sex is just some evolved trait and really nothing but some hormones and electrical activity in my brain. It has no real meaning. That doesn't mean I'm going to ignore it, or that I could even if I would. The same thing with my sense of identity and survival instinct. I'm not just going to dump those because some philosopher decides that everything is meaningless. Everything is obviously meaningless, except for the meaning we inject into the world, by that haphazard jumble of traits and preferences that evolution and chance bundled up in those fancy brains of ours. What is the next thing you want to remove? Morality? Ambition? Curiosity? Valuing truth over lies?
I believe you're missing my point. I'm not saying that identity is "meaningless", I'm saying that it doesn't exist separate from things that actually exist, like memories, personality and so forth. Happiness exists. Pleasure exists. Memories exist. Love exists. Why do they exist? Because they are, as you say, "nothing but some hormones and electrical activity in my brain". That makes them very important. Identity (separate from memories and personality) does not exist as hormones and electrical activity in the brain. Your sense of identity exists, but that's not the same thing as identity.
When I say 'identity' I mean what you call 'sense of identity'.
Quote:
And your sense of identity would still exist in a fork, if you think that's valuable.
I don't. It is my sense of identity that's important, just as I enjoy my own pleasurable experiences amd not those of my fork.
Quote:
You seem to believe that we have a choice between accepting reality and living functional, happy, moral human lives. I believe the two are intrinsically connected.
I don't believe that. What I said was that reality doesn't care about our happiness, just as little as it cares about our identity and consciousness. Many arguments for uploading and egocasting refer to the reality of the processes undelying our consciousness and use them to undermine identity and continuity of consciousness, and my point was that this line of thought eradicates any human values and experiences, which shows that the argument is wrong (unless you're a depressed apathic nihilist)
Quote:
So my questions stand: a) What evidence do you have that a "you", a "consciousness" or an "identity" exists separate from your memories, personality traits, emotions, etcetera?
I don't believe it exists separately. What I do believe is that none of those things can be scanned or uploaded with any sort of relevance to my experience.
Quote:
b) Why are you afraid of losing this "you", "consciousness" or "identity"?
Mostly survival instinct.
Quote:
I have seen this topic debated so many times, and every time people have the same arguments about what "they" would and wouldn't see/experience/etcetera. It's just frustrating to see people have long discussions without defining their terms. So until someone can define and prove the existence of a self separate from all the things that would be copied in a fork, I'm just going to replace every instance of the words "identity", "I" or "you" in such a context with "an invisible gremlin".
So we make a fork of you. Your fork gets everything you dream of. You are imprisoned under appaling conditions, tortured regularly and prevented from committing suicide, suffering for decades. What is your quality of life? What is your fork's quality of life? What sort of sacrifice would you make to trade places with your fork? Does this thought experiment not amply demonstrate how 'you' are not copied over to your fork?
Ilmarinen Ilmarinen's picture
Okay, let's go back to the
Let's imagine a different experiment. Someone knocks you out and uses Star Trek technology to make an identical copy of you. You wake up an hour later staring at someone who looks just like you. Can you tell if you're the original or if you are the copy and if so, how?
[------------/Nation States/-----------] [-----/Representative Democracy/-----] [--------/Regulated Capitalism/--------]
OneTrikPony OneTrikPony's picture
has no one here read Richard
has no one here read Richard K Morgan's Altered Carbon? I get VV's point of view because that book is what first introduced me to the concept of forking and ego casting. Anyone who has read it will understand this argument immediately. Without spoiling too much; The plot brings the protagonist to a point where he's had to alpha fork and multiple sleeve to get the job done. Merging isn't an option, neither is long term multiplicity, so before hand they, the two instances of the same guy barely a few hours old, sit down and decide who's going to live.
Ilmarinen wrote:
Let's imagine a different experiment. Someone knocks you out and uses Star Trek technology to make an identical copy of you. You wake up an hour later staring at someone who looks just like you. Can you tell if you're the original or if you are the copy and if so, how?
Copies is not an applicable term. They are the same person. But in support of VV; let's append that scenario to the Altered Carbon scenario. Only one of the egos is going to be allowed to continue consciousness. How do they decide which one will live? What if you had to 'shoot your own dog' so to speak? The technical situation prohibits continuity of consciousness during an ego cast--a common situation during an ego cast due to light speed lag, hardware issues, immigration quarantine--When you get where you're going, the originating Ego confirms for yourself that you *are* you to the best of his knowledge you then have to press a button terminating his consciousness and wiping all copies from the originating system. Did you just commit suicide? For me the answer is yes but I do it all the time. If I decide to immigrate from Luna to Titan in a real sense I'm making the decision to kill the guy who lived on Luna. I suppose I'd go to the ego casting facility and sign documents that allow the tech to terminate my consciousness and delete all my files on Luna. I decided to end my life on Luna but I'm also continueing my life on Titan. That's not a lot different than a decision to take a job or propose marriage or drive drunk. I make decisions all the time that kill the guy I might have been next year. NBD

Mea Culpa: My mode of speech can make others feel uninvited to argue or participate. This is the EXACT opposite of what I intend when I post.

jhfurnish jhfurnish's picture
Smokeskin wrote:So we make a
Smokeskin wrote:
So we make a fork of you. Your fork gets everything you dream of. You are imprisoned under appaling conditions, tortured regularly and prevented from committing suicide, suffering for decades. What is your quality of life? What is your fork's quality of life? What sort of sacrifice would you make to trade places with your fork? Does this thought experiment not amply demonstrate how 'you' are not copied over to your fork?
Orson Scott Card wrote a short story about exactly this situation (using cloning): 'Fat Farm'. I won't spoil it for you. Go read that one.
Alkahest Alkahest's picture
Smokeskin wrote:When I say
Smokeskin wrote:
When I say 'identity' I mean what you call 'sense of identity'.
Your sense of identity is information. That information would be copied into a fork. Thus, your sense of identity would not be lost. I'm having a hard time understanding the problem here.
Smokeskin wrote:
I don't. It is my sense of identity that's important, just as I enjoy my own pleasurable experiences amd not those of my fork.
But as said, the sense of identity would still exist. You're not actually arguing about whether your sense of identity would exist or not, you're assuming the existence of this "identity" I don't believe in and combine it with your sense of identity. I'm arguing that SoI (sense of identity) would exist, you're arguing that it would not since SoI+IG (invisible gremlin) would not exist.
Smokeskin wrote:
I don't believe that. What I said was that reality doesn't care about our happiness, just as little as it cares about our identity and consciousness. Many arguments for uploading and egocasting refer to the reality of the processes undelying our consciousness and use them to undermine identity and continuity of consciousness, and my point was that this line of thought eradicates any human values and experiences, which shows that the argument is wrong (unless you're a depressed apathic nihilist)
Of course reality doesn't "care" about anything, it would be weird if it did. But happiness exists, which means that we can care about it. I'm arguing that "identity" or "continuity of consciousness" doesn't exist, which means that it's irrational to care about it, just as I think it's irrational to care about the will of God since I'm an atheist. That doesn't make me a depressed apathetic nihilist.
Smokeskin wrote:
I don't believe it exists separately. What I do believe is that none of those things can be scanned or uploaded with any sort of relevance to my experience.
How about this: We're no longer allowed to use the words "I", "me" or "my". I will instead use the words "I1", "me1" and "my1". The definition of "I1" is: "The collection of memories, personality traits and other mental phenomena available to the cognitive systems producing this sentence". If you're not content with that definition, you're going to have to define I2 in some other way and use that instead.
Smokeskin wrote:
Mostly survival instinct.
If I1 see a stick on the path when I1'm talking a walk, my1 instinct kicks in and tells me1 "Oh shit, a snake!". Then the rest of my1 brain tells me1: "No idiot, it's just a stick."
Smokeskin wrote:
So we make a fork of you. Your fork gets everything you dream of. You are imprisoned under appaling conditions, tortured regularly and prevented from committing suicide, suffering for decades. What is your quality of life? What is your fork's quality of life? What sort of sacrifice would you make to trade places with your fork? Does this thought experiment not amply demonstrate how 'you' are not copied over to your fork?
Well, strictly speaking I1 would probably not say that I1 would exist in this thought experiment, since there are different experiences separating me1 from these two minds. They are however two minds very similar to me1. One has good quality of life, one has bad. Since they are so similar to me1, I1 can imagine roughly how they would react and I1 feel really bad for the tortured one. But your question is nonsensical since I1 would either not exist at all or exist in both minds, depending on how broadly we are defining I1. This is a bit like talking to a religious person who insist that I'm an atheist because I hate God. No, I don't believe in God, and I don't believe in some free-floating personal identity completely divorced from mental phenomena that can actually be shown to exist.
President of PETE: People for the Ethical Treatment of Exhumans.
Smokeskin Smokeskin's picture
Alkahest wrote:Smokeskin
Alkahest wrote:
Smokeskin wrote:
When I say 'identity' I mean what you call 'sense of identity'.
Your sense of identity is information. That information would be copied into a fork. Thus, your sense of identity would not be lost. I'm having a hard time understanding the problem here.
Smokeskin wrote:
I don't. It is my sense of identity that's important, just as I enjoy my own pleasurable experiences amd not those of my fork.
But as said, the sense of identity would still exist. You're not actually arguing about whether your sense of identity would exist or not, you're assuming the existence of this "identity" I don't believe in and combine it with your sense of identity. I'm arguing that SoI (sense of identity) would exist, you're arguing that it would not since SoI+IG (invisible gremlin) would not exist.
Smokeskin wrote:
I don't believe that. What I said was that reality doesn't care about our happiness, just as little as it cares about our identity and consciousness. Many arguments for uploading and egocasting refer to the reality of the processes undelying our consciousness and use them to undermine identity and continuity of consciousness, and my point was that this line of thought eradicates any human values and experiences, which shows that the argument is wrong (unless you're a depressed apathic nihilist)
Of course reality doesn't "care" about anything, it would be weird if it did. But happiness exists, which means that we can care about it. I'm arguing that "identity" or "continuity of consciousness" doesn't exist, which means that it's irrational to care about it, just as I think it's irrational to care about the will of God since I'm an atheist. That doesn't make me a depressed apathetic nihilist.
This is the essential part. You believe that happiness exists, but not sense of identity or continuity of consciousness. That strikes me as completely inconsistent. They're all mental phenomena. Why does one exist and not the other?
Quote:
Smokeskin wrote:
I don't believe it exists separately. What I do believe is that none of those things can be scanned or uploaded with any sort of relevance to my experience.
How about this: We're no longer allowed to use the words "I", "me" or "my". I will instead use the words "I1", "me1" and "my1". The definition of "I1" is: "The collection of memories, personality traits and other mental phenomena available to the cognitive systems producing this sentence". If you're not content with that definition, you're going to have to define I2 in some other way and use that instead.
Smokeskin wrote:
Mostly survival instinct.
If I1 see a stick on the path when I1'm talking a walk, my1 instinct kicks in and tells me1 "Oh shit, a snake!". Then the rest of my1 brain tells me1: "No idiot, it's just a stick."
But when you care about your future happiness, the rest of your brain doesn't think "No idiot, there's no continuity of consciousness so you won't ever experience it"?
Quote:
Smokeskin wrote:
So we make a fork of you. Your fork gets everything you dream of. You are imprisoned under appaling conditions, tortured regularly and prevented from committing suicide, suffering for decades. What is your quality of life? What is your fork's quality of life? What sort of sacrifice would you make to trade places with your fork? Does this thought experiment not amply demonstrate how 'you' are not copied over to your fork?
Well, strictly speaking I1 would probably not say that I1 would exist in this thought experiment, since there are different experiences separating me1 from these two minds. They are however two minds very similar to me1. One has good quality of life, one has bad. Since they are so similar to me1, I1 can imagine roughly how they would react and I1 feel really bad for the tortured one. But your question is nonsensical since I1 would either not exist at all or exist in both minds, depending on how broadly we are defining I1.
So what happens to forks don't matter to you? And you'd willingly egocast, which is the same as killing yourself and letting a fork continue. That seems to imply you don't care what happens to yourself. Is that the case?
Quote:
This is a bit like talking to a religious person who insist that I'm an atheist because I hate God. No, I don't believe in God, and I don't believe in some free-floating personal identity completely divorced from mental phenomena that can actually be shown to exist.
No. You don't believe in identity or continuity of consciousness, yet you go about your life acting like you care about what happens to your future self. There's a huge gap between your behavior and your belief.
Alkahest Alkahest's picture
Smokeskin wrote:This is the
Smokeskin wrote:
This is the essential part. You believe that happiness exists, but not sense of identity or continuity of consciousness. That strikes me as completely inconsistent. They're all mental phenomena. Why does one exist and not the other?
A happy person and an unhappy person behave in different ways. A happy person and an unhappy person provide different information consistent with the evolutionary purpose of happiness. A happy person and an unhappy person have different neural activity. Happiness can be manipulated with various drugs. And so on and so forth. Can you say the same about "continuity of consciousness", whatever that means? Also, I have already said that your sense of identity would still exist in a fork, so obviously I believe in sense of identity. But sense of identity and identity are two different things, much like a hallucination of an angel and an angel are two different things.
Smokeskin wrote:
But when you care about your future happiness, the rest of your brain doesn't think "No idiot, there's no continuity of consciousness so you won't ever experience it"?
As said, depends on the definition of "you". I don't need to be 100% the same person as a future version of me to wish that person happiness, just as I don't have to be the same person as people I care about to wish them happiness.
Smokeskin wrote:
So what happens to forks don't matter to you? And you'd willingly egocast, which is the same as killing yourself and letting a fork continue. That seems to imply you don't care what happens to yourself. Is that the case?
You know, we should extend that rule about not using the words "I", "me" and "my" to words like "you" and "yourself". By saying "you'd willingly egocast, which is the same as killing yourself and letting a fork continue" you have already presupposed the existence of the entity/concept/thing under discussion.
Smokeskin wrote:
No. You don't believe in identity or continuity of consciousness, yet you go about your life acting like you care about what happens to your future self. There's a huge gap between your behavior and your belief.
As I have already said, if I1 am the same person as "my future self" entirely depends on how wide a definition of "me" we are using. Alkahest-now and Alkahest-tomorrow are per definition not identical, which rules out the possibility that we have the same identity. We are however very, very similar. So we might as well define ourselves as the same person. But that's just words. The underlying reality doesn't change depending on what definition of "me" I have decided to use today. And if Alkahest-tomorrow is me or another person is completely irrelevant since I try to base my behavior on reality, not definitions that only exist in someone's imagination. I1 care about Alkahest-tomorrow for the same reason I1 care about myself, since Alkahest-today and Alkahest-tomorrow both harbor the beliefs, motivations, memories, emotions and so forth that I1 value. If we carry around the same metaphysical phantasm of identity is rather unimportant since I don't care about metaphysics. Since I seem to have problems communicating my beliefs to you, I will explain it all in a short argument: P1: There is no reason to believe in the existence of unfalsifiable phenomena. P2: Continuity of consciousness is unfalsifiable. P3: All rational values should relate to phenomena one believes exist. C: It's irrational to value continuity of consciousness. Which step do you disagree with?
President of PETE: People for the Ethical Treatment of Exhumans.
Smokeskin Smokeskin's picture
A word on terminology: I'm
A word on terminology: I'm not going to switch the meanings of words like "you" and "I" because you choose to redefine them and insist on numbering them. "You" means you, the body you're in and what it does. Your fork is a copy of you. These are the common usages of the words, and sticking to that makes communicating with language much easier. You can easily describe what you mean using these words. "I care about the average utility of me and my forks" for example. And there is a big difference between you and your forks (if you had any). I'm talking to you. However similar your thought processes might be, you're not telepathically linked to your forks. You don't feel what they feel or control what they do. You feel what you feel and control what you do, and I'm talking to you and you're answering. What I am interested in is your behavior.
Alkahest wrote:
Smokeskin wrote:
No. You don't believe in identity or continuity of consciousness, yet you go about your life acting like you care about what happens to your future self. There's a huge gap between your behavior and your belief.
As I have already said, if I1 am the same person as "my future self" entirely depends on how wide a definition of "me" we are using. Alkahest-now and Alkahest-tomorrow are per definition not identical, which rules out the possibility that we have the same identity. We are however very, very similar. So we might as well define ourselves as the same person. But that's just words. The underlying reality doesn't change depending on what definition of "me" I have decided to use today. And if Alkahest-tomorrow is me or another person is completely irrelevant since I try to base my behavior on reality, not definitions that only exist in someone's imagination. [b]I1 care about Alkahest-tomorrow for the same reason I1 care about myself, since Alkahest-today and Alkahest-tomorrow both harbor the beliefs, motivations, memories, emotions and so forth that I1 value. [/b]If we carry around the same metaphysical phantasm of identity is rather unimportant since I don't care about metaphysics.
Look at the part I bolded in your text. With that in mind, let us go back to the thought experiment I presented earlier. We make a fork of you. Your fork will get everything you dream of. You will be imprisoned and tortured for decades with no option for suicide. You and your fork obviously both harbor the same beliefs, motivations, memories, emotions and so forth, and this is the yardstick you apparently use for determining whose experiences you care about. So I ask again. You are given the option to switch places with your fork if you make a sacrifice. How big a sacrifice would you make? By your logic, you wouldn't want to make any sacrifice, after all that only hurts the combined utility of you and your fork. Does this seem realistic? After 24 hours of torture, you are given the option to switch again in return for a sacrifice. Are you claiming that you're just going to continue accepting the pain because rationally you know that your fork is out there enjoying life?
Alkahest wrote:
Since I seem to have problems communicating my beliefs to you, I will explain it all in a short argument: P1: There is no reason to believe in the existence of unfalsifiable phenomena. P2: Continuity of consciousness is unfalsifiable. P3: All rational values should relate to phenomena one believes exist. C: It's irrational to value continuity of consciousness. Which step do you disagree with?
P2 of course.
Alkahest Alkahest's picture
Smokeskin wrote:A word on
Smokeskin wrote:
A word on terminology: I'm not going to switch the meanings of words like "you" and "I" because you choose to redefine them and insist on numbering them.
I'm just trying to be precise. I'm trying to talk about reality rather than the fictional world of folk psychology and sloppy language use, using advances in cognitive science and the philosophy of mind rather than someone's gut feelings and a tumorous growth on the Germanic language branch. I'll try to see if I can understand your definition.
Smokeskin wrote:
"You" means you,
Redundant.
Smokeskin wrote:
the body you're in
If we define "me" as my current physical body, of course an uploaded mind based on my brain isn't me. Trivially true, not what we're talking about.
Smokeskin wrote:
and what it does.
I think we have already agreed that there is no difference in the behavior between the original person and the uploaded mind. So we have now established that A is A, A is not-B and A is B. See why I want to use more precise definitions?
Smokeskin wrote:
Your fork is a copy of you. These are the common usages of the words, and sticking to that makes communicating with language much easier.
When you're challenging the assumptions made by folk psychology, using the language of folk psychology isn't much help. Language isn't reality, and to describe reality properly we have to modify language for specific purposes. I really didn't expect the use of stipulative definitions to be so controversial.
Smokeskin wrote:
You can easily describe what you mean using these words. "I care about the average utility of me and my forks" for example.
If that is what I intended to communicate, I would say that. It's not.
Smokeskin wrote:
And there is a big difference between you and your forks (if you had any). I'm talking to you. However similar your thought processes might be, you're not telepathically linked to your forks. You don't feel what they feel or control what they do. You feel what you feel and control what you do, and I'm talking to you and you're answering. What I am interested in is your behavior.
I don't think anyone has claimed that two forks would experience the same thing. I'm not the one believing in unfalsifiable mental phenomena here.
Smokeskin wrote:
Look at the part I bolded in your text. With that in mind, let us go back to the thought experiment I presented earlier. We make a fork of you. Your fork will get everything you dream of. You will be imprisoned and tortured for decades with no option for suicide. You and your fork obviously both harbor the same beliefs, motivations, memories, emotions and so forth, and this is the yardstick you apparently use for determining whose experiences you care about. So I ask again. You are given the option to switch places with your fork if you make a sacrifice. How big a sacrifice would you make? By your logic, you wouldn't want to make any sacrifice, after all that only hurts the combined utility of you and your fork. Does this seem realistic? After 24 hours of torture, you are given the option to switch again in return for a sacrifice. Are you claiming that you're just going to continue accepting the pain because rationally you know that your fork is out there enjoying life?
You are still insisting that one of the forks would be me and that the other would not be me. My claim is that both would be me, or that neither would be me, depending on how broad a definition of "me" one decides to use. I can't answer your question if I can't accept the basic premise of it.
Smokeskin wrote:
Alkahest wrote:
Since I seem to have problems communicating my beliefs to you, I will explain it all in a short argument: P1: There is no reason to believe in the existence of unfalsifiable phenomena. P2: Continuity of consciousness is unfalsifiable. P3: All rational values should relate to phenomena one believes exist. C: It's irrational to value continuity of consciousness. Which step do you disagree with?
P2 of course.
Right, now we're getting somewhere. So how do you suggest one could test the existence of continuity of consciousness?
President of PETE: People for the Ethical Treatment of Exhumans.
Smokeskin Smokeskin's picture
Alkahest wrote:Smokeskin
Alkahest wrote:
Smokeskin wrote:
A word on terminology: I'm not going to switch the meanings of words like "you" and "I" because you choose to redefine them and insist on numbering them.
I'm just trying to be precise. I'm trying to talk about reality rather than the fictional world of folk psychology and sloppy language use, using advances in cognitive science and the philosophy of mind rather than someone's gut feelings and a tumorous growth on the Germanic language branch. I'll try to see if I can understand your definition.
Smokeskin wrote:
"You" means you,
Redundant.
Only because you chopped up the sentence.
Quote:
Smokeskin wrote:
the body you're in
If we define "me" as my current physical body, of course an uploaded mind based on my brain isn't me. Trivially true, not what we're talking about.
If we upload you into a machine and don't destroy your body, you will still be in your body, experience sensations in the body, your thought processes will be in the brain, you will be able to move the body. You will still be there. In the machine there is a fork of you (assume a running fork). It will have its own thought processes, experience different sensations, make its own decisions. While you might care greatly about eachother's fate, there is no telepathic link between you and your fork. Because your sensations, thoughts, and actions are unique to you and are completely independent from your fork, there is a dinstinction between them. No matter how you define the words, you won't experience what your fork experiences or decide its actions.
Quote:
Smokeskin wrote:
and what it does.
I think we have already agreed that there is no difference in the behavior between the original person and the uploaded mind.
We don't agree on that. If you have to choose if the torture should continue for either you or your fork, I expect you will prefer it to happen to your fork rather than you, since that will free you from the pain.
Quote:
So we have now established that A is A, A is not-B and A is B. See why I want to use more precise definitions?
No, it seems quite clear that your newspeak is obscuring the fact that you and your fork won't share experiences.
Quote:
Smokeskin wrote:
Your fork is a copy of you. These are the common usages of the words, and sticking to that makes communicating with language much easier.
When you're challenging the assumptions made by folk psychology, using the language of folk psychology isn't much help. Language isn't reality, and to describe reality properly we have to modify language for specific purposes. I really didn't expect the use of stipulative definitions to be so controversial.
Smokeskin wrote:
You can easily describe what you mean using these words. "I care about the average utility of me and my forks" for example.
If that is what I intended to communicate, I would say that. It's not.
Then please say what you mean. I honestly have no idea what you think. Sometimes it sounds like you expect a telepathic link between all copies of you, but I don't think you mean that. At other times it seems like you think you're able to make decisions completely unaffected by your experiences (like sustained torture), which seems very unrealistic - this may be what you actually think, but you've been dodging the question so far. I would honestly like to know what you think. And if you're not going by average utility, and you certainly seem to want to include your forks' utilities and you don't weigh your utility higher, then what does your utility function look like? The sum of their utilities? So you would want to make as many forks as possible in most cases?
Quote:
Smokeskin wrote:
And there is a big difference between you and your forks (if you had any). I'm talking to you. However similar your thought processes might be, you're not telepathically linked to your forks. You don't feel what they feel or control what they do. You feel what you feel and control what you do, and I'm talking to you and you're answering. What I am interested in is your behavior.
I don't think anyone has claimed that two forks would experience the same thing. I'm not the one believing in unfalsifiable mental phenomena here.
Smokeskin wrote:
Look at the part I bolded in your text. With that in mind, let us go back to the thought experiment I presented earlier. We make a fork of you. Your fork will get everything you dream of. You will be imprisoned and tortured for decades with no option for suicide. You and your fork obviously both harbor the same beliefs, motivations, memories, emotions and so forth, and this is the yardstick you apparently use for determining whose experiences you care about. So I ask again. You are given the option to switch places with your fork if you make a sacrifice. How big a sacrifice would you make? By your logic, you wouldn't want to make any sacrifice, after all that only hurts the combined utility of you and your fork. Does this seem realistic? After 24 hours of torture, you are given the option to switch again in return for a sacrifice. Are you claiming that you're just going to continue accepting the pain because rationally you know that your fork is out there enjoying life?
You are still insisting that one of the forks would be me and that the other would not be me. My claim is that both would be me, or that neither would be me, depending on how broad a definition of "me" one decides to use. I can't answer your question if I can't accept the basic premise of it.
But you (remember that means your body and brain) is the one experiencing the pain and the one that has to make the decision. If you decide to swap, the pain stops for you. You can't dodge this by saying "my fork is also me". It is not in terms of actual experiences and decisions. Could you endure the pain forever?
Quote:
Smokeskin wrote:
Alkahest wrote:
Since I seem to have problems communicating my beliefs to you, I will explain it all in a short argument: P1: There is no reason to believe in the existence of unfalsifiable phenomena. P2: Continuity of consciousness is unfalsifiable. P3: All rational values should relate to phenomena one believes exist. C: It's irrational to value continuity of consciousness. Which step do you disagree with?
P2 of course.
Right, now we're getting somewhere. So how do you suggest one could test the existence of continuity of consciousness?
First, as long as my brain is functioning properly, my consciousness will be there in the future. And it has been there in the past. This sets it apart from beliefs like "my soul will live on in heaven". Second, I experience a retrospective continuity. Third, read the thought experiment above. Look at people's behavior. This establishes the forward link. It is no less ephemeral than other qualia. I can easily imagine an AI or mentally ill human without continuity of consciousness. One who felt no connection to past states, or didn't care about its future state because it felt no connection to it.
Alkahest Alkahest's picture
Smokeskin wrote:Only because
Smokeskin wrote:
If we upload you into a machine and don't destroy your body, you will still be in your body, experience sensations in the body, your thought processes will be in the brain, you will be able to move the body. You will still be there. In the machine there is a fork of you (assume a running fork). It will have its own thought processes, experience different sensations, make its own decisions. While you might care greatly about eachother's fate, there is no telepathic link between you and your fork.
Once again, I'm fairly certain that no-one has actually claimed that. I certainly haven't. I would appreciate if we could abandon that particular discussion since it appears to be based not on what I have said but an interpretation that has little to do with my actual arguments.
Smokeskin wrote:
Because your sensations, thoughts, and actions are unique to you and are completely independent from your fork, there is a dinstinction between them. No matter how you define the words, you won't experience what your fork experiences or decide its actions.
The brain-mind and the computer-mind are completely independent from each other, yes. I have not claimed otherwise. What you have yet to show is that the brain-mind would share a mental quality with the current me that the computer-mind wouldn't. Using my definition of "me", both minds are equally me, because they share all my important mental qualities. I would like us to figure out what you believe the brain-mind would share with me that the computer-mind wouldn't.
Smokeskin wrote:
We don't agree on that. If you have to choose if the torture should continue for either you or your fork, I expect you will prefer it to happen to your fork rather than you, since that will free you from the pain.
I said that the behavior of the two minds would be the same. Are you saying that the brain-mind would behave differently from the computer-mind in this situation, that the brain-mind would want to switch while the computer-mind wouldn't? If that isn't what you're saying, I fail to see the relevance to what we were talking about, which was "I think we have already agreed that there is no difference in the behavior between the original person and the uploaded mind".
Smokeskin wrote:
No, it seems quite clear that your newspeak is obscuring the fact that you and your fork won't share experiences.
I doubt you can find a single quote from me where I have claimed that the two minds would share experiences.
Smokeskin wrote:
Then please say what you mean. I honestly have no idea what you think. Sometimes it sounds like you expect a telepathic link between all copies of you, but I don't think you mean that. At other times it seems like you think you're able to make decisions completely unaffected by your experiences (like sustained torture), which seems very unrealistic - this may be what you actually think, but you've been dodging the question so far.
See above.
Smokeskin wrote:
I would honestly like to know what you think. And if you're not going by average utility, and you certainly seem to want to include your forks' utilities and you don't weigh your utility higher, then what does your utility function look like? The sum of their utilities? So you would want to make as many forks as possible in most cases?
What we are discussing is the existence or non-existence of continuity of consciousness and if an uploaded mind is the same as the original mind. I would appreciate if we could go back to the original question.
Smokeskin wrote:
But you (remember that means your body and brain) is the one experiencing the pain and the one that has to make the decision. If you decide to swap, the pain stops for you. You can't dodge this by saying "my fork is also me". It is not in terms of actual experiences and decisions.
If you define "you" and "your body", of course the fork isn't me. Let's call your definition of me "me2". The brain-mind is me1. The computer-mind is me1. The brain-mind is me2. The computer-mind is not me2. But as said, the fact that an uploaded mind does not have the same body as the original mind is a trivial truth with little to do with what we're talking about. What we are discussing is if the brain-mind and the current me would share a certain mental quality that the computer-mind and the current me would not share.
Smokeskin wrote:
Could you endure the pain forever?
I imagine that neither me would want to endure the pain. Rationally both the brain-mind and the computer-mind would probably not want the other any harm, but egoistic instinct would most likely drive both the brain-mind and the computer-mind to wanting to switch. Remind me again how the fact that you can probably torture me into acting against my own moral compass implies that a certain theory in the field of cognitive science is correct? It's a bit like saying that your love for your family doesn't exist just because you can probably be tortured to the point where you are willing to kill them to make it stop. Your love is not strong enough to withstand millions of years of evolutionary instincts, just like my rationality is not strong enough. Do you think that the brain-mind would act selfishly in this situation while the computer-mind would willingly accept the torture? If not, the fact that you think they would both act in the same way seems to be more proof of my position than your position.
Smokeskin wrote:
First, as long as my brain is functioning properly, my consciousness will be there in the future. And it has been there in the past. This sets it apart from beliefs like "my soul will live on in heaven".
Consciousness is not the same thing as continuity of consciousness. Consciousness would exist in a fork.
Smokeskin wrote:
Second, I experience a retrospective continuity.
That's called "memories". Memories are not the same thing as continuity of consciousness. Memories would exist in a fork.
Smokeskin wrote:
Third, read the thought experiment above. Look at people's behavior. This establishes the forward link.
How people act in your thought experiment is at best a proof of sense of identity/sense of "continuity of consciousness". I have already explained that I believe that sense of identity exists. But an illusion is not reality. A person who hallucinates an angel is not proof of angels. The fact that everyone sees square A as darker than square B in the famous checker shadow illusion is not proof that it actually is. Our brain fools us all the time.
Smokeskin wrote:
I can easily imagine an AI or mentally ill human without continuity of consciousness. One who felt no connection to past states, or didn't care about its future state because it felt no connection to it.
Once again, illusion is not reality. If you by "continuity of consciousness" mean "sense of identity" or something similar, you must explain why a fork would not have a sense of identity. If all mental processes are copied, that sense would be as present in the fork as in the original brain.
President of PETE: People for the Ethical Treatment of Exhumans.
Ilmarinen Ilmarinen's picture
Smokeskin wrote:
Smokeskin wrote:
Second, I experience a retrospective continuity.
Do you understand that any fork you created would experience this also? I asked a question earlier: Let's imagine a different experiment. Someone knocks you out and uses Star Trek technology to make an identical copy of you. You wake up an hour later staring at someone who looks just like you. Can you tell if you're the original or if you are the copy and if so, how? I'd like an answer on that.
[------------/Nation States/-----------] [-----/Representative Democracy/-----] [--------/Regulated Capitalism/--------]
Smokeskin Smokeskin's picture
Alkahest wrote:Smokeskin
Alkahest wrote:
Smokeskin wrote:
If we upload you into a machine and don't destroy your body, you will still be in your body, experience sensations in the body, your thought processes will be in the brain, you will be able to move the body. You will still be there. In the machine there is a fork of you (assume a running fork). It will have its own thought processes, experience different sensations, make its own decisions. While you might care greatly about eachother's fate, there is no telepathic link between you and your fork.
Once again, I'm fairly certain that no-one has actually claimed that. I certainly haven't. I would appreciate if we could abandon that particular discussion since it appears to be based not on what I have said but an interpretation that has little to do with my actual arguments.
But this is the core of identity. You are not your fork.
Quote:
Smokeskin wrote:
Because your sensations, thoughts, and actions are unique to you and are completely independent from your fork, there is a dinstinction between them. No matter how you define the words, you won't experience what your fork experiences or decide its actions.
The brain-mind and the computer-mind are completely independent from each other, yes. I have not claimed otherwise. What you have yet to show is that the brain-mind would share a mental quality with the current me that the computer-mind wouldn't. Using my definition of "me", both minds are equally me, because they share all my important mental qualities. I would like us to figure out what you believe the brain-mind would share with me that the computer-mind wouldn't.
Brain-mind shares experiences and decisions with you. Computer-mind does not. You have confirmed this with your response to the torture experiment.
Quote:
Smokeskin wrote:
We don't agree on that. If you have to choose if the torture should continue for either you or your fork, I expect you will prefer it to happen to your fork rather than you, since that will free you from the pain.
I said that the behavior of the two minds would be the same. Are you saying that the brain-mind would behave differently from the computer-mind in this situation, that the brain-mind would want to switch while the computer-mind wouldn't? If that isn't what you're saying, I fail to see the relevance to what we were talking about, which was "I think we have already agreed that there is no difference in the behavior between the original person and the uploaded mind".
I am saying that when we place the two minds in different circumstances but with full knowledge of the other, they will want different things to happen. Defining both as "you" when they not only have different experiences but have different priorities, goals and make different decisions seems pointless. As per the torture experiment, you want to switch with your fork but your fork does not want to switch because you don't want to experience the pain any longer and it matters less to you that your fork will experience it.
Smokeskin wrote:
I would honestly like to know what you think. And if you're not going by average utility, and you certainly seem to want to include your forks' utilities and you don't weigh your utility higher, then what does your utility function look like? The sum of their utilities? So you would want to make as many forks as possible in most cases?
What we are discussing is the existence or non-existence of continuity of consciousness and if an uploaded mind is the same as the original mind. I would appreciate if we could go back to the original question. [/Quote] But itt is highly relevant. That the uploaded mind and the original mind have different utility functions (mainly that the uploaded mind attaches much greater value to its own experiences and likewise for the original mind) is a very big issue and defines their identity as separate. How can they be the same if they want different things?
Quote:
Smokeskin wrote:
But you (remember that means your body and brain) is the one experiencing the pain and the one that has to make the decision. If you decide to swap, the pain stops for you. You can't dodge this by saying "my fork is also me". It is not in terms of actual experiences and decisions.
If you define "you" and "your body", of course the fork isn't me. Let's call your definition of me "me2". The brain-mind is me1. The computer-mind is me1. The brain-mind is me2. The computer-mind is not me2. But as said, the fact that an uploaded mind does not have the same body as the original mind is a trivial truth with little to do with what we're talking about. What we are discussing is if the brain-mind and the current me would share a certain mental quality that the computer-mind and the current me would not share.
I have established several differences between you and your fork. Different experiences, different utility functions, different decisions. They sense different things, want different things, do different things.
Quote:
Smokeskin wrote:
Could you endure the pain forever?
I imagine that neither me would want to endure the pain. Rationally both the brain-mind and the computer-mind would probably not want the other any harm, but egoistic instinct would most likely drive both the brain-mind and the computer-mind to wanting to switch. Remind me again how the fact that you can probably torture me into acting against my own moral compass implies that a certain theory in the field of cognitive science is correct?
It establishes the differences in your identity. You only have relationship to your fork, but it is not you. Things that happen to you are much important to you than the same things happening to your fork.
Quote:
It's a bit like saying that your love for your family doesn't exist just because you can probably be tortured to the point where you are willing to kill them to make it stop. Your love is not strong enough to withstand millions of years of evolutionary instincts, just like my rationality is not strong enough.
Absolutely not. If you said you loved your fork like your family, I wouldn't object (though I doubt it since we aren't wired for that emotional connection and I'm far from certain that people would even like their forks). You're claiming no separate identity, which is what we're establishing is not the case.
Quote:
Do you think that the brain-mind would act selfishly in this situation while the computer-mind would willingly accept the torture? If not, the fact that you think they would both act in the same way seems to be more proof of my position than your position.
No, I believe they will both value their own experiences more (or in some cases both be self-sacrificial, but in any case they will find their decisions at odds with eachother). Each has their own identity and this makes them want different things.
Quote:
Smokeskin wrote:
First, as long as my brain is functioning properly, my consciousness will be there in the future. And it has been there in the past. This sets it apart from beliefs like "my soul will live on in heaven".
Consciousness is not the same thing as continuity of consciousness. Consciousness would exist in a fork.
Of course. A fork has continuity of consciousness too.
Quote:
Smokeskin wrote:
Second, I experience a retrospective continuity.
That's called "memories". Memories are not the same thing as continuity of consciousness. Memories would exist in a fork.
unless we tricked the fork, it would however know that it did not have continuity of consciousness to before the forking.
Quote:
Smokeskin wrote:
Third, read the thought experiment above. Look at people's behavior. This establishes the forward link.
How people act in your thought experiment is at best a proof of sense of identity/sense of "continuity of consciousness". I have already explained that I believe that sense of identity exists. But an illusion is not reality. A person who hallucinates an angel is not proof of angels. The fact that everyone sees square A as darker than square B in the famous checker shadow illusion is not proof that it actually is. Our brain fools us all the time.
Angels are not the same as a belief in angels, but Happiness is the same as a sense of happiness however. You're confusing qualia with external events.
Quote:
Smokeskin wrote:
I can easily imagine an AI or mentally ill human without continuity of consciousness. One who felt no connection to past states, or didn't care about its future state because it felt no connection to it.
Once again, illusion is not reality. If you by "continuity of consciousness" mean "sense of identity" or something similar, you must explain why a fork would not have a sense of identity. If all mental processes are copied, that sense would be as present in the fork as in the original brain.
Forks have an identity of course. But their identity is different than yours.
Ilmarinen Ilmarinen's picture
Smokeskin wrote:
Smokeskin wrote:
Forks have an identity of course. But their identity is different than yours.
Okay. I think I know where the problem is now: You think someone is arguing that a fork of you is the same as your present self. We're not. We're arguing that it is the same as your past self and by extension any forks you make in the future are your future selves just as your original brain/body is. It's like having a dotted line and at some point instead of having one dot you put down two and then they go off in separate directions. There are now two different dotted lines, but they both come from the same dotted line. The split - the 'fork' - is the place where they diverge. But they're both equally connected to their shared past. Again, if this isn't true, do you think that if an original and a fork lacked any physical evidence of who was who one of them would [i]feel[/i] like a fork?
[------------/Nation States/-----------] [-----/Representative Democracy/-----] [--------/Regulated Capitalism/--------]
Smokeskin Smokeskin's picture
Ilmarinen wrote:Smokeskin
Ilmarinen wrote:
Smokeskin wrote:
Forks have an identity of course. But their identity is different than yours.
Okay. I think I know where the problem is now: You think someone is arguing that a fork of you is the same as your present self. We're not. We're arguing that it is the same as your past self and by extension any forks you make in the future are your future selves just as your original brain/body is. It's like having a dotted line and at some point instead of having one dot you put down two and then they go off in separate directions. There are now two different dotted lines, but they both come from the same dotted line. The split - the 'fork' - is the place where they diverge. But they're both equally connected to their shared past.
Alkahest is arguing that we don't have continuity of consciousness and that forks don't have an individual identity. There is also the big difference that with my point of view egocasting equals death and being restored from backup doesn't resurrect you (though of course your loved ones would only suffer an abstract loss). If you are ok with uploading, egocasting and backup, then you need to account for how these copies are you.
Quote:
Again, if this isn't true, do you think that if an original and a fork lacked any physical evidence of who was who one of them would [i]feel[/i] like a fork?
Of course not. There's no difference between the original and the fork. Even when a fork knows its a fork it doesn't feel any different (aside from stuff like existential qualms which some people might have).
Smokeskin Smokeskin's picture
And just to be clear - I don
And just to be clear - I don't think there's anything unrealistic about egocasting. People willingly kill themselves for a variety of reason, like suicide bombers believing they'll go to heaven, cultist thinking they'll go to the alien UFO, to save loved ones, to preserve their honor after a failure, etc., and they do it with different degrees of calm. That people in EP do it, but not easily since they often suffer Stress damage, content that a copy of their ego lives on somewhere, seems reasonable enough. But these people are very different from me in their willingness to accept personal death, like a samurai commiting harikiri after failure.
jhfurnish jhfurnish's picture
Religions and philosophies within the EP world...
I'm curious about what religions and philosophies within the game setting have to say on the issue. I expect that religionists amongst the bioconservatives and flats will be the natural objectors to the whole sleeving and forking process, whereas any that are eastern-based will have the least objections. I'm most curious to know if AGI's and egos who prefer to be infomorphs develop a 'homebrew' religious system.
Alkahest Alkahest's picture
Smokeskin wrote:But this is
Smokeskin wrote:
But this is the core of identity. You are not your fork.
I regret to inform you that repetition of a claim does not constitute proof of that claim.
Smokeskin wrote:
Brain-mind shares experiences and decisions with you. Computer-mind does not. You have confirmed this with your response to the torture experiment.
So by saying that the brain-mind and the computer-mind would behave in exactly the same way I have in fact confirmed your theory that the brain-mind shares a mental phenomenon with me that the computer-mind lacks. That's either an argument with an underlying logic far too subtle for me to understand or a complete non sequitur.
Smokeskin wrote:
I am saying that when we place the two minds in different circumstances but with full knowledge of the other, they will want different things to happen. Defining both as "you" when they not only have different experiences but have different priorities, goals and make different decisions seems pointless. As per the torture experiment, you want to switch with your fork but your fork does not want to switch because you don't want to experience the pain any longer and it matters less to you that your fork will experience it.
I will try to explain my position more clearly. The current me would equally become the brain-mind and the computer-mind. However, that does not mean that the brain-mind and the computer-mind are necessarily the same person. They gradually become different persons, just as I gradually become a different person from who I was earlier in my life by simply living. To simplify further: A is X. B is X and Y. C is X and Z. So the next question is naturally, how much different can I accept a fork of me to be before I think I would lose something valuable by dying? I don't think there's any simple rule to use. Each night when I fall asleep I experience things that will not stick in my memory, and I don't think my life is unbearable as a result. If I would lose an entire day, I would be slightly hesitant. A week, it would be a bit of a bother. A month? Yeah, starting to feel like I have quite a bit to lose. But it's not black and white, it's gradual. I'm declining to respond to some of your points since they all seem to be founded on the mistaken belief that I think that the brain-mind and the computer-mind would have the same identity. If you would like to explore some of those questions further, please ask me again.
Smokeskin wrote:
Of course. A fork has continuity of consciousness too.
I said consciousness, not continuity of consciousness. I believe in consciousness. I don't believe in continuity of consciousness.
Smokeskin wrote:
unless we tricked the fork, it would however know that it did not have continuity of consciousness to before the forking.
If we forked me, a person who does not believe in continuity of consciousness, do you think the fork would believe that it has a continuity of consciousness but not one that it shares with the current me? Of course not. All you're saying is that people who share your beliefs would spawn forks that share your beliefs - which, once again, seems to be more proof of my position than your position. People making a claim about their own cognitive systems is not adequate proof that their theory is correct.
Smokeskin wrote:
Angels are not the same as a belief in angels, but Happiness is the same as a sense of happiness however. You're confusing qualia with external events.
I have, as I have already explained, not claimed that sense of identity doesn't exist. People feel that they have a continuity of consciousness and thus share an identity with a person they remember themselves being. The thing is, if we create a perfect clone of you, that clone would have the same memories as you as well as the same sense of identity. You would experience continuity of consciousness and sense of identity with the person you remember yourself being. Your clone would experience continuity of consciousness and sense of identity with the person he remembers himself being. So how is your introspection any proof of the reality of identity and continuity of consciousness?
Smokeskin wrote:
Forks have an identity of course. But their identity is different than yours.
I think we need to agree to some common terms. Right now, we often appear to talk past one another. How about this: Alkahest-now is the current me, the one you're talking to. Alkahest-brain is a future me that has undergone a non-destructive uploading. Alkahest-computer is the spawn of that uploading process. Alkahest-now will eventually become Alkahest-brain and Alkahest-computer. Alkahest-brain and Alkahest-computer are initially identical but will eventually become more different from each other. No-one has an "identity" as you use the word since I consider that to be a completely meaningless term.
President of PETE: People for the Ethical Treatment of Exhumans.
Alkahest Alkahest's picture
Smokeskin, I would also like
Smokeskin, I would also like to ask you another question: Why are you afraid of death? I can answer why I am afraid of death: Death means all memories precious to me will be lost forever. Death means I can't take care of the people I love. Death means there will be no-one around to argue for my beliefs. Death means that the personality traits and emotions that characterize me will be gone. And so on and so forth. I'm not afraid of losing my identity or my continuity of consciousness because I don't believe I have either of those things. I can't be afraid of losing what I don't believe I have in the first place. As I understand it, you dislike death because it means that your identity or continuity of consciousness will be gone forever. Am I mistaken?
President of PETE: People for the Ethical Treatment of Exhumans.
Smokeskin Smokeskin's picture
Alkahest wrote:Smokeskin, I
Alkahest wrote:
Smokeskin, I would also like to ask you another question: Why are you afraid of death? I can answer why I am afraid of death: Death means all memories precious to me will be lost forever. Death means I can't take care of the people I love. Death means there will be no-one around to argue for my beliefs. Death means that the personality traits and emotions that characterize me will be gone. And so on and so forth. I'm not afraid of losing my identity or my continuity of consciousness because I don't believe I have either of those things. I can't be afraid of losing what I don't believe I have in the first place. As I understand it, you dislike death because it means that your identity or continuity of consciousness will be gone forever. Am I mistaken?
Yes you are mistaken. I am afraid of death because of my survival instinct. It is something biologically wired in my brain. It makes me place a very high value on living. Your reasons for fear of death seem to be merely a list of additional consequences of your death and all of them are external. You don't value continued existence and future experiences in itself?
Smokeskin Smokeskin's picture
Alkahest wrote:Smokeskin
Alkahest wrote:
Smokeskin wrote:
Forks have an identity of course. But their identity is different than yours.
I think we need to agree to some common terms. Right now, we often appear to talk past one another. How about this: Alkahest-now is the current me, the one you're talking to. Alkahest-brain is a future me that has undergone a non-destructive uploading. Alkahest-computer is the spawn of that uploading process. Alkahest-now will eventually become Alkahest-brain and Alkahest-computer. Alkahest-brain and Alkahest-computer are initially identical but will eventually become more different from each other. No-one has an "identity" as you use the word since I consider that to be a completely meaningless term.
Ok. See next section for my definition of identity.
Quote:
Smokeskin wrote:
But this is the core of identity. You are not your fork.
I regret to inform you that repetition of a claim does not constitute proof of that claim.
It is not merely a repitition of my claim. We established that Alkahest-brain has a different utility function and a separate consciousness from Alkahest-computer. This difference is what identity is, which why I feel you're dodging the issue when you refuse to talk about utility functions. Expected utility is what guides behavior, and with different behavior I don't see how you can claim non-separate identity. We know that if tortured A-brain would switch the burden to A-computer I assume that A-brain would refuse to work a lot and let A-computer spend all the money without contributing. Yet you're saying that A-brain finds it perfectly reasonable to [i]die[/i] as long as A-computer lives on.
Quote:
Smokeskin wrote:
Brain-mind shares experiences and decisions with you. Computer-mind does not. You have confirmed this with your response to the torture experiment.
So by saying that the brain-mind and the computer-mind would behave in exactly the same way I have in fact confirmed your theory that the brain-mind shares a mental phenomenon with me that the computer-mind lacks. That's either an argument with an underlying logic far too subtle for me to understand or a complete non sequitur.
It is not subtle at all, the problem is that you're switching circumstances too. A-brain prefers A-computer to be tortured. A-computer prefers A-brain to be be tortured. When each values their own future sensations much more than a forks, I don't see how they can also claim non-continuity of consciousness. Wouldn't that be contradictory? Why does A-brain care? Or let us make it even more clear. Alkahest gets tortured. He can take it no more. He will soon have an uploaded and he has two options: 1 - A-brain's torture will continue. A-computer will be free. 2 - A-brain will be tortured for a day more and then be set free. A-computer will continue to be tortured. What will Alkahest choose?
Quote:
Smokeskin wrote:
I am saying that when we place the two minds in different circumstances but with full knowledge of the other, they will want different things to happen. Defining both as "you" when they not only have different experiences but have different priorities, goals and make different decisions seems pointless. As per the torture experiment, you want to switch with your fork but your fork does not want to switch because you don't want to experience the pain any longer and it matters less to you that your fork will experience it.
I will try to explain my position more clearly. The current me would equally become the brain-mind and the computer-mind. However, that does not mean that the brain-mind and the computer-mind are necessarily the same person. They gradually become different persons, just as I gradually become a different person from who I was earlier in my life by simply living. To simplify further: A is X. B is X and Y. C is X and Z.
I understand that is your belief, but I do not share that view.
Quote:
Smokeskin wrote:
Of course. A fork has continuity of consciousness too.
I said consciousness, not continuity of consciousness. I believe in consciousness. I don't believe in continuity of consciousness.
Smokeskin wrote:
unless we tricked the fork, it would however know that it did not have continuity of consciousness to before the forking.
If we forked me, a person who does not believe in continuity of consciousness, do you think the fork would believe that it has a continuity of consciousness but not one that it shares with the current me? Of course not. All you're saying is that people who share your beliefs would spawn forks that share your beliefs - which, once again, seems to be more proof of my position than your position. People making a claim about their own cognitive systems is not adequate proof that their theory is correct.
In the two above responses I was clarifying my own position. I suppose you asked to understand it better or try to expose internal inconsistencies. I'm not convinced you don't have continuity just because you don't believe in it.
Quote:
Smokeskin wrote:
Angels are not the same as a belief in angels, but Happiness is the same as a sense of happiness however. You're confusing qualia with external events.
I have, as I have already explained, not claimed that sense of identity doesn't exist. People feel that they have a continuity of consciousness and thus share an identity with a person they remember themselves being. The thing is, if we create a perfect clone of you, that clone would have the same memories as you as well as the same sense of identity.
We disagree here. Each fork has their own sense of identity linked to their own consciousness.
Quote:
You would experience continuity of consciousness and sense of identity with the person you remember yourself being. Your clone would experience continuity of consciousness and sense of identity with the person he remembers himself being. So how is your introspection any proof of the reality of identity and continuity of consciousness?
Well they're qualia. They're introspectively apparent, they're reflected in behavior, they're consistent across a wide range of thought experiments, and not in conflict with reality. By contrast a belief in non-continuity of consciousness fails on several accounts, like for example the torture fork thought experiment.
jhfurnish jhfurnish's picture
Continuity
Some very smart, very spiritual people have taken the 'irrational' stance that there is indeed continuity. Some of the debate is - in what form? Bhuddists seem to believe that everyone returns to a 'database', submerging back into a whole. Not so bad. Others believe in an individual identity. Others believe that people reincarnate in the form of repeating patterns occuring in the universe. It's essentially impossible to know for sure, of course, until it happens. Will science catch up to this? There are a few in the field who do try.
Alkahest Alkahest's picture
Smokeskin wrote:Yes you are
Smokeskin wrote:
Yes you are mistaken. I am afraid of death because of my survival instinct. It is something biologically wired in my brain. It makes me place a very high value on living.
I'm not sure a "death" where all my mental phenomena live on can even be called a death. It's certainly not the kind of death our evolution has made us afraid of, since they didn't have uploading technology on the African savanna. But once again, I remind you that language is not reality. Using a more narrow definition of death ("the destruction of the mental phenomena that characterize an active person", a person who is destructively uploaded does not die, but clearly you're still scared of uploading. Why? How do you define death in this context?
Smokeskin wrote:
Your reasons for fear of death seem to be merely a list of additional consequences of your death and all of them are external.
If all mental phenomena are external, I wonder what you view as internal.
Smokeskin wrote:
You don't value continued existence and future experiences in itself?
I'm not sure how many times I have to remind you of this, but I don't believe in continuity of consciousness. I believe I1 would experience the future just as well as an uploaded mind as I1 would as a brain.
President of PETE: People for the Ethical Treatment of Exhumans.
Alkahest Alkahest's picture
Smokeskin wrote:It is not
Smokeskin wrote:
It is not merely a repitition of my claim. We established that Alkahest-brain has a different utility function and a separate consciousness from Alkahest-computer.
We have not established that Alkahest-now has a different utility function and a separate consciousness from Alkahest-computer. Any more than Alkahest-yesterday has a different utility function and a separate consciousness from Alkahest-now, that is. We have also established that you can force a person to act against his moral compass by torturing him. I'm sure the field of psychology will be thrilled by this exciting new discovery.
Smokeskin wrote:
This difference is what identity is, which why I feel you're dodging the issue when you refuse to talk about utility functions. Expected utility is what guides behavior, and with different behavior I don't see how you can claim non-separate identity.
This conversation would be much smoother if you actually read what I wrote: "The current me would equally become the brain-mind and the computer-mind. However, that does not mean that the brain-mind and the computer-mind are necessarily the same person. They gradually become different persons, just as I gradually become a different person from who I was earlier in my life by simply living."
Smokeskin wrote:
We know that if tortured A-brain would switch the burden to A-computer I assume that A-brain would refuse to work a lot and let A-computer spend all the money without contributing. Yet you're saying that A-brain finds it perfectly reasonable to [i]die[/i] as long as A-computer lives on.
Sure. As said, that depends on how much unique experience A-brain has. I think a day's worth of memories is entirely acceptable to lose. Now, if you separate the two for years and ask A-brain to kill himself, that's an entirely different thing. But to cut to the chase: Would I put a gun to my face and pull the trigger if I was convinced that my mind had just been backed up on a hard drive? Yes, I would. There, I saved us several days worth of pointless thought experiments.
Smokeskin wrote:
It is not subtle at all, the problem is that you're switching circumstances too. A-brain prefers A-computer to be tortured. A-computer prefers A-brain to be be tortured. When each values their own future sensations much more than a forks, I don't see how they can also claim non-continuity of consciousness. Wouldn't that be contradictory? Why does A-brain care?
Well, according to my belief there's the fact that forks gradually become more different persons as time goes on. If I value myself1, I should value the future me1 based on myself1 more than my fork, since that person is more similar to be than my fork. Of course, how different I am from my fork varies depending on how old the split is.
Smokeskin wrote:
Or let us make it even more clear. Alkahest gets tortured. He can take it no more. He will soon have an uploaded and he has two options: 1 - A-brain's torture will continue. A-computer will be free. 2 - A-brain will be tortured for a day more and then be set free. A-computer will continue to be tortured. What will Alkahest choose?
If he isn't already uploaded, both A-brain and A-computer would be equally him, so it wouldn't make a difference. He would flip a coin.
Smokeskin wrote:
In the two above responses I was clarifying my own position. I suppose you asked to understand it better or try to expose internal inconsistencies. I'm not convinced you don't have continuity just because you don't believe in it.
You're changing the subject. You said "unless we tricked the fork, it would however know that it did not have continuity of consciousness to before the forking". But that's only true if the fork actually believed in continuity of consciousness. It's not proof of continuity of consciousness any more than a Christian's belief in God is proof of God.
Smokeskin wrote:
We disagree here. Each fork has their own sense of identity linked to their own consciousness.
And how do you attempt to prove this? If all memories are copied perfectly, why wouldn't the sense of identity be copied as well? Why would some mental phenomena be copied and not others?
Smokeskin wrote:
Well they're qualia. They're introspectively apparent, they're reflected in behavior, they're consistent across a wide range of thought experiments, and not in conflict with reality.
That's not an answer to my question. In fact, it's another non sequitur. Read this again: "You would experience continuity of consciousness and sense of identity with the person you remember yourself being. Your clone would experience continuity of consciousness and sense of identity with the person he remembers himself being. So how is your introspection any proof of the reality of identity and continuity of consciousness?" My point is that a proof of the existence of continuity of consciousness which derives from subjective introspection has to take into account the fact that a mind you would say does not share continuity of consciousness with you experiences the same continuity of consciousness as you do. If the clone's experience of continuity of consciousness with what you consider to be a previous instance of you but not him is not proof of that continuity of consciousness, why is your experience of continuity of consciousness with what you consider to be a previous instance of you but not him proof of that continuity of consciousness?
Smokeskin wrote:
By contrast a belief in non-continuity of consciousness fails on several accounts, like for example the torture fork thought experiment.
A large part of your argument now seems to rest on that thought experiment. There's a big problem with that: I have already said that A-brain and A-computer are different persons.
President of PETE: People for the Ethical Treatment of Exhumans.
Alkahest Alkahest's picture
jhfurnish wrote:Some very
jhfurnish wrote:
Some very smart, very spiritual people have taken the 'irrational' stance that there is indeed continuity. Some of the debate is - in what form? Bhuddists seem to believe that everyone returns to a 'database', submerging back into a whole. Not so bad. Others believe in an individual identity. Others believe that people reincarnate in the form of repeating patterns occuring in the universe. It's essentially impossible to know for sure, of course, until it happens. Will science catch up to this? There are a few in the field who do try.
Those are generally known as "crackpots". I'm still curious about that "energy field" you were talking about earlier, but to be completely honest I think a scientist who argues for its existence probably falls within that category. Anyway, it seems that there are three types of beliefs about this subject: 1: The belief that continuity of consciousness exists and survives beyond the death of the physical body. 2: The belief that continuity of consciousness exists but doesn't survive beyond the death of the physical body. 3: The belief that continuity of consciousness doesn't exist.
President of PETE: People for the Ethical Treatment of Exhumans.
Smokeskin Smokeskin's picture
Alkahest wrote:Smokeskin
Alkahest wrote:
Smokeskin wrote:
It is not merely a repitition of my claim. We established that Alkahest-brain has a different utility function and a separate consciousness from Alkahest-computer.
We have not established that Alkahest-now has a different utility function and a separate consciousness from Alkahest-computer. Any more than Alkahest-yesterday has a different utility function and a separate consciousness from Alkahest-now, that is. We have also established that you can force a person to act against his moral compass by torturing him. I'm sure the field of psychology will be thrilled by this exciting new discovery.
I have previously interpreted your claim that forks don't have separate identities as them not having separate utility functions (as this is how I understand identity). This is what the torture experiment looked at, and if that was your belief it exposed an actual inconsistency and not just the triviality of selfish behavior. However, this seems to not be the case, so the point is moot. I'm uncertain about what you mean by forks lacking identity though, more on that below.
Quote:
Smokeskin wrote:
This difference is what identity is, which why I feel you're dodging the issue when you refuse to talk about utility functions. Expected utility is what guides behavior, and with different behavior I don't see how you can claim non-separate identity.
This conversation would be much smoother if you actually read what I wrote: "The current me would equally become the brain-mind and the computer-mind. However, that does not mean that the brain-mind and the computer-mind are necessarily the same person. They gradually become different persons, just as I gradually become a different person from who I was earlier in my life by simply living." [...] But to cut to the chase: Would I put a gun to my face and pull the trigger if I was convinced that my mind had just been backed up on a hard drive? Yes, I would. There, I saved us several days worth of pointless thought experiments.
Smokeskin wrote:
It is not subtle at all, the problem is that you're switching circumstances too. A-brain prefers A-computer to be tortured. A-computer prefers A-brain to be be tortured. When each values their own future sensations much more than a forks, I don't see how they can also claim non-continuity of consciousness. Wouldn't that be contradictory? Why does A-brain care?
Well, according to my belief there's the fact that forks gradually become different persons as time goes on. If I value myself1, I should value the future me1 based on myself1 more than my fork, since that person is more similar to be than my fork. Of course, how different I am from my fork varies depending on how old the split is.
Smokeskin wrote:
Or let us make it even more clear. Alkahest gets tortured. He can take it no more. He will soon have an uploaded and he has two options: 1 - A-brain's torture will continue. A-computer will be free. 2 - A-brain will be tortured for a day more and then be set free. A-computer will continue to be tortured. What will Alkahest choose?
If he isn't already uploaded, both A-brain and A-computer would be equally him, so it wouldn't make a difference. He would flip a coin.
Smokeskin wrote:
In the two above responses I was clarifying my own position. I suppose you asked to understand it better or try to expose internal inconsistencies. I'm not convinced you don't have continuity just because you don't believe in it.
You're changing the subject. You said "unless we tricked the fork, it would however know that it did not have continuity of consciousness to before the forking". But that's only true if the fork actually believed in continuity of consciousness.
I'm not changing the subject. When you ask me something, I describe my position, not yours. From my POV you experience the qualia of continuity but don't believe in it. I believe your qualia is right, except for the creation of forks where the continuity before creation is an illusion - much in the same way that I believe our sight corresponds to photons hitting our eyes, unless some brain implant created an equivalent illusion. Your non-belief I consider false, just like I'd assume you were wrong if you claimed we were living in the Matrix and our sight was illusory.
Quote:
Smokeskin wrote:
We disagree here. Each fork has their own sense of identity linked to their own consciousness.
And how do you attempt to prove this? If all memories are copied perfectly, why wouldn't the sense of identity be copied as well? Why would some mental phenomena be copied and not others?
This where I don't understand what you mean. We agree that forks are essentially different persons. Their utility functions are linked to themselves (and to their future, and in your case also their future forks) and only weakly to the other forks. I believe all mental phenomena are copied, but this happens anyway, just like a pointer to 0x084A actually points somewhere else if you copy the computer's state to another machine. Then why don't you think they have different identities or sense of identities? Sure, they all think they're Alkahest, but Alkahest-brain doesn't think he is Alkahest-computer. At best the other one is a dated backup.
Quote:
Smokeskin wrote:
Well they're qualia. They're introspectively apparent, they're reflected in behavior, they're consistent across a wide range of thought experiments, and not in conflict with reality.
That's not an answer to my question. In fact, it's another non sequitur. I will ask again: You would experience continuity of consciousness and sense of identity with the person you remember yourself being. Your clone would experience continuity of consciousness and sense of identity with the person he remembers himself being. So how is your introspection any proof of the reality of identity and continuity of consciousness?
You have no proof of say the reality of consciousness either. You only have the qualia to go on. I choose to accept my qualia. I can't explain them and no one else can either (yet). Taking them more or less at face value seem the only reasonable solution. That goes for my consciousness as well as its continuity. You choose to reject major elements of your qualia. To me disregarding observed qualia is much like disregarding physical observations. Not entirely because we know some qualia are in conflict with reality, but lacking physical counter-evidence disregarding some qualia but not others seems strange. By what principle do you pick and choose among qualia? When you disregard the obvious target of your survival instinct, continuity of consciousness, how did you decide on your memories and such to be the features you want to survive? Why not say your genes, "after all that's the natural way that evolution works," (I don't agree with the quoted statement btw, just providing what seems like an equally arbitrary definition "me").
Alkahest Alkahest's picture
Smokeskin wrote:I'm not
Smokeskin wrote:
I'm not changing the subject. When you ask me something, I describe my position, not yours. From my POV you experience the qualia of continuity but don't believe in it.
First of all, I believe "qualia" is a completely meaningless term with no extension. Second of all, I don't experience continuity. I experience memories. Some people choose to interpret these memories as proof of the soul. I don't. Do you have any proof that this continuity is experienced as anything other than memories? Third of all, you are still ignoring what you actually said. You said that "unless we tricked the fork, it would however know that it did not have continuity of consciousness to before the forking". Look at the actual words. How would the fork know that it did not have continuity of consciousness to before the forking if it doesn't believe in continuity of consciousness?
Smokeskin wrote:
I believe your qualia is right, except for the creation of forks where the continuity before creation is an illusion - much in the same way that I believe our sight corresponds to photons hitting our eyes, unless some brain implant created an equivalent illusion. Your non-belief I consider false, just like I'd assume you were wrong if you claimed we were living in the Matrix and our sight was illusory.
That's a lot of claims without any proof. You have decided that continuity of consciousness exists but that it's only an illusion when it comes to uploaded minds. And what is your argument for these claims? "Because qualia". Your "qualia" is right, the "qualia" of forks are wrong, because one is reality and the other is illusion. I assume I'm supposed to take that on faith.
Smokeskin wrote:
This where I don't understand what you mean. We agree that forks are essentially different persons. Their utility functions are linked to themselves (and to their future, and in your case also their future forks) and only weakly to the other forks.
Well, assuming they are egoists.
Smokeskin wrote:
I believe all mental phenomena are copied, but this happens anyway, just like a pointer to 0x084A actually points somewhere else if you copy the computer's state to another machine.
I'm not sure what you mean. In this situation, the entire content of the computer has been copied, not just the pointer.
Smokeskin wrote:
Then why don't you think they have different identities or sense of identities? Sure, they all think they're Alkahest, but Alkahest-brain doesn't think he is Alkahest-computer. At best the other one is a dated backup.
If they have the same memories, they have the same sense of identity. If they have different memories (and they will develop different memories from the moment they fork), they will have different senses of identity. But "identity" in itself is a meaningless concept.
Smokeskin wrote:
You have no proof of say the reality of consciousness either. You only have the qualia to go on.
Of course I have proof of the reality of consciousness! The various cognitive systems constituting consciousness are clearly visible in an fMRI. A conscious person and an unconscious person behave differently. And so on and so forth.
Smokeskin wrote:
I choose to accept my qualia. I can't explain them and no one else can either (yet). Taking them more or less at face value seem the only reasonable solution. That goes for my consciousness as well as its continuity.
"When anyone tells me, that he saw a dead man restored to life, I immediately consider with myself, whether it be more probable, that this person should either deceive or be deceived, or that the fact, which he relates, should really have happened. I weigh the one miracle against the other; and according to the superiority, which I discover, I pronounce my decision, and always reject the greater miracle." Granted, continuity of consciousness and dead men walking may seem like two completely different things, but they both fly in the face of scientific evidence. Our senses, both internal and external, deceive us every day. Why are you so adamant about blindly accepting your intuition about this one illusion?
Smokeskin wrote:
You choose to reject major elements of your qualia.
Well, personally I would rather describe it as rejecting the very concept of qualia, but sure.
Smokeskin wrote:
To me disregarding observed qualia is much like disregarding physical observations. Not entirely because we know some qualia are in conflict with reality, but lacking physical counter-evidence disregarding some qualia but not others seems strange. By what principle do you pick and choose among qualia?
Since I don't believe "qualia" is a meaningful term I don't pick and choose among them, but I do have a principle relevant to this discussion: Always assume the existence of the fewest possible phenomena necessary to explain reality. Nothing about human behavior requires the existence of continuity of consciousness, the existence of memories suffice. If I have to choose between believing in memories and continuity of consciousness and simply believing in memories, the first option has to explain more than the second. It doesn't. So I don't.
Smokeskin wrote:
When you disregard the obvious target of your survival instinct, continuity of consciousness,
The obvious targets of my survival instinct are my testicles. Evolution cares more about making babies than various mental illusions.
Smokeskin wrote:
how did you decide on your memories and such to be the features you want to survive? Why not say your genes, "after all that's the natural way that evolution works," (I don't agree with the quoted statement btw, just providing what seems like an equally arbitrary definition "me").
All value axioms are fundamentally non-rational. (Not irrational, just non-rational.) To bring up Hume again, you can't derive an ought from an is. But to act rationally on your values in the real world, what you value must exist in reality. Continuity of consciousness doesn't exist, and to value it would be completely fruitless. Valuing the non-existent is the same as complete nihilism. Memories exist, and valuing them can actually lead somewhere.
President of PETE: People for the Ethical Treatment of Exhumans.
Ilmarinen Ilmarinen's picture
Smokeskin wrote:
Smokeskin wrote:
Yes you are mistaken. I am afraid of death because of my survival instinct. It is something biologically wired in my brain. It makes me place a very high value on living.
Okay, let's just go with this. Egocasting and restoring from backup don't trip my survival instinct because they don't [i]feel[/i] like death - they feel like teleporting and reloading the last save in a video game respectively.
[------------/Nation States/-----------] [-----/Representative Democracy/-----] [--------/Regulated Capitalism/--------]
Smokeskin Smokeskin's picture
Ilmarinen wrote:Smokeskin
Ilmarinen wrote:
Smokeskin wrote:
Yes you are mistaken. I am afraid of death because of my survival instinct. It is something biologically wired in my brain. It makes me place a very high value on living.
Okay, let's just go with this. Egocasting and restoring from backup don't trip my survival instinct because they don't [i]feel[/i] like death - they feel like teleporting and reloading the last save in a video game respectively.
Not correct. Egocasting for the sender feels like having your brain wiped. Alkahest is ready for egocasting and as he said, if his brain had just been scanned he would have no problems shooting himself in the brain. That's what egocasting entails. If you're not willing to do that, you shouldn't egocast. @Alkahest: Unfortunately I don't have time to respond atm. I will get back later today or tomorrow.
Ilmarinen Ilmarinen's picture
Smokeskin wrote:Ilmarinen
Smokeskin wrote:
Ilmarinen wrote:
Smokeskin wrote:
Yes you are mistaken. I am afraid of death because of my survival instinct. It is something biologically wired in my brain. It makes me place a very high value on living.
Okay, let's just go with this. Egocasting and restoring from backup don't trip my survival instinct because they don't [i]feel[/i] like death - they feel like teleporting and reloading the last save in a video game respectively.
Not correct. Egocasting for the sender feels like having your brain wiped. Alkahest is ready for egocasting and as he said, if his brain had just been scanned he would have no problems shooting himself in the brain. That's what egocasting entails. If you're not willing to do that, you shouldn't egocast.
I prefer to think of it as taking a nap and then having a pre-set automated system shoot me in the brain. Again, I'm fine with it because as far as I'm concerned the egocast is 'future me', the backup is 'past me' and any forks are 'separate people who might become part of me if we get together quickly enough'. Unless they're forks I haven't made yet but am planning to make in the future, in which case they're 'future mes'. I've expressed my reasons for taking those stances before, but since you've stated you mostly care about people's behavior, there you go - these are the definitions influencing my behavior.
[------------/Nation States/-----------] [-----/Representative Democracy/-----] [--------/Regulated Capitalism/--------]
Smokeskin Smokeskin's picture
Ilmarinen wrote:Smokeskin
Ilmarinen wrote:
Smokeskin wrote:
Ilmarinen wrote:
Smokeskin wrote:
Yes you are mistaken. I am afraid of death because of my survival instinct. It is something biologically wired in my brain. It makes me place a very high value on living.
Okay, let's just go with this. Egocasting and restoring from backup don't trip my survival instinct because they don't [i]feel[/i] like death - they feel like teleporting and reloading the last save in a video game respectively.
Not correct. Egocasting for the sender feels like having your brain wiped. Alkahest is ready for egocasting and as he said, if his brain had just been scanned he would have no problems shooting himself in the brain. That's what egocasting entails. If you're not willing to do that, you shouldn't egocast.
I prefer to think of it as taking a nap and then having a pre-set automated system shoot me in the brain.
That's more or less what would happen, yes.
Ilmarinen Ilmarinen's picture
Smokeskin wrote:
Smokeskin wrote:
That's more or less what would happen, yes.
Again: provided I have the egocast going out and a backup saved somewhere this doesn't feel like death to me. I can continue posting on the theory of consciousness and whatnot, but the point is that going into facility I wouldn't feel like 'a copy' of me would arrive at the destination - I would feel like it [i]was[/i] me.
[------------/Nation States/-----------] [-----/Representative Democracy/-----] [--------/Regulated Capitalism/--------]
OpenInsurgency OpenInsurgency's picture
Speaking for myself, I'd have
Speaking for myself, I'd have a hard time shooting myself even after having had my brain scanned. I'm not sure survival instinct can (without scientific intervention) be overridden so easily. The problem is a matter of prior associations - I associate the gun with killing and death, whereas I would hypothetically associate an egocaster with convenient transportation. A similar effect can be seen with current methods of transportation. In general, people readily drive, or are driven, to their workplaces without fear of injury or death because they associate automobiles with convenient transport - unless they have experienced an event which has added additional associations, such as a car accident. In contrast, people typically have greater trepidation when boarding an aeroplane, as many associate aeroplanes with tragic accidents thanks to the media. Of course, statistically they are less likely to be injured or die on a plane than in a car, but nevertheless prior associations effect their feelings and decision-making. Thus, I don't think you could write them out completely. But getting back to the primary topic of the thread, my fundamental problem with the idea of egocasting and whatnot is that my consciousness (let's call it ME v1) would not be sustained by the process. I have no belief in any afterlife and would call myself a materialist - thus, I don't hold any belief in a persistent "soul". So, if egocasting involves my mind being copied and sent as electronic data, with the source material being deleted, ME v1 ceases to exist. In my mind, the copy (despite being entirely identical) is a different thing, ME v2; it is a wholly new consciousness, although it possesses exactly all the features the last one did. It's a hard idea to bring across in words, admittedly. And if I lived in the world of Eclipse Phase, I'd doubt that I'd be egocasting anytime soon.
Ilmarinen Ilmarinen's picture
OpenInsurgency wrote: In my
OpenInsurgency wrote:
In my mind, the copy (despite being entirely identical) is a different thing, ME v2; it is a wholly new consciousness, although it possesses exactly all the features the last one did.
Can you explain this distinction or explain why it matters?
[------------/Nation States/-----------] [-----/Representative Democracy/-----] [--------/Regulated Capitalism/--------]
OpenInsurgency OpenInsurgency's picture
I'll give it my best shot.
I'll give it my best shot. I shall attempt to answer both of your queries - firstly, "Can you explain the distinction". I would analogize my point of view thus. Say that my father had died, but fortunately I had backed up his entire personality and memories. These were downloaded into a body that was an exact replica of my father. To all intents and purposes, this resurrected dad is the same being as that which died. However, in my mind there would still be a distinction. This resurrected father would remember being there at my birth - but he himself was not. My now dead father was. I would draw the distinction here as being one of physical presence. This resurrected father has mental presence for certain events, but this instance of him was not actually there. In my mind, it would be the same with egocasting. If you egocast, you would not perceive your mind being wiped, followed by a period of nothing, succeeded by an awakening at the other end. As soon as that brain is wiped, ME v1 ceases to be. ME v2 takes over, being an exact copy who is indistinguishable from the old version. As far as why this matters, it only matters to ME v1. Who, if it were me, would quite enjoy to continue existing. To everybody else, the brain that was scanned and wiped and the new copy that was transported are one and the same. I hope I have explained it better, but a great amount of my thinking lies in "gut instinct", for lack of a better term. Egocasting is not something I would do, but resleeving over an ego bridge, which acts as a literal transfer while the subject is conscious, that I would be happy with.
jackgraham jackgraham's picture
Consciousness, Roger Ebert, and Memetic Immortality
Glad you guys are having fun with this topic. :) Usually, I skip over threads with the word "soul" in them (they give me the relieebeejeebees), but this one is pretty damned interesting. When I was working on the section about resleeving/backup/egocasting, I had a pretty interesting conversation with a friend who at the time was a psych grad student. I told him about how I was positing a technology that maintained continuity of consciousness. He's one of these "consciousness is an epiphenomenon" guys and was annoyed by my assertion that if the ego is aware throughout the process, the person who comes out on the other end is the same person. His point was "consciousness isn't anything special," i.e., if nanites are taking your brain apart bit by bit and recreating it in a new body, you're still dying, even if your consciousness -- which, by his thinking, is an illusion -- experiences moving from one body to another with no lost time or loss of awareness. Your consciousness isn't your "self," he felt; your body is. So this is where I confess that I'm as uncomfortable with this stuff as some of you. I'm by & large a materialist, although I still classify myself as an agnostic rather than a straight up atheist. I'd really like it if there were an afterlife, but I ain't banking on it -- nor am I into restricting the possibilities of my materialist life because of what might lay beyond. But lacking any belief in an afterlife beyond, "Eh, that'd be nice," I still want a robust definition of self, and I'll be honest with you guys -- I haven't found one. At the moment when I wrote the resleeving rules for EP, I was very committed to the idea of consciousness as being of paramount importance, but my philosophical journey in the intervening six years has put some miles on that starting point. Anyhoo, I'll tell you something that I do find very attractive and am just now starting to give some serious thought to. A couple of days ago, Roger Ebert passed away. He was someone I admired greatly, and I felt the need to write an obit of my own. One of the best things I read amid all of the coverage of his passing, though, was a piece that he wrote himself, a couple of years back, when he was grappling with the implications of his illness. It's titled, [b]I Do Not Fear Death[/b], and it's the most excellent materialist take on the passage from this world to oblivion that I've ever read. What I extrapolated from Ebert's piece was this: In some sense, you are your memetic footprint, first and foremost. Never mind about soul, genetic continuity, or squishy notions of what constitutes the self. Regardless of whether any of these things are real or important, the imprint you leave on the noösphere [i]is[/] real & important. How does this apply to resleeving/backup/egocasting, and to transhumanism generally? The transhumans of EP might well live in a future that's adopted this philosophy. For them, the question of whether you "die" when you resleeve might be, philosophically, bafflingly primitive, in the same way that we scratch our heads and wonder wtf Thales or Pythagoras were thinking. Maybe they're satisfied that when they move from one body to the next, their memetic force is uninterrupted. Perhaps, for them, that's enough. I'm living in their Dark Ages, though, so what do I know? :)
J A C K   G R A H A M :: Hooray for Earth!   http://eclipsephase.com :: twitter @jackgraham @faketsr :: Google+Jack Graham
Myrmidont Myrmidont's picture
A probably-not amusing story.
A probably-not amusing story. Also probably beside the point, now. I go to an egocasting facility for a business trip from Mars to Extropia. I egocast, but they don't put me to sleep yet. I wonder if I* am there yet. I* wake up in a new room, and wonder what I experienced. I am not I*. I am somehow able to break out of the egocasting facility after transmitting, because I don't want to die. I shoot an Egocast techie on the way out, and I am now a fugitive. I* am going about my business on Extropia. When I* am almost done, I leave an alpha fork, i, to finish business, since that's legal on Extropia and someone's messaged me about a problem back on Mars. I* get egocasted back to Mars, to help the police with their investigation, and I** wake up again on Mars. The techs on Extropia kill I*. I am on the run, and eventually, thanks to the memories and a behavioural profile provided by I**, the police catch me and shoot me when I resist arrest. I** wonder why I would have done such a thing, and then i* get transmitted back to me, and I** am then aware of the conclusion of the business I* went to Extropia for, and with a clean law enforcement record since I** wasn't the one that committed the crime. He's dead. But, because i am data, i only sent a copy back to Mars. i then buy a body in Extropia, join a Firewall team, and fight crime. I, I*, I**, i and i* would all claim to be the same identity, since they all remember existing for all of their previous 'life'. However, by the end of the story, none of them could claim to be the same [u]person[/u] as the I that started this story, not even the original I, since even I ended this story with different experiences and knowledge than what I started with. By the same token, I**, i and i* all have memories of the business conclusions on Extropia, and I** have memories of being both at Extropia and on Mars for a small period of time.
[@-rep +0|c-rep +0|f-rep +0|g-rep +0|i-rep +0|r-rep +0|x-rep +0] [img]http://boxall.no-ip.org/img/theeye_fanzine_userbar.jpg[/img] [img]http://boxall.no-ip.org/img/reints_userbar.jpg[/img]
Decivre Decivre's picture
OpenInsurgency wrote:I'll
OpenInsurgency wrote:
I'll give it my best shot. I shall attempt to answer both of your queries - firstly, "Can you explain the distinction". I would analogize my point of view thus. Say that my father had died, but fortunately I had backed up his entire personality and memories. These were downloaded into a body that was an exact replica of my father. To all intents and purposes, this resurrected dad is the same being as that which died. However, in my mind there would still be a distinction. This resurrected father would remember being there at my birth - but he himself was not. My now dead father was. I would draw the distinction here as being one of physical presence. This resurrected father has mental presence for certain events, but this instance of him was not actually there.
Depends on how we look at it. We are in many ways not the same person we were yesterday. My father isn't the same man today that he was a decade ago. Experiences, both blessed and tragic, have changed him for better or worse. So, do you consider scarring experiences to be something that kills the "you" that exists now, creating a new "you"? If not, then why do these experiences not count as death while reinstantiating does (considering that you are significantly less changed after reinstantiation than after a traumatic or pivotal event)?
OpenInsurgency wrote:
In my mind, it would be the same with egocasting. If you egocast, you would not perceive your mind being wiped, followed by a period of nothing, succeeded by an awakening at the other end. As soon as that brain is wiped, ME v1 ceases to be. ME v2 takes over, being an exact copy who is indistinguishable from the old version. As far as why this matters, it only matters to ME v1. Who, if it were me, would quite enjoy to continue existing. To everybody else, the brain that was scanned and wiped and the new copy that was transported are one and the same.
If you acknowledge they are one and the same, why tack on the seemingly arbitrary labels of "v1" and "v2"? Furthermore, v1 continues to exist... you have just tacked on the moniker of "v2" to him afterwards. You can argue that v1 ceased to exist if you consider v1 to be the hardware you were running on rather than the software that is your mind, but then you couldn't make the argument that v1 would like to continue to exist... since v1 has no mind to "like" anything anymore. It is hardware with no software with which to think, feel, or like.
OpenInsurgency wrote:
I hope I have explained it better, but a great amount of my thinking lies in "gut instinct", for lack of a better term. Egocasting is not something I would do, but resleeving over an ego bridge, which acts as a literal transfer while the subject is conscious, that I would be happy with.
Let's talk about the ego bridge. An ego bridge works by dismantling the physical brainstate and writing it to a virtual brainstate. The mind can be kept conscious the whole time, never really noticing to any real sense when the brainstate ceases to be organic and becomes digital. To you, this is considered an acceptable means of transfer, one that feels as though it were a smooth transition. But I have a thought problem for you to quander. Imagine you have three ego bridges. Two empty bodies sit in the second and third bridge, while you sit in the first. You lay down and begin the transfer process as your mind is moved from your brain to the ego bridge in digital form. At which point your mind is transitioned to another ego bridged and gradually placed within another sleeve. As you sit up in the second bridge (or third, it really doesn't matter), you look to see that the sleeve in the other bridge does the same. It too has your ego, and retained consciousness throughout the entire process. The bridge transmitted to both bridges simultaneously, and your ego(s) felt no loss of continuity in either direction. Which one is you?
Transhumans will one day be the Luddites of the posthuman age. [url=http://bit.ly/2p3wk7c]Help me get my gaming fix, if you want.[/url]
Gorkamorka Gorkamorka's picture
Myrmidont wrote:
Myrmidont wrote:
*snip* I, I*, I**, i and i* would all claim to be the same identity, *snip*
Just want to pipe in and say that I like that story. It's somehow has small personal horror. Almost Lovecraftian, but without the cosmic horror.
Alkahest Alkahest's picture
I think a problem with these
I think a problem with these kinds of discussions is that people often fall back on their intuitions to justify their beliefs, and our intuitions are biased in favor of folk psychology and other pseudoscientific ideas. I prefer to look at the available facts and trust logic and evidence rather than my gut feelings. I would like to ask Smokeskin, OpenInsurgency and anyone else who is uncomfortable with the idea of mind uploading because they fear the loss of continuity of consciousness three questions: 1: What phenomena can only be explained if we assume the existence of continuity of consciousness? 2: If you can't think of an answer to the above question, what reasons do you have to believe that continuity of consciousness exists? 3: If you can't think of an answer to that question either, why are you afraid of losing something you have no reason to believe exists?
President of PETE: People for the Ethical Treatment of Exhumans.
Smokeskin Smokeskin's picture
Alkahest wrote:Smokeskin
Alkahest wrote:
Smokeskin wrote:
I'm not changing the subject. When you ask me something, I describe my position, not yours. From my POV you experience the qualia of continuity but don't believe in it.
First of all, I believe "qualia" is a completely meaningless term with no extension.
Of course it is not a meaningsless term. You experience qualia like everyone else. You can believe it is an illusion, or an emergent property of certain types of information processing in matter, or signs of a soul, or whatever, but you can't deny it is there. The term is far from meaningless, and if you're trying to ignore the most basic observations every single human makes constantly, you're just being wilfully ignorant. Just because you don't understand it doesn't mean you just handwave it.
Quote:
Second of all, I don't experience continuity. I experience memories. Some people choose to interpret these memories as proof of the soul. I don't. Do you have any proof that this continuity is experienced as anything other than memories?
You could say that about any experience and emotion. Do we really experience them, or do we just have memories of them? I don't believe it makes much difference in this context though.
Quote:
Third of all, you are still ignoring what you actually said. You said that "unless we tricked the fork, it would however know that it did not have continuity of consciousness to before the forking". Look at the actual words. How would the fork know that it did not have continuity of consciousness to before the forking if it doesn't believe in continuity of consciousness?
You're splitting words. You know what continuity means. You might not believe in it, but you understand the concept well enough that you can see when it would be there and when it wouldn't. What you're saying is equivalent to claiming "but how could possible be able to use Newtonian mechanics to calculate trajectories of objects moving at near-c speeds when I know relativity theory?"
Quote:
Smokeskin wrote:
I believe your qualia is right, except for the creation of forks where the continuity before creation is an illusion - much in the same way that I believe our sight corresponds to photons hitting our eyes, unless some brain implant created an equivalent illusion. Your non-belief I consider false, just like I'd assume you were wrong if you claimed we were living in the Matrix and our sight was illusory.
That's a lot of claims without any proof. You have decided that continuity of consciousness exists but that it's only an illusion when it comes to uploaded minds. And what is your argument for these claims? "Because qualia". Your "qualia" is right, the "qualia" of forks are wrong, because one is reality and the other is illusion. I assume I'm supposed to take that on faith.
No, you're not supposed to take it on faith. Qualia can be illusory, and (short of trickery, living in the Matric, etc.) we have ways of determining them as false. If you see an orange, you would reasonably assume that photons were reflecting of an actual orange. However if I demonstrated to you that the image in your brain was generated by direct stimulation from a wires embedded in your visual cortex, you would experience the orange but know it was a false experience. It is the same with continuity. If you were a fork you'd still experience continuity, but you would know that at the moment of forking, it was actually broken.
Quote:
Smokeskin wrote:
This where I don't understand what you mean. We agree that forks are essentially different persons. Their utility functions are linked to themselves (and to their future, and in your case also their future forks) and only weakly to the other forks.
Well, assuming they are egoists.
No. Even unselfish forks would have different utility functions. Take for example sticking to an exercise routine - you'd force your forks to do so while you succumbed to the temptation of watching a movie instead for example.
Quote:
Smokeskin wrote:
I believe all mental phenomena are copied, but this happens anyway, just like a pointer to 0x084A actually points somewhere else if you copy the computer's state to another machine.
I'm not sure what you mean. In this situation, the entire content of the computer has been copied, not just the pointer.
The two pointers still point to two different registers existing in separate places.
Quote:
Smokeskin wrote:
Then why don't you think they have different identities or sense of identities? Sure, they all think they're Alkahest, but Alkahest-brain doesn't think he is Alkahest-computer. At best the other one is a dated backup.
If they have the same memories, they have the same sense of identity. If they have different memories (and they will develop different memories from the moment they fork), they will have different senses of identity. But "identity" in itself is a meaningless concept.
Two forks are different people, as you said. There are properties that are different, like for example their coordinates in physical space (whether that be electric charge in computer bits or brain matter). This gives them different identity.
Quote:
Smokeskin wrote:
You have no proof of say the reality of consciousness either. You only have the qualia to go on.
Of course I have proof of the reality of consciousness! The various cognitive systems constituting consciousness are clearly visible in an fMRI. A conscious person and an unconscious person behave differently. And so on and so forth.
If you didn't have the qualia of consciousness and people's description of it and were able to correlate that to certain brain states, you would have no idea what those brain scans meant.
Quote:
Smokeskin wrote:
I choose to accept my qualia. I can't explain them and no one else can either (yet). Taking them more or less at face value seem the only reasonable solution. That goes for my consciousness as well as its continuity.
"When anyone tells me, that he saw a dead man restored to life, I immediately consider with myself, whether it be more probable, that this person should either deceive or be deceived, or that the fact, which he relates, should really have happened. I weigh the one miracle against the other; and according to the superiority, which I discover, I pronounce my decision, and always reject the greater miracle." Granted, continuity of consciousness and dead men walking may seem like two completely different things, but they both fly in the face of scientific evidence. Our senses, both internal and external, deceive us every day. Why are you so adamant about blindly accepting your intuition about this one illusion?
For the same reason that happiness matters to me. We have probed the world around us and found that some qualia match well with reality, and others don't. Others yet, like happiness and consciousness, certainly seem to reflect activity in our brains, but it carries special meaning to us. Denying some of these (like continuity) and accepting others (like happiness) seems completely arbitrary. Denying all of them is also not a feasible solution - life without happiness and motivation is unacceptable. So you have to accept all of them, don't you? Otherwise, explain to me why you think happiness is valid but continuity is not. And continuity of consciousness does not fly in the face of science. The matter and energy in your brain transitions from one state to the next according to the laws of physics.
Quote:
Smokeskin wrote:
You choose to reject major elements of your qualia.
Well, personally I would rather describe it as rejecting the very concept of qualia, but sure.
Smokeskin wrote:
To me disregarding observed qualia is much like disregarding physical observations. Not entirely because we know some qualia are in conflict with reality, but lacking physical counter-evidence disregarding some qualia but not others seems strange. By what principle do you pick and choose among qualia?
Since I don't believe "qualia" is a meaningful term I don't pick and choose among them, but I do have a principle relevant to this discussion: Always assume the existence of the fewest possible phenomena necessary to explain reality. Nothing about human behavior requires the existence of continuity of consciousness, the existence of memories suffice. If I have to choose between believing in memories and continuity of consciousness and simply believing in memories, the first option has to explain more than the second. It doesn't. So I don't.
So you're disregarding observations because they don't fit your hypothesis. That's very irrational imo.
Quote:
Smokeskin wrote:
When you disregard the obvious target of your survival instinct, continuity of consciousness,
The obvious targets of my survival instinct are my testicles. Evolution cares more about making babies than various mental illusions.
Survival instinct is obviously not linked to your testicles. People gladly give up their testicles to live. It isn't even close. People give up their testicles to avoid even risk of death. Do you really feel differently? And evolution doesn't care about anything, and we're not evolutionary fitness optimizers anyway. Our instincts and desires have generally given us certain evolutionary advantages of course, as there has been selection pressure on many of them, but that is a very different thing. And I care about my instincts and desires rather than evolution.
Quote:
Smokeskin wrote:
how did you decide on your memories and such to be the features you want to survive? Why not say your genes, "after all that's the natural way that evolution works," (I don't agree with the quoted statement btw, just providing what seems like an equally arbitrary definition "me").
All value axioms are fundamentally non-rational. (Not irrational, just non-rational.) To bring up Hume again, you can't derive an ought from an is. But to act rationally on your values in the real world, what you value must exist in reality. Continuity of consciousness doesn't exist, and to value it would be completely fruitless.
You need to provide an argument for your claim that continuity as qualia is false.
Quote:
Valuing the non-existent is the same as complete nihilism.
No, but that's a different discussion.
Quote:
Memories exist, and valuing them can actually lead somewhere.
Why value them? You're back to arbitrarily picking your values. And I don't value my memories as such. I value some of them. Other's I'd prefer to get rid off. Most I don't care that much about. And as research has demonstrated our memories are few, highly inaccurate, change substantially over time, and many are just plain made up. For example listen to this: http://www.ted.com/talks/scott_fraser_the_problem_with_eyewitness_testim... You've picked quite an ephemeral thing as "you". If your memories are generally inaccurate, changed and reconstructed on the fly everytime we recall them and then stored in the new version, doesn't it seem that the "controller" that generates and changes the memories are more central? Or something else? Both in terms of qualia and science, continuity of consciousness seem to be on a MUCH surer footing than memories as what constitutes "you".
Ilmarinen Ilmarinen's picture
Smokeskin wrote:
Smokeskin wrote:
When you disregard the obvious target of your survival instinct, continuity of consciousness
You may need to stop projecting things on others. Continuity of consciousness doesn't trigger [i]my[/i] survival instinct at all. If you told me that I would stop existing for ten seconds before getting reconstituted, I honestly would not care.
[------------/Nation States/-----------] [-----/Representative Democracy/-----] [--------/Regulated Capitalism/--------]
Smokeskin Smokeskin's picture
Alkahest wrote:I think a
Alkahest wrote:
I think a problem with these kinds of discussions is that people often fall back on their intuitions to justify their beliefs, and our intuitions are biased in favor of folk psychology and other pseudoscientific ideas. I prefer to look at the available facts and trust logic and evidence rather than my gut feelings. I would like to ask Smokeskin, OpenInsurgency and anyone else who is uncomfortable with the idea of mind uploading because they fear the loss of continuity of consciousness three questions: 1: What phenomena can only be explained if we assume the existence of continuity of consciousness?
The qualia of continuity of consciousness.
Smokeskin Smokeskin's picture
Ilmarinen wrote:Smokeskin
Ilmarinen wrote:
Smokeskin wrote:
When you disregard the obvious target of your survival instinct, continuity of consciousness
You may need to stop projecting things on others. Continuity of consciousness doesn't trigger [i]my[/i] survival instinct at all. If you told me that I would stop existing for ten seconds before getting reconstituted, I honestly would not care.
But the continuity of your consciousness is the obvious target. That you no longer feel that way is something you have arrived at through a substantial intellectual effort, is it not? And the interesting part is the following part that you didn't quote or answer. After you decided that continuity wasn't relevant, how did you determine what was relevant?
Decivre Decivre's picture
Smokeskin wrote:The qualia of
Smokeskin wrote:
The qualia of continuity of consciousness.
But are you sure that qualia will not be retained if your consciousness were transferred, especially if it happened without your explicit knowledge? If you were to enter a virtual reality scenario and exit out of it, and unbeknownst to you your mind were copied and transferred to an identical body in an identical-looking location, would this qualia be lost?
Transhumans will one day be the Luddites of the posthuman age. [url=http://bit.ly/2p3wk7c]Help me get my gaming fix, if you want.[/url]
Smokeskin Smokeskin's picture
I still don't enter into
I still don't enter into discussions with you Decivre.
Ilmarinen Ilmarinen's picture
Smokeskin wrote:
Smokeskin wrote:
But the continuity of your consciousness is the obvious target. That you no longer feel that way is something you have arrived at through a substantial intellectual effort, is it not?
Not really. My instinctive reaction to people expressing distaste with Star Trek transporters was to decide they were whiners.
Smokeskin wrote:
And the interesting part is the following part that you didn't quote or answer. After you decided that continuity wasn't relevant, how did you determine what was relevant?
Obviously I care about myself - the 'I' that sees through my eyes, hears through my ears, thinks my thoughs. Unlike you I don't believe this 'I' is something that needs to be continuous nor that an interruption wipes it out. It is a purely physical phenomenon generated by the present state of mind/brain. Therefore what matters to me is the sum total of my memories, feelings, thoughts, and experiences. Obviously this something is very ephemeral and is changed all the time - which is fine. I don't need my 'I' to be static. Long story short, I think of myself as something like a video game character. I don't have a problem with having my save file transferred to another computer; I have a slight problem with having to restore from my most recent save; I have a great big problem with having my character profile deleted. I only hope to find a save point and fast.
[------------/Nation States/-----------] [-----/Representative Democracy/-----] [--------/Regulated Capitalism/--------]
Smokeskin Smokeskin's picture
Ilmarinen wrote:Smokeskin
Ilmarinen wrote:
Smokeskin wrote:
But the continuity of your consciousness is the obvious target. That you no longer feel that way is something you have arrived at through a substantial intellectual effort, is it not?
Not really. My instinctive reaction to people expressing distaste with Star Trek transporters was to decide they were whiners.
That doesn't demonstrate anything and could just as easily be a failure to understand what happened. Most people I've talked to never even considered that one dies and another is created, and when you point it out it makes them in doubt. The person being teleported experiences death, just like with an egocast, and as we went over earlier you didn't seem to consider that. The experience is very different for the two. And are you really trying to tell me that when faced with death, there's a natural part of you that goes "I'm only afraid if there's not a backup of me?"
Quote:
Smokeskin wrote:
And the interesting part is the following part that you didn't quote or answer. After you decided that continuity wasn't relevant, how did you determine what was relevant?
Obviously I care about myself - the 'I' that sees through my eyes, hears through my ears, thinks my thoughs. Unlike you I don't believe this 'I' is something that needs to be continuous nor that an interruption wipes it out.
Why do you believe that? How do you explain that 'you' will be transferred? If we merely created the fork and didn't destroy your present body, 'you' would still be in your body and another consciousness would be in the fork. They start out the same, but they clearly don't share any consciousness, there is no transfer of any sort. You're making a huge metaphysical leap with your belief that 'you' will be transferred, you seem to have no evidence to back it up, and a thought experiment as simple as a fork demonstrate that 'your' consciousness won't be transferred.
Quote:
It is a purely physical phenomenon generated by the present state of mind/brain. Therefore what matters to me is the sum total of my memories, feelings, thoughts, and experiences.
That's a pretty big "therefore" and you don't provide any sort of argument for why you picked those features instead of, say, your DNA. So it is arbitrarily chosen?

Pages