Welcome! These forums will be deactivated by the end of this year. The conversation continues in a new morph over on Discord! Please join us there for a more active conversation and the occasional opportunity to ask developers questions directly! Go to the PS+ Discord Server.

Survival of the Soul

169 posts / 0 new
Last post
Ilmarinen Ilmarinen's picture
Smokeskin wrote:
Smokeskin wrote:
That doesn't demonstrate anything and could just as easily be a failure to understand what happened.
I understand what's happening quite clearly. You get taken apart and then you get put back together. As long as all the bits get put back together in the right order, there is no practical difference between having that done to you and traveling the more conventional way. I understand the ramifications. They don't concern me.
Smokeskin wrote:
And are you really trying to tell me that when faced with death, there's a natural part of you that goes "I'm only afraid if there's not a backup of me?"
I feel like my behavior would get more risky if a backup of myself were made, yes. Not [i]much[/i] more, but still.
Smokeskin wrote:
Why do you believe that? How do you explain that 'you' will be transferred?
As I explained before, I don't actually believe there is a continuous 'me' to be transferred in the first place. The 'me' I described is being generated at runtime. Transfer the system state and I would re-emerge.
Smokeskin wrote:
If we merely created the fork and didn't destroy your present body, 'you' would still be in your body and another consciousness would be in the fork.
Yes and no. You could just as easily state that 'I' would be in the fork and another consciousness would inhabit my original body. The two consciousness are the same consciousness before the fork, increasingly divergent after the fork and finally the same again after the merge.
Smokeskin wrote:
[b]They start out the same[/b], but they clearly don't share any consciousness, there is no transfer of any sort.
This right here is key. They do start out the same. If consciousness has a physical presence, that's included under 'the same'. If not, then it's irrelevant.
Smokeskin wrote:
You're making a huge metaphysical leap with your belief that 'you' will be transferred, you seem to have no evidence to back it up, and a thought experiment as simple as a fork demonstrate that 'your' consciousness won't be transferred.
Again, I refer you to my thought experiment instead. If a fork can't tell that it's a fork without being told then it can't have lost anything important during the forking process. If there was some vital continuity that was interrupted, then why isn't the fork able to sense it? Why doesn't it feel like a different person on its own? Find a way for a fork and an original to tell who's who without outside help or physical clues and I might agree that the original has something the fork does not.
Smokeskin wrote:
That's a pretty big "therefore" and you don't provide any sort of argument for why you picked those features instead of, say, your DNA. So it is arbitrarily chosen?
My DNA defines only my body. My body doesn't matter to me. My mind does. As to being arbitrary - it might be? All I know is that the thought of going to sleep doesn't freak me out in the slightest and correspondingly the thought of being absent from reality for ten seconds or so doesn't either. At the end of the day if I can be assured that there is a person who thinks he's me and he has my mind as changed by time and experience...well, that's the closest I can find to defining 'me' in the future. Given that in Eclipse Phase there is literally technology that would allow a completely different person to inhabit my brain, I find the idea that being me is somehow tied to the brain difficult to swallow.
[------------/Nation States/-----------] [-----/Representative Democracy/-----] [--------/Regulated Capitalism/--------]
Smokeskin Smokeskin's picture
Ilmarinen wrote:Smokeskin
Ilmarinen wrote:
Smokeskin wrote:
That doesn't demonstrate anything and could just as easily be a failure to understand what happened.
I understand what's happening quite clearly. You get taken apart and then you get put back together.
No. We could do the "you get put back together" part alone, and there'd be two humans with separate consciousnesses. "You" would only be one of them.
Quote:
As long as all the bits get put back together in the right order, there is no practical difference between having that done to you and traveling the more conventional way. I understand the ramifications. They don't concern me.
If you were scientifically honest, you'd admit you don't understand consciousness. You haven't solved the hard problem. So you don't understand the ramifications. You have a belief that you won't die as long as a copy is made of you.
Quote:
Smokeskin wrote:
And are you really trying to tell me that when faced with death, there's a natural part of you that goes "I'm only afraid if there's not a backup of me?"
I feel like my behavior would get more risky if a backup of myself were made, yes. Not [i]much[/i] more, but still.
Exactly. There's your natural instincts, and then there's your beliefs modifying it a little.
Quote:
Smokeskin wrote:
Why do you believe that? How do you explain that 'you' will be transferred?
As I explained before, I don't actually believe there is a continuous 'me' to be transferred in the first place. The 'me' I described is being generated at runtime. Transfer the system state and I would re-emerge.
But that's a belief. And you still seem to avoid the question of why you re-emerge in the copy if the original is destroyed but not if the original is preserved. It sounds very metaphysical that you're able to find the most suitable vessel like that tbh.
Quote:
Smokeskin wrote:
If we merely created the fork and didn't destroy your present body, 'you' would still be in your body and another consciousness would be in the fork.
Yes and no. You could just as easily state that 'I' would be in the fork and another consciousness would inhabit my original body. The two consciousness are the same consciousness before the fork, increasingly divergent after the fork and finally the same again after the merge.
The idea that they're "increasingly divergent" is putting it mildly. From the very first instant they're extremely different in that they don't share their consciousness.
Quote:
Smokeskin wrote:
[b]They start out the same[/b], but they clearly don't share any consciousness, there is no transfer of any sort.
This right here is key. They do start out the same. If consciousness has a physical presence, that's included under 'the same'. If not, then it's irrelevant.
Their state is identical when we start out, but they are physically in two different places. Consciousness is of course ultimately a physical phenomena, and since they are in physically different places, they are two separate consciousnesses.
Quote:
Smokeskin wrote:
You're making a huge metaphysical leap with your belief that 'you' will be transferred, you seem to have no evidence to back it up, and a thought experiment as simple as a fork demonstrate that 'your' consciousness won't be transferred.
Again, I refer you to my thought experiment instead. If a fork can't tell that it's a fork without being told then it can't have lost anything important during the forking process. If there was some vital continuity that was interrupted, then why isn't the fork able to sense it? Why doesn't it feel like a different person on its own?
You might as well ask why direct brain stimulation could fool the senses. We are simply not equipped with senses so acute that we can't fool them.
Quote:
Find a way for a fork and an original to tell who's who without outside help or physical clues and I might agree that the original has something the fork does not.
Your belief holds when we're not allowed full information, yes. However when full information is available, the situation is extremely easy to resolve. Why do you think your belief requires us to remain ignorant of the facts?
Quote:
Smokeskin wrote:
That's a pretty big "therefore" and you don't provide any sort of argument for why you picked those features instead of, say, your DNA. So it is arbitrarily chosen?
My DNA defines only my body. My body doesn't matter to me. My mind does. As to being arbitrary - it might be?
Yes, it might be. And you'd gamble your life, maybe for as little upside as egocasting somewhere or getting a different body, based on an arbitrary definition? I get people who have values that matter to them more than their own life. They accept death. I really don't get people who use philosphical guesswork to convince themselves that they don't die along with the cessation of their consciousness.
Quote:
All I know is that the thought of going to sleep doesn't freak me out in the slightest and correspondingly the thought of being absent from reality for ten seconds or so doesn't either. At the end of the day if I can be assured that there is a person who thinks he's me and he has my mind as changed by time and experience...well, that's the closest I can find to defining 'me' in the future.
When you sleep, your brain continues to tick along. The idea that there's no continuity between your brain states during sleep is frankly silly.
Quote:
Given that in Eclipse Phase there is literally technology that would allow a completely different person to inhabit my brain, I find the idea that being me is somehow tied to the brain difficult to swallow.
Why? Aside from some practical difficulties, why shouldn't we be able to scramble a brain, destroying the mind that was there, then put everything back together in a way that would let a different consciousness emerge?
Ilmarinen Ilmarinen's picture
I can't tell if you're
I can't tell if you're deliberately misrepresenting my statements or misunderstanding them this badly, so I'll explain it again: The consciousness that was me one second before the fork is destroyed irrevocably by the passage of that one second and the movement of electrons across my brain. It's over and done with no matter what happens next. The only connection it has to any past version of itself are memories. The only connection it has to future versions of itself are expectations. This consciousness has no particular connection to any of the atoms in the brain. If you replaced one atom with an identical atom the consciousness generated in the next moment wouldn't change. If you replaced all atoms with identical atoms, the consciousness still wouldn't change. Only the pattern in which the atoms are arranged determines the consciousness that is generated. When a fork is made an exact copy of this pattern is created. This includes the sense of having been the original. The original has this sense. The copy also has this sense. When the fork is brought online a brand new consciousness is generated. This isn't relevant because a brand new consciousness is generated in the original too. The one that was there a second before is over. Gone. Kaput. It had its moment, recorded the observations of the senses, thought an infinitesimal part of a complete thought and then disappeared to be replaced by another one. So I think you're the one who's assigning metaphysical properties to things. You think that there is some property of originalness that transfers from old atoms to the new when they touch each other. You think that your consciousness is anything other than the product of a brain examining itself, that the you behind your eyes is anything other than a collection of signals.
[------------/Nation States/-----------] [-----/Representative Democracy/-----] [--------/Regulated Capitalism/--------]
Smokeskin Smokeskin's picture
Ilmarinen wrote:I can't tell
Ilmarinen wrote:
I can't tell if you're deliberately misrepresenting my statements or misunderstanding them this badly, so I'll explain it again: The consciousness that was me one second before the fork is destroyed irrevocably by the passage of that one second and the movement of electrons across my brain. It's over and done with no matter what happens next. The only connection it has to any past version of itself are memories. The only connection it has to future versions of itself are expectations. This consciousness has no particular connection to any of the atoms in the brain. If you replaced one atom with an identical atom the consciousness generated in the next moment wouldn't change. If you replaced all atoms with identical atoms, the consciousness still wouldn't change. Only the pattern in which the atoms are arranged determines the consciousness that is generated. When a fork is made an exact copy of this pattern is created. This includes the sense of having been the original. The original has this sense. The copy also has this sense. When the fork is brought online a brand new consciousness is generated. This isn't relevant because a brand new consciousness is generated in the original too. The one that was there a second before is over. Gone. Kaput. It had its moment, recorded the observations of the senses, thought an infinitesimal part of a complete thought and then disappeared to be replaced by another one.
Are you claiming that you know this? That science has determined these this is the way consciousness works, that it is repeatedly destroyed?
Quote:
So I think you're the one who's assigning metaphysical properties to things. You think that there is some property of originalness that transfers from old atoms to the new when they touch each other. You think that your consciousness is anything other than the product of a brain examining itself, that the you behind your eyes is anything other than a collection of signals.
Of course I don't think there's anything but the physical going on. I just don't see any reason to believe that my consciousness is destroyed every second. I don't see any reason to deny qualia that are not contradicted by evidence. And failing solid evidence I'm not going to risk my life on some idea.
Alkahest Alkahest's picture
Smokeskin wrote:Of course it
Smokeskin wrote:
Of course it is not a meaningsless term. You experience qualia like everyone else. You can believe it is an illusion, or an emergent property of certain types of information processing in matter, or signs of a soul, or whatever, but you can't deny it is there.
Sure I can. Just because something is counterintuitive doesn't mean that it's false, if the facts back it up. I reject the idea that we are supposed to conduct experiments, fight our intuitions and see beyond the immediately apparent when it comes to fields like physics, but that we are supposed to accept our unchallenged perceptions when it comes to analyzing consciousness.
Smokeskin wrote:
The term is far from meaningless, and if you're trying to ignore the most basic observations every single human makes constantly, you're just being wilfully ignorant. Just because you don't understand it doesn't mean you just handwave it.
Humans tend to grant the "ineffable" far too much respect. I see no reason to believe in something that can't be measured, can't be compared to anything else, is not necessary to explain anything about reality, and which serves no purpose whatsoever other than to legitimize the individual's authority over consciousness. Here's an old but good paper by Daniel Dennett on the subject. While I don't really appreciate the "throw links to long articles at people instead of making your own arguments"-school of discussion, I'm afraid that the qualia-debate might eclipse the original subject unless I attempt to cut it somewhat short.
Smokeskin wrote:
You could say that about any experience and emotion. Do we really experience them, or do we just have memories of them? I don't believe it makes much difference in this context though.
Cognitive and affective processes are aided by memories, but they are still separate (although of course interlinked) systems. Emotions leave clear, identifiable footprints in our brains, can the same be said about "continuity of consciousness"?
Smokeskin wrote:
You're splitting words. You know what continuity means. You might not believe in it, but you understand the concept well enough that you can see when it would be there and when it wouldn't. What you're saying is equivalent to claiming "but how could possible be able to use Newtonian mechanics to calculate trajectories of objects moving at near-c speeds when I know relativity theory?"
What you're attempting to do is to show that the theory of relativity is false, so forgive me for not accepting your argument unquestioningly. If I told you that all your invisible gremlins fell out of your hair every time you showered you would "know" that the invisible gremlins you had yesterday are gone today if you showered this morning. But I'm not really any closer to proving the existence of invisible gremlins, am I?
Smokeskin wrote:
No, you're not supposed to take it on faith. Qualia can be illusory, and (short of trickery, living in the Matric, etc.) we have ways of determining them as false.
Since qualia are by definition ineffable I don't really think we have any way of determining if they are "false" or not. Actually, I don't think it even makes sense to talk about qualia as false.
Smokeskin wrote:
If you see an orange, you would reasonably assume that photons were reflecting of an actual orange. However if I demonstrated to you that the image in your brain was generated by direct stimulation from a wires embedded in your visual cortex, you would experience the orange but know it was a false experience. It is the same with continuity. If you were a fork you'd still experience continuity, but you would know that at the moment of forking, it was actually broken.
This argument of yours brings up an interesting point. The perception of an orange can be false or correct depending on data gathered outside the perception itself. The perception of continuity can be false or correct depending on... what, exactly? Your entire argument, as I understand it, rests on the assumption that perception of continuity of consciousness is sufficient proof of the reality of continuity of consciousness. My argument, that the existence of continuity of consciousness can not be verified independently of your perception of it and that we therefore have no reason to believe it exists, seems to fit your thoughts when it comes to oranges. If I see an orange but all the experiments in the world fail to detect the presence of an orange, I rightly regard my perception of the orange as a misapprehension - and so would you. But if you perceive continuity of consciousness when nothing except your own perception can verify its presence, you still cling to the belief that it's more real than my imaginary orange. Why?
Smokeskin wrote:
No. Even unselfish forks would have different utility functions. Take for example sticking to an exercise routine - you'd force your forks to do so while you succumbed to the temptation of watching a movie instead for example.
A perfectly altruistic hedonistic utilitarian who forked would spawn two minds with the same utility function: to maximize happiness. Forks do not necessarily have different utility functions.
Smokeskin wrote:
The two pointers still point to two different registers existing in separate places.
For all practical purposes, the registers are the same. If I copy a text file I'm working on to two different folders, I don't anguish over which one is the "real" text file before I keep writing in it. (Of course, the moment the two files contain different information I care about the difference.)
Smokeskin wrote:
Two forks are different people, as you said. There are properties that are different, like for example their coordinates in physical space (whether that be electric charge in computer bits or brain matter). This gives them different identity.
I don't assign value to my coordinates in physical space before I upload, why should I do it afterwards? The reason I don't want to use "identity" is that it's conflated with too many different ideas with only the faintest of connections to each other, from the collection of cognitive and affective systems constituting a mind to the nature of individuality to continuity of consciousness to a body's molecular makeup. It's a term in search of a purpose. "Continuity of consciousness" is already enough of a headache to define and discuss, I don't see the need to drag another fuzzy concept into the discussion when other words work just as well.
Smokeskin wrote:
If you didn't have the qualia of consciousness and people's description of it and were able to correlate that to certain brain states, you would have no idea what those brain scans meant.
I don't really see how people's description of the ineffable would be helpful, considering the fact that "ineffable" means "impossible to describe".
Smokeskin wrote:
For the same reason that happiness matters to me. We have probed the world around us and found that some qualia match well with reality, and others don't. Others yet, like happiness and consciousness, certainly seem to reflect activity in our brains, but it carries special meaning to us. Denying some of these (like continuity) and accepting others (like happiness) seems completely arbitrary. Denying all of them is also not a feasible solution - life without happiness and motivation is unacceptable. So you have to accept all of them, don't you? Otherwise, explain to me why you think happiness is valid but continuity is not.
First of all, even if we accept the existence of qualia (which I don't), it's ridiculous to talk about the "qualia" of continuity. The redness of red, the pain of a broken arm and so on can be described as these singular atoms of subjective experience, but continuity? How can something which necessitates experience over time be broken down into these hypothetical atomic nuggets? Secondly, it's rather simple. The existence of happiness can be independently confirmed using various experiments, as I have already explained. Happiness has an impact on the real, physical world, and the difference between happiness and non-happiness can be tested in a myriad different ways. Happiness is a mental phenomena, and as such could be uploaded using the technology we are discussing. None of this is true of "continuity of consciousness". I'm surprised that you still seem to be perplexed by the fact that complete absence of evidence leads me to disbelieve the existence of something.
Smokeskin wrote:
And continuity of consciousness does not fly in the face of science. The matter and energy in your brain transitions from one state to the next according to the laws of physics.
If you define "continuity of consciousness" as "the matter and energy constituting your current physical brain", it's pretty obvious that continuity of consciousness is not copied during an uploading process. That's a trivial truth, and not remotely related to what we are discussing.
Smokeskin wrote:
So you're disregarding observations because they don't fit your hypothesis. That's very irrational imo.
I will repeat what I wrote: "Always assume the existence of the fewest possible phenomena necessary to explain reality. Nothing about human behavior requires the existence of continuity of consciousness, the existence of memories suffice. If I have to choose between believing in memories and continuity of consciousness and simply believing in memories, the first option has to explain more than the second. It doesn't. So I don't." Now, please identify: 1: The observations that I am disregarding. 2: The hypothesis.
Smokeskin wrote:
Survival instinct is obviously not linked to your testicles. People gladly give up their testicles to live. It isn't even close. People give up their testicles to avoid even risk of death. Do you really feel differently?
I didn't say that evolution did a particularly good job. People kill themselves to protect their children, people chop off their testicles to live. We're not particularly rational animals. Which is why I try to not listen to my intuitions.
Smokeskin wrote:
And evolution doesn't care about anything, and we're not evolutionary fitness optimizers anyway. Our instincts and desires have generally given us certain evolutionary advantages of course, as there has been selection pressure on many of them, but that is a very different thing. And I care about my instincts and desires rather than evolution.
I find it a bit strange that you believe the "obvious" target of our survival instinct to be continuity of consciousness when nothing even similar to that concept was conceived earlier than the 17th century, as far as I know. Only a small minority of historical humanity (and probably current humanity, as well) have cared about this strange concept that you think is so important to our species. Belief in the soul is a far more popular hypothesis, and yet you don't seem to believe in that particular metaphysical spook. What is obvious is that our instincts have rather little to do with our reality today or in the future. We are afraid of death because death means no more babies, and the kinds of death we care about are the kinds that involve us being hit with rocks or eaten by tigers or withering away from disease. Our instincts have not prepared us for uploading technology, so why should we listen to our instincts when we contemplate the implications of uploading? More to the point, why should we all listen to your instincts? If other people say that their instincts tell them that there's nothing scary about uploading, you say that they should think further and that their instincts are unreliable since they don't understand that they should be scared of uploading. But when I say that your instincts are unreliable and that you should think further, you hide behind your instincts and claim that they are the ultimate authority and that further thinking is unnecessary. You can't have it both ways. Either accept a person's instincts and intuitions as the final word on whether or not uploading is anything to be afraid of, or accept that we have to go beyond our gut feelings if we ever want to reach the truth.
Smokeskin wrote:
You need to provide an argument for your claim that continuity as qualia is false.
I'm not sure if it's an answer to your question, but my reasoning about this whole qualia-debate goes something like this: 1: Qualia doesn't exist. 2: Even if qualia exists, it makes no sense to talk about "continuity" as qualia. 3: Even if it makes sense to talk about "continuity" as qualia, the mere experience of continuity is not sufficient proof of the existence of continuity. 4: Even if the mere experience of continuity is sufficient proof of the existence of continuity, the experience of continuity is present in an uploaded mind. But since the entire rest of this discussion is dedicated to this debate, I don't really see a need to expand this answer further and create yet another quotation tree.
Smokeskin wrote:
Why value them? You're back to arbitrarily picking your values.
Yes? I think I covered that pretty clearly when I said that all value axioms are fundamentally non-rational.
Smokeskin wrote:
And I don't value my memories as such. I value some of them. Other's I'd prefer to get rid off. Most I don't care that much about. And as research has demonstrated our memories are few, highly inaccurate, change substantially over time, and many are just plain made up. For example listen to this: http://www.ted.com/talks/scott_fraser_the_problem_with_eyewitness_testim...
No argument here. I don't value all my memories, just some of them.
Smokeskin wrote:
You've picked quite an ephemeral thing as "you". If your memories are generally inaccurate, changed and reconstructed on the fly everytime we recall them and then stored in the new version, doesn't it seem that the "controller" that generates and changes the memories are more central? Or something else?
Well, as far as I know, there is no controller. Contrary to popular belief, there is no homunculus sitting in our brain pulling levers.
Smokeskin wrote:
Both in terms of qualia and science, continuity of consciousness seem to be on a MUCH surer footing than memories as what constitutes "you".
You seem to misunderstand me. I don't value my memories (and other mental phenomena) because they constitute "me" and I value "myself", I value my memories and other mental phenomena on their own. I see no need to create a superfluous layer of "identity" on top of the things I value.
President of PETE: People for the Ethical Treatment of Exhumans.
Alkahest Alkahest's picture
Smokeskin wrote:The qualia of
Smokeskin wrote:
The qualia of continuity of consciousness.
Leaving aside all the problems the term "qualia" brings with it, I think the main problem with you current argument is that you rely entirely on your subjective experiences to prove the existence of continuity of consciousness, but at the same time you have to say that experiences are not sufficient proof of continuity of consciousness. (Otherwise, a fork's experience would invalidate your attempt to prove that continuity of consciousness can't be copied.) It appears that what you are trying to say is that continuity of consciousness is the combination of the experience of consciousness and the continuity of the physical body. But the continuity of the physical body is not a mental phenomenon, and therefore continuity of consciousness (using the definition I just gave) is not a mental phenomenon.
President of PETE: People for the Ethical Treatment of Exhumans.
Smokeskin Smokeskin's picture
Alkahest wrote:Smokeskin
Alkahest wrote:
Smokeskin wrote:
The qualia of continuity of consciousness.
Leaving aside all the problems the term "qualia" brings with it, I think the main problem with you current argument is that you rely entirely on your subjective experiences to prove the existence of continuity of consciousness, but at the same time you have to say that experiences are not sufficient proof of continuity of consciousness. (Otherwise, a fork's experience would invalidate your attempt to prove that continuity of consciousness can't be copied.)
We've been over this like 5 times now. No, a fork's experience does not invalidate continuity, just like the ability to fool our senses with direct brain stimulation does not invalidate everything we perceive, or the ability to implant or alter memories does not invalidate all memories, or the ability to falsify scientific findings does not invalidate all science.
Quote:
It appears that what you are trying to say is that continuity of consciousness is the combination of the experience of consciousness and the continuity of the physical body. But the continuity of the physical body is not a mental phenomenon, and therefore continuity of consciousness (using the definition I just gave) is not a mental phenomenon.
Consciousness and mental phenomena is not separate from the physical. I'm not a dualist. EDIT: Oh there's another post. Need time to read that Dennett article and your post.
Alkahest Alkahest's picture
Smokeskin wrote:We've been
Smokeskin wrote:
We've been over this like 5 times now. No, a fork's experience does not invalidate continuity, just like the ability to fool our senses with direct brain stimulation does not invalidate everything we perceive, or the ability to implant or alter memories does not invalidate all memories, or the ability to falsify scientific findings does not invalidate all science.
No, what invalidates continuity of consciousness is the complete lack of evidence for its existence apart from unverifiable testimony of the ineffable. ("Testimony of the Ineffable" being a good, if oxymoronic, name for a band.)
Smokeskin wrote:
Consciousness and mental phenomena is not separate from the physical. I'm not a dualist.
I haven't claimed that mental phenomena are separate from the physical. But if your entire claim is that an upload does not have the same physical body as the original mind, we could have ended this discussion a long time ago. It's trivially true. What I'm attempting to find out is if there are any mental phenomena which can't be copied, if the uploaded mind would lack something compared to the original mind.
Smokeskin wrote:
EDIT: Oh there's another post. Need time to read that Dennett article and your post.
Take your time, I'm sure you have far better things to do than to debate the philosophical implications of hypothetical technologies in a fictional setting.
President of PETE: People for the Ethical Treatment of Exhumans.
King Shere King Shere's picture
Smokeskin wrote:Consciousness
Smokeskin wrote:
Consciousness and mental phenomena is not separate from the physical. I'm not a dualist.
Alkahest wrote:
I haven't claimed that mental phenomena are separate from the physical. But if your entire claim is that an upload does not have the same physical body as the original mind, we could have ended this discussion a long time ago. It's trivially true. What I'm attempting to find out is if there are any mental phenomena which can't be copied, if the uploaded mind would lack something compared to the original mind.
[b]Scenario 1 There exist a mental phenomena that can't be copied. [/b] I (regardless of instance)would prefer to use the travel methods to retain the mental phenomena, even if that means a discomfort & higher price. Beeing a pragmatic, i would consider the uploads then a form of offspring. And thus a sufficient way & in many senses a better alternative than the "messy" sexual reproduction (which would produce a offspring that needs tutoring) [b]Scenario 2 I falsly belive the mental phenomena is copied[/b] Ignorance is bliss and I might would faultly consider the other instances as me and would use the quickest method to travel that is affordable. Though being pragmatic i would more likely not trust information as given, and more likely consider the other instances of me as offspring. So see scenario1 [b]Scenario 3 the mental phenomena is copied[/b] same as scenario 2 [b]Scenario 4 certain methods copy the phenomena, others don't.[/b] Same as scenario 1 or 2 ..
Smokeskin wrote:
EDIT: Oh there's another post. Need time to read that Dennett article and your post.
Alkahest wrote:
Take your time, I'm sure you have far better things to do than to debate the philosophical implications of hypothetical technologies in a fictional setting.
If scenario1 is correct. Its not as hypothetical as one might think, as it can have medical IRL ramifications. For example sedatives. Deep sedative & near death (Brain is shutsdown (has a period of inactivity) & later rebooted) in my opinion might infact result in the termination of the "uncopyable" mental process & a offspring is born instead. Another query would be What parts (if "essential" parts exist) of the body that can be replaced, neglected or shutdown without it jepordizing, ending or altering the "uncopyable" mental phenomena.
Smokeskin Smokeskin's picture
Alkahest wrote:Smokeskin
Alkahest wrote:
Smokeskin wrote:
We've been over this like 5 times now. No, a fork's experience does not invalidate continuity, just like the ability to fool our senses with direct brain stimulation does not invalidate everything we perceive, or the ability to implant or alter memories does not invalidate all memories, or the ability to falsify scientific findings does not invalidate all science.
No, what invalidates continuity of consciousness is the complete lack of evidence for its existence apart from unverifiable testimony of the ineffable.
Well, that's another argument (contained in your other post which I'll reply to). But could we settle the issue that there is not an internal inconsistency with forks experiencing continuity of consciousness? Just because we can fool the brain that doesn't make everything in it false.
Quote:
Smokeskin wrote:
Consciousness and mental phenomena is not separate from the physical. I'm not a dualist.
I haven't claimed that mental phenomena are separate from the physical. But if your entire claim is that an upload does not have the same physical body as the original mind, we could have ended this discussion a long time ago. It's trivially true. What I'm attempting to find out is if there are any mental phenomena which can't be copied, if the uploaded mind would lack something compared to the original mind.
It isn't the entire claim, but it is an issue that keeps popping up. You keep claiming the fork and the original are the same, when they're clearly not. They have different physical bodies (whether that "body" is digital, biological or whatnot), and since consciousness is entirely rooted in the physical, their consciousnesses are also different. The real issue is "at the instant before you are forked, is it meaningful to consider yourself intimately connected to the future consciousness emerging from your original body and not so much to the future consciousness of the fork?" I say yes, you say no. This is the issue, and what we should be discussing.
Quote:
Smokeskin wrote:
EDIT: Oh there's another post. Need time to read that Dennett article and your post.
Take your time, I'm sure you have far better things to do than to debate the philosophical implications of hypothetical technologies in a fictional setting.
I wasn't implying that I had some sort of duty to read and reply, it was just so you would know that I wasn't dodging your other points. I'm personally annoyed when discussions derail because the other side nitpicks minor issues and avoids the central ones.
Smokeskin Smokeskin's picture
Alkahest wrote:Smokeskin
Alkahest wrote:
Smokeskin wrote:
Of course it is not a meaningsless term. You experience qualia like everyone else. You can believe it is an illusion, or an emergent property of certain types of information processing in matter, or signs of a soul, or whatever, but you can't deny it is there.
Sure I can. Just because something is counterintuitive doesn't mean that it's false, if the facts back it up. I reject the idea that we are supposed to conduct experiments, fight our intuitions and see beyond the immediately apparent when it comes to fields like physics, but that we are supposed to accept our unchallenged perceptions when it comes to analyzing consciousness.
I'm not saying you should accept your perceptions unchallenged. I'm saying you can't deny that they're there. You experience qualia. That's a fact. As I said, you can believe it is illusory, but you can't deny that you experience them. Furthermore, we don't understand consciousness or qualia. We have no idea why there is this internal experience. We can scientifically describe things like "happiness" mechanically - these nerves send signals like this, that cause those cells to release hormones, that stimulates these neurons to change like that, and that makes the brain react differently like this in the future. I'm not saying we don't know each step in detail, or that we understand the full complexity, but I feel pretty confident that there's nothing surprising in there for neuroscientists if they were to map it fully. Sort of like how I can look at computer game and not understand everything that goes on under the hood in the machine, but I'm confident that everything is due to the functioning of components I could understand. But we don't understand and are nowhere near understanding why I feel like I'm experiencing happiness. How does consciousness arise? And now comes the really interesting part. Our goals, our morals and ethics, our desires. I know of no scientific way of finding any sort of basis for these. We can more or less measure the difference between happiness and misery in a multitude of ways, but why should we care about whether one or the other happens? As you say, this choice is axiomatic (and just to be clear, I think "choice" is fundamentally a physical process). I hope that one day we'll understand consciousness and this discussion we're having will be settled (or be like the "discussion" between creationists and evolutionists). But until then, we'll have to bootstrap this part of our belief system. There are things I value greatly - survival, happiness, morality, etc. That's just how they feel to me. Some feelings are contradictory, and I try to prioritize as best I can. I assume we agree on this. I assume that we also agree that we feel like we have continuity of consciousness. And this is the part that baffles me regarding your belief. If I try to knock down continuity of consciousness (because we don't have proof of it or whatever), then everything else that matters falls down with it. Consciousness, why does that matter? Happiness? Morality? I simply can't conceive of a consistent bootstrap process that excludes continuity but includes the other things. If continuity doesn't matter, why does anything matter? Happiness and misery are both just electrochemical processes in the brain, if it wasn't for our qualia couldn't we value one as much as the other? There are even people who seem to value other people's misery over happiness, while we rarely see people who don't experience continuity. Finally, this question is tied to something very fundamental: survival. If I'm right, you'd be killing yourself through a simple error in philosophical speculation (and the track record for speculation is pretty poor). As a variant of Pascal's Wager, this is one is actually solid. I won't pretend that I have final answers on this. I don't. When I looked into cryopreservation I factored in a metaphysical risk of actually dying and remaining dead even though my body was revived in the future. But to discount this part of my qualia, when qualia is obviously there yet so poorly understood, and I place so great value on other parts of my qualia (like happiness), it just seems inconsistent.
Quote:
Smokeskin wrote:
The term is far from meaningless, and if you're trying to ignore the most basic observations every single human makes constantly, you're just being wilfully ignorant. Just because you don't understand it doesn't mean you just handwave it.
Humans tend to grant the "ineffable" far too much respect. I see no reason to believe in something that can't be measured, can't be compared to anything else, is not necessary to explain anything about reality, and which serves no purpose whatsoever other than to legitimize the individual's authority over consciousness. Here's an old but good paper by Daniel Dennett on the subject. While I don't really appreciate the "throw links to long articles at people instead of making your own arguments"-school of discussion, I'm afraid that the qualia-debate might eclipse the original subject unless I attempt to cut it somewhat short.
Compared to my understanding of qualia, Dennett's paper is a straw man. I don't consider qualia as anything but the experience of being me and having perceptions and feelings, and I don't see any reason to believe that they have independent existence or that they're anything but an emergent property of the physical processes in my brain. I can't imagine qualia without such underlying processes, and I can't imagine such processes without qualia.
Quote:
Smokeskin wrote:
You could say that about any experience and emotion. Do we really experience them, or do we just have memories of them? I don't believe it makes much difference in this context though.
Cognitive and affective processes are aided by memories, but they are still separate (although of course interlinked) systems. Emotions leave clear, identifiable footprints in our brains, can the same be said about "continuity of consciousness"?
I can't imagine that I'd feel continuity without that feeling originating from processes in the brain.
Quote:
Smokeskin wrote:
You're splitting words. You know what continuity means. You might not believe in it, but you understand the concept well enough that you can see when it would be there and when it wouldn't. What you're saying is equivalent to claiming "but how could possible be able to use Newtonian mechanics to calculate trajectories of objects moving at near-c speeds when I know relativity theory?"
What you're attempting to do is to show that the theory of relativity is false, so forgive me for not accepting your argument unquestioningly.
Yes, but I'm making the attempt in a world without evidence for relativity, and the problem is that you were trying to show internal inconsistencies in Newtonian mechanics by using relativity. When you want to demonstrate that my model is internally inconsistent, you have to use my model. You can't use your own.
Quote:
Smokeskin wrote:
No, you're not supposed to take it on faith. Qualia can be illusory, and (short of trickery, living in the Matric, etc.) we have ways of determining them as false.
Since qualia are by definition ineffable I don't really think we have any way of determining if they are "false" or not. Actually, I don't think it even makes sense to talk about qualia as false.
I think you're relying on Dennett's straw man definition here with your "ineffable". The qualia of seeing an orange is effable. And we have various of determining if an orange is actually where we see it, which would determine if the qualia was false. In my example it was even extremely simple to tell it was false because we had wires in your head that generated the image.
Quote:
Smokeskin wrote:
If you see an orange, you would reasonably assume that photons were reflecting of an actual orange. However if I demonstrated to you that the image in your brain was generated by direct stimulation from a wires embedded in your visual cortex, you would experience the orange but know it was a false experience. It is the same with continuity. If you were a fork you'd still experience continuity, but you would know that at the moment of forking, it was actually broken.
This argument of yours brings up an interesting point. The perception of an orange can be false or correct depending on data gathered outside the perception itself. The perception of continuity can be false or correct depending on... what, exactly? Your entire argument, as I understand it, rests on the assumption that perception of continuity of consciousness is sufficient proof of the reality of continuity of consciousness. My argument, that the existence of continuity of consciousness can not be verified independently of your perception of it and that we therefore have no reason to believe it exists, seems to fit your thoughts when it comes to oranges. If I see an orange but all the experiments in the world fail to detect the presence of an orange, I rightly regard my perception of the orange as a misapprehension - and so would you. But if you perceive continuity of consciousness when nothing except your own perception can verify its presence, you still cling to the belief that it's more real than my imaginary orange. Why?
The real problem here is that we don't understand qualia, and that some qualia are wholly internal. I can't check for why happiness, morality or consciousness matters to me in the real world either.
Quote:
Smokeskin wrote:
No. Even unselfish forks would have different utility functions. Take for example sticking to an exercise routine - you'd force your forks to do so while you succumbed to the temptation of watching a movie instead for example.
A perfectly altruistic hedonistic utilitarian who forked would spawn two minds with the same utility function: to maximize happiness. Forks do not necessarily have different utility functions.
I wouldn't expect such an entity to care about their continuity, no, but humans are not perfectly altruistic hedonistic utilitarians. If we created an AI with such features, it wouldn't mind dying.
Quote:
Smokeskin wrote:
For the same reason that happiness matters to me. We have probed the world around us and found that some qualia match well with reality, and others don't. Others yet, like happiness and consciousness, certainly seem to reflect activity in our brains, but it carries special meaning to us. Denying some of these (like continuity) and accepting others (like happiness) seems completely arbitrary. Denying all of them is also not a feasible solution - life without happiness and motivation is unacceptable. So you have to accept all of them, don't you? Otherwise, explain to me why you think happiness is valid but continuity is not.
First of all, even if we accept the existence of qualia (which I don't), it's ridiculous to talk about the "qualia" of continuity. The redness of red, the pain of a broken arm and so on can be described as these singular atoms of subjective experience, but continuity? How can something which necessitates experience over time be broken down into these hypothetical atomic nuggets? Secondly, it's rather simple. The existence of happiness can be independently confirmed using various experiments, as I have already explained. Happiness has an impact on the real, physical world, and the difference between happiness and non-happiness can be tested in a myriad different ways. Happiness is a mental phenomena, and as such could be uploaded using the technology we are discussing. None of this is true of "continuity of consciousness". I'm surprised that you still seem to be perplexed by the fact that complete absence of evidence leads me to disbelieve the existence of something.
I'm sure that the source of the qualia of continuity can be confirmed too (the alternative is that we live in a dualistic universe), just as for happiness.
Quote:
Smokeskin wrote:
And continuity of consciousness does not fly in the face of science. The matter and energy in your brain transitions from one state to the next according to the laws of physics.
If you define "continuity of consciousness" as "the matter and energy constituting your current physical brain", it's pretty obvious that continuity of consciousness is not copied during an uploading process. That's a trivial truth, and not remotely related to what we are discussing.
Of course it is what we're discussing. The qualia of continuity seems to me to relate to "the matter and energy constituting your current physical brain", just like my sense of fairness relates to what happens to other people. One thing is the feeling, another is what it relates to.
Quote:
Smokeskin wrote:
So you're disregarding observations because they don't fit your hypothesis. That's very irrational imo.
I will repeat what I wrote: "Always assume the existence of the fewest possible phenomena necessary to explain reality. Nothing about human behavior requires the existence of continuity of consciousness, the existence of memories suffice. If I have to choose between believing in memories and continuity of consciousness and simply believing in memories, the first option has to explain more than the second. It doesn't. So I don't." Now, please identify: 1: The observations that I am disregarding. 2: The hypothesis.
1: Your qualia (continuity of consciousness). 2: That we don't have continuity of consciousness.
Quote:
Smokeskin wrote:
Survival instinct is obviously not linked to your testicles. People gladly give up their testicles to live. It isn't even close. People give up their testicles to avoid even risk of death. Do you really feel differently?
I didn't say that evolution did a particularly good job. People kill themselves to protect their children, people chop off their testicles to live. We're not particularly rational animals. Which is why I try to not listen to my intuitions.
What about your intuitions about happiness and ethical behavior?
Quote:
Smokeskin wrote:
And evolution doesn't care about anything, and we're not evolutionary fitness optimizers anyway. Our instincts and desires have generally given us certain evolutionary advantages of course, as there has been selection pressure on many of them, but that is a very different thing. And I care about my instincts and desires rather than evolution.
I find it a bit strange that you believe the "obvious" target of our survival instinct to be continuity of consciousness when nothing even similar to that concept was conceived earlier than the 17th century, as far as I know. Only a small minority of historical humanity (and probably current humanity, as well) have cared about this strange concept that you think is so important to our species. Belief in the soul is a far more popular hypothesis, and yet you don't seem to believe in that particular metaphysical spook.
The concept may be the new but the underlying intuition seems to have been there all the time. Lots of people in present day never considered the concept but seems to grasp the concept when explained to the them. Honestly I don't see caring about your soul as being different than caring about your continuity. Belief in souls seem to be the same, just combined with the (most likely mistaken) idea that our consciousness continues to exist after our body perishes. It seems that for all practical intents and purposes you also care about the continuity of your consciousness, and only when/if uploading becomes possible would you stop caring.
Quote:
What is obvious is that our instincts have rather little to do with our reality today or in the future. We are afraid of death because death means no more babies, and the kinds of death we care about are the kinds that involve us being hit with rocks or eaten by tigers or withering away from disease. Our instincts have not prepared us for uploading technology, so why should we listen to our instincts when we contemplate the implications of uploading?
If psychosurgery becomes possible, why should we care about happiness? Why not just edit that away, or at least remove its connection to inefficient things like fun and compassion. It would be the competitive thing to do. It's the question of axiomatic values - they come from our instincts. I am not comfortable stripping away meaning from our existence. Nick Bostrom has an excellent paper on it: http://www.nickbostrom.com/fut/evolution.html .
Quote:
More to the point, why should we all listen to your instincts? If other people say that their instincts tell them that there's nothing scary about uploading, you say that they should think further and that their instincts are unreliable since they don't understand that they should be scared of uploading. But when I say that your instincts are unreliable and that you should think further, you hide behind your instincts and claim that they are the ultimate authority and that further thinking is unnecessary. You can't have it both ways. Either accept a person's instincts and intuitions as the final word on whether or not uploading is anything to be afraid of, or accept that we have to go beyond our gut feelings if we ever want to reach the truth.
The main difference is that you claim to experience continuity yourself, but you believe it to be illusory (or you have a memory of continuity but believe it to be a false memory).
Quote:
Smokeskin wrote:
You need to provide an argument for your claim that continuity as qualia is false.
I'm not sure if it's an answer to your question, but my reasoning about this whole qualia-debate goes something like this: 1: Qualia doesn't exist.
That's begging the question, and you're ignoring that we all experience qualia.
Quote:
Smokeskin wrote:
And I don't value my memories as such. I value some of them. Other's I'd prefer to get rid off. Most I don't care that much about. And as research has demonstrated our memories are few, highly inaccurate, change substantially over time, and many are just plain made up. For example listen to this: http://www.ted.com/talks/scott_fraser_the_problem_with_eyewitness_testim...
No argument here. I don't value all my memories, just some of them.
It isn't just what you value, it is also that we remember very little, and a lot of that we remember wrong, our memories change substantially over time, and many are just plain made up. That's what the video is about.
Quote:
Smokeskin wrote:
You've picked quite an ephemeral thing as "you". If your memories are generally inaccurate, changed and reconstructed on the fly everytime we recall them and then stored in the new version, doesn't it seem that the "controller" that generates and changes the memories are more central? Or something else?
Well, as far as I know, there is no controller. Contrary to popular belief, there is no homunculus sitting in our brain pulling levers.
The brain and the processes in it IS the controller.
Quote:
Smokeskin wrote:
Both in terms of qualia and science, continuity of consciousness seem to be on a MUCH surer footing than memories as what constitutes "you".
You seem to misunderstand me. I don't value my memories (and other mental phenomena) because they constitute "me" and I value "myself", I value my memories and other mental phenomena on their own. I see no need to create a superfluous layer of "identity" on top of the things I value.
That seems to be a reasonable consequence of not believing in qualia.
Alkahest Alkahest's picture
Smokeskin wrote:Well, that's
Smokeskin wrote:
Well, that's another argument (contained in your other post which I'll reply to). But could we settle the issue that there is not an internal inconsistency with forks experiencing continuity of consciousness? Just because we can fool the brain that doesn't make everything in it false.
What I've been trying to explain is that when your sole proof for the existence of continuity of consciousness is the experience of continuity of consciousness, disregarding someone else's experience as not being a result of "true" continuity of consciousness leads to your proof resting on shaky ground. In the orange example, we can decide which experience is true and which is false based on proof other than the experiences themselves. To differentiate between two identical experiences, you need something other than the experiences themselves.
Smokeskin wrote:
It isn't the entire claim, but it is an issue that keeps popping up. You keep claiming the fork and the original are the same, when they're clearly not. They have different physical bodies (whether that "body" is digital, biological or whatnot), and since consciousness is entirely rooted in the physical, their consciousnesses are also different.
And if I transmit a text file from one computer to another, they are "different", sure. Except in every way that matters to me. The emergent informational content is what I care about, not the individual molecules hosting the information.
Smokeskin wrote:
The real issue is "at the instant before you are forked, is it meaningful to consider yourself intimately connected to the future consciousness emerging from your original body and not so much to the future consciousness of the fork?" I say yes, you say no. This is the issue, and what we should be discussing.
What do you mean by "intimately connected"? You seem to be grasping after some kind of vaguely defined thing that connects a certain configuration of matter you have come to think of as yourself to similar configurations of matter in the future and the past. To me it seems rather simple. The future you and the past you are different from the current you in form, molecular makeup and informational content. There is no single "you" that extends in all directions in time, there are only similar systems that you care about. I just want to extend that to caring about similar systems elsewhere in space as well as time.
President of PETE: People for the Ethical Treatment of Exhumans.
Alkahest Alkahest's picture
Smokeskin wrote:I'm not
Smokeskin wrote:
I'm not saying you should accept your perceptions unchallenged. I'm saying you can't deny that they're there. You experience qualia. That's a fact. As I said, you can believe it is illusory, but you can't deny that you experience them.
I have experiences, yes. But nothing about those experiences require the concept of qualia to explain.
Smokeskin wrote:
Furthermore, we don't understand consciousness or qualia. We have no idea why there is this internal experience. We can scientifically describe things like "happiness" mechanically - these nerves send signals like this, that cause those cells to release hormones, that stimulates these neurons to change like that, and that makes the brain react differently like this in the future. I'm not saying we don't know each step in detail, or that we understand the full complexity, but I feel pretty confident that there's nothing surprising in there for neuroscientists if they were to map it fully. Sort of like how I can look at computer game and not understand everything that goes on under the hood in the machine, but I'm confident that everything is due to the functioning of components I could understand. But we don't understand and are nowhere near understanding why I feel like I'm experiencing happiness. How does consciousness arise?
Careful, you're starting to sound like a dualist. If we understand the systems and processes that constitute consciousness and experience, we understand consciousness and experience because there is nothing else there but those systems and processes. The "hard problem" of consciousness is easily solved by realizing that there is no problem there in the first place.
Smokeskin wrote:
And now comes the really interesting part. Our goals, our morals and ethics, our desires. I know of no scientific way of finding any sort of basis for these. We can more or less measure the difference between happiness and misery in a multitude of ways, but why should we care about whether one or the other happens? As you say, this choice is axiomatic (and just to be clear, I think "choice" is fundamentally a physical process). I hope that one day we'll understand consciousness and this discussion we're having will be settled (or be like the "discussion" between creationists and evolutionists). But until then, we'll have to bootstrap this part of our belief system. There are things I value greatly - survival, happiness, morality, etc. That's just how they feel to me. Some feelings are contradictory, and I try to prioritize as best I can. I assume we agree on this.
We should distinguish between our internal values and morality itself. I don't need a reason to act in a way I define as moral, and I don't need that morality to exist either in my intuitions or in the rest of the world. I'm fully aware that morality is a completely fictional concept with no extension outside the belief systems that channel our actions in ways those systems define as rational. Hume's law and all that.
Smokeskin wrote:
I assume that we also agree that we feel like we have continuity of consciousness.
Actually, we don't. I don't feel like I have continuity of consciousness, I feel like I have memories - short-term and long-term. You chose to interpret these memories as continuity of consciousness, some people interpret them as the "soul", and some, like me, interpret them as simply memories. I realize and accept that you experience continuity of consciousness, but you don't seem to realize and accept that I don't.
Smokeskin wrote:
And this is the part that baffles me regarding your belief. If I try to knock down continuity of consciousness (because we don't have proof of it or whatever), then everything else that matters falls down with it. Consciousness, why does that matter? Happiness? Morality? I simply can't conceive of a consistent bootstrap process that excludes continuity but includes the other things.
Well, consciousness and happiness are mental phenomena that we can prove exist while morality is a system which defines rationality. Your claim is that continuity of consciousness is a mental phenomenon, and that is the claim which I disagree with.
Smokeskin wrote:
If continuity doesn't matter, why does anything matter? Happiness and misery are both just electrochemical processes in the brain, if it wasn't for our qualia couldn't we value one as much as the other? There are even people who seem to value other people's misery over happiness, while we rarely see people who don't experience continuity.
Nothing does matter. But we can choose to act in accordance to moral systems nonetheless. But acting as if things that do not exist actually exist leads to irrational behavior.
Smokeskin wrote:
Finally, this question is tied to something very fundamental: survival. If I'm right, you'd be killing yourself through a simple error in philosophical speculation (and the track record for speculation is pretty poor). As a variant of Pascal's Wager, this is one is actually solid.
Sure I'd be killing myself, using the definition of "killing myself" evolution has prepared us for - the death of our bodies. A body which is eaten by a lion and a body which is destructively uploaded are equally dead. My mind is nothing but a happy byproduct of the processes keeping my body alive. The thing is, I don't care about my body. I care about the informational content of my brain. I'm in rebellion against my nature, favoring the tool to the goal.
Smokeskin wrote:
I won't pretend that I have final answers on this. I don't. When I looked into cryopreservation I factored in a metaphysical risk of actually dying and remaining dead even though my body was revived in the future.
In my experience, factoring in metaphysical anything is a waste of time, considering that metaphysics is more or less the study of stuff that doesn't exist.
Smokeskin wrote:
But to discount this part of my qualia, when qualia is obviously there yet so poorly understood, and I place so great value on other parts of my qualia (like happiness), it just seems inconsistent.
If you agree that the "qualia" of happiness can be copied, why do you think that the "qualia" of continuity of consciousness can't be copied? That seems inconsistent.
Smokeskin wrote:
Compared to my understanding of qualia, Dennett's paper is a straw man. I don't consider qualia as anything but the experience of being me and having perceptions and feelings, and I don't see any reason to believe that they have independent existence or that they're anything but an emergent property of the physical processes in my brain. I can't imagine qualia without such underlying processes, and I can't imagine such processes without qualia.
Aren't you just using "qualia" is a synonym for "experiences", then?
Smokeskin wrote:
I can't imagine that I'd feel continuity without that feeling originating from processes in the brain.
Of course not, but how can we identify these processes in the brain if they aren't processes that can be uploaded? My problem with this is that you seem to believe that this feeling originates from processes in the brain, but that it can't be copied. It's like it exist on an ephemeral plane between matter and non-matter.
Smokeskin wrote:
Yes, but I'm making the attempt in a world without evidence for relativity, and the problem is that you were trying to show internal inconsistencies in Newtonian mechanics by using relativity. When you want to demonstrate that my model is internally inconsistent, you have to use my model. You can't use your own.
I gladly admit that a fork which shares all your beliefs about continuity of consciousness shares your belief that it does not have the same continuity of consciousness as a person you don't think it shares continuity of consciousness with. But that's hardly an argument for the existence of continuity of consciousness.
Smokeskin wrote:
I think you're relying on Dennett's straw man definition here with your "ineffable". The qualia of seeing an orange is effable.
No it's not. If you believe in qualia, you should believe that it's theoretically possible for you to look at an orange orange while another person looks at it and sees what you would describe as a green orange. However, you can never figure out that you don't experience the same qualia, since he calls your green "orange". As a person who doesn't believe in qualia, I don't think the above scenario is theoretically possible, so I don't have a problem with it.
Smokeskin wrote:
And we have various of determining if an orange is actually where we see it, which would determine if the qualia was false. In my example it was even extremely simple to tell it was false because we had wires in your head that generated the image.
So, how do we determine if the continuity of consciousness-"qualia" is false or not?
Smokeskin wrote:
The real problem here is that we don't understand qualia, and that some qualia are wholly internal. I can't check for why happiness, morality or consciousness matters to me in the real world either.
I didn't ask why continuity of consciousness matters to you, I asked why you believe in it when you have no external proof of its existence, when you don't grant that same courtesy to an orange. To repeat: "If I see an orange but all the experiments in the world fail to detect the presence of an orange, I rightly regard my perception of the orange as a misapprehension - and so would you. But if you perceive continuity of consciousness when nothing except your own perception can verify its presence, you still cling to the belief that it's more real than my imaginary orange. Why?"
Smokeskin wrote:
I wouldn't expect such an entity to care about their continuity, no, but humans are not perfectly altruistic hedonistic utilitarians. If we created an AI with such features, it wouldn't mind dying.
I just wanted to show that forks don't necessarily have different utility functions.
Smokeskin wrote:
I'm sure that the source of the qualia of continuity can be confirmed too (the alternative is that we live in a dualistic universe), just as for happiness.
If the source can be confirmed, the source can be uploaded. Problem solved?
Smokeskin wrote:
Of course it is what we're discussing. The qualia of continuity seems to me to relate to "the matter and energy constituting your current physical brain", just like my sense of fairness relates to what happens to other people. One thing is the feeling, another is what it relates to.
Do you see anyone here arguing that the upload is made out of the same matter and energy as the original brain? No. If that's how you define the "continuity" you care about, the upload obviously doesn't have that. The reason I want you to define your terms clearly is that this discussion gets very confusing otherwise.
Smokeskin wrote:
1: Your qualia (continuity of consciousness). 2: That we don't have continuity of consciousness.
1: Nothing in the world requires qualia to be explained. 2: That's not a hypothesis, that's a conclusion reached from the premises I presented.
Smokeskin wrote:
What about your intuitions about happiness and ethical behavior?
As said, axioms are fundamentally non-rational. What I'm against is using intuitions instead of logic.
Smokeskin wrote:
The concept may be the new but the underlying intuition seems to have been there all the time. Lots of people in present day never considered the concept but seems to grasp the concept when explained to the them. Honestly I don't see caring about your soul as being different than caring about your continuity. Belief in souls seem to be the same, just combined with the (most likely mistaken) idea that our consciousness continues to exist after our body perishes.
Seeing little difference between your own belief and belief in souls does not do your belief any favors, I'm afraid.
Smokeskin wrote:
It seems that for all practical intents and purposes you also care about the continuity of your consciousness, and only when/if uploading becomes possible would you stop caring.
I don't care about continuity of consciousness. I care about memories, values and other verifiable mental phenomena. You keep saying that I care about continuity of consciousness, but can you mention a single action or belief of mine which requires me to care about continuity of consciousness as opposed to caring about memories, values and mental phenomena other than continuity of consciousness? I believe you when you say that you care about continuity of consciousness, why can't you grant me the same courtesy?
Smokeskin wrote:
If psychosurgery becomes possible, why should we care about happiness? Why not just edit that away, or at least remove its connection to inefficient things like fun and compassion. It would be the competitive thing to do.
That's assuming that we should value being competitive. Hume's law, all over again.
Smokeskin wrote:
It's the question of axiomatic values - they come from our instincts. I am not comfortable stripping away meaning from our existence. Nick Bostrom has an excellent paper on it: http://www.nickbostrom.com/fut/evolution.html .
I fail to see the relevance to the existence of continuity of consciousness. Continuity of consciousness is not a value, and rational values need to be directed towards something that exists. I don't deny you the right to care about continuity of consciousness, I just call valuing it irrational. The moment you can show to me that continuity of consciousness exists in the same way that happiness does, I'll stop calling that value irrational.
Smokeskin wrote:
The main difference is that you claim to experience continuity yourself, but you believe it to be illusory (or you have a memory of continuity but believe it to be a false memory).
Actually, I don't believe this is true. I may be mistaken, but as far as I know I have specifically claimed to not experience continuity of consciousness. You claim that we both experience continuity of consciousness but that I see it as an illusion, I claim that we both experience memories but that you interpret it as continuity of consciousness while I interpret it as memories.
Smokeskin wrote:
That's begging the question, and you're ignoring that we all experience qualia.
Well, everyone except the p-zombies.
Smokeskin wrote:
It isn't just what you value, it is also that we remember very little, and a lot of that we remember wrong, our memories change substantially over time, and many are just plain made up. That's what the video is about.
I don't dispute any of that.
Smokeskin wrote:
The brain and the processes in it IS the controller.
I'm not sure what your point is, to be honest.
Smokeskin wrote:
That seems to be a reasonable consequence of not believing in qualia.
I'm glad we understand each other!
President of PETE: People for the Ethical Treatment of Exhumans.
nerdnumber1 nerdnumber1's picture
This, of course, makes anyone
This, of course, makes anyone who voices concerns about whether the soul is conserved in an upload seem like a dick, since that would mean saying that 99.9% of our species are soulless facsimiles of people who died years ago. For this reason, most people try not to think too hard about it, especially if they have been resleeved before.
LogosInvictus LogosInvictus's picture
There are some serious flaws
There are some serious flaws with the automatic assumption that continuity of consciousness exists, even if we step back from a purely rational standpoint. Within the canon of Buddhist philosophy, continuity of consciousness is an illusion created by the self-illusion's perception of its own fundamentally impermanent existence. Even if Buddhism accepts a dualist ideology, it doesn't affect the acceptance of resleeving - transferring your consciousness into a new form is not any different from anything that happens all the time, everywhere, anyway. From a purely logical standpoint, there's no proof that continuity of consciousness exists as anything other than a perception caused by a confluence of different mental processes taken collectively. To wander a bit, let's look at the example of a video game. In the context of a video game, I perceive my character as interacting with its environment. I perceive (in most modern games at least) persistent changes to the environment caused by that character, and I further perceive - after a fashion - the character's self-narrative, which one could liberally define as continuity of consciousness. However, all of these perceptions are the result of informational processes which, if they are well-programmed, run completely unseen to create the illusion of a persistent gaming environment. In the same way, our feeling of continuity of consciousness is a result of processes and subprocesses and evolutionary pressures that creates the illusion of continuity of consciousness. This, I think, is the basis of Alkahest's argument. Within the context of resleeving, all of these processes can be assumed to be copied perfectly. Smokeskin, on the other hand, appears to be arguing that there is a fundamental separateness between consciousness and the processes that we are to assume create it. Therefore, the argument can be presumed that resleeving, forking, etc. does not transfer what I will call, for lack of a better term, continuity, even though all the processes (the software, if you will) that creates it a transferred. Instead, the transfer creates a new continuity which is fundamentally different from its "original", even if they produce 100% identical output. To use our example of the video game, however - if I copy all of the software processes that create the game, and run them on hardware that is identical to the hardware for which the game was originally designed, then I have the video game. The virtual reality that is created by the game is in all ways identical to the the original. The same applies to a forked or copied Ego - all input (including previous experiences, current situation, biological inputs, memory artifacts found in the morph, etc.) being equal, two Egos copied from the same source, will produce the same output. Even if continuity of consciousness is more than an illusion, both Egos will experience it to an identical degree. One will not feel more "real" than the other to any external observer, including itself. How, then, can science measure the difference? Alkahest seems to be arguing that the primary cause (or a primary cause, at least) of the illusion of continuity of consciousness, is memory. Smokeskin's counterargument is that memory is faulty at best. If we accept that continuity of consciousness exists separately from memory, then we have moved out of the realm of quantifiable science, as there is no physical effect which is predicated upon the existence of continuity of consciousness, and there is nothing that we can observe, test or falsify which proves or disproves the existence of continuity of consciousness as anything other than an illusion or perception created by the processes which we consider to be our mind or self. If we accept that continuity of consciousness IS a perception created by a confluence of what are essentially software processes, then we can at least address the argument with a modicum of logic. If continuity is the perception of sequential memory (which seems to be a reasonable approximation of Alkahest's argument on the subject), and memory is faulty (which seems to be a reasonable approximation of Smokeskin's response), then we can presume that some, or all, of our perception of continuity of consciousness is erroneous. So it seems, based on what information we have that can actually be understood as within the realm of inquiry, that continuity of consciousness does not, in fact, exist except as an artifact of the processes which we call the mind. Going back to the original question of the thread, with regards to the belief in the soul and its place in the transhuman future, we are going to find ourselves with a lot of answers. The most common religions in EP largely consider the soul to be immutable and eternal. Some might tie the soul to the continuity of consciousness - this will be most prevalent in religions influenced by non-gnostic Christian, Judaic and Islamic memes. As I mentioned earlier, Buddhism does not typically link the soul to a specific instance of consciousness or a physical body, and at least some Buddhists would welcome resleeving and physical immortality as a way to circumvent the cycle of death and redeath and achieve enlightenment in a single effective life time. Because Judaism does have a certain connection to lineage, some extremely conservative groups might only allow resleeving into a body grown from their own or their family's genetic material, if they allow it at all. Similarly, I could see certain sects of Islam disdaining biomorphs as being too close to treading in God's domain and, therefore, only allowing resleeving into synthmorphs (the more 'unnatural' the better in some cases). Christian groups might rationalize the idea of a new consciousness as being a new being in need of salvation - there could even be a whole movement devoted to Rebaptism. Others might perceive physical immortality and resleeving, biomodification, etc. as allegorical analogues to the idea of rising up in a perfected body to be with Christ. Others might even take the philosophy of "once saved, always saved" and other concepts like generational sin to whole new levels with the advent of resleeving. To whit, there's a lot of interesting ground to cover with regard to the place of the soul and religious philosophies in regard to some of EP's central technologies and I, personally, feel kind of sad that most religious movements tend to get relegated to "oh, they're all bioconservatives" at the table.
"I've never understood that. Why does the universe give us puzzles with no answers?" "Pay back, maybe?"
Decivre Decivre's picture
A thought for people to
A thought for people to ponder: we discuss continuity as a concept because it is relevant to the specific setting of Eclipse Phase, but let's take it out of the equation. Let's assume that your mind, for all intents and purposes, feels no change in continuity during transfer. Forking or egocasting your ego to another system feels akin to instantaneous teleportation from locale to locale. Restoration after death feels like an instantaneous transition from the moment of death to the moment of reinstantiation. It should be theoretically possible, since the game is proposing that our mind is software... and software too can be restored from an instance without a break in computation, despite a break in time. So if [i]dis[/i]continuity were nonexistent as a concept, how would we then define consciousness without the qualia of continuity to gauge it?
Transhumans will one day be the Luddites of the posthuman age. [url=http://bit.ly/2p3wk7c]Help me get my gaming fix, if you want.[/url]
Smokeskin Smokeskin's picture
LogosInvictus wrote:There are
LogosInvictus wrote:
There are some serious flaws with the automatic assumption that continuity of consciousness exists, even if we step back from a purely rational standpoint. Within the canon of Buddhist philosophy, continuity of consciousness is an illusion created by the self-illusion's perception of its own fundamentally impermanent existence. Even if Buddhism accepts a dualist ideology, it doesn't affect the acceptance of resleeving - transferring your consciousness into a new form is not any different from anything that happens all the time, everywhere, anyway. From a purely logical standpoint, there's no proof that continuity of consciousness exists as anything other than a perception caused by a confluence of different mental processes taken collectively. To wander a bit, let's look at the example of a video game. In the context of a video game, I perceive my character as interacting with its environment. I perceive (in most modern games at least) persistent changes to the environment caused by that character, and I further perceive - after a fashion - the character's self-narrative, which one could liberally define as continuity of consciousness. However, all of these perceptions are the result of informational processes which, if they are well-programmed, run completely unseen to create the illusion of a persistent gaming environment. In the same way, our feeling of continuity of consciousness is a result of processes and subprocesses and evolutionary pressures that creates the illusion of continuity of consciousness. This, I think, is the basis of Alkahest's argument. Within the context of resleeving, all of these processes can be assumed to be copied perfectly. Smokeskin, on the other hand, appears to be arguing that there is a fundamental separateness between consciousness and the processes that we are to assume create it.
I most certainly do not. Where did you get that idea?
Quote:
Therefore, the argument can be presumed that resleeving, forking, etc. does not transfer what I will call, for lack of a better term, continuity, even though all the processes (the software, if you will) that creates it a transferred. Instead, the transfer creates a new continuity which is fundamentally different from its "original", even if they produce 100% identical output.
So two computers roll off an Intel Copy Exactly production line. They're for all intents and purposes indistinguishable in their form and function. Yet they are two different computers. You can run a program on one on them, copy its state to the other, and resume running it there, but that doesn't make it the same machine. It just runs the same program. The human version of that experiment is making a copy of my mind in a machine and asking me to kill myself. Of course I'm not going to do that. I care about me and the existence of a copy changes that very little (that my family wouldn't experience loss etc. is a comfort though). Why do I only care about me, this me? Why do I fear death even if a copy exists? Well, that's feelings for you. Why do I love my children? Why can intense and prolonged pain from disease be easily forgotten while the equivalent degree of pain from torture often results in severe psychological problems? Just like I'm not going to abandon my children because of some study that says the childless are happier, and I'm not going to be able to trivialize torture just because having cancer was just as painful, I'm not going to dump my self preservation instinct and sense of identity over some armchair philosophizing.
Quote:
To use our example of the video game, however - if I copy all of the software processes that create the game, and run them on hardware that is identical to the hardware for which the game was originally designed, then I have the video game. The virtual reality that is created by the game is in all ways identical to the the original. The same applies to a forked or copied Ego - all input (including previous experiences, current situation, biological inputs, memory artifacts found in the morph, etc.) being equal, two Egos copied from the same source, will produce the same output. Even if continuity of consciousness is more than an illusion, both Egos will experience it to an identical degree. One will not feel more "real" than the other to any external observer, including itself. How, then, can science measure the difference?
Two distinct copies that each feel like themselves and considers the the other one as just a copy is completely consistent. Science can easily measure the difference between, just as it can measure the physical distance between the two virtually identical Intel computers.
Quote:
Alkahest seems to be arguing that the primary cause (or a primary cause, at least) of the illusion of continuity of consciousness, is memory. Smokeskin's counterargument is that memory is faulty at best. If we accept that continuity of consciousness exists separately from memory, then we have moved out of the realm of quantifiable science, as there is no physical effect which is predicated upon the existence of continuity of consciousness, and there is nothing that we can observe, test or falsify which proves or disproves the existence of continuity of consciousness as anything other than an illusion or perception created by the processes which we consider to be our mind or self. If we accept that continuity of consciousness IS a perception created by a confluence of what are essentially software processes, then we can at least address the argument with a modicum of logic. If continuity is the perception of sequential memory (which seems to be a reasonable approximation of Alkahest's argument on the subject), and memory is faulty (which seems to be a reasonable approximation of Smokeskin's response), then we can presume that some, or all, of our perception of continuity of consciousness is erroneous. So it seems, based on what information we have that can actually be understood as within the realm of inquiry, that continuity of consciousness does not, in fact, exist except as an artifact of the processes which we call the mind.
That argument gets you nowhere. The same can be said of any internal experience, and it doesn't make anything trivial or nonexistant just because it is a mere artifact of the mind. Love doesn't exist except as an artifact of the processes which we call the mind. Perception doesn't exist except as an artifact of the processes which we call the mind. It is all just electrochemical processes in that 1.5kg lump in out skulls. Yet that is where almost all of the exiting, important things happen.
Quote:
Going back to the original question of the thread, with regards to the belief in the soul and its place in the transhuman future, we are going to find ourselves with a lot of answers. The most common religions in EP largely consider the soul to be immutable and eternal. Some might tie the soul to the continuity of consciousness - this will be most prevalent in religions influenced by non-gnostic Christian, Judaic and Islamic memes. As I mentioned earlier, Buddhism does not typically link the soul to a specific instance of consciousness or a physical body, and at least some Buddhists would welcome resleeving and physical immortality as a way to circumvent the cycle of death and redeath and achieve enlightenment in a single effective life time. Because Judaism does have a certain connection to lineage, some extremely conservative groups might only allow resleeving into a body grown from their own or their family's genetic material, if they allow it at all. Similarly, I could see certain sects of Islam disdaining biomorphs as being too close to treading in God's domain and, therefore, only allowing resleeving into synthmorphs (the more 'unnatural' the better in some cases). Christian groups might rationalize the idea of a new consciousness as being a new being in need of salvation - there could even be a whole movement devoted to Rebaptism. Others might perceive physical immortality and resleeving, biomodification, etc. as allegorical analogues to the idea of rising up in a perfected body to be with Christ. Others might even take the philosophy of "once saved, always saved" and other concepts like generational sin to whole new levels with the advent of resleeving. To whit, there's a lot of interesting ground to cover with regard to the place of the soul and religious philosophies in regard to some of EP's central technologies and I, personally, feel kind of sad that most religious movements tend to get relegated to "oh, they're all bioconservatives" at the table.
It tends to be that only religious details only matter to people of that religion. Other people only care when the religious people try to force their ideas on people. That's the difference between a new age mystic and a fundamental christian, they both have some silly ideas but only the latter will forcibly try to make you conform to their strange ideas. EP places religion in much the same role as it has today - except for bioconservative agendaes and terrorist activities, nobody outside of religion really cares.
Nebelwerfer41 Nebelwerfer41's picture
I've been reading this thread
I've been reading this thread all afternoon, and while towards the end it feels like people talking past each other in an effort to "win" the argument, I like it. I can see where both sides are coming from. The ideas bantered around will help me roleplay a character who was egocasted out of a dying flat morph against the wishes of his bioconservative family. He views it as if he had died and is a separate person from the ego that inhabited his old morph.

Pages