You know those sci-fi teleporters like in Star Trek where you disappear from one location then instantaneously reappear in another location? Do you trust that they are safe to use?

To fully understand my question, you need to understand the safety concerns regarding teleporters as explained in this video: https://www.youtube.com/watch?v=nQHBAdShgYI

spoiler

I wouldn’t, because the person that reappears aint me, its a fucking clone. Teleporters are murder machines. Star Trek is a silent massacre!

      • penguin@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        So you’d be fine with a scientist creating a perfect clone of you, and then killing you, letting the clone take your place?

        If it had the same memories.

        • MonkderZweite@feddit.ch
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Yes. Since i would still be alive and have no memories of being killed. There’s no distinguishion between a perfect clone and me. Sorry if you don’t like a “you” only being memories.

            • MonkderZweite@feddit.ch
              link
              fedilink
              arrow-up
              2
              ·
              1 year ago

              Then let me tell you that Consciousness is based on memory. Memory copied => “you” copied, debate done.

              • penguin@sh.itjust.works
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                edit-2
                1 year ago

                Consciousness is not based on memory or else computers would be considered conscious.

                And if according to what you’re saying, a clone with all of your memories would mean you have two points of view. I could take your clone into a different room and you’d be able to tell me what they see. But it obviously wouldn’t work like that because your own sense of self would still be locked in your head and the clone would get its own sense of self, albeit one with the same memories.

                • MonkderZweite@feddit.ch
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  edit-2
                  1 year ago

                  Consciousness is not based on memory or else computers would be considered conscious.

                  What i meant is, memory plays a key role.

                  Consciousness is, simplified, a set of self-feeding loops over input and memory, with emotions and attention (Amygdala) as regulatory mechanism.

                  And what we consider as consciosness only exists because of short-term memory snd our vast mental capabilities. Arguably, every higher animal has a sort of consciousness, just far more limitted. And maybe a more limited set of regulators (memories), because of our societal nature.

                  would mean you have two points of view.

                  No, the input is not shared between two beings, even if there are two of the same.

                  and the clone would get its own sense of self, albeit one with the same memories.

                  Exactly. But because he has the same body, same memories and same feelings, he is you. Which would change with time if the original you is not deconstructed, because the “you” of today is not the “you” of yesterday because of memories, genexpression, yadda yadda.

                  • penguin@sh.itjust.works
                    link
                    fedilink
                    arrow-up
                    1
                    arrow-down
                    1
                    ·
                    1 year ago

                    There is no reason what you describe should give rise to consciousness rather than a biological artificial intelligence. The sense of self, the perspective that feels like me peering out through my eyes, is not explained by anything you said.

                    A copy of me does not equal me because we’d both have separate senses of self. Having copied memories does nothing to affect that.