- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
guy recently linked this essay, its old, but i don’t think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin
guy recently linked this essay, its old, but i don’t think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin
Eh. I’d argue that in as much as “you” means anything, forks would both be equally the person, there’s no “original” who is more the person. It’s a point of divergence, both have equal claim to all experiences and history up to the point of divergence. Privileging the “original” over the “copy” is cultural prejudice, subjectively they’re the same person to the moment of divergence.
I don’t think that’s the right way to untangle that dilemma ethically, because it can lead people to jump to galaxy brained “solutions” like “but what if you can make sure only one of you exists at once?” that don’t make any sense or answer anything but are still active cognitohazards for some people.
You, as in the one that is in there right now, that instance would continue along its own discrete path for as long as it exists: if another instance were made and separated off that would be a person, that would be a non-contiguous you, but it would not be the same you that is there right now, a distinction that becomes important when dealing with cognitohazards like trying to terminate that instance as the new one is spun up so that “you” get to be the one in a machine instead and there’s no perceptual break between them.
I’d argue that the ethical way to deal with forking copies like that would be to find ways to keep them linked up and at least partially synced, effectively making them component parts of a larger contiguous whole instead of just duplicating someone in a way that inevitably means at least one of the copies gets fucked over by whatever circumstances inspired the copying. So instead of the you that’s here now and the you spun up in a decade on a computer, there’d be the you that’s here now and then also a new secondary brain that’s on that computer, both of which communicate, share some sensory data, and operate almost as if you’d just added more hemispheres to your brain. And at some point after that maybe you could start considering individual copies ablative the same way bits of brain are, things you don’t want to lose but which you can survive losing and can potentially repair and replace given time because of how redundant and distributed brain functions are.