Some companies are trying to offer the illusion of that option already, but it looks pretty half-assed. You do things like answer a bunch of questions, give an app access to your emails, etc., and it tries to create a convincing "you" that your bereaved love ones can "talk with."
Not, in any meaningful sense, you -- just an algorithm trying to imitate you, and probably not that well ... yet.
But let's suppose that the algorithms keep getting better, and that data collect keeps getting more robust, such that it could be convincing even to people who knew you when you were alive.
Let's cover what seems to be one big hangup first:
What makes all of this especially fraught is that the dead person may not have given consent.
Who cares? The dead person is dead.
If there's no afterlife, the whole idea of consent is meaningless. Even if data can be "owned," it would be owned by the deceased's estate/heirs, and they should be free to do with it as they please (note: Data can't be "owned," and anyone who comes across the kind of data we're talking about here should be free to use it in any non-fraudulent/non-violent way they please).
If there is an afterlife, do "intellectual property rights" extend across multiple lives (possibly in different realities), and if so how are those "rights" handled/executed/adjudicated? I suppose there might be a court in "higher" realities to make such decisions in "lower" realities, maybe even with a multi-dimensional enforcement mechanism of some kind ... but if so, I haven't run into those entities and haven't heard of any communications from, or actions by, them in this reality. It's a question we can't answer and need not worry about.
Would I be willing to provide data for one of these busted-ass chatbot thingums? Well, if a love one told me it was important to him or her, I'd at least consider it. But I'd caution them not to place much stock in the idea that the resulting product was "me."
Would I be interested in having a loved one provide such data for my use? Well, if Tamara offered to, I'd probably accept. If she pre-deceases me, maybe I'd take some comfort in being able to banter with her about trivial things in hope of enjoying better retention of my memories of her. But give the current apparent state of the tech, I wouldn't expect it to work very well.
Now, if it ever becomes possible to actually identify, isolate, and upload a human consciousness to a computer system in such a way that it's arguably me or you "living" in there, I'd definitely be interested, provided that the "consciousness" retained agency, up to and including the ability to commit "suicide" in a non-recoverable way.
Note that I say "arguably." I'm aware that there are arguments against the proposition that such a "consciousness" would really "be" the person it was copied from. I tend to agree with those arguments, but to also expect that the "consciousness" would at least be a "person." Which is why I would not want such "persons" to be electronic prisoners sentenced to eternal, un-endable life. That is where I'd locate the issue of "consent," provided the data comprising the "consciousness" wasn't gathered coercively.
No comments:
Post a Comment