Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

Alexa will soon mimic voices, including your dead relatives

Reply
#2
C C Offline
Digital voice cloning is fairly harmless. It's when they start programming the outward personality of _X_ individual that issues of digital immortality enter the fray.

Such will be a one-sided or fragmentary depiction unless multiple people contribute to the construct, who regularly interacted with _X_ over the years. The private identity of _X_ will not be captured to much degree unless they left behind a lot of journal notes concerning their internal thoughts and tendencies (Sylvia Plath would be a good example of that).

In the future, _X_ will simply incrementally formulate and update their own postmortem replacement. There will likely also be bias inherent in that due to _X_ desiring to leave behind a more idealistic template of themselves, sans the flaws or whatever habitual shortcomings self-irritated them. Narcissists, OTOH, might be far less tormented in that respect -- but also deficient in inferring how other perspectives perceive them (any of us have been assessed as an overall sphincter to somebody we interacted with once or several times in the past.)

Digital immortality becomes "ghost-like" when the replica personality is embedded in a domestic or business environment -- part of an AI system regulating an entire house or company edifice, along with connections to robots, display screens, appliances, and tools. "Haunted abodes" will finally be instantiated. Since ghost legends are somewhat already renown for only exhibiting limited behavioral and psychological characteristics of the original human, incompleteness and superficiality of the "resident spirit" will be less glaring in that recreational conception of "what's going on".
Reply
#3
Leigha Offline
(Jun 24, 2022 06:35 PM)C C Wrote: Digital voice cloning is fairly harmless. It's when they start programming the outward personality of _X_ individual that issues of digital immortality enter the fray.

Such will be a one-sided or fragmentary depiction unless multiple people contribute to the construct, who regularly interacted with _X_ over the years. The private identity of _X_ will not be captured to much degree unless they left behind a lot of journal notes concerning their internal thoughts and tendencies (Sylvia Plath would be a good example of that).

In the future, _X_ will simply incrementally formulate and update their own postmortem replacement. There will likely also be bias inherent in that due to _X_ desiring to leave behind a more idealistic template of themselves, sans the flaws or whatever habitual shortcomings self-irritated them. Narcissists, OTOH, might be far less tormented in that respect  -- but also deficient in inferring how other perspectives perceive them (any of us have been assessed as an overall sphincter to somebody we interacted with once or several times in the past.)

Digital immortality becomes "ghost-like" when the replica personality is embedded in a domestic or business environment -- part of an AI system regulating an entire house or company edifice, along with connections to robots, display screens, appliances, and tools. "Haunted abodes" will finally be instantiated. Since ghost legends are somewhat already renown for only exhibiting limited behavioral and psychological characteristics of the original human, incompleteness and superficiality of the "resident spirit" will be less glaring in that recreational conception of  "what's going on".

At first, I thought it might be an interesting experiment, but...it honestly sounds creepy. ''Hey, grandma - find my playlist.'' I wouldn't want my grandmother to know that I've basically relegated her now to my virtual assistant.

These types of ''ideas'' have a dark side...

Imagine, you're startled awake in the middle of the night, to your deceased relative's voice ''I don't understand, can you repeat that?''

You're not sure if you're dreaming or not...and as time goes on, you're sharing your ''ghost story'' on YouTube.

I can see this happening. lol Siri blurts out random comments when it picks up sounds coming from the tv, or a conversation I'm having with friends, for example. How unnerving it would be if it's my deceased grandmother's voice I'm hearing, instead. Confused
Reply
#4
confused2 Online
Like Leigha I wouldn't want my dead friends and relatives cast in the role of digital assistant - not least because being helpful or even half sober would be totally out of character for most of them.
Not for me or mine but an earlier thread did suggest [edit IMHO] a better kind of immortality.. like a character in a game..
https://www.bbc.co.uk/programmes/article...h-the-dead
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Be polite to Alexa. confused2 1 96 Dec 28, 2021 07:56 PM
Last Post: Magical Realist
  Machine learning tool predicts early symptoms of schizophrenia in patient's relatives C C 0 144 Jan 26, 2021 11:32 PM
Last Post: C C
  Autonomous Robot Delivery Trucks to Hit Streets Soon Yazata 0 315 Feb 23, 2018 04:36 AM
Last Post: Yazata
  Alexa, Siri, Cortana: Are you misogynistically abusing your digital assistant? C C 2 658 Aug 15, 2017 06:36 PM
Last Post: C C



Users browsing this thread: 1 Guest(s)