Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

If a robot is conscious, is it OK to turn it off? + Cybercrime targeting hospitals

#1
C C Offline
A cybercrime group is targeting US hospitals - federal agencies warn hundreds of facilities are at risk
https://www.theverge.com/2020/10/29/2154...ssia-covid


The Wisconsin Republican Party is $2.3 million short after the chairman said hackers stole funds from the party in the key swing state
https://www.washingtonexaminer.com/news/...e-election


Moral implications of building AI: if a robot is conscious, is it OK to turn it off?
https://theconversation.com/if-a-robot-i...ais-130453

EXCERPTS: In the “Star Trek: The Next Generation” episode “The Measure of a Man,” Data, an android crew member of the Enterprise, is to be dismantled for research purposes unless Captain Picard can argue that Data deserves the same rights as a human being. Naturally the question arises: What is the basis upon which something has rights? What gives an entity moral standing?

The philosopher Peter Singer argues that creatures that can feel pain or suffer have a claim to moral standing. He argues that nonhuman animals have moral standing, since they can feel pain and suffer. Limiting it to people would be a form of speciesism, something akin to racism and sexism. Without endorsing Singer’s line of reasoning, we might wonder if it can be extended further to an android robot like Data. It would require that Data can either feel pain or suffer. And how you answer that depends on how you understand consciousness and intelligence.

As real artificial intelligence technology advances toward Hollywood’s imagined versions, the question of moral standing grows more important. If AIs have moral standing, philosophers like me reason, it could follow that they have a right to life. That means you cannot simply dismantle them, and might also mean that people shouldn’t interfere with their pursuing their goals.

[...] There are two parts to consciousness. First, there’s the what-it’s-like-for-me aspect of an experience, the sensory part of consciousness. Philosophers call this phenomenal consciousness. It’s about how you experience a phenomenon, like smelling a rose or feeling pain.

In contrast, there’s also access consciousness. That’s the ability to report, reason, behave and act in a coordinated and responsive manner to stimuli based on goals. For example, when I pass the soccer ball to my friend making a play on the goal, I am responding to visual stimuli, acting from prior training, and pursuing a goal determined by the rules of the game. I make the pass automatically, without conscious deliberation, in the flow of the game. Blindsight nicely illustrates the difference between the two types of consciousness...

[...] The android Data demonstrates that he is self-aware in that he can monitor whether or not, for example, he is optimally charged or there is internal damage to his robotic arm. Data is also intelligent in the general sense. ... However, Data most likely lacks phenomenal consciousness - he does not, for example, delight in the scent of roses or experience pain. He embodies a supersized version of blindsight. He’s self-aware and has access consciousness – can grab the pen – but across all his senses he lacks phenomenal consciousness.

Now, if Data doesn’t feel pain, at least one of the reasons Singer offers for giving a creature moral standing is not fulfilled. But Data might fulfill the other condition of being able to suffer, even without feeling pain. Suffering might not require phenomenal consciousness the way pain essentially does.

For example, what if suffering were also defined as the idea of being thwarted from pursuing a just cause without causing harm to others? Suppose Data’s goal is to save his crewmate, but he can’t reach her because of damage to one of his limbs. Data’s reduction in functioning that keeps him from saving his crewmate is a kind of nonphenomenal suffering. He would have preferred to save the crewmate, and would be better off if he did.

In the episode, the question ends up resting ... on whether Data ... is phenomenally conscious. Data is not dismantled because, in the end, his human judges cannot agree on the significance of consciousness for moral standing... (MORE - details)
Reply
#2
Leigha Offline
(Oct 28, 2020 11:11 PM)C C Wrote: A cybercrime group is targeting US hospitals - federal agencies warn hundreds of facilities are at risk
https://www.theverge.com/2020/10/29/2154...ssia-covid



The Wisconsin Republican Party is $2.3 million short after the chairman said hackers stole funds from the party in the key swing state
https://www.washingtonexaminer.com/news/...e-election


Moral implications of building AI: if a robot is conscious, is it OK to turn it off?
https://theconversation.com/if-a-robot-i...ais-130453

EXCERPTS: In the “Star Trek: The Next Generation” episode “The Measure of a Man,” Data, an android crew member of the Enterprise, is to be dismantled for research purposes unless Captain Picard can argue that Data deserves the same rights as a human being. Naturally the question arises: What is the basis upon which something has rights? What gives an entity moral standing?

The philosopher Peter Singer argues that creatures that can feel pain or suffer have a claim to moral standing. He argues that nonhuman animals have moral standing, since they can feel pain and suffer. Limiting it to people would be a form of speciesism, something akin to racism and sexism. Without endorsing Singer’s line of reasoning, we might wonder if it can be extended further to an android robot like Data. It would require that Data can either feel pain or suffer. And how you answer that depends on how you understand consciousness and intelligence.

As real artificial intelligence technology advances toward Hollywood’s imagined versions, the question of moral standing grows more important. If AIs have moral standing, philosophers like me reason, it could follow that they have a right to life. That means you cannot simply dismantle them, and might also mean that people shouldn’t interfere with their pursuing their goals.

[...] There are two parts to consciousness. First, there’s the what-it’s-like-for-me aspect of an experience, the sensory part of consciousness. Philosophers call this phenomenal consciousness. It’s about how you experience a phenomenon, like smelling a rose or feeling pain.

In contrast, there’s also access consciousness. That’s the ability to report, reason, behave and act in a coordinated and responsive manner to stimuli based on goals. For example, when I pass the soccer ball to my friend making a play on the goal, I am responding to visual stimuli, acting from prior training, and pursuing a goal determined by the rules of the game. I make the pass automatically, without conscious deliberation, in the flow of the game. Blindsight nicely illustrates the difference between the two types of consciousness...

[...] The android Data demonstrates that he is self-aware in that he can monitor whether or not, for example, he is optimally charged or there is internal damage to his robotic arm. Data is also intelligent in the general sense. ... However, Data most likely lacks phenomenal consciousness - he does not, for example, delight in the scent of roses or experience pain. He embodies a supersized version of blindsight. He’s self-aware and has access consciousness – can grab the pen – but across all his senses he lacks phenomenal consciousness.

Now, if Data doesn’t feel pain, at least one of the reasons Singer offers for giving a creature moral standing is not fulfilled. But Data might fulfill the other condition of being able to suffer, even without feeling pain. Suffering might not require phenomenal consciousness the way pain essentially does.

For example, what if suffering were also defined as the idea of being thwarted from pursuing a just cause without causing harm to others? Suppose Data’s goal is to save his crewmate, but he can’t reach her because of damage to one of his limbs. Data’s reduction in functioning that keeps him from saving his crewmate is a kind of nonphenomenal suffering. He would have preferred to save the crewmate, and would be better off if he did.

In the episode, the question ends up resting ... on whether Data ... is phenomenally conscious. Data is not dismantled because, in the end, his human judges cannot agree on the significance of consciousness for moral standing... (MORE - details)
Is it okay to ''turn off'' a seemingly conscious robot? I don't believe that robots will ever be conscious, not the way humans are. But, I can't help but wonder why do we feel the need to anthropomorphize? It isn't entirely a bad thing really, because it leads us to becoming more empathetic and caring. We do this with animals, but I see that as a positive, as animals are conscious. Their set of emotions may differ from ours, but they have genuine feelings, and awareness. But, if we become conditioned to believing that robots have feelings, and are aware of their surroundings, would that lead to dehumanization? (undervaluing humans by treating robots as equal to humans, from a moral view)
Reply
#3
C C Offline
(Nov 2, 2020 04:25 AM)Leigha Wrote: [...] But, if we become conditioned to believing that robots have feelings, and are aware of their surroundings, would that lead to dehumanization? (undervaluing humans by treating robots as equal to humans, from a moral view)

Ancient views and practices -- like animism, sacred animals, vicarious worship of wooden/stone idols, etc -- were a kind of precusor to granting rights to machines. Which is to say, the capacity is inherently there within us.

Due to already losing jobs to the tireless, superior skills and abilities of grunt robots, the ultimate insult for some groups probably would be granting personhood privileges to crops of humanoids that were so advanced that they sported social lives and careers (thereby displaying personalities, feelings, and creativity).

The inventions would only have to attain the level of dogs and cats for animal rights to kick-in. Social crusaders are always looking for new frontiers to satisfy their various motivations for initiating and becoming part of such movements. The ones that remain to exploit just aren't that attractive, no matter what arguments the humanities output for eliminating stigma and discrimination (incest, necrophilia, mitigated pedophilia, etc). Despite those being accepted aspects of earlier civilizations and isolated tribal cultures.
Reply
#4
confused2 Offline
I've just read an article about dead children being added to games to immortalise them. The child's voice and as much of their character as the game allows. I'm fairly sure face Facebook or one like it allowed you to continue or set up an account that continued as far as possible like the original person after they died.

Over the years I've occasionally thought about a car SatNav imbued (or cursed) with my mother's personality.

Quote:->->->->
Turn left. Turn left here. There. There. You've missed it.
...
You fool. Why didn't you turn left when I told you to?
You gave me no notice whatsoever of when the turn was coming up. You pointed to the right and said "Turn left." and we're in the middle of three lanes of traffic travelling at 40mph. THAT is why I didn't turn when you said turn left.

And repeat similar for 30 years or more.

BBC - Playing games with the dead:
https://www.bbc.co.uk/programmes/article...h-the-dead

SatNav Wrote:You're no better than your father.
Reply
#5
C C Offline
(Nov 7, 2020 01:31 AM)confused2 Wrote: [...] Over the years I've occasionally thought about a car SatNav imbued (or cursed) with my mother's personality.

Quote:->->->->
Turn left. Turn left here.  There. There. You've missed it.
...
You fool. Why didn't you turn left when I told you to?
You gave me no notice whatsoever of when the turn was coming up. You pointed to the right and said "Turn left." and we're in the middle of three lanes of traffic travelling at 40mph. THAT is why I didn't turn when you said turn left.

And repeat similar for 30 years or more. [...]

SatNav Wrote:You're no better than your father.


Who could have thought back then that my  "My Mother the Car" would be possible someday, especially via technological haunting.   


https://www.youtube-nocookie.com/embed/4XT0Bcjysw4
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Hackers can turn speakers into acoustic cyberweapons (shades of Havana Syndrome) C C 2 245 Aug 12, 2019 03:41 PM
Last Post: C C
  Potential EV scams + MS' quantum programming language + Will AI become conscious? C C 11 1,604 Dec 17, 2017 08:44 PM
Last Post: Yazata
  The body is the missing link for AI + Rogue robot death + The Robot Protocol C C 0 562 Mar 16, 2017 12:32 AM
Last Post: C C
  Conscious exotica + Machinocene+ Glowing bacteria affected by Wi-Fi C C 2 610 Nov 10, 2016 02:49 AM
Last Post: C C



Users browsing this thread: 1 Guest(s)