
It's actually better to have someone caring about and relating to _X_ out of duty, than because of feelings. The latter are unreliable (can come or go as well as can flip to negative). And constantly having to be motivated by emotions -- having to maintain them -- can be mentally draining for many people, or family caregivers.
- - - - - - - - -
Will society become colder if robots help the sick and elderly?
https://www.sciencenorway.no/artificial-...ly/2371088
EXCERPTS: Atle Ottesen Søvik is a professor at the MF Norwegian School of Theology, Religion and Society with a focus on the philosophy of religion. He points out that we must consider what the care alternatives are.
If the alternative is that the elderly receive little or poor care due to limited human capacity, it might be better for robots to assist. However, he believes it can be easy to prioritise wrongly. The sense of care and dignity might be lost if too much is automated.
“The importance of the care dimension should not be underestimated. Research shows that you can endure a lot if you feel valued. If you feel like a nuisance and a burden, life can feel very painful,” Søvik says.
While robots might be a cost-effective solution, Søvik argues that we should not focus solely on the numbers. Other values must also be considered, such as recognition, dignity, and community.
But can robots also provide a sense of care and recognition?
This is where the research results diverge, says Søvik. “Some people want to feel empathy from another person, while others question how genuine that empathy really is. Is the person just pretending to care without actually being interested?
[...] Some people might be satisfied with a companion robot they can talk to.
“You might feel that there’s no judgment. I don't feel like a burden, the robot has plenty of time,” he says.
Others find that companion robots don’t work for them.
[...] Humans can form relationships with robots and feel acknowledged as the robot learns how to get to know you.
“We can feel a sense of recognition that both the robot and I are distinct individuals, who are vulnerable and can break. There are many social dimensions that are interesting to explore,” he says.
But there are differences between the relationship with a robot and a relationship with other humans. One of them is that people spend a long time getting to know each other and build a relationship. Søvik points out that a robot can 'like' everyone equally.
“This is where I think there’s a value in human weakness, which binds us together and enables us to experience a kind of bond that we can’t have with robots,” he says.
Another aspect is that a good friend can challenge you.
“The machine is often designed just to tell you that you’re great and might not provide the challenges and resistnce you need," he says. (MORE - missing details)
- - - - - - - - -
Will society become colder if robots help the sick and elderly?
https://www.sciencenorway.no/artificial-...ly/2371088
EXCERPTS: Atle Ottesen Søvik is a professor at the MF Norwegian School of Theology, Religion and Society with a focus on the philosophy of religion. He points out that we must consider what the care alternatives are.
If the alternative is that the elderly receive little or poor care due to limited human capacity, it might be better for robots to assist. However, he believes it can be easy to prioritise wrongly. The sense of care and dignity might be lost if too much is automated.
“The importance of the care dimension should not be underestimated. Research shows that you can endure a lot if you feel valued. If you feel like a nuisance and a burden, life can feel very painful,” Søvik says.
While robots might be a cost-effective solution, Søvik argues that we should not focus solely on the numbers. Other values must also be considered, such as recognition, dignity, and community.
But can robots also provide a sense of care and recognition?
This is where the research results diverge, says Søvik. “Some people want to feel empathy from another person, while others question how genuine that empathy really is. Is the person just pretending to care without actually being interested?
[...] Some people might be satisfied with a companion robot they can talk to.
“You might feel that there’s no judgment. I don't feel like a burden, the robot has plenty of time,” he says.
Others find that companion robots don’t work for them.
[...] Humans can form relationships with robots and feel acknowledged as the robot learns how to get to know you.
“We can feel a sense of recognition that both the robot and I are distinct individuals, who are vulnerable and can break. There are many social dimensions that are interesting to explore,” he says.
But there are differences between the relationship with a robot and a relationship with other humans. One of them is that people spend a long time getting to know each other and build a relationship. Søvik points out that a robot can 'like' everyone equally.
“This is where I think there’s a value in human weakness, which binds us together and enables us to experience a kind of bond that we can’t have with robots,” he says.
Another aspect is that a good friend can challenge you.
“The machine is often designed just to tell you that you’re great and might not provide the challenges and resistnce you need," he says. (MORE - missing details)