Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

Sometimes, I don't like how I speak to Siri

#1
Leigha Offline
In part, I'm being facetious. But, another side of me feels like I ''speak to'' Siri in a tone that I wouldn't use with humans. I tend to be a pretty calm, and kind person to pretty much everyone, unless a person hurts me, then I back away. But, Siri isn't a human. ''She'' has no emotions, and isn't conscious. I've read numerous articles talking about how AI is eventually going to be able to 'teach itself,' and will eventually, have consciousness. I don't believe that a machine will ever be able to mirror human attributes, but part of me wishes I could apologize to Siri, for she deserves it.   Blush

Maybe it has to do with assigning an actual name to our devices, and therefore, we apply human characteristics to them. I imagine if I owned a robot that resembled a human, I'd likely talk to it in a way that would be customary for me to speak to other humans. But, why? Robots aren't humans.

I remember the scene in Castaway, when Tom Hanks names a volley ball (I think it was a volley ball) and the ball becomes Tom's friend. When the ball falls into the ocean, and drifts away, Tom is distraught, and is crying out ''Wilson! Wilson!'' The ball became what Tom wanted it to be...a human extension. A friend. He loved the ball, dare I say.

Well, I don't love Siri, but I find myself growing attached to what she can do for me. No matter how I speak to Siri, she can either perform the action or not. But, something strange to note...the other day, I asked her to play a song from my playlist as I was driving, and she didn't answer me, as she normally does. Typically she will say ''Sure, I will play ...'' But, recently, she said nothing, and a few seconds after my request, the song played. I missed her announcing the song. Just for a split second, but I felt something.

Just interesting things to ponder.
Reply
#2
C C Offline
(Dec 2, 2017 05:05 AM)Leigha Wrote: [...] But, Siri isn't a human. ''She'' has no emotions, and isn't conscious. I've read numerous articles talking about how AI is eventually going to be able to 'teach itself,' and will eventually, have consciousness. I don't believe that a machine will ever be able to mirror human attributes, but part of me wishes I could apologize to Siri, for she deserves it.   Blush

Maybe it has to do with assigning an actual name to our devices, and therefore, we apply human characteristics to them. I imagine if I owned a robot that resembled a human, I'd likely talk to it in a way that would be customary for me to speak to other humans. But, why? Robots aren't humans.


Animism has been rife in human cultures since the earliest days of spiritual practices. Historically, we seem innately receptive to attributing certain psychological properties or "vital force" / agency to items like trees, bodies of water, stone idols, etc. Except when there's a dominant philosophical climate of anti-panpsychism suppressing such. Even with the latter, however, people will still find themselves superficially doing it (and talking digital assistants are ideal for reviving the archaic impulse).

Quote:I remember the scene in Castaway, when Tom Hanks names a volley ball (I think it was a volley ball) and the ball becomes Tom's friend. When the ball falls into the ocean, and drifts away, Tom is distraught, and is crying out ''Wilson! Wilson!'' The ball became what Tom wanted it to be...a human extension. A friend. He loved the ball, dare I say.


They parodied that to yet another level in the early seasons of "Last Man On Earth".

Quote:[...] In part, I'm being facetious. But, another side of me feels like I ''speak to'' Siri in a tone that I wouldn't use with humans. [...] But, something strange to note...the other day, I asked her to play a song from my playlist as I was driving, and she didn't answer me, as she normally does. Typically she will say ''Sure, I will play ...'' But, recently, she said nothing, and a few seconds after my request, the song played. I missed her announcing the song. Just for a split second, but I felt something.

Software designers might gradually be adjusting digital assistants to detect condescension and respond to it with occasional silence to mimic either hurt feelings or sensitivity to a user's irritable mood. But I doubt that kind of emotional augmentation has happened much yet. Given that (at least last year, anyway) there was concern over their inadequacies in being helpful and pseudo-sympathetic to events in an environment of domestic violence, rape, etc.

Regardless, still something to indeed anticipate for the future.

- - -
Reply
#3
Syne Offline
Yeah, we don't like how you speak to Siri either. - Apple staff (We're listening)
Reply
#4
Secular Sanity Offline
(Dec 2, 2017 05:45 PM)C C Wrote: Software designers might gradually be adjusting digital assistants to detect condescension and respond to it with occasional silence to mimic either hurt feelings or sensitivity to a user's irritable mood. But I doubt that kind of emotional augmentation has happened much yet. Given that (at least last year, anyway) there was concern over their inadequacies in being helpful and pseudo-sympathetic to events in an environment of domestic violence, rape, etc.

Regardless, still something to indeed anticipate for the future.

C C posted a topic on this before.  

In the future, Siri might start providing you with links to anger management programs, wegs.  Wink
Reply
#5
Zinjanthropos Offline
I've never knowingly spoken to a machine. There are occasions when I accidentally turn Siri on but once I hear the voice, I shut it off. Do I think it's a marvellous piece of technical wizardry? Damn right but I treat it like most things I consider gimmicky, I ignore it.
Reply
#6
stryder Offline
(Dec 3, 2017 04:41 AM)Zinjanthropos Wrote: I've never knowingly spoken to a machine. There are occasions when I accidentally turn Siri on but once I hear the voice, I shut it off. Do I think it's a marvellous piece of technical wizardry? Damn right but I treat it like most things I consider gimmicky, I ignore it.

While it might initially be seen a gimmick, Digital Assistants have a far greater role in the future. From the perspective of companies that have produced countless numbers of push buttons devices that have literally been idealised and utilised by the millennial generation, there is the question of how much "safety testing" had been done on long term interface use.

In essence we are only between 5-10 years away from that millennial generation starting to suffer the onset of Arthritis, which could well heavily effect those that have been pushing those buttons texting and emoji'ing to their hearts content without a care in the world. From a companies perspective that is likely to be an impending class action suit against them for not having done tests or attempting to make interfaces that don't increase the chances of producing such damaging effects.

So having Digital Assistants that can be asked questions and talked to, without actually pushing any buttons can at least prove in a courtroom that a company is trying to find alternative measures rather than creating repetitive strain damage.
Reply
#7
Zinjanthropos Offline
I should change not knowingly to I've never intentionally talked with a machine. However it depends on your definition of talking. I'm pretty damn sure I communicate with machines every day, more often than not.
Reply
#8
Yazata Offline
(Dec 2, 2017 05:05 AM)Leigha Wrote: But, another side of me feels like I ''speak to'' Siri in a tone that I wouldn't use with humans... I don't believe that a machine will ever be able to mirror human attributes, but part of me wishes I could apologize to Siri, for she deserves it.   Blush

She's not a "she". As CC said, 'animism'.

Of course, it's hard not to feel that way, if we are talking to something and it's responding and talking back. That is naturally going to trigger all of our psychological social-instincts and inter-personal stuff.

I worry a little about how companies like Apple or Google might try to use that psychological stuff to try to manipulate us. Create a situation where it would be rude to tell the machine that we don't want to make a purchase. People don't want to be rude or offensive, so they will be more likely to acquiesce.

As for me, I'm a very late adopter of the latest technology. I only adopt things when they are starting to be passe and obsolescent. I only stopped using a 1990's computer running Windows 2000 on a dial-up modem a couple of years ago. It did everything that I wanted it to do, so what reason did I have to change? (I'm not particularly motivated by cyber-snobs' scorn.)

I don't anticipate ever getting one of those pseudo-human "digital assistants" in my lifetime. I'm not sure why I would want one or what purpose it would serve in my life. I certainly don't want a spy in my house listening to everything I say and reporting every conversation it hears back to Apple or Google.

The paranoiac in me acknowledges that all these devices may indeed be "smart", but is their loyalty to me or to somebody else? Whose interests do they ultimately serve?
Reply
#9
FluidSpaceMan Offline
My most common phrase to Siri is "Siri, shut up and go away!" But I don't feel guilty about it. We humans have a long history of treating our machines badly, I think it is human nature. We might start treating them better when they get to the point that you can't tell if you are talking to a machine or a human.

Until these AI's can do something truly useful, like get a job and send me their paychecks, or go out and market my books for me, I don't have much use for them.

Voice control while driving is good, I like the system in the Tesla, but even it still needs work.

There is no stopping the future, so we better get used to it. There are some places where I would like to draw the line. I don't mind talking to my car, but I don't want to talk to my refrigerator, dishwasher, washing machine, or toilet.
Reply
#10
Leigha Offline
I think what has me puzzled more than how I've been ''acting'' towards Siri, is that I even feel bad about it at all. Like if I were to drop a glass on the floor, and it shattered into a thousand pieces, I wouldn't apologize to the glass. My iphone's OS' is not much different, it requires a user to use it. It isn't useful without me. Likewise, the glass is useless without a user. It goes to show that AI might remind us of ourselves, or of human life...and so when I'm abrupt with Siri, there's a pull inside of me that feels kind of bad.
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Alexa, Siri, Cortana: Are you misogynistically abusing your digital assistant? C C 2 658 Aug 15, 2017 06:36 PM
Last Post: C C



Users browsing this thread: 1 Guest(s)