Artificial intelligence could identify gang crimes -- and ignite an ethical firestorm
http://www.sciencemag.org/news/2018/02/a...-firestorm
EXCERPT: When someone roughs up a pedestrian, robs a store, or kills in cold blood, police want to know whether the perpetrator was a gang member: Do they need to send in a special enforcement team? Should they expect a crime in retaliation? Now, a new algorithm is trying to automate the process of identifying gang crimes. But some scientists warn that far from reducing gang violence, the program could do the opposite by eroding trust in communities, or it could brand innocent people as gang members.
That has created some tensions. At a presentation of the new program this month, one audience member grew so upset he stormed out of the talk, and some of the creators of the program have been tight-lipped about how it could be used.
“This is almost certainly a well-intended piece of work,” says Google software engineer Blake Lemoine, who is based in Mountain View, California, and has studied ways of reducing bias in artificial intelligence. “But have the researchers considered the possible unintended side effects?”
For years, scientists have been using computer algorithms to map criminal networks, or to guess where and when future crimes might take place, a practice known as predictive policing. But little work has been done on labeling past crimes as gang-related.
In the new work, researchers developed a system that can identify a crime as gang-related based on only four pieces of information [...] To classify crimes, the researchers invented something called a partially generative neural network. A neural network is made of layers of small computing elements that process data in a way reminiscent of the brain’s neurons. A form of machine learning, it improves based on feedback—whether its judgments were right.[...] Hau Chan, a computer scientist now at Harvard University who was presenting the work, responded that he couldn’t be sure how the new tool would be used....
MORE: http://www.sciencemag.org/news/2018/02/a...-firestorm
AI can now create fake porn, making revenge porn even more complicated
https://theconversation.com/ai-can-now-c...ated-92267
EXCERPT: In January this year, a new app was released that gives users the ability to swap out faces in a video with a different face obtained from another photo or video – similar to Snapchat’s “face swap” feature. It’s an everyday version of the kind of high-tech computer-generated imagery (CGI) we see in the movies. You might recognise it from the cameo of a young Princess Leia in the 2016 Star Wars film Rogue One, which used the body of another actor and footage from the first Star Wars film created 39 years earlier. Now, anyone with a high-powered computer, a graphics processing unit (GPU) and time on their hands can create realistic fake videos – known as “deepfakes” – using artificial intelligence (AI).
Sounds fun, right? The problem is that these same tools are accessible to those who seek to create non-consensual pornography of friends, work colleagues, classmates, ex-partners and complete strangers – and post it online....
MORE: https://theconversation.com/ai-can-now-c...ated-92267
http://www.sciencemag.org/news/2018/02/a...-firestorm
EXCERPT: When someone roughs up a pedestrian, robs a store, or kills in cold blood, police want to know whether the perpetrator was a gang member: Do they need to send in a special enforcement team? Should they expect a crime in retaliation? Now, a new algorithm is trying to automate the process of identifying gang crimes. But some scientists warn that far from reducing gang violence, the program could do the opposite by eroding trust in communities, or it could brand innocent people as gang members.
That has created some tensions. At a presentation of the new program this month, one audience member grew so upset he stormed out of the talk, and some of the creators of the program have been tight-lipped about how it could be used.
“This is almost certainly a well-intended piece of work,” says Google software engineer Blake Lemoine, who is based in Mountain View, California, and has studied ways of reducing bias in artificial intelligence. “But have the researchers considered the possible unintended side effects?”
For years, scientists have been using computer algorithms to map criminal networks, or to guess where and when future crimes might take place, a practice known as predictive policing. But little work has been done on labeling past crimes as gang-related.
In the new work, researchers developed a system that can identify a crime as gang-related based on only four pieces of information [...] To classify crimes, the researchers invented something called a partially generative neural network. A neural network is made of layers of small computing elements that process data in a way reminiscent of the brain’s neurons. A form of machine learning, it improves based on feedback—whether its judgments were right.[...] Hau Chan, a computer scientist now at Harvard University who was presenting the work, responded that he couldn’t be sure how the new tool would be used....
MORE: http://www.sciencemag.org/news/2018/02/a...-firestorm
AI can now create fake porn, making revenge porn even more complicated
https://theconversation.com/ai-can-now-c...ated-92267
EXCERPT: In January this year, a new app was released that gives users the ability to swap out faces in a video with a different face obtained from another photo or video – similar to Snapchat’s “face swap” feature. It’s an everyday version of the kind of high-tech computer-generated imagery (CGI) we see in the movies. You might recognise it from the cameo of a young Princess Leia in the 2016 Star Wars film Rogue One, which used the body of another actor and footage from the first Star Wars film created 39 years earlier. Now, anyone with a high-powered computer, a graphics processing unit (GPU) and time on their hands can create realistic fake videos – known as “deepfakes” – using artificial intelligence (AI).
Sounds fun, right? The problem is that these same tools are accessible to those who seek to create non-consensual pornography of friends, work colleagues, classmates, ex-partners and complete strangers – and post it online....
MORE: https://theconversation.com/ai-can-now-c...ated-92267