Casual Discussion Science Forum @scivillage

Full Version: Ethics alone won’t fix Big Tech’s problems
You're currently viewing a stripped down version of our content. View the full version with proper formatting.

EXCERPT: . . . The Chinese government is using a “vast, secret system” of artificial intelligence and facial recognition technology to identify and track Uighurs—a Muslim minority, 1 million of whom are being held in detention camps in China’s northwest Xinjiang province. This technology allows the government to extend its control of the Uighur population across the country.

[...] A.I. systems also decide what information is presented to you on social media, which ads you see, and what prices you’re offered for goods and services. They monitor your bank account for fraud, determine your credit score, and set your insurance premiums. A.I.-driven recommendations help determine where police patrol and how judges make bail and sentencing decisions.

As our lives intertwine with A.I., researchers, policymakers, and activists are trying to figure out how to ensure that these systems reflect and respect important human values, like privacy, autonomy, and fairness. Such questions are at the heart of what is often called “A.I. ethics” (or sometimes “data ethics” or “tech ethics”).

[...] In discussions about emerging technologies, there is a tendency to treat ethics as though it offers the tools to answer all values questions. I suspect this is largely ethicists’ own fault: Historically, philosophy (the larger discipline of which ethics is a part) has mostly neglected technology as an object of investigation, leaving that work for others to do. (Which is not to say there aren’t brilliant philosophers working on these issues; there are. But they are a minority.) The result ... is that the vast majority of scholarly work addressing issues related to technology ethics is being conducted by academics trained and working in other fields.

[...] we also need to understand why attempts at building “good technologies” have failed in the past, what incentives drive individuals and organizations not to build them even when they know they should, and what kinds of collective action can change those dynamics. To answer these questions, we need more than ethics. We need history, sociology, psychology, political science, economics, law, and the lessons of political activism. In other words, to tackle the vast and complex problems emerging technologies are creating, we need to integrate research and teaching around technology with all of the humanities and social sciences.

Moreover, in failing to recognize the proper scope of ethical theory, we lose our grasp of ethical practice. It should come as no surprise that ethics alone hasn’t transformed technology for the good. Ethicists will be the first to tell you that knowing the difference between good and bad is rarely enough, in itself, to incline us to the former. (We learn this whenever we teach ethics courses.) Acting ethically is hard. We face constant countervailing pressures, and there is always the risk we’ll get it wrong. Unless we acknowledge that, we leave room for the tech industry to turn ethics into “ethics theater”—the vague checklists and principles, powerless ethics officers, and toothless advisory boards, designed to save face, avoid change, and evade liability.

Ethics requires more than rote compliance. And it’s important to remember that industry can reduce any strategy to theater. Simply focusing on law and policy won’t solve these problems, since they are equally (if not more) susceptible to watering down. Many are rightly excited about new proposals for state and federal privacy legislation, and for laws constraining facial recognition technology, but we’re already seeing industry lobbying to strip them of their most meaningful provisions. More importantly, law and policy evolve too slowly to keep up with the latest challenges technology throws at us, as is evident from the fact that most existing federal privacy legislation is older than the internet.

The way forward is to see these strategies as complementary, each offering distinctive and necessary tools for steering new and emerging technologies toward shared ends. The task is fitting them together. (MORE - details)