Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

How pandemic reshaped collaboration in sci + Health care algorithms rife with bias

#1
C C Offline
How the pandemic has reshaped collaboration — and competition — in science
https://www.statnews.com/2021/06/22/scie...aboration/

INTRO: For as long as people have pointed telescopes at the night sky and slipped drops of pond water under microscopes, competition has been as much a part of the scientific enterprise as curiosity, creativity, and discovery. And for centuries, that has served humanity well. Rivalries push fields forward; Tesla versus Edison sparked the electrical revolution, Pasteur versus Koch showed us how to fight once invisible sources of infection, Joliet-Curie versus Meitner ushered in the nuclear age.

But a global health crisis is no time for guarding secrets. In the last year and a half, Covid-19 showed the world what’s possible when scientists put collaboration first.

On Tuesday, as part of the Milken Institute’s Future of Health Summit, STAT executive editor Rick Berke asked a panel of experts how the pandemic has reshaped the cultural landscape in science. They hailed the widespread adoption of preprints for accelerating the exchange of information and ideas, but said there’s a need to rethink long standing incentive structures in science that have favored the individual and rewarded people for staying in their own lanes.

“There’s still that tension,” said Kathryn Richmond, senior director of the Paul G. Allen Frontiers Group, who oversees $200 million in grants to early-stage researchers who might not qualify for traditional funding opportunities. “Even now, 20 years after the Allen Institute was founded on the concept of team science and making these brain atlases so that the whole field can use them, when we bring people in to join the organization, sometimes they say, ‘I can sit on these results for a little bit, can’t I? And just dig a little deeper?’ And it’s like ‘no, this is open science — the data, the methods, it all goes out before you publish.’”

Here are other highlights from the event, edited lightly for clarity... (MORE)


‘Nobody is catching it’: Algorithms used in health care nationwide are rife with bias
https://www.statnews.com/2021/06/21/algo...hospitals/

INTRO: The algorithms carry out an array of crucial tasks: helping emergency rooms nationwide triage patients, predicting who will develop diabetes, and flagging patients who need more help to manage their medical conditions.

But instead of making health care delivery more objective and precise, a new report finds, these algorithms — some of which have been in use for many years — are often making it more biased along racial and economic lines.

Researchers at the University of Chicago found that pervasive algorithmic bias is infecting countless daily decisions about how patients are treated by hospitals, insurers, and other businesses. Their report points to a gaping hole in oversight that is allowing deeply flawed products to seep into care with little or no vetting, in some cases perpetuating inequitable treatment for more than a decade before being discovered.

“I don’t know how bad this is yet, but I think we’re going to keep uncovering a bunch of cases where algorithms are biased and possibly doing harm,” said Heather Mattie, a professor of biostatistics and data science at Harvard University who was not involved in the research. She said the report points out a clear double standard in medicine: While health care institutions carefully scrutinize clinical trials, no such process is in place to test algorithms commonly used to guide care for millions of people.

“Unless you do it yourself, there is no checking for bias from experts in the field,” Mattie said. “For algorithms that are going to be deployed in a wider population, there should be some checks and balances before they are implemented.”

The report, the culmination of more than two years of research, sets forth a playbook for addressing these biases, calling on health care organizations to take an inventory of their algorithms, screen them for bias, and either adjust or abandon them altogether if flaws cannot be fixed.

“There is a clear market failure,” said Ziad Obermeyer, an emergency medicine physician and co-author of the report. “These algorithms are in very widespread use and affecting decisions for millions and millions of people, and nobody is catching it.” (MORE)
Reply
#2
Syne Offline
Does the article, or anyone cited, ever actually get around to giving any examples of racial or economic bias? Or is that just a trendy thing to exploit for attention?
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Excess sugar consumption costs Canada’s health-care system $5 billion each year C C 0 53 Mar 16, 2022 06:23 PM
Last Post: C C
  Do we have a ‘bias bias’? C C 2 122 Jan 20, 2022 02:13 AM
Last Post: Syne
  Consensus report shows burnout prevalent in health care community C C 0 199 Oct 24, 2019 03:36 PM
Last Post: C C
  Algorithms Should Not Be Grading Essays C C 0 253 Oct 26, 2017 01:05 AM
Last Post: C C



Users browsing this thread: 1 Guest(s)