How should we treat science’s growing pains?
https://www.theguardian.com/science/poli...wing-pains
EXCERPT: Jerome Ravetz has been one of the UK’s foremost philosophers of science for more than 50 years. Here, he reflects on the troubles facing contemporary science. He argues that the roots of science’s crisis have been ignored for too long. Quality control has failed to keep pace with the growth of science. [...] As noted already in the Guardian’s science pages, there is no lack of initiatives to tackle science’s crisis in all its aspects, from reproducibility to the abuse of metrics, to the problems of peer review. This gives good grounds for hope that the crisis will eventually be resolved, and that it will not become a general crisis of trust in science. Should that occur, and ‘science’ ceases to be a key cultural symbol of both truth and probity, along with material beneficence, then the consequences could be far-reaching. To that end, we should consider what lies behind the malpractices whose exposure has triggered the crisis over the last decade.
It is clear that a combination of circumstances can go far to explain what has gone wrong. Systems of controls and rewards that had evolved under earlier conditions have in many ways become counterproductive, producing perverse incentives that become increasingly difficult for scientists to withstand. Our present problems can be explained partly by the transformation from the ‘little science’ of the past to the ‘big science’ or ‘industrialised science’ of the present. But this explanation raises a problem: if the corrupting pressures are the result of the structural conditions of contemporary science, can they be nullified in the absence of a significant change in those conditions?
We should explore how these new conditions lead to these new pressures. There are two familiar qualitative aspects of the steady quantitative growth of the scientific enterprise....
In doctors we trust, especially when they admit to bias
https://www.sciencedaily.com/releases/20...155235.htm
RELEASE: A doctor's guidance may reassure us more than we realize -especially if she says she is likely to recommend treatment in her field of expertise, known as "specialty bias."
Doing research in a real-world health care setting, a Cornell expert and her colleagues have found that when surgeons revealed their bias toward their own specialty, their patients were more likely to perceive them as trustworthy. And patients are more apt to follow their recommendation to have surgical treatment. The research was published this week in the Proceedings of the National Academy of Sciences.
The study has important implications for professional advisers of any stripe and policymakers who deal with disclosure rules, said Sunita Sah, a physician and assistant professor of management and organizations at Cornell's Samuel Curtis Johnson Graduate School of Management.
"If an adviser discloses a bias, it should alert the recipient to some uncertainty regarding the quality of the advice. 'Perhaps I need to discount this a little bit.' Disclosure of bias, if anything, should decrease the weight that patients put on their physicians' recommendations," said Sah, an expert on conflicts of interest and disclosure. "But, instead, we find that patients report increased trust and they are more likely to take the physician's treatment than patients who do not hear their physician disclose a bias."
Sah and her colleagues based their findings on 219 transcripts of conversations between surgeons and male patients in four Veterans Affairs hospitals in which the surgeon revealed a diagnosis of localized prostate cancer to the patient. While discussing treatment options, some surgeons freely admitted to having a bias toward their own specialty, with statements such as, "I'm a surgeon, so I'm biased toward recommending surgery." Patients who heard their surgeon disclose their specialty bias were nearly three times more likely to decide to have surgery than patients who did not hear their surgeons disclose a bias.
The researchers also conducted a randomized lab experiment. In this study, 447 men watched video clips of an actor portraying a surgeon, who described two treatment options: surgery and radiation. In the "disclosure" group, the men heard the actor disclose his bias towards surgery, similar to the surgeons in the Veteran Affairs hospitals. The control group saw the same video, except for the bias disclosure. The men who heard the disclosure were more likely to choose surgery than the control group and reported higher trust in the doctor's expertise.
Sah and her colleagues also found that surgeons who disclosed bias toward their specialty or discussed a potential meeting with a radiologist oncologist for radiation treatment tended to give stronger recommendations for surgery.
"Bias disclosure can have a profound influence on adviser recommendations and the choices their advisees make," said Sah. "Professional advisers and policymakers should implement such disclosures with care."
https://www.theguardian.com/science/poli...wing-pains
EXCERPT: Jerome Ravetz has been one of the UK’s foremost philosophers of science for more than 50 years. Here, he reflects on the troubles facing contemporary science. He argues that the roots of science’s crisis have been ignored for too long. Quality control has failed to keep pace with the growth of science. [...] As noted already in the Guardian’s science pages, there is no lack of initiatives to tackle science’s crisis in all its aspects, from reproducibility to the abuse of metrics, to the problems of peer review. This gives good grounds for hope that the crisis will eventually be resolved, and that it will not become a general crisis of trust in science. Should that occur, and ‘science’ ceases to be a key cultural symbol of both truth and probity, along with material beneficence, then the consequences could be far-reaching. To that end, we should consider what lies behind the malpractices whose exposure has triggered the crisis over the last decade.
It is clear that a combination of circumstances can go far to explain what has gone wrong. Systems of controls and rewards that had evolved under earlier conditions have in many ways become counterproductive, producing perverse incentives that become increasingly difficult for scientists to withstand. Our present problems can be explained partly by the transformation from the ‘little science’ of the past to the ‘big science’ or ‘industrialised science’ of the present. But this explanation raises a problem: if the corrupting pressures are the result of the structural conditions of contemporary science, can they be nullified in the absence of a significant change in those conditions?
We should explore how these new conditions lead to these new pressures. There are two familiar qualitative aspects of the steady quantitative growth of the scientific enterprise....
In doctors we trust, especially when they admit to bias
https://www.sciencedaily.com/releases/20...155235.htm
RELEASE: A doctor's guidance may reassure us more than we realize -especially if she says she is likely to recommend treatment in her field of expertise, known as "specialty bias."
Doing research in a real-world health care setting, a Cornell expert and her colleagues have found that when surgeons revealed their bias toward their own specialty, their patients were more likely to perceive them as trustworthy. And patients are more apt to follow their recommendation to have surgical treatment. The research was published this week in the Proceedings of the National Academy of Sciences.
The study has important implications for professional advisers of any stripe and policymakers who deal with disclosure rules, said Sunita Sah, a physician and assistant professor of management and organizations at Cornell's Samuel Curtis Johnson Graduate School of Management.
"If an adviser discloses a bias, it should alert the recipient to some uncertainty regarding the quality of the advice. 'Perhaps I need to discount this a little bit.' Disclosure of bias, if anything, should decrease the weight that patients put on their physicians' recommendations," said Sah, an expert on conflicts of interest and disclosure. "But, instead, we find that patients report increased trust and they are more likely to take the physician's treatment than patients who do not hear their physician disclose a bias."
Sah and her colleagues based their findings on 219 transcripts of conversations between surgeons and male patients in four Veterans Affairs hospitals in which the surgeon revealed a diagnosis of localized prostate cancer to the patient. While discussing treatment options, some surgeons freely admitted to having a bias toward their own specialty, with statements such as, "I'm a surgeon, so I'm biased toward recommending surgery." Patients who heard their surgeon disclose their specialty bias were nearly three times more likely to decide to have surgery than patients who did not hear their surgeons disclose a bias.
The researchers also conducted a randomized lab experiment. In this study, 447 men watched video clips of an actor portraying a surgeon, who described two treatment options: surgery and radiation. In the "disclosure" group, the men heard the actor disclose his bias towards surgery, similar to the surgeons in the Veteran Affairs hospitals. The control group saw the same video, except for the bias disclosure. The men who heard the disclosure were more likely to choose surgery than the control group and reported higher trust in the doctor's expertise.
Sah and her colleagues also found that surgeons who disclosed bias toward their specialty or discussed a potential meeting with a radiologist oncologist for radiation treatment tended to give stronger recommendations for surgery.
"Bias disclosure can have a profound influence on adviser recommendations and the choices their advisees make," said Sah. "Professional advisers and policymakers should implement such disclosures with care."