Yesterday 06:49 PM
https://magazine.hms.harvard.edu/article...nformation
EXCERPT: In my current practice in exurban Tennessee, I often encounter a different set of realities. Many of my patients live paycheck to paycheck, and I’m not infrequently asked to postpone a treatment date to the first of the next month, after a patient gets paid. Sometimes patients decline a referral to a specialist surgeon in Nashville because they cannot afford the gas money to make the trip. There are varying levels of health literacy and social support. At the moment, I am pondering what to do with a patient who needs a bone marrow biopsy but requested sedation due to a phobia of needles and does not have anyone to call on to serve as his designated driver home.
Through these vulnerabilities seeps misinformation. When someone in the community or an online personality promises cures that feel more natural, affordable, and empowering, it is easy to understand the appeal. By contrast, the treatments I offer — infusions, radiation, surgery — are intimidating, disruptive, and toxic. If you already feel left out of a health system that seems distant, confusing, and expensive, why not place your trust in the person on YouTube who seems to get it?
I find it fascinating that the rise of misinformation and distrust appears to have coincided with the advent of artificial intelligence tools, which are considered to be particularly potent for flattening the information divide between physician and patient. But while ChatGPT and other chatbots are empowering a certain subset of patient-customers, others are worse off today than they have ever been — mired in conspiracy theories and pure quackery, despite living within striking distance of the most sophisticated science and medicine in all of human history... (MORE - missing details)
EXCERPT: In my current practice in exurban Tennessee, I often encounter a different set of realities. Many of my patients live paycheck to paycheck, and I’m not infrequently asked to postpone a treatment date to the first of the next month, after a patient gets paid. Sometimes patients decline a referral to a specialist surgeon in Nashville because they cannot afford the gas money to make the trip. There are varying levels of health literacy and social support. At the moment, I am pondering what to do with a patient who needs a bone marrow biopsy but requested sedation due to a phobia of needles and does not have anyone to call on to serve as his designated driver home.
Through these vulnerabilities seeps misinformation. When someone in the community or an online personality promises cures that feel more natural, affordable, and empowering, it is easy to understand the appeal. By contrast, the treatments I offer — infusions, radiation, surgery — are intimidating, disruptive, and toxic. If you already feel left out of a health system that seems distant, confusing, and expensive, why not place your trust in the person on YouTube who seems to get it?
I find it fascinating that the rise of misinformation and distrust appears to have coincided with the advent of artificial intelligence tools, which are considered to be particularly potent for flattening the information divide between physician and patient. But while ChatGPT and other chatbots are empowering a certain subset of patient-customers, others are worse off today than they have ever been — mired in conspiracy theories and pure quackery, despite living within striking distance of the most sophisticated science and medicine in all of human history... (MORE - missing details)
