Mount Sinai has become a laboratory for AI, trying to shape the future of medicine. But some healthcare workers fear the technology comes at a cost.
WP gift article expires in 14 days.
There are a lot of Doctors in favor of AI too. Imagine realtime patient monitoring that can alert doctors to a say, a possible heart attack. It is something that has been worked on for at least the last 15 years.
This looks like another instance where AI could be used to really make doctors and nurses lives easier, provide more and better care at lower cost - but in the hands of greedy corporate types it wont go that way.
I’m not an expert at ML or cardiology, but I was able to create models that could detect heart arrhythmias with upwards of 90% accuracy, higher accuracy than a cardiologist, and do so much faster.
Do I think AI can replace doctors? No. The amount of data needed to train a model is immense (granted I only had access to public sets), and detecting rarer conditions was not feasible. While AI will beat cardiologists in this one aspect, making predictions is not the only thing a cardiologist does.
But I think positioning AI as a tool to assist in triage, and to provide second opinions could be a massive boon for the industry.
It can also help get better medical advice into people’s hands, when such can be extremely inaccessible under our current, maximum enshittified healthcare system.
That is a good thing and a bad thing. Self diagnosis will inevitably end with misdiagnosis.
I think AI has the potential to increase the amount of patients seen, and maybe even decrease cost, but in the enshittified American system I’m willing to bet it would not be close to the best outcome
They worry about the technology making wrong diagnoses,
You know who I’ve seen make “wrong diagnoses” over and over again? Human fricken doctors. And not to me (a healthy, upper middle class white male professional) but to my wife (a disabled woman with a debilitating genetic disease from a shitty part of Texas). We had to fight for years and spend tons of money to get “official” diagnoses that we were able to make at home based on observation, Googling and knowledge of her family history. I’ve watched male neurologists talk to ME instead of her while staring at her boobs. I’ve watched ER doctors have her history and risks explained to them in excruciating detail, only to send her home (when it turns out she needs emergency surgery).
revealing sensitive patient data
Oh, 100%, this is gonna happen.
becoming an excuse for insurance and hospital administrators to cut staff in the name of innovation and efficiency.
Oh, 100% this is ALSO gonna happen. My wife recently had to visit the ER twice, receive scary spinal surgery and stay over for 2 weeks. The NUMBER ONE THING I noticed was that in this state of the art hospital, in a small, wealthy, highly gentrified town, was DANGEROUSLY understaffed. The nurses and orderlies were stretched so thin, they couldn’t even stop to breath (and they were OFTEN cranky and RUSHING to do delicate tasks where they could easily make mistakes). This reckless profiteering is already a problem (that probably needs some more aggressive regulation to deal with it, nothing else will work). If AI exposes it more and pushes it to a breaking point, maybe that could ultimately be a good thing.