We are already starting to 👀 see this.
I have been a 🤱 neonatal critical care doctor for now over two decades.
Understanding physiology has been a mainstay of our therapies, interventions, and care maybe more so than any other non-organ specific medical specialty.
Our institution was afforded the opportunity of having a brilliant neonatal cardiac and hemodynamic physiologist/scientist/physician visit to share his wealth of knowledge, Patrick McNamara from the University of Iowa.
🙏 Thanks for visiting and enlightening.
He highlighted that our knowledge, understanding, application of basic and important physiological principles underlying fetal and neonatal hemodynamics is diminishing. It is true.
In the field of cardiology, I have been witness to such change.
When I was a fledging resident at the University of Michigan and requested a ❤️ cardiology consult on a patient, a human cardiologist would arrive to the bedside and magically diagnose a baby with congenital heart disease using a stethoscope, a thorough exam, and astute skills (still in awe of Dr A. Rosenthal).
Now when I make the same request, I see an echocardiography machine get wheeled to the bedside by a tech.... and the static/dynamic structural and flow data acquisition begins... later to be interpreted by an echo cardiologist in a separate building, in front of a computer screen...rarely ever visiting the patient bedside.
The days of the stethoscope as a fundamental cardiology tool are gone.
Is the bedside human interaction gone too?
Sadly from my perspective-- technology in the past 2 decades continues to bring us physicians further and further away from our patients. EHR-- say no more.
We have more data. But do we have more knowledge and wisdom?
We spend less time with patients. Are we providing better care with our tech?
In the following article, the authors highlight how reliance on #AI will likely (already has begun) change even the most rigorous of scientists.
-the Dunning Kruger effect, a tendency to over-estimate one's knowledge-- leading to incorrect assumptions.
-Automation Complacency- we start to trust AI and automation too much and do not double check accuracy. It becomes routine. Think clinical decision support tools.
-Illusion of explanatory breadth-- studies will become more focused on issues/problems that AI can be applied to and less focused on those that require more human work (and maybe are harder to study).
-Data bias leads to restricted objectivity and perpetuates inequities and disparities.
We need to venture carefully, cautiously, rigorously, and ethically as AI is applied to healthcare and scientific investigation.
#UsingWhatWeHaveBetter and wiser!
https://lnkd.in/g_zJfBFK