Later today, I’ll be debating Avisha NessAiver, data scientist and creator of @DistilledScience, about whether AI will replace doctors! The debate will be moderated by the host of Digital Health Investor Talk podcast, Steve Wardell. Recently, Bill Gates has mentioned that in 10 years many current jobs will be done by AI, including the doctors job. While this is quite a statement, coming from the likes of Bill Gates, it needs to be taken seriously. As a practitioner of AI in healthcare and the author of one of the only books on this topic, I don’t necessarily buy what one of the great minds of our times is envisioning in my area. That’s because being a great mind in computer science and an amazing businessman still may not be enough to fully understand healthcare. I’ve been saying for quite some time that certain industries are more complicated than others and without having worked inside that industry, it may be very difficult to really understand and appreciate how everything works.
Having been a practicing physician and spent time in clinics, hospitals, procedure rooms, imaging suites, and other settings, it was obvious that everything was fluid around me. That means that the unpredictability of what happened next was a feature and not a bug. Patients wake up one morning and have back pain so severe that they can’t get out of bed and make it to their doctor’s office for their appointment. Meanwhile, a patient walks in with no past history and tells you that they’re not feeling well. It is your job now to figure out why today, of all days, they noticed a new problem. This means that no matter how prepared you have been with the help of data analysis, AI, software, and other tools, for the patients you were scheduled to see, you now have to deal with unforeseen issues. You need to start with no data to figure out what’s going on. Is this patient in your office because they really have a major physical issue that is finally leading to some vague symptoms? or, are they experiencing personal issues and their physical symptoms are a manifestation of a sad mind?
I encountered this scenario many times while I was a practicing physician. More important than anything in this situation were my observations about the patient’s appearance, state of mind, baseline demographics, and many soft and intangible factors. Often, I got to the root cause of the issue by establishing trust and spending a few minutes getting to know them. This human connection also turns out to be therapeutic many times. Someone in an emotionally challenging state often needs to connect with another human being who can listen to them and empathize. If you’re doing this right, many times they open up about what is really going on and you can avoid doing expensive and time-consuming tests. Of course, many times there is something really going on and a proper diagnostic work-up needs to be undertaken. Even in those situations, the human element is a huge part of what will unfold next. People who have emotional or physical issues are in a vulnerable state and need human guidance and connection. There is not one way to do things and a human that connects with them and understands who they are will make the right choices based on their personality, mental and emotional state, preferences, and more. All of this is outside the purview of AI and what it will or will not do in healthcare.
It’s important to keep in mind that I’m one of the biggest proponents of AI for health and healthcare. Otherwise, I wouldn’t spend 3 years writing a book about this topic. Today, we have a huge shortage of resources in healthcare and the problem is only getting worse! The projected shortage of nurses and doctors over the next decade will be a major problem that will impact quality and accessibility of care. It is not possible to train enough doctors and nurses to address this issue because that takes a long time. While that needs to be happening to avoid this problem in the future, technology will be the answer to mitigate the many undesirable effects of the projected shortages. In the past, technology has not been as effective in providing relief to the healthcare workforce. Why? Because of some of the issues that I mentioned earlier. Practice of medicine does not happen on paper, or better stated, in a computer! Often, the big clue about what is going on comes from the patient in the last moment of a visit, either verbally or during an exam. Or, from a family member or friend that is accompanying them. The information you use to make a decision is sitting in different places: electronic health record, radiology system, outside lab network, genetics analysis, or more. This type of multimodal data is difficult to analyze for computers, especially that some of it is not recorded anywhere. As such, traditional software has not been very helpful with assisting with the long to-do list of providers.
AI does indeed hold the promise to change that. AI agents are becoming much more powerful, capable of ingesting multimodal data, analyzing and reasoning through that data, and taking action. The last part, taking action, is key in healthcare. What we need help with in healthcare is automation. We need assistance for the healthcare workforce, not in providing reminders to do something, but to actually do it. While these are the early days for agentic AI, it’s reasonable to expect that it can be the breakthrough technology to take over many administrative and clinical tasks and provides much needed relief. However, what it will not do is create human connection that is an integral part of practicing medicine. It is also a key element in drawing out the needed information that guides the course of your diagnostic and management actions and driving adherence to the prescribed treatment. Arriving at the right course of action is often a back and forth with the patient.
The patient’s preferences are not pre-recorded that can be loaded up into a questionnaire and then analyzed by AI. The options need to be explained to them, clarified after they ask questions, and the experience of others like them shared with them in a way that they can relate to. Therein lies the big difference between healthcare and other sectors that have experienced significant automation, like farming. More than 99% of the jobs and activities in farming were mechanized and automated over the last 150 years. Why can’t we do the same in healthcare? Well, because humans are not plants. They have emotions and act in an unpredictable manner. The number one predictor of whether they will follow medical advice is their relationship with their medical team. AI will be an enabler of this but it won’t ever establish a human connection with a patient.
About 10 years ago, I was on stage at a digital health conference with a silicon valley venture capitalist and a billionaire. At that time, he predicted that in 5 years there won’t be a need for radiologists because AI will do a better job than them in reading all of the radiographic studies. Well, ten years and counting and our shortage of radiologists has never been worse than it is today. So, when I heard Bill Gates’ recent prediction about not needing doctors in 10 years, it reminded me of that prediction by another very smart and successful billionaire. For me, the takeaway from all of this is that while high IQ and amazing insights can help you be very successful in many areas, it does not necessarily make you an expert in everything. When you hear predictions like this, don’t take them as gospel. All of us are right and wrong about a lot of things all the time. Relying on people with deep expertise in your area of interest is still the best way to understand what may happen in the future!