Researchers at Cornell University attempted to create a more transparent AI system that works by counseling doctors in the same way a human colleague would.
After interviewing and surveying a group of twelve doctors and clinical librarians, the researchers found that when these medical experts disagree on what to do next, they turn to the relevant biomedical research and weigh up its merits. Yang’s team created an AI tool based on GPT-3, an older large language model that once powered OpenAI’s ChatGPT.
The tool’s interface is straightforward: on one side, it provides the AI’s suggestions and contrasts this with relevant biomedical literature. However, the jury is still out on how the Cornell researchers’ AI would hold up when subjected to a similar analysis.
Overall, while these tools may be helpful to doctors who have years of expertise to inform their decisions, we’re still a long way out from an “AI medical advisor” that can replace them.