The AI Factor
Great examples of the power of machine learning on medical data are easy to find today. Dr. Atul Butte, director of the Institute for Computational Health Sciences at UCSF, employed AI on big public datasets to isolate a molecular treatment for small-cell lung cancer; Butte predicts computational optimization can bring drug development down from $1 billion to a mere $100,000, and from 10 years to 2 years. Many public datasets are being created: Harvard’s Personal Genome Project, MHealth, Chicago Health Atlas and HealthData.gov are just a few among dozens of accessible libraries.
However, most live medical data in EHRs is decisively not AI-ready. It has to be extracted and standardized, which is laborious work. Machine learning can then scour the data to discover “hidden layers,” i.e., patterns never seen before. But that intelligence can’t then just turn around and run on the live EHR data, especially when it’s looking for something extremely complex, such as a constellation of genes interacting with lifestyle factors. Even when EHRs do comply to a standard, that EHR architecture typically was not built from the ground up for AI optimization.
Anyone and everyone will promise artificial intelligence insight from data. But it’s those who are already doing it, across millions of patients, on live data, who have the distinct advantage.
Interoperability is the holy grail. The problem is that people have to get paid for it.
– Geoffrey Clapp, healthcare entrepreneur and advisor