Best is back
I think about the days before electronic health record systems and how patient charts worked. Clinics and hospitals had medical record rooms. These medical record rooms held thousands of individual patient charts in the form of folders, alphabetized much like you would find books in a library. Clinical staff would physically check out the patient record, review the information, and contribute new information to the medical record before physically checking the patient chart back to the right place in the medical record rooms. These charts were often couriered to other facilities when patients needed care elsewhere. The patient chart would, hopefully, get couriered back to the original storage location. The entire process sounds awful and frightening, and it was, but clinical staff overcome many shortcomings in the healthcare system in order to take care of patients effectively.
Today these same patient records are stored electronically in databases. Each healthcare entity has one or more databases, and clinical staff access the information about a patient through software applications on the hospital network. These software applications, depending on the specialty or function, can look quite different however they are accessing the same patient data. This works better than the physical medical record rooms of yore, but still has room for improvement. For example, in 2022 we still have the challenge of exchanging all the patient data electronically with other healthcare facilities. In other words, patient record exchange between two different databases. The term used to describe the ability to exchange patient records electronically between disparate databases is called interoperability, which is difficult to describe and even harder to pull off. We’ve come a long way in the last four decades, but we’re still not quite in a place where all the right patient data is available to the right clinicians at the right time and in the right format.
What if the physical medical record room with one patient record that is checked out and shared with other clinicians and checked back in could exist in the cloud? What if there was a master patient clinical record that, depending on where the patient shows up, is checked out electronically so the patient information is reviewed and more information is contributed before checking the record back in. A virtual medical record room of sorts. I’m not sure if that’s exactly what Larry has in mind, but my sense is it would be something along these lines. We are living in a time between physical, paper patient records and virtual patient records. Patients today have many different records. Each healthcare institution the patient has presented at will have at least one version of that patient’s medical record and no two patient medical records are identical or compatible. Better than paper, but we can do better.
I mentioned cloud computing above and I’d like to talk about that term for a bit because Larry famously discussed this back in a 2009 interview. I say “discussed,” but he passionately bashed folks who made it sound like “cloud computing” was some new phenomenon. The fact is cloud computing simply describes computers and processors on a network. The very first computers that processed requests on a local network were considered cloud computing. Back in those days, the time it took to process requests and send the data back to dummy terminals was lengthy. Often hours or days. Humans grew impatient and demanded local processing power to complete tasks much faster than relying on a centralized network of computers to process requests. This was the dawn of personal computing (PCs). People had desktop computers at first then laptops for more portability. It sounds like a major problem was solved, but those early computers pale in comparison to what’s available today. My first computer in college (mid-90s) had a RAM of 640KB. You read that right…kilobytes. Not megabytes. Certainly not gigabytes. I remember compiling my C++ programs on this bad boy and I would have to think hard about when to hit the compile button because I knew it would take roughly 15 minutes to complete that task. Brutal. Obviously, microprocessors and operating systems have come a long way. Network bandwidth and connectivity have also come a long way. We went from dial-up to DSL to cable to fiber. We now live in a world where fiber connectivity is widely available. The combination of increased processing power of computers along with gigabit internet connectivity has forced us to question the need for local processing. So much so that we have now shifted back to the original design of the first computers because processing requests can now take fractions of a second instead of hours or days. We still use personal computers but more and more they are a tool to get to the things we need in the cloud. Larry’s point in the video above is that this entire cloud computing world existed before so it’s not new. It just has a new name. Personal computers were better, but not the best.
Where am I going with this? Well, I believe a similar phenomenon will occur in healthcare related to the concept of “best-of-breed.” Best-of-breed allows clinical users to have access to the highest quality solutions from any software vendor instead of selecting one vendor to supply all workflow solutions. When I first started in healthcare back in 1998, best-of-breed was the norm. Healthcare organizations would cobble together integration between disparate systems using integration protocols like HL7. The integration back then was painful, awful, expensive (insert your own adjective here). Integration was so painful, in fact, the entire industry seemed to embrace the concept of selecting one vendor for all their departmental application needs. In the 2000s, many healthcare organizations would procure one EHR vendor to supply most, if not all, of their application needs. This is the world we live in today. What happens if we solve the integration problem in such a profound way, kind of like what happened with cloud computing when processing speed and internet connectivity were solved? Well, we go back to best-of-breed approach of selecting applications. This doesn’t mean the core EHR system goes away. In fact, healthcare organizations will still land on one EHR system, but they will select the EHR vendor that offers the best integration with other software vendor solutions. This gives healthcare organizations back the control they had previously. Best-of-breed was the right model. Just like cloud computing, and it will return.
How will best-of-breed return? We’re already starting to see this in the form of EHR vendors having their own app stores. Cerner has Code. Epic has AppOrchard. MEDITECH has Greenfield. Athena has More Disruption Please (MDP). Allscripts has Allscripts Developer Portal (ADP). You get the idea. The EHR vendors have focused on developing HL7 FHIR resources to offer a more RESTful way of integrating. These RESTful APIs become powerful for software vendors who need to exchange data with the core EHR systems. Healthcare organizations, for the first time in many years, have the ability to evaluate the best software solutions for their staff based on their individual workflow needs. The EHR vendor that succeeds in this new world may not be the one we expect to win. Just because a vendor is on top of the market today doesn’t mean they will remain. Just remember there was a time when HBOC McKesson was the #1 EHR. GE was also #1 for some time. The only thing that remains the same is that everything changes. The vendors that evolve accordingly to an integration with best-of-breed solutions will thrive. The other vendors will suffer. The movement of the healthcare industry away from best-of-breed to a single vendor for all solutions was better for a period of time, but it’s not the best. #BestIsBack