Is Health IT Still the Prescribed Treatment for what Ails Healthcare?
Ever wonder why Health IT hasn’t revolutionized healthcare as IT has most every industry in the world? We do – and finding how IT can still improve healthcare is one of the reasons Health Solutions Research exists.
Information Technology, and advances in technology in general,typically decreases costs throughout the value chain from supplier through consumer, streamlines communications, and in empowers consumers by giving them greater access to information and therefore greater say in what and how products and services are provided to them. Taken together, these forces led to the IT revolution.
Even the theatre industry has benefited from IT.
Managing the sound and lighting effects in theatre productions is much less expensive and more user-friendly now then say 30 years ago, allowing theatre companies to lower the costs for stage productions and making culture more available to the masses.
Similar cost reductions and operational efficiencies are being witnessed globally. Currently, Uber and AirBnB are causing major disruptions by bringing a new, tech-based model to transportation and the hotel industry. (And in the process, creating an entirely new economy.)
But this hasn’t happened in healthcare.
Healthcare still has not fully adopted IT and much of it operates as it did in the 1950s. (Look, for instance at similarities between the hospital patient room from the 1950s and today.)
And, the underlying cost structure hasn’t changed – or reduced – as other industries have. In fact, healthcare costs continue to rise year over year at a rate greater than inflation.
This isn’t entirely for lack of trying – either on the part of IT firms or the medical community. For instance, the Digital Health wing of the annual CES conference in Las Vegas is growing year after year with more and more tech-based offerings aimed at improving access to healthcare and/or changing the industry itself. But not a lot of that change has happened.
The question is Why?
The question is “Why has IT been unable to penetrate healthcare to effect cost reductions when it has a well established track record for doing so in other industries?” And its close corollary – “What can we do about it?”
To understand the situation well enough to answer the question, we have to realize that the IT and healthcare industries differ in one important way.
The information technology field follows the “If you build it, they will come” mantra made famous by the Kevin Costner film, “Field of Dreams”. (As we’ll see, this approach would be a nightmare in healthcare.)
And, more recently, Steve Jobs was famous for not listening to the consumer – because the consumer doesn’t know what they want until he puts it in front of them. [Interesting analysis of Jobs’ quote on this subject on the HelpScout blog.] And really, most customers and the public in general doesn’t know what they want from IT or what kinds of things IT can accomplish in the first place. So anticipating – or creating – customer needs is a viable approach, one Apple did and continues to do exceptionally well.
But, this doesn’t work in healthcare where the medical community is dealing with human life. Doctors can only use drugs, medical devices, and treatments that have been widely tested in controlled settings, proven to be effective, and with side effects that are well-known and documented.
This requirement for testing is foreign to the IT industry. In IT, it is not uncommon to release products with known bugs and allow consumers to identify the problems and deficiencies with the software or solution. Microsoft was famous for taking this approach and Google has taken the practice one step further by releasing beta software for public use. [Disclosure: I love Google and Microsoft – research for this and all blogs starts at Google, and are drafted in Word.]
Beta refers to software that, while functional, is not fully ready for commercial use.
Yes, there are sound business reasons for the practice or releasing software that may not be fully ready for commercial use – especially in Google’s case as their products are released for free – as flaws in software products can be hard to find or are manifest only under certain conditions. Relying on customer reports is often the most efficient way to identify flaws. There are more customers for a piece of software than you can ever have testers, so, customer reports actually are one reliable way to identify the bugs.
But in healthcare, this doesn’t work. We don’t want to wait for people to die to realize a drug or other treatment may have serious side effects. Anything intended for use with real patients must be thoroughly tested on patients and in controlled and monitored settings.
So even if you put an IT solution in front of doctors and other providers that may potentially increase the quality of care, they won’t use it unless you can also point to documented evidence of its efficacy. The uptake, or adoption, of new drugs and technological approaches requires a greater level of evidence than the IT industry may be used to providing.
So where do we go from here?
What’s required is a merging of the medical industry’s approach to clinical trials (e.g., controlled testing with all results documented) with the iterative, Agile development process from the technology industry.
Agile development is a process for developing a software or technical solution that breaks the overall effort into multiple stages, called sprints. The end of each sprint allows for an opportunity to test (e.g., run a small clinical trial).
Doing so may (but may not) increase the timeline and cost of the development of a health IT solution – but will more than make up for it in the increased quality and long-term uptake of the solution by the medical industry.
Performing clinical trials earlier in the development of health IT solutions allows the medical community to contribute early on by helping adapt the technology to the specific needs of their patients and potentially making providers more amenable to incorporating the solution into their practice of medicine going forward. At the least, being involved earlier in the development process will train providers in the use of the technology – especially important for situations when the new technology will change the workflows associated with patient care.
In essence, IT can still be the remedy for what ails the US healthcare system and incorporating a clinical trial approach to the Agile development process for tech aimed at patient care may be just what the doctor ordered.
Comentarios