Picture this: You want to check something from your doctor, so you log into your medical portal to see your health information; only, instead of seeing your information, you see the data from another person. Of course you are going to wonder, “Does that mean someone can see mine as well?”
This exact scenario has occurred at least twice in the last week from patients using the electronic health record (EHR) systems reported.
The problem is that there is almost no software program out there without its own set of bugs; the bugs may manifest themselves as something as simple as forcing an app to close, or as shocking as seeing the medical information of a different person.
In the case of EHR, apparently this glitch occurs when specific functions are performed in a particular way, making it almost impossible to quantify the exact number of affected patients. Also in the case of EHR, a greater problem exists in that this glitch is a very clear violation of HIPPA. Though this appears to be accidental, the fact remains that the software is sharing the private and protected health information of patients, to other patients.
While it is possible to go into the software and pull reports, without strict auditing logs, it is practically impossible to determine not only which patients saw the wrong information, but also which patients’ information was seen. It also makes it difficult to know exactly which of the providers using EHR need to be contacted.
Auditing is one of those back room functions. It’s like spending money on rewiring the electricity of your home instead of opening up your kitchen; it is extremely important because it keeps you safe, but it is also very un-sexy, and like home owners, business owners do not want to spend their money on something whose intrinsic value is not seen until something goes wrong.
If the auditing function is weak, then assumptions must be made; and you know what they say about assumptions.
Who’s to Blame?
In this situation, it appears that everyone loses.
Nobody is perfect, but imagine you work in the healthcare industry, and you receive a notification from the software company, a company that you trust with your patients’ most sensitive information, that says a bug in the system has caused half of your patients’ private information to be breached. And if that wasn’t bad enough, because this caused a clear HIPAA violation, you have to notify not just your patients, but also the Office of Civil Rights (OCR), too.
It’s like paying for your taxes to be done by a professional and then being audited, only to find that the Accountant committed fraud, but that you are responsible. You thought you did everything right, and you did, except that you made a bad choice. In this case, the impacted practices will not only have to deal with (understandably) upset patients, but they will also incur legal and other related expenses.
That being said, depending on the number of patients impacted and the underlying circumstances of the violation, it is possible that the OCR will open an investigation of both the software company and the medical practice. It may get the practice out of a violation on that charge, but it also opens them up to be dissected by OCR, which, if their HIPAA compliance is not on-point, could lead to additional penalties or even fines.
Software Companies need to make sure that their programs are tested to their breaking points; prevention is always preferable to a cure. Their healthcare industry customers are held to high standards in the form of HIPAA and that is always important to keep in mind. Auditing logs should be strong and detailed in order to make sure everything is in compliance.
Healthcare Practices need to take responsibility, as well, because they are the parties being held to HIPAA standards. If the software is not up to par, then they need to make sure to get their program within compliance as soon as possible. And both parties need to cover themselves with business associate agreements.
As the String Theory suggests, one little action, one tiny bug, can cause a huge impact.