In my role as vice-Chair of the HIT Standards Committee, I join many of the subcommittee calls debating the standards and implementation guidance needed to support meaningful use. Over the past few months, I've learned a great deal from the Privacy and Security Working group.
Here are my top 5 lessons about security for healthcare information exchange.
1. Security is not just about using the right standards or purchasing products that implement those standards, it's also about the infrastructure on which those products run and policies that define how they'll be used. A great software system that supports role-based security is not so useful if everyone is given the same role/access permissions. Running great software on a completely open wireless network could lead to compromise of privacy.
2. Security is an end to end process. The healthcare ecosystem is as vulnerable as its weakest link. Thus, each application, workstation, network, and server within an enterprise must be secured to a reasonable extent. Only by creating a secure enterprise can healthcare information exchange be secured between enterprises.
3. As stated in #1, policies define how security technology is used. However, the US does not have a single, unified healthcare privacy policy - we have 50 of them since state law pre-empts HIPAA. This means that products will need to have the technology capabilities to support heterogeneous policies. For example, a clinician may have simple username/password authentication, while a government agency might require a smart card, biometrics, or hardware token.
4. Security is a process, not a product. Every year hackers will innovate and security practices will need to be enhanced to protect confidentiality. Security is also a balance between ease of use and absolute protection. The most secure library in the world would be one that never checked out books.
5. Security is a function of budgets. I spend over $1 million per year on security work at BIDMC. Knowing that rural hospitals and small practitioners have limited budgets, we need to set security requirements at a pace they can afford. Imposing Department of Defense 'nuclear secrets' security technology on a small doctor's office is not feasible. Thus, the Privacy and Security Workgroup has developed a matrix of required minimum security standards to be implemented in 2011, 2013, 2015, realizing that some users will go beyond these minimums.
Privacy and Security is foundational to ARRA and Meaningful Use. Since patients will only trust EHRs if they believe their confidentiality is protected via good security, there will be increasing emphasis on better security technology and implementation over the next few years.
Although some may find increased security cumbersome, our goal of care coordination through health information exchange depends on robust security technology, infrastructure and best practices.
3 comments:
I worked for years in the legal community building Internet connected software. Now as I delve into the Health IT world, I notice much of the same issues.
It's not insurmountable.
We discovered that a system's ability to do a few things helped with the security process:
1. Audit - keep a trail of what happened and when. This trail should be detailed but easily reported against in case of incidents or just to understand how the system works to better improve it. This means all of the portions of the system need to support this audit trail, not just the "application". If the hosting OS doesn't let me know that someone logged in and became root, that's a problem. Every event should be auditable and reportable. This should be built-in so even small practitioners can take advantage of it.
2. "Heuristic" systems - Like the doctor said, Security isn't some product, it's a process. But products can help enable and streamline that process. I like heuristic and trained systems, like Intrusion Detection/Prevention, application firewalls, network behavior analysis and systems like those that help us be proactive in security and, like audit trails, help us know what's going on.
3. Like the doctor points out, the weakest link is your vulnerability. I can have whiz-bang Infosec, but if I can call your receptionist and con her into giving me passwords/information, it's all useless. That's why audit trails and early detection are important. If you have enough layers in place and good ways to monitor them, someone's going to hit your tripwire before they get to your pot of gold.
4. Encrypt everything. The whole chain. From point A to point B. I don't care if you have private lines or your computer is locked behind steel doors. Encryption is easy to do these days for IT professionals. Really. Encrypt it all. Have a key escrow that's somewhere very very safe just in case you lose your keys.
5. Educate. I like to think that my profession is oddly mysterious and inaccessible to all but the most devoted of this cabal, but the truth is... it's not so fancypants. As people, we know it's best practice to lock our cars, turn on the alarms in our office before we go home and not to reveal our PIN number. Educate people who work with computers the same way. There are some basic principles that, when taught, can drastically improve security across the board just by having the average person be an active participant in your security.
I probably have more, but I really liked this column and just wanted to respond quickly.
Good points. I wonder if not having a national security policy is the reason that we have so much more difficulty adopting EHR and HIE than other countries do.
Perhaps there is more incentive to access PHI because much of our healthcare is a for profit venture?
I see the future of healthcare data warehousing to become an environment in which patients have control of their information and move from simply being patients to being fully informed consumers.
Post a Comment