Wednesday, December 9, 2009

Advice to Beacon Communities

On December 2, David Blumenthal announced the $235 million dollar Beacon Community Program to accelerate and demonstrate the ability of health IT to transform local health care systems.

As we all think about how best to submit our applications (15 communities will be chosen), here are a few guiding principles:

1. This must be a public/private partnership - we need to ensure the "public good" needs of population health reporting, disparities in care reporting, immunization registries, administrative simplification, and biosurveillance are met. We must ensure private practice needs to achieve meaningful use are considered including electronic lab workflow, e-prescribing, and clinical summary exchange. We must include payers, both public and private.

2. We need to leverage the work that has already been done. In the case of Massachusetts we have multiple organization such as

Massachusetts eHealth Collaborative - implements EHRs

New England Healthcare Exchange Network - exchanges healthcare data

Massachusetts Health Data Consortium - develops healthcare IT policy and educates the community

Eastern Massachusetts Healthcare Initiative - provides a guiding coalition of payer and provider CEOs

Massachusetts eHealth Institute - the state organization serving as the distribution point for Federal funds

Massachusetts Health Quality Partners - the quality analysis organization

Boston Public Health Commission - the public health reporting entity

All of these organizations need to work together to create a single Beacon Community application. Having multiple applications from a region purely because of political infighting does not telegraph the kind of collaboration needed to be a beacon for others.

3. The focus must be on quality and efficiency, not IT for the sake of IT. There must be a measurable outcome - better wellness, fewer strokes/cardiac events, less hospitalizations.

4. There must be great governance - a steering committee and a series of working groups that can make tough decisions on detailed issues.

5. The strategy should be easily understood by all i.e. 40% of all clinicians in our community will have certified EHRs that are meaningfully used and exchanging data with other providers, payers, and patients for coordination of care and quality improvement. We will create the foundation for healthcare reform in which organizations are held accountable and rewarded for patient wellness, not delivering more fee for service care.

Thus, gather your stakeholders, creating a collaboration and ensure that IT raises the bar for everyone. Closed and proprietary IT is no longer a strategy, since healthcare is a zero sum game balancing the interests of employers, payers, and providers in the service of the patient.

Tuesday, December 8, 2009

Advice to Health Information Exchanges

With the HIE grants around the corner, many communities are asking me for advice to prioritize their local/regional infrastructure and applications. Here are my recommendations based on leading an HIE for the past 10 years.

1. Define your business requirements based on the value proposition for participants. This will maximize the likelihood stakeholders will pay for the services they receive, ideally based on gainsharing a portion of their cost avoidance i.e. if a lab costs $1.00 to send via email and .20 to send via the HIE, the hospital can pocket .80 and everyone wins.

2. Ensure you have a business model - subscription-based is good, since grants are not a business model. Transaction fees are an impediment to commerce - they are a disincentive to using the HIE. A fixed subscription fee encourages maximal use of the HIE.

3. Ensure you have policies for consent, auditing, and authorization of trading partners. Policies constrain technology choices and ensure the exchange is trusted.

4. Start with Push transactions. This is a very important point. I've written several blogs about the importance of using the simplest possible standards to achieve the business goal. A RESTful approach which enables clinical data to be pushed from one clinician to another is simple to implement since it leverages existing standards and capabilities of web servers without requiring mastery of more complex techniques. In Massachusetts we push admission notification, discharge summaries, ED summaries and other CDA documents. We're currently implementing push of quality data to registries and public health surveillance to the Boston Public Health Commission. The advantage of Push is that it does not require a master patient index, since the messaging is provider to provider or organization to organization. Consent is easy since the patient simply consents for the push. Wes Rishel has written several great blogs about this approach, which are important to read as they reflect some of the discussion in the HIT Policy Committee's NHIN Working Group.

5. Have a vision for Pull transactions. At some point, the usual ED use case will be mentioned - how do you pull together the entire history of patient from all the sites they have received care in a community? You'll need a master patient index, a record locator service (what institutions have records on the patient) and a means to pull summaries from each site. This pull infrastructure is much more complicated and expensive. Consent is much more complex - who can pull what from where and when? Instead of a provider to provider push involving two clinicians, pull could result in hundreds of people viewing a record. Push is great, but it really does not help the Emergency Department gather data about a patient for life saving treatment. Eventually we'll need to implement Pull.

As I've said in my blog about the Genius of the AND - the right approach is Pull and Push - REST and SOAP.

In the upcoming weeks, we'll have the interim final rule on standards. In the upcoming months the NHIN Working Group will provide us with policy guidance. The 5 points above plus the work of the HIT Policy/HIT Standards Community should provide the guidance that HIEs need.

Monday, December 7, 2009

Consistent Time

I was recently asked by my staff how we should coordinate the time of day across organizations which exchange healthcare information. In a future which treats data from outside data sources as appropriate for clinical decisionmaking, you can imagine the following data exchange:

Hospital 1 posts lab result 12:01pm
Hospital 1 sends result to Hospital 2 12:02pm
Hospital 1 revises lab result 12:15pm
Hospital 1 sends revision to Hospital 2 12:16pm
Order is entered at hospital 2 12:17pm

Time synchronization among participants in a healthcare information exchange is important. If Hospital 2's clocks were 3 minutes slow, it would be challenging to know if the order was entered based on the original or revised lab result.

HITSP has published T16, the Consistent Time Transaction to help address this problem. It's based on an IHE profile created to support the synchronization of security audit logs.


Here is the relevant section IHE ITI TF Vol 2: 3.1.4.1 from IHE-CT profile
"The NTP transactions are described in detail in RFC1305. There is also extensive documentation on the transactions and recommendations on configurations and setup provided at http://www.ntp.org. Rather than reproduce all of that material as part of this Framework, readers are strongly encouraged to explore that site. The most common mode is the query-response mode that is described below. For other forms, see RFC1305 and the material on http://www.ntp.org.
The Time Server shall support NTP (which implicitly means that SNTP clients are also supported). Secure NTP may also be supported. The Time Client shall utilize NTP when it is grouped with a Time Server, or when high accuracy is required. For ungrouped Time Clients with 1 second accuracy requirements, SNTP may be useable. Time Clients may also support Secure NTP."

Although original designed for audit trails, the transaction has been expanded to other transactions, since organizations have realized that having synchronized clocks really helps documentation integrity and workflows. As the use of the consistent time is extended beyond audit trails, there are interesting issues about just how precisely synchronized devices in a network should be - a few seconds, one second, a subsecond?

At BIDMC, we point to stratum 1 servers that are directly connected to computers attached to atomic clocks.

The interesting question for HIEs is what should be synchronized.

My hospital servers are all synchronized against one set of time sources.

Our HIE, NEHEN, has suggested that all the gateways used to exchange data among multiple hospitals should be synchronized with one time source to ensure that all send and receive timestamps for clinical data exchange are consistent. Otherwise, data might arrive at one hospital before it leaves another!

However, since HIE gateways will be synchronized with one time source and hospital internal servers may be synchronized with others, the HIE time may vary from the hospital time.

Maybe the right answer is that as part of our national healthcare IT effort, we should mandate that all hospitals and HIEs should use a single set of known-adequate time servers to ensure all healthcare time is consistent.

For the moment, the following strategy seems reasonable

1. Require hospitals use NTP to ensure their internal time stamps are consistent. This will ensure audit trails within an organization, whether merged in an audit repository or just reported from disparate systems upon request, are consistent.

2. Synchronize health information exchange gateways within an HIE to a single time source so that transactions have consistent send and receive times.

If we know the hospital audit trail time stamp is consistent and we know the HIE send/receive times are consistent, we can recreate any event that is disputed.

Expecting every hospital to change its time synchronization servers to those used by the HIE is unrealistic - what if the hospital participates in multiple HIEs?

At some future time, we all may change to a national healthcare time server that is part of the NHIN, but for now the hospital use of NTP will be decoupled from the HIE use of NTP.

Friday, December 4, 2009

Cool Technology of the Week

In June, I bought a Strida folding bike and change my commute pattern so that I park outside of town and cycle to all my meetings.

Now that the days are shorter, I'm cycling in twilight and I needed a set of lights for visibility. I chose the Knog Frog Lights from Australia.

These lights are water-resistant, weigh 12 grams, burn for 160 hours on 2 coin cells, and have a flexible silicon body with an integrated clipping/quick-release mounting. The 10,000 millicandela LED is visible up to 600 meters. I purchased a white light for my handlebars and a red light for my seatpost. I push on the silicon housing and set them to flashing mode at dusk.

They've been a great addition to my Strida and keep me safe in Boston traffic.

2 bike lights burning for 160 hours that weigh under an ounce. That's cool!

Thursday, December 3, 2009

My Christmas List

As I prepare the list of the few gifts I want for Christmas, I'm guided by first principles. Here's a great column that reflects on what we have and what we keep. Know what you own and ensure it's the minimal right stuff .

For me, there are only two items on my Christmas list
- A small home workbench that enables me to help my family with electrical, plumbing, carpentry and painting projects.
- A bike repair stand that attaches to the workbench so that I can maintain the gears, cables, and components of my family's bikes.

As stated in the column above, I try to own little, but if there are tools I use every day, I try to own the right ones. My personal tools are all Craftsman and my bike maintenance equipment is from Park Tool.

The good news about being very limited and precise with your belongings is that there are no impulse buys, no Black Fridays and no shopping stress, just a plan.

I plan to buy them online with Sears and REI gift cards, then assemble them on the weekend after Christmas.

Wednesday, December 2, 2009

Strong Identity Management

In addition to audit trails, a key component of enforcing security policy is ensuring the identity of those who use applications. In the November 19th HIT Standards Committee testimony, we heard about the need for strong identity management.

Currently, most systems support username/password with various rules such as those we use as BIDMC:

Passwords must be at least eight (8) characters in length
Passwords must contain characters from at least three (3) of the following four (4) classes:
English upper case letters A,B,C,...Z
English lower case letters a,b,c,...z
Westernized Arabic numerals 0,1,2,...9
Non-alphanumeric ("special characters") such as punctuation symbols: !,@,#...
New passwords must be different from previously used passwords.
Under no circumstances should the Passwords contain your username or any part of your full name or other easily identifiable information.

However, it's clear that something stronger than a username/password will be needed for e-prescribing controlled substances. The DEA has insisted upon NIST Level 3 authentication. What do levels of authentication mean?

Level 1 is the lowest assurance and Level 4 is the highest. The levels are based on the degree of confidence needed in the process used to establish identity and in the proper use of the established credentials.

Level 1 - Little or no confidence in the asserted identity’s validity. Level 1 requires little or no confidence in the asserted identity. No identity proofing is required at this level, but the authentication mechanism should provide some assurance that the same claimant is accessing the protected transaction or data.

Level 2 - Some confidence in the asserted identity’s validity. Level 2 requires confidence that the asserted identity is accurate. Level 2 provides for single-factor remote network authentication, including identity-proofing requirements for presentation of identifying materials or information.

Level 3 - High confidence in the asserted identity’s validity. Level 3 is appropriate for transactions that need high confidence in the accuracy of the asserted identity. Level 3 provides multifactor remote network authentication.

Level 4 - Very high confidence in the asserted identity’s valid. Level 4 is for transactions that need very high confidence in the accuracy of the asserted identity. Level 4 provides the highest practical assurance of remote network authentication. Authentication is based on proof of possession of a key through a cryptographic protocol.

If Level 3 authentication is implemented in healthcare for prescribing controlled substances, strong identity management may be expanded to other aspects of healthcare such as signing notes, signing orders, or gaining physical access to restricted areas.

Given the workflow implications of an added authentication burden, it's important to choose the right technology approach.

There are a wide range of two-factor authentication methods, including security tokens, smart cards, biometrics, certificates, soft tokens, and cell phone-based approaches.

I've had experience with each of these. Here's a summary of my findings

Tokens - you'd think tokens would easy to use, but we had a high login failure rate, challenges with tokens getting lost/destroyed (in the laundry), time synchronization issues (as the battery begins to age, the clock inside the token may begin running slowly), and clinician dissatisfaction with having to carry yet another device. A clinician with multiple affiliations has an even worse problem - multiple tokens to carry around. Token and licensing costs were expensive.

Smart cards - we use smart cards for physical access and they work well. They are foolproof to use, can be laundered without an issue, and are inexpensive. The only problem with using them in software authentication is the expense of adding smart card readers to our 8000 workstations. Buying and maintaining 8000 USB devices is costly. However, they are still a serious consideration, since clinicians like the idea of walking up to a device and using something they already have - a badge - to authenticate.

Biometrics - I've written about our use of BIO-key in the Emergency Department. Biometrics are convenient because you can just swipe a finger, which you always have with you (we hope). Many laptops have built in finger print readers and the BIO-key software easily integrates web applications into Active Directory. As with smart cards, the only challenge is installing and maintaining fingerprint scanners on 8000 existing desktops. Biometrics have been very popular with our clinicians and we've had a very low false negative rate (and zero false positives).

Certificates - managing certificates for 20,000 users is painful. We've done it and although I am a strong believer in organization level certificates, I remain unconvinced that user level certificates are a good idea. Maybe new approaches like Microsoft's Infocard, which presents digitally signed XML-based credentials, will make storage and presentation of cryptographic credentials easier.

Soft tokens are just a software version of hardware token running on a mobile device or desktop. Since software must be installed and maintained on each device, they can be a challenge to support.

Cell phone based approaches - Harvard Medical School recently implemented two factor authentication with cell phones as a way of securing password reset functions. It's been popular, easy to support, and very low cost. Companies such as Anakam offer tools and technology to implement strong identify management in cell phones via text messaging, voice delivery of a PIN, or voice biometric verification. Per the Anakam website, their products achieve full compliance with NIST Level 3, are scalable to millions of users, cost less than hard tokens or smart codes, are installable in the enterprise without added client hardware/software, and are easy to use (all you have to do is answer a phone call or read a text message).

Thus, my vote for achieving NIST Level 3 is to chose among smart cards, biometrics or cell phone based approaches depending on the problem to be solved and the workflow that is being automated. Although we've not yet implemented cell phone approaches for EHR authentication, I can imagine that our 2011 authentication strategy might be

Physical Access (hundreds of existing doors that have smart card readers) - Smart cards

Fast trusted login in the Emergency Department (100 devices that are kept in a closed physical space) - Biometrics

Generalized two factor authentication for e-prescribing controlled substances (thousands of devices and hundreds of users) - Cell phone approaches

With strong identity management, our audit trails will have greater value. It will be challenging for a user to claim that they were not the person performing the transaction. The combination of trusted identity and complete audit trails is key to a multi-layered defense against privacy breeches.

Tuesday, December 1, 2009

Standardizing Audit Trails

Over the past month, the HIT Standards Committee and its Privacy and Security Workgroup have discussed the simplest set of security standards for 2011 and beyond.

We've had debates about audit trails, strong identity management, and consistent time. Each of these topics deserves its own entry and I'll start with audit trails. Thanks to John Moehrke of GE for background information - he was part of the team that wrote the IHE profile on auditing, ATNA, which is based on ASTM E2147 healthcare specific audit trail standards.

What is an audit trail and how does it differ from a disclosure log?

Audit is simply a record of system events -- it does not get into "why" as does accounting for disclosures (which includes release of information to external organizations for a specific purpose). HIPAA requires that organizations:

1. Implement hardware, software, and/or procedural mechanisms that record and examine activity in information systems that contain or use electronic protected health information; and

2. Implement procedures to regularly review records of information system activity, such as audit logs, access reports, and security incident tracking reports.

The first HIPAA requirement is what should be captured in the EHR product certification criteria, and the second is a "meaningful use" policy statement. While the ATNA profile draws upon the ASTM E2147 standard in its specification of the data elements to be captured in an audit trail, it goes beyond what either ASTM or HIPAA require in its focus on the concept of centralizing audit information by having multiple systems send audit messages to an audit repository. The ASTM standard specifies the data elements to be captured when auditing accesses to protected healthcare information, but goes beyond ATNA by also specifying a list of data elements to be included in an accounting of disclosures.

The issue with audit trail standards is determining how specific to be about their requirements. Technical requirements for auditing accesses in EHR systems, from most restrictive/intrusive to least could be:

1. Cross-enterprise ATNA - the ability of an organization in a health information exchange to query another organization's audit trail. This is helpful in maintaining trust because there is a virtual audit trail for the community.

2. Intra-enterprise audit integration - the capability to construct a continuous audit trail across systems within an organization using ATNA (such as the Veteran's Administration Auditing Service project.

3. The first HIPAA requirement above, which is a policy "standard" not a technological one.

Initially, we debated the need for cross-enterprise audit trails. Shouldn't ARRA privacy requirements be met with just #3, a policy?

In our Security Testimony on November 19th, we heard that availability of audit trails between organizations is one of the most important elements of building trust in a health information exchange.

Automating audit trails at the organizational level, #2, seems like a wonderful solution to security, breach notification, and privacy. There have been very public exposures of health data, and in each case the people who violated privacy were discovered through the use of audit logs:






There are more…

When it comes to real world implementation of audit logs, I know there is a big difference between legacy software and new applications. Typically EHRs have built up audit logging incrementally, based on customer requests, not via foundational design. Ideally, audit logging would have been architected to be a replaceable code module or web service, but this is not likely in existing applications. However, Healthcare Information Exchange/NHIN is largely a greenfield. This is new code that could be held to higher standards.

Thus, one possibility is to use #3 (a policy) for organization level audit trails, but to use a standards based audit repository (ATNA) for Health Information Exchanges. For internal security events it could be acceptable to record audit logs in a proprietary format as long as they can be exportable into a text file format with a well formed time-stamp.

We'll be having an ongoing dialog about this topic over the next month. I welcome your input.