Tuesday, May 26, 2009

A Personal Reflection on Standards Harmonization

As HITSP prepares for the demands of ARRA by reorganizing its work around meaningful use rather than use cases, here is my view of the state of standards harmonization in the US. This is my personal opinion, not a statement from HITSP or ONC.

1. Medication management and e-prescribing

This area is very mature and widely implemented.

NCPDP Script 10.5 is the right messaging standard to support e-prescribing workflow in ambulatory and long term care settings.

The National Library of Medicine's RxNorm is the right vocabulary to specify medication names.

The Food and Drug Administration's Unique Ingredient Identifier (UNII) is the right vocabulary for chemical substances and is especially useful in allergy checking.

Structured SIG, although still evolving, is good enough to describe the way to take a medication.

The Veterans Administration's National Drug File Reference Terminology (NDF-RT) is the right vocabulary for medication class and is especially useful in drug/drug interaction checking and formulary enforcement.

There are few controversies in the medication standards area. The only outstanding issues are the fact that some of these standards such as Structured SIG and RxNorm are relatively new and continue to evolve.

2. Laboratory

HL7 2.51 is good enough for results reporting to EHRs, public health, and biosurveillance.

LOINC is the right vocabulary for lab test names.

UCUM, although very new, is a reasonable vocabulary to describe units of measure.

The only controversy around lab is the timing of implementation, given that thousands of commercial labs in the US need to update their interfaces to support HL7 2.51, LOINC and UCUM. Of these, standardizing units of measure with UCUM is probably the most controversial, given that using UCUM is new for lab stakeholders. I've recently spoken with healthcare IT leaders from other countries and all agree that standardizing units of measure for labs is a priority and should move forward.

3. Clinical Summaries

Just about all stakeholders agree that clinical summaries (problem list, medication list, allergy list, diagnostic test reports, discharge summaries, other documents) should be represented in XML.

The question is what flavor of XML - HL7's Clinical Document Architecture or ASTM's Continuity of Care Record.

HITSP harmonized these two approaches with the HL7 Continuity of Care Document (CCD).

The major controversy in the area of clinical summaries is the nature of the XML format and schema. Some have described the CDA as overly complex XML. I've also heard that some believe CCR's XML could be improved. My hope is that all stakeholders continue to work together to converge on a single, simple XML representation of a clinical summary that works for everyone and is more similar to the typical XML structures used widely on the web.

4. Quality Measures

The National Quality Forum's HITEP efforts have fostered a new way to represent quality measures in terms of a collection of data types. Additional work needs to be done to uniformly map these data types to specific standards. It's likely that the same standards mentioned above for medications, laboratory and clinical summaries will be suitable for transmitting quality measures to data marts.

The only controversy in the world of quality measures is the need to rewrite existing measures in terms of EHR data types. The National Quality Forum will be the catalyst for such a project.

5. Common Data Transport

The above discussion on medications, labs, summaries and quality has been about content and vocabularies, not the secure transmission of data from place to place. How should transport work for all healthcare data exchange?

Just about everyone agrees that the internet/TCPIP/HTTPS is the right approach. However, there are controversies about the other standards to be used - enveloping, authorization/authentication, and architecture.

Some have proposed simple RESTful web services. Some have suggested that SOAP with WS* constructs provides a more solid security framework.

In Massachusetts, we've used CAQH CORE Phase II with SOAP over HTTPS and X.509 certificates. We do nearly 100 million transactions a year with this approach and it works very well.

HITSP will work on harmonizing common data transport as part of its 2009 Extensions and Gaps efforts. It will harmonize the transport work done to date, the efforts of the NHIN pilots and the requirements of the Common Data Transport Use Case recently released to HITSP by ONC.

In general, how do we resolve remaining standards controversies by the end of 2009 when a interim final rule must be finalized per ARRA? Here's my understanding of the process:

1. The HIT Policy Committee will propose a set of priorities for "meaningful use", likely in the next 60 days. The National Coordinator will deliver these to the HIT Standards Committee.

2. The HIT Standards Committee and its 3 workgroups (Clinical Operations, Clinical Quality, Privacy and Security) will determine what existing accepted /recognized standards best support the HIT Policy Committee's priorities, likely in the next 90 days. The HIT Standards Committee will draw on the work of harmonization organizations (including HITSP), standards development organizations, and implementation guide writers. The Standards Committee will also engage NIST for standards testing.

3. The National Coordinator will review this work and if it is appropriate deliver it to the Secretary of HHS for acceptance and publication in the interim final rule.

There will be several periods of public comment and administrative review along the way.

What will HITSP's role be in this process? Initially it will provide expert testimony about harmonized standards to the HIT Standards Committee. In general, HITSP responds to the priorities established by the Office of the National Coordinator. It is independent of any particular administration/political party. If there is a need to approach priorities in a different way, HITSP will align to do that, just as it has with the ARRA focused efforts of the past 60 days.


Bernz said...

Its questions like this that make technologies like Enterprise Service Buses interesting to me (having just implemented one to bring systems together that don't WANT to speak to each other).

I mean, I love standards. I wish more organizations agreed on things and made our lives easier. But that's not the reality I (currently) live in. So I've used Enterprise Service Bus technology to help out.

The idea is this: System A and System B both input and output structured data. They both have APIs. They speak in different ways. You CAN write a mapping and messaging layer, but an ESB is like a pre-written map. It sits in the middle and accepts IO in hundreds of different forms. You still have to map the IO, but usually its a simple GUI interface to do so. After implementation, it doesn't matter if the system speaks JSON or SOAP or has its own XML... it just has to be structured. It also means that, combined with a BPM, you can wrap business rules around it.

Still, life would be easier if we all used an agreed upon format. But in case we can't, ESBs and other easy-mapping technologies will exist.

John said...


Thanks for your leadership in this area. Bringing real-world applications and HITSP leadership to the HIT Standards Committee will be very helpful.

The information you shared provides useful context about where the process and the data are leading us. I thought I might add a perspective or two to your comments around Common Data Transport.

My company VisionShare supplies connectivity solutions to over 3000 hospitals, 2000 clinics and a host of other entities (long term care, home hospice, billing services, etc.). We have our secure connectivity solutions in more than 9000 locations across the US. They are connected mainly via an HTTPS solution secured by server and client side X.509 certificates. These solutions have also been approved for use in Medicare applications.

We have anticipated a national standard for some time now. We were also a member of the CORE Phase II Connectivity Subgroup and participated in those recommendations. CORE Phase II resulted in the specification of two connectivity technology standards: "SOAP+WSDL" and "HTTP MIME Multipart" (i.e., multipart/form-data).

At VisionShare we have used a multipart/form-data approach to quickly and cleanly integrate with several trading partners. We've used SOAP in a few instances, but in general have found that integration partners prefer the perceived simplicity of the multipart/form-data approach. We also opened up our network through an X.509 client-cert authenticated REST-based API and have found that, with good detailed technical documentation, integration partners are often able to integrate with no issues or questions (from a large variety of platforms).

While SOAP and WS-* standards have their place (and are heavily favored in the NHIN specs), our experience is that there is a complexity tradeoff that occurs when choosing SOAP and WS-*. In production, virtually 100% of our integration partners have enough in-house knowledge of HTTP(S) and XML to easily comprehend and quickly utilize a REST API. SOAP and WS-* do not seem to share such widespread understanding.

We believe that while picking a scalable standard is very important, there will be a very busy body of work for some time, deploying protocol converters, and solutions that allow the integration of new standards with legacy systems.

Most importantly we’ve learned that security need not be the enemy of scalability.

Best regards,

John Feikema | President & CEO | john.feikema@visionshareinc.com

e-Older American said...

Perhaps, rather than just an agreed upon format, an agreed upon open source enterprise gateway would be the best path to follow.

I am planning to download the code and software development kit(SDK)for the CONNECT Gateway which is now becoming available to the public. Three primary elements (described on the Connect Website) make up gateway:

"The NHIN Gateway implements the core NHIN services enabling such functions as locating patients at other health organizations within the NHIN, requesting and receiving documents associated with the patient, and recording these transactions for subsequent auditing by patients and others.

Other features include authenticating network participants, formulating and evaluating authorizations for the release of medical information, and honoring consumer preferences for sharing their information.

The Enterprise Service Component (ESC) provides default implementations of many critical enterprise components required to support electronic health information exchange, including a Master Patient Index (MPI), Document Registry and Repository, Authorization Policy Engine, Consumer Preferences Manager, HIPAA-compliant Audit Log and others.

Agencies are free to adopt the default enterprise component implementations packaged in the CONNECT ESC or to plug in existing agency implementations of these service components. This component also includes a software development kit (SDK) for developing adapters to plug in existing systems such as electronic health record solutions to turn on information flows to support the secure exchange of health information across the NHIN. This makes CONNECT a platform for participation in health information exchanges.

The Universal Client Framework enables agencies to develop end-user applications using the enterprise service components in the ESC. This makes CONNECT a platform for innovation."

A hands on CONNECT Seminar is going to be held in Washington, D.C. on June 29th and 30. I hope that a Webcast will be made available for interested persons, like myself, who are unable to attend in person.

cbmd4u said...


Your summary is quite compelling as an overview. I have been following HL7, ASTM and the CCR,CCD,
CDA controversy since its inception. How is it possible that Google went with CCR instead of the
"HARMONIZED" CCD. Do they really believe that
CCR is a more "correct" use of XML?/

Look forward to the Board Meeting on Monday
Chris Bickford MD La Jolla/Scrippshealth