Friday, August 29, 2008

Cool Technology of the Week

In my role as Chair of the Healthcare Information Technology Standards Panel (HITSP), I've worked with 500 stakeholder organizations to harmonize the standards for clinical summaries, labs, e-prescribing, public health reporting, quality measurement, and personal health records.

These standards are now beginning to be implemented widely in the private and public sector since being mandated for Federal procurement in January of 2008.

Massachusetts and its hospitals have embraced these new standards and today we're sending thousands of transactions with them. One of our more exciting implementations went live earlier this month - lifetime medical record exchange with the Social Security Administration (SSA) for disability processing.

Using HITSP standards for data content (CCD) and transmission (SOAP), BIDMC has built a web service to provide real time exchange of data with the Social Security Administration. The client and the web service communicate using XML messages that follow the SOAP standard. The client sends a SOAP message containing patient demographics and a copy of the digitized consent and the web service responds with a SOAP message containing the Continuity of Care Document of the patient. Transport layer security is enforced via SSL. Message layer security is enforced using X.509 certificates and digital signature. All SOAP messages are digitally signed by sender’s certificate. Both the digital signature and the sender’s certificate are validated for data origin authentication.

A deidentified sample of an actual SSA transmission is available online.

It contains:

Patient Demographics:

Birth time
Phone number
Next of Kin information

Condition/Problems: (fully coded)

Current Problem list from webOMR
All inpatient visit diagnosis in the date range requested
All ED visit diagnosis in the date range requested
All Outpatient visit diagnosis in the date range requested

Results (human readable) limited to date range requested

All Lab results (chem, hem, urinalysis, blood gases, blood bank, etc.)

All reports from:
CT Scan Reports
Pathology/Biopsy Reports
X-ray Reports
MRI Reports
Cardiac Catheterization Reports
EEG (electroencephalogram)
ECG/EKG Reports
PET Scan Reports
Pulmonary Function Test Reports and Tracings
Graded Exercise Test/Exercise Tolerance Test Reports (cardiac tests)

Procedures (fully coded) limited to date range requested

Inpatient procedures
ED procedures
Outpatient procedures

Encounter (human readable) limited to date range requested

All discharge summaries from Inpatient stays – (identical to MA Share document)
All Operative notes
All ED visit summaries
All Notes on patient from webOMR
Phone calls
Encounter summaries

Thus, we've leveraged the internet (not private networks or VPNs), existing web standards (SOAP, SSL), and a structured, vocabulary controlled XML exchange (CCD) to rapidly implement interoperability with patient consent.

I've read some articles and heard some rumors that HITSP efforts are not XML based, are not vocabulary controlled, and are not embracing existing web standards. The SSA project illustrates the reality of how HITSP interoperability specifications empower connectivity at low cost with little effort using XML, vocabularies, and existing web standards.

Thursday, August 28, 2008

100 Things to Do Before You Die

Today I heard about a sobering accidental death.

Dave Freeman, co-author of “100 Things to Do Before You Die,” a travel guide and ode to odd adventures that inspired readers and imitators, died on Aug. 17 after a fall at his home in the Venice section of Los Angeles. He was 47.

Dave and I were only 6 months different in age. Over the past 10 years while I've been climbing, hiking, and traveling the world, I've tried to live each day to the fullest, creating my own 100 Things list. Here are a few completed entries on my list from my recent vacation and previous travels:

1. Explore the excavations of old Jerusalem with an archeologist. I did consulting for the government of Israel on coordinating electronic health records for mass causality incidents. I asked for a day with an archeologist as my compensation.

2. Walk Hadrian's Wall (separates England and Scotland in the UK) coast to coast and have a pint of ale in a rural Scottish Pub with farmers who've never met an American.

3. Kayak across the Baltic Sea (it took 3 days)

4. Gather crawfish by moonlight in Sweden (before I was a vegan)

5. Explore the fiords of Norway on foot

6. Bicycle throughout East Anglia in England, exploring the backroads and hedgerows.

7. Walk the 7 hills of Rome and climb the stairs to the top of St. Peters

8. Climb the Untersberg in Salzburg, Austria

9. Play the Turkish Ney in a Mosque in Istanbul

10. Walk the John Muir Trail

11. Climb every mountain in New England in every season of the year

12. Spend a weekend meditating and praying with Buddhist Monks at a temple on Mt. Koya

13. Spend the night on top of Half Dome

14. Play the Digeridoo in Australia

15. Scuba dive with Grey Whales

16. Watch the sunrise from the top of Mt. Fuji

17. Walk the old city of Prague where my ancestors made Pilsner

18. Play a flute concert on the top of the Eichorn Pinnacle in Yosemite (photo above)

19. Have dinner in the Smithsonian at night after it's closed

20. Walk 5 miles on the Great Wall of China

21. Climb the Grand Tetons at dawn

22. Camp under the stars at 11,000 feet in Yosemite during a meteor shower

23. Get licked on the face by a giraffe

24. Watch a sunrise from Machu Picchu

25. Ride a sleeper train with my family from Tokyo to Sapporo Japan.

Wherever I wander, I always look for the road less traveled. I never sit in a hotel room or seek the comforts of home - there will be time for that later. By filling each moment with lasting experiences instead of buying 'stuff' , I have very memorable encounters with people and places that provide me with long term satisfaction when I return home.

Dave - I wish you well and I hope you experienced your 100. I've already made a list for 2009 that includes sunsets on Santorini, playing the Hungarian Furulya with shepherds near Budapest, and climbing the East Face of Mt. Whitney. Whenever my time comes, I'll know that I balanced each day with work, family, and personal pursuits that include the best things I can find in this world.

Wednesday, August 27, 2008

High Performance Computing for Research

In the past, I've written about the Harvard infrastructure I oversee in support of the research community

High Performance Computing clusters are a collaboration of the research community and IT to create a shared infrastructure for the benefit of all. We've learned a great detail about how to balance central/local IT offerings and how to build highly reliable low cost research IT infrastructure. Last year we organized a summit to share lessons learned. Over 100 leaders in high performance computing (HPC) gathered at Harvard Medical School to share ideas and hear presentations from their colleagues.

The Summit was a great success with over 90% of attendees rating it as extremely valuable and that they learned something that would be of immediate use to them. And this was the point of the Summit -- to bring together the leaders to share ideas and approaches. Sharing stories and best practices (and worst practices!) allows the field to grow more rapidly and efficiently.

Biomedical Informatics is at an exciting cross roads: the computational challenges facing researchers, clinicians and public health professionals now exceed the computational power typically available in an academic biomedical setting. This is exciting because it means that the advances in high performance computing from other disciplines (e.g. physics) can be brought to bear on the great challenges of life sciences, health and medical research. The opportunities to develop new therapies, monitor trends in ambulatory hospital data and catch and avert drug related mishaps (e.g. Vioxx) are truly astounding. With the advent of the $1,000 “ome” (genotype, phenotype, labs) – the capacity to analyze and predict longitudinally and in real time as well as the ability to hypothesis test retrospectively will challenge the computational boundaries of all biomedical research organizations. Computational power is now at the very core or our ability to rapidly advance the state of clinical care and healthcare. In fact some new labs at Harvard Medical School do not even have a wet lab component, but do all of their work through "in silico" simulations and modeling. All of this leads to an exciting time for people who need to build and provide HPC infrastructure to the research community!

One unique part of the summit is the use of audience participation devices which allow the organizers and participants to poll all the attendees. Here are a few of the results of last year's audience participation surveys:

* While 43% of biomed HPC facilities are building new data centers -- but 35% are leasing commercial data center space instead!
* Around 50% of biomed HPC clusters now use some sort of parallel or distributed filesystem -- with 95% of leaders planning in production or implemented in the next two years
* Most facilities (63%) still rely on gigabit Ethernet as their primary interconnect followed by 10G ethernet (17%) and Infiniband (12%)
* Around hald of biomed HPC shops use virtualization in their production environment and when they do they use VMWare (66%) and Xen (23%)
* Platform LSF is the primary job scheduler, used in nearly half of all Biomed HPC shops -- the remainder of shops are divided amongst Sun Grid Engine, Open PBS and Univa

The Summit has four focus areas this year:

* Managing biomedical storage growth -- with many institutions seeing storage growth measured in petabytes the challenges with managing, archiving and retrieving accelerate
* Bringing HPC to the users -- many scientists need access to HPC resources but don't have the technical skills to adapt to command lines and shell scripts. New frameworks allow researchers to do common tasks through a web GUI or via automated web services.
* Connecting to the grid -- many small and medium biomedical HPC shops would like to participate in the national grid efforts, but knowing where to start and how to participate is an ongoing challenge
* Trends in Biomedical HPC -- hear from the experts about the latest trends in Biomedical HPC including the new Centers for Translational Science and the growing demand for large scale HPC services in Biomedical Research

I'd like to invite anyone interested in High Performance Computing in the Biomedical fields to join their colleagues and other leaders in this rapidly evolving field to join us at Harvard Medical School in October. Registration is open at

I hope to see you there!

Tuesday, August 26, 2008

Pharmacy Initiated Renewals

I've written extensively about e-Prescribing and have summarized the transactions as

Check health plan eligibility
Enforce formulary rules
Check dispensed/reimbursed drug history for drug/drug interactions
Route the prescription to a retail or mail order pharmacy

I have not discussed the refill/renewal process.

One of the clear return on investment cases for e-Prescribing is the notion that a pharmacy can initiate a renewal request rather than requiring the clinician's office's to process a patient call.

A patient simply walks into a pharmacy and asks for a renewal. If no refills are available, the pharmacy system autogenerates a renewal transaction which is sent to the original prescribing physician's electronic medical record system. Once there, it appears in a Task or "To Do list" for physician approval. Once approved, the prescription is sent back to the pharmacy to be filled. Refill requests can also be generated via the pharmacy’s web page or for a maintenance medication, via a chain’s web site, a patient can setup automatic dispensing of refills every 30 days (or whatever frequency) and then the patient receives a call and/or email when the prescription is ready. If there are no refills left, then the pharmacy will initiate the refill request so that the prescription will be ready when the previous dispensing is used up.

There are a few tricky informatics and workflow issues such as

1. How do you match the renewal and the original prescription?

If the original new prescription was sent electronically, the EMR or e-prescribing system can insert a unique identifier for the patient / med into the prescription. This identifier is stored in the pharmacy system and echoed in the NCPDP standard refill request transaction. There is a 100% match if the EMR / e-prescribing system has been coded correctly.

Thus all EMR vendors and EMR self builders should create and send prescriptions electronically with this unique identifier. Then, as refill requests begin to flow, they will be easily matched in the EMR workflow.

If a new prescription was not sent electronically, it will be matched based on the prescriber's physician identifier. If the physician is registered with a SureScripts Prescriber ID (SPI), then the pharmacy will send a refill request automatically at any time a patient calls into the pharmacy and there is a need for physician approval to continue the therapy. This is completely automated in the pharmacy system and applies to any prescription in the system prescribed by that doctor.

When a refill request comes in without the unique matching identifier, it can still be matched automatically in many cases. The refill request will have the patient demographic information, especially the last name and birth date, and will have the NDC of the dispensed medication. By finding the patient using last name and birth date and then using the NDC to identify the therapeutic class by using the EMR’s drug file and then matching that class against the patient’s active med history list, the EMR can provide an exact match or narrow down to a couple medications for the prescriber to select for the refill approval or denial.

2. What do you do if the original prescribing physician is out of the office?

In our all our various workflow implementations that create physician work queues, we need to think about coverage patterns. Typically, we enable covering physicians to examine the work queues of other physicians or enable support staff to monitor queues so they can escalate pharmaceutical renewals and other time sensitive notifications.

Here's the current status of Pharmacy Initiated renewals in the US and Massachusetts:

Surescripts/RxHub currently has over 50,000 active physicians in the network across the country. Approximately 60% of these clinicians are using a certified EMR system (i.e. Allscripts (Touchworks and Healthmatics), NextGen, eClinicalWorks, Kryptiq/GE, Epic etc.). In July 2008 over 35% of the 6.5 million e-prescription transactions processed were refill requests and refill responses. Half of these transactions were processed by EMR users via pharmacy initiated electronic refill transactions and the corresponding electronic refill responses (using NCPDP Refill Request and Refill Response standard transactions and not New Prescriptions).

In July 2008 in Massachusetts there were about 25,000 pharmacy generated electronic prescription renewal requests going into EMR systems and being responded to (approved or denied) by clinicians and sent back to pharmacies electronically.

Historically, Pharmacy Initiated Renewals were supported mostly by stand alone eRx applications, but now, this transaction is supported by EMRs. Over the next year, the large academic systems in Massachusetts will plan to include these transactions and this workflow in their EMRs. The end result will be a further decrease in the administrative burden of medication management on physician offices, reducing costs and enhancing patient service.

Monday, August 25, 2008

A Connectivity Holiday

From August 8 to August 22, I took a two week vacation and a connectivity holiday. What did that mean?

From August 8 to 12, I was on the John Muir trail, 50 miles from the nearest cell tower and had no ability to connect to email, the web, voice etc. I left all my devices in the car.

When I returned to Tuolumne Meadows, I turned on my Blackberry, downloaded over 1000 messages, and used a variation of my email triage criteria.

If the email was a CC or FYI, I quickly read it and deleted it without responding.

If the email was about a product or from a vendor, I deleted it without reading it. Post vacation, I'll have plenty of time to review products.

If the email was from my staff asking me to help with a project or budget issue, I responded.

If the email was from a customer containing a question or complaint, I responded.

Each day from August 13-22, I used this same technique. The end result was that I sent about 10 emails per day. I did no phone calls.

When I reconnected to a network on August 22, I simply highlighted the thousands of email in my inbox and pressed delete.

It was liberating.

I know that I left hundreds of vendor questions unanswered. I also know that I read all email via my Blackberry that contained FYIs and that I responded to every customer/staff need. Over the next few weeks, I'm sure I'll receive many resends from folks who wanted a response while I was on vacation, but it will be much less than the 1000's I received during that time.

My ability to send 10 emails a day and keep the peace while on vacation raises the issue - have we created an email culture that is so overwhelming that we need to spend hours a day just answering email? Maybe a bulk delete - the equivalent of declaring email bankruptcy - is something I should try episodically as a way of cleaning the slate.

If there are issues that have not been resolved or there are areas where I need to intervene, I'll receive another email asking for help.

My experience over the two weeks of my vacation taught me that we are often too quick to send an email, escalate a problem, or delegate simple issues. In the days before email, we may have been more productive just because instant communication was not available and we just worked out problems on our own.

I want to thank all my staff at BIDMC, HMS, MA-Share, NEHEN, and HITSP for supporting my time away. The fact that I could delete thousands of email without a consequence is a tribute to their ability to resolve complex issues independently.

My connectivity holiday also included complete separation from news, RSS feeds, and my blog.

All that keyboard time was replaced with family time and the joy of not knowing what time or day it was.

When I returned, I asked others for a summary of the news of the past two weeks. Basically, the Olympics occurred, the Democratic National Convention geared up, and Clark Rockefeller was found to be a con man.

I always learn a great deal from vacations - alpine climbing skills, time with family, and a focus on the basics of eating, sleeping, and avoiding sunburn. This year I learned that an email and connectivity holiday is possible.

If I did not respond to your email, send it again if the issue is still important. Otherwise, relish the digital silence!

I'm rejuvenated and have many new ideas for projects, blog entries, and cool technologies. The aura of my vacation will last for a few more weeks and I look forward to challenges of the Fall season ahead.

Thursday, August 7, 2008

Into the Wild Part II

This morning I begin my trek of the John Muir trail. I'm traveling fast and light with a 13 pound pack including food and water. From August 8 to August 12, I will not have access to voice or data.

From August 12 to 15, I'll be climbing in Tuolumne Meadows and will have spotty access to email.

From August 15 to 21, I'll check email once a day from the Eastern Sierras.

My blog will return on August 22 when I'm back on a network.

And now, off Into the Wild!

Wednesday, August 6, 2008

Storage for Research

I've written several blog entries about the challenges of providing enterprise storage for clinical and financial operations. We have half a petabyte of spinning storage at BIDMC and 200 Terabytes at Harvard Medical School.

At HMS, we've recently hired BioTeam to assist us with planning the next generation of storage for the research community. Although the acquisition and management of storage for the research community can be done by the same team that has been responsible for our enterprise storage, there will be unique challenges providing storage to the research community over the next few years that will require a different approach.

I recently attended a community-wide storage planning conference with all the HMS research stakeholders. We heard from them about research projects in imaging and genomics that will generate terabytes per day. Most of this data can be archived after analysis then eventually deleted.

This means that IT must be able to provision three kinds of storage - high performance storage for analysis on our Linux clusters, mid-tier storage for near term access and very inexpensive archival storage.

Further, IT needs to be able to extremely nimble in providing this storage on demand to support the evolving needs of investigators. Of course, a charge back model will be mutually agreed upon so that the demand is not infinite and the budget is not zero.

BioTeam prepared this excellent overview which frames the issues.

Over the next few months, we'll be considering our hardware options, service levels and processes to support the new storage requirements of the research community. It's a challenge that all IT organizations in academic must embrace.

Tuesday, August 5, 2008

The iPhone is what I want, the Blackberry is what I need

I've been a Blackberry advocate in the past, so immediately my objectivity analyzing the new iPhone 3G will be called into question.

If it helps, I have an entirely Apple household - a Macbook Air, a Macbook Pro, an iMac 20, and 3 iPod Nano's. All my blogs are published from my Macbook Air via Firefox. I truly attempt to be Geneva in my testing of every technology.

For the past week, I've been running an iPhone 3G in parallel with my Blackberry Curve. I've tested its integration with Exchange email, its network support on 3G/2.5G/WiFi, its battery life, its App Store, and its user interface.

There are many aspects of the iPhone which are truly innovative and work extremely well. There are other aspects which I found frustrating, such as the touch screen keyboard and the short battery life. Whether or not the iPhone is the perfect mobile device for you depends on your use case for needing mobility.

Here's the detail of my experiences.

AT&T provided me an iPhone for testing including an account with a generous voice and data plan. The first step was to activate the phone. We ran into several difficulties that were unrelated to the iPhone itself - AT&T had mistakenly associated the iPhone account with the personal account of a faculty member who had a past due balance. Thus, they refused to activate the iPhone until this other person's balance was paid. Several phone calls and emails later, the issue was resolved. We were then able to activate the account. It's interesting that account activation requires the iPhone to be directly connected to iTunes via USB. For folks who have not used iTunes previously, the notion of a phone with a dependency on music management software will be a bit of learning curve.

Once the phone was activated, my next step was to integrate it with Microsoft Exchange. The Apple engineering to integrate the iPhone into enterprise Exchange environments via Active Sync is top notch. The problem that I ran into was a usability problem, which surprised me, since Apple is so good with usability.

To integrate the iPhone into Exchange, you need to type your email address, domain\username, password, and server name. Typing on the iPhone requires the use of a touch screen keyboard, because there are no true keys on the device. The first screen of the keyboard contains only the letters A-Z without punctuation, symbols or numbers. Pressing the .?123 key produces the numbers 0-9, a forward slash, punctuation and a few symbols. On this screen is a mysterious key labeled "#+=". Pressing it produces many commonly needed keys including the backslash. I think you can see where this is going.

I typed in my email address, then had to type in my domain\username. Since I could not find the backslash, I assumed that the iPhone required me to use a forward slash. I entered my password (which is alphanumeric, mixed case and a non-English word requiring me to toggle through 3 keyboard screens multiple times), and server name. Then the truly odd behavior began.

For whatever reason, the iPhone's calendar can sync with Exchange when the domain\username has a forward slash instead of a backslash, but email does not work. Even worse, since Active Sync perceives there is something wrong with the username/password, it keeps retrying and locks Active Directory. I sat with our server administrators and unlocked my Active Directory account numerous times as we tried various combinations of username/password entries. I finally thought to try a backslash instead of forward slash in my username, discovered the appropriate key and got everything working. Once I did, the push email through Active Sync was very impressive, delivering calendars, contacts and email with a speed that is very similar to the Blackberry.

I then began to reply to email and that's where my next adventure started. I had to type a medical consult in response to a Poison Control query about a mushroom ingestion. I typed

"The patient will be fine"

which appeared as

"The patient will be gone"


The iPhone keyboard is a touch screen with small non-tactile keys. A tall adult male with large hands (me) is going to "fat finger" the keys, so Apple has implemented a T9 like system to predict what words you meant to type. The F and G are next to each other as are the I and O on a QWERTY keyboard. FINE and GONE are valid combinations of pressing F/G and I/O with a fat finger.

Then I wrote

"The patient will live for a good long time"

which appeared as

"The patient will lice for a food long time"

I've spend days working with the touch screen keyboard and the although I'm modestly improved, the non-tactile T9 driven keyboard is just not as fast as the Blackberry keyboard.

On average, email takes me 3 times longer on the iPhone keyboard than on a Blackberry, assuming I want perfectly correct messages. The extra time is spent using the backspace key to correct those words that were fat fingered or T9'ed into some truly amusing typos i.e.

"The Lord is my Shepherd, I shall not want"


"The Lord is my Shepherd, I shall NPR want"


"The quick brown fox jumped over the lazy dogs"


"The quick brown fox jumped OCR the lazy dogs"

I'm sure the dogs enjoyed the Optical Character Recognition.

Just to be sure that my experiences were not the product of my unhip 46 year old email centric worklife, I spoke with several twentysomethings about their experiences with the iPhone. They all confirmed that they spend a great deal of time reviewing any professional email they send. Their consistent comment to me was that the iPhone is a great consumer device and a problematic corporate device.

So, the bottom line is that the iPhone is a great web browser, a great application platform, and a great way to read email. The App Store enables a new kind of commerce - the micro-app, truly democratizing application distribution in the way that Web 2.0 democratized publishing. BTW - the phone works well too.

However, the iPhone is not a perfect email appliance for someone who has to answer hundreds of emails each day with total accuracy.

Thus, if you want a truly innovative mobile computer for consumers that manages multimedia, has hundreds of add on applications through the App Store, and that connects to WiFi and 3G networks (note, these result in a battery life of a few hours at best), then you want an iPhone.

If an email appliance is your use case for mobility, you need a Blackberry.

I will watch the evolution of the iPhone very closely. If the keyboard issue is resolved, then Blackberry will have true competition among corporate email users. Until then, the device on my belt will be a Blackberry.

Monday, August 4, 2008

Draft FY09 Information Systems Operating Plan

Every summer, I work with my Governance Committees, the Medical Executive Committee, the Clinical Operating Oversight Committee, the Clinical Chiefs, the VPs, our users, and all my staff to create a draft operating plan for the following year.

In general, in non-profit organizations, there is not a significant increase in operating budget year to year, so we do not typically create an audacious plan that requires significant new resources. Instead, we work within the governance process to determine the highest priorities based on safety/compliance, return on investment, strategic importance and impact. We then execute those priorities with the scope, timing and resources which fit within our budget.

The good news is that for the past 10 years we've been able to achieve a high degree of customer satisfaction because we complete just about every stakeholder's high priorities.

My draft FY09 Operating Plan for Beth Israel Deaconess is available online.

It includes numerous clinical, financial and infrastructure goals which will enhance the functionality, reliability and security of our applications and systems.

Feedback is always welcome.

Friday, August 1, 2008

Cool Technology of the Week

A GPS is a great tool to identify your location or for Geocaching, but most modern GPS devices are not easy to use to track where you've been so you can keep a record of your travels and easily repeat past routes.

The Trackstick II receives signals from twenty four satellites orbiting the earth. With this information, the Trackstick II can precisely calculate its own position anywhere on the planet to within fifteen meters. It records its own location, time, date, speed, heading and altitude at preset intervals. Your exact location and the route traveled can be viewed and played back directly within Google™ Earth. There are no monthly fees.

Applications include
* Employee and vehicle monitoring
* Mileage recording and verification
* Public Safety
* Law Enforcement
* Homeland Security
* Keeping a record of personal outdoor travel
* Keeping a record of trips as tourist
* Child / Family Safety

It's powered by 2 AAA batteries and connects to a computer via USB 2.0 for uploading to Google Earth.

A self contained GPS with automated travel recording that interfaces to Google Earth. That's cool!