Friday, April 29, 2011

Cool Technology of the Week

A major theme in healthcare IT lately has been the value of unstructured healthcare data, which can be mined using natural language processing and search technologies to produce meaningful knowledge

Although the transformation of unstructured data into structured data is a new concept in healthcare, there's a commercial website that illustrates its power - Tripit.com

TripIt, it is an itinerary consolidation and sharing tool that's very simple to use.   You email any trip confirmations (air, car, hotel etc) to plans@tripit.com and TripIt combines all of the elements into one itinerary. That itinerary can be then saved to your calendar, viewed on the web, accessed via mobile devices and shared with others.

There are three major functions – itinerary collation, itinerary management and itinerary sharing.

To test itinerary collation, I emailed an Expedia confirmation from an upcoming Alaska trip (I'm keynoting a HIMSS event in Anchorage in June then climbing for a few days).    The free text was transformed perfectly into the structured data shown in the graphic above, including automatic weather and map information.   There's an iPhone, Android and Blackberry app to access this structured data via mobile devices.  The iPhone app worked perfectly.

I use Apple Mail and by simply clicking on the calendar integration feature of Tripit, my full itinerary was automatically added to my calendar



I shared my itinerary with my wife by inviting her to join Tripit via her gmail account.  I also added my Tripit itinerary to my Facebook wall.

A natural language processing application that turns unstructured confirmation emails into web, mobile and social networking accessible structured data.   That's cool!

Thursday, April 28, 2011

My 2011 Garden Plan

It's Spring in New England and I'm preparing my gardens.

This year, I planted oak leaf lettuce and spinach in a cold frame and selected seeds for a Summer raised bed garden of eggplant, cucumbers,peas, beans, and heirloom cherry tomatoes.

5 years ago, my wife and I joined the waiting list for a space in the Wellesley Community Garden on Brookside Road.   We were just notified that we'll be granted a space this year.   This means that we'll have a 32 x 25 foot plot to share with another family.   Our plan is to install several raised beds and plant Japanese pumpkins (Kabocha) and other vegetables that require generous amount of sunny, well-drained flat ground that we do not have in our backyard because of the 100 foot hemlocks causing shade much of the year.

All our seeds come from the Kitazawa Seed Company, a truly remarkable supplier.

For the next few weekends, I'll be tilling soil, hauling mulch, building fences, and installing raised beds.    My plan for new fencing to keep rabbits, squirrels and chipmunks from eating our fresh produce is pictured above.   I found two great design resources - one about wire fencing and one about raised beds.

We've lived in New England for 15 growing seasons so I've learned not to plant tender seedlings until after mid May.   It's still possible to have a hard freeze in April despite the temptation to plant induced by occasional 70 degree days.

As my daughter goes off to college and we enter the next stage of life (51-60),  the time in our backyard garden and our new community garden space will be very therapeutic.

The rituals of the planting/harvesting cycle, the anticipation of fresh vegetables, and physical labor of small scale farming  melt away all the problems of the week.   I look forward to a weekend in the dirt!

Wednesday, April 27, 2011

National Strategy for Trusted Identities in Cyberspace

On April 15, 2011, the Whitehouse released the National Strategy for Trusted Identities in Cyberspace (NSTIC) during a launch event that included U.S. Sec. of Commerce Gary Locke, other Administration officials, and U.S. Senator Barbara Mikulski, as well as a panel discussion with private sector, consumer advocate, and government ID management experts. 

What is it a trusted identity in Cyberspace?   This animation describes the scope of the effort.  It includes smartcards, biometrics, soft tokens, hard tokens, and certificate management applications.

NSTIC envisions a cyber world - the Identity Ecosystem - that improves upon the passwords currently used to access electronic resources. It includes a vibrant marketplace that allows people to choose among multiple identity providers - both private and public - that will issue trusted credentials proving identity. 

Why do we need it?

NSTIC provides a framework for individuals and organizations to utilize secure, efficient, easy-to-use and interoperable identity solutions to access online services in a manner that promotes confidence, privacy, choice and innovation.

Shopping, banking, social networking, and accessing employee intranets result in greater opportunities for innovation and economic growth, but the online infrastructure for supporting these services has not evolved at the same pace. The National Strategy for Trusted Identities in Cyberspace addresses two central problems impeding economic growth online - 1) Passwords are inconvenient and insecure 
2) Individuals are unable to prove their true identity online for significant transactions.

Identity theft is costly, inconvenient and all-too common
*In 2010, 8.1 million U.S. adults were the victims of identity theft or fraud, with total costs of $37 billion.
*The average out-of-pocket loss of identity theft in 2008 was $631 per incident.
*Consumers reported spending an average of 59 hours recovering from a “new account” instance of ID theft.

Phishing continues to rise, with attacks becoming more sophisticated
*In 2008 and 2009, specific brands or entities were targeted by more than 286,000 phishing attacks, all attempting to replicate their site and harvest user credentials. 
*A 2009 report from Trusteer found that 45% of targets divulge their personal information when redirected to a phishing site, and that financial institutions are subjected to an average of 16 phishing attacks per week, costing them between $2.4 and $9.4 million in losses each year.5

Managing multiple passwords is expensive
*A small business of 500 employees spends approximately $110,000 per year on password management. That’s $220 per user per year.

Passwords are failing
*In December 2009, the Rockyou password breach revealed the vulnerability of passwords. Nearly 50% of users’ passwords included names, slang words, dictionary words or were extremely weak, with passwords like “123456”.

Maintenance of multiple accounts is increasing as more services move online
*One federal agency with 44,000 users discovered over 700,000 user accounts, with the average user having 16 individual accounts.

Improving identity practices makes a difference
*Implementation of strong credentials across the Department of Defense resulted in a 46% reduction in intrusions.
*Use of single sign-on technologies can reduce annual sign-in time by 50 hours/user/year.

The next step is creation of a national program office to manage the project and coordinate public-private efforts.    I look forward to a voluntary, opt in strong identity for e-commerce.   Who knows, if this effort is successful, maybe we can move forward with a voluntary, opt in strong identity for healthcare.

Tuesday, April 26, 2011

Business Spam

Our Proofpoint Spam filters remove the Nigerian businessmen and Viagra ads from my email stack.   However, it's really challenging to auto-delete legitimate business email from major companies that I would just rather not read.

Business Spam (BS) is what I call the endless stream of chaff filling my inbox with sales and marketing fluff.  If a colleague emails me about a cool new emerging technology, I'm happy.   If a trusted business partner gives me a preview of a new product and offers me the opportunity to beta test it, I'm thrilled.  If Bob at XYZ.com describes their cloud-based, software as service, offshore, outsourced, app store compliant product line that's compiled in powerpoint (i.e. does not yet exist except in sales and marketing materials), I press delete as fast as I can.

Since there are multiple domains that can be used to reach me - bidmc.harvard.edu, caregroup.harvard.edu, caregroup.org etc. many email list sellers vend 5 or 6 variations of my email address, resulting in 5 or 6 copies of each life changing offer in my inbox.

Now I know why some say email is dead.   Email is a completely democratic medium.  Anyone can email anyone.  There are no ethical or common sense filters.  The result is that Business Spam will soon outnumber my legitimate email.

Social networking architectures offer an alternative.   I'm on Facebook, Twitter, LinkedIn, Plaxo etc.   In those applications, individuals request access to me.   Based on their relationships to my already trusted colleagues and my assessment of their character, I either allow or deny access.  Once I "friend" them, appropriate communications can flow. If the dialog becomes burdensome or inappropriate, I can "block" them.

In order to stay relevant, email needs to incorporate social networking-like features.   It should be easy to block individuals, companies, or domains that I do not want to hear from.   Today, when a vendor ignores my pleas to remove me from their emailing list (demonstrating a lack of compliance with anti-spamming policies), I ask our email system administrator to blacklist their entire domain, preventing the flow of their Business Spam across the enterprise.

For those of you who use unsolicited business email as a marketing technique, beware.   Your message is not only diluted by the sheer volume of companies generating Business Spam, but it also creates a negative impression among your recipients.

My advice - send your customers a newsletter describing your products and services.  Ask them to opt in to receive future messages.  If they do not respond, stop sending them.   It's just a like a Facebook request - you pick your friends and your friends pick you.

The alternative is that all your communications will be deemed Business Spam and blocked at the front door.    Do you really want all your customers to say your emails are BS (Business Spam)?

Monday, April 25, 2011

Facebook's Green Data Center

In my roles as CIO at Harvard Medical School and Beth Israel Deaconess Medical Center, I oversee 4 data centers (one primary and one disaster recovery site for each institution).   Over the past several years, I've not been challenged by data center real estate, I've been challenged by power and cooling demands.

My teams have invested substantial time and effort into enhancing our power usage effectiveness (PUE) - the ratio of total power consumption including cooling and transformer losses divided by how much of the power is actually used by computing equipment.

In the graphic above, BIDMC has achieved a PUE of 1.82, which is low compared to many corporations.  We've done cold aisle containment, floor tile ventilation, and hot air recapture to reduce our Computer Room Air Conditioning (CRAC) needs substantially.  We've matched the average of most green computing initiatives.

Despite all our efforts, we are limited by the constraints of the standard commercial hardware we run and the building we use.

Facebook has designed its own buildings and created its own servers via its  Open Compute Project .   Initial power usage effectiveness ratios are 1.07, compared with an average of 1.5 for their existing facilities.

Here's an overview of how they did it.

They've removed uninterruptible power supplies and centralized chilling units, which we cannot do because of architectural/engineering limitations of our building design.   We're likely to achieve a PUE of 1.5, but could only achieve 1.07 by opening a new, fresh-built data center.

Here's a look at the kind of energy efficiency that cloud providers are achieving by creating dedicated mega data center buildings.

On April 28, I'm keynoting the Markley Group's annual meeting and you can be sure that I'll include power and cooling in my list of the things that keep me up at night.

Congratulations, Facebook!

Friday, April 22, 2011

Cool Technology of the Week

I'm a great fan of creating networks of networks for healthcare information exchange.   Point to point interoperability does not scale but creating local or regional collaborations that enable large numbers of organizations to connect with minimal interfacing works very well.

Today, Surescripts announced the Lab Interoperability Cooperative to connect hospital labs with public health agencies.

In Massachusetts, NEHEN has worked with the Boston Public Health Commission and the Massachusetts Department of Public Health to enable all the hospitals in Eastern Massachusetts to send reportable lab, syndromic surveillance, and immunization information by simply connecting HL7 2.5.1 transmissions to a single gateway.

Surescripts has the same plan but on a national scale. Hospitals interested in participating can register by completing the “Phase I Checklist” by April 29, 2011.

The project is funded by a grant from the Centers for Disease Control with participation from the American Hospital Association and the College of American Pathologists. During the two-year grant period, the project will recruit, educate and connect  a minimum of 500 hospital labs to the appropriate public health agencies.  At least 100 will be critical access or rural hospitals.

Based on the Surescripts Network for Clinical Interoperability, the project will support all federal and state policies and standards for health information exchange, including privacy and security standards.

A standards-based network to connect hospital labs and public health agencies.   That's cool!

Thursday, April 21, 2011

Upcoming Conferences

Spring is speaking season and here are two upcoming conferences of interest.  I'll be moderating panels at both.

The first conference is "Enabling the Adoption of HIT to Transform Patient Care" on April 25, 2011 at the Harvard Club of Boston.

This conference features keynotes by Dr. David Blumenthal who will speak about his vision for modernizing health care delivery and Dr. David Bates who will discuss using health IT to improve patient safety.   The conference will also feature two panels. The first will focus on supporting providers to achieve meaningful use of EHR. The second will focus on new and innovative technologies to engage patients and providers in care delivery.

The conference is the result of hard work by the HSPH Public Health & Technology (PHAT) Forum, a graduate student organization whose mission is to provide an interactive, cross-disciplinary forum for exploration and innovation at the intersection of health, information, and technology.

The second conference is the Governor's Health IT Conference hosted by Deval Patrick at the DCU Center, Worcester, MA May 9-10.

Keynotes include Deval Patrick, Dr. David Blumenthal, and Sachin Jain, MD, MBA, Special Assistant to the National Coordinator.    Topics include:
*How the Office of the National Coordinator will fund the deployment of electronic health records and the exchange of data among these systems
*Governor Patrick's proposal for transforming the healthcare payment system
Medicare and Medicaid initiatives for quality improvement and shared savings
*The contributions that health IT will make to clinical quality, patient-centeredness, and the economic recovery in Massachusetts

See you at these conferences!

Wednesday, April 20, 2011

The April HIT Standards Committee meeting

The April HIT Standards committee included important discussions about the timeline for the work ahead, how we'll organize to do that work, and how the HIT Standards Committee will interact with the HIT Policy Committee and the S&I Framework to ensure ONC has the final certification and standards criteria needed for meaningful use stage 2.

The meeting began with an overview by Farzad Mostashari, the new National Coordinator.  He described the perfect storm of events we have in healthcare now - Meaningful Use Stage 2 from the HITECH Act, Accountable Care Organizations from the Affordable Care Act, and the Partnership for Patients: Better Care, Lower Cost from HHS.

Paul Tang then reviewed the likely characteristics of Stage 2.   He summarized the public comment on stage 2 noting strong support for eRx of discharge medications, electronic progress notes, electronic medication administration records, secure messaging, and recording patient preferences for communications.   He noted mixed support for other initiatives such as the list of care team members and longitudinal care plans.  He highlighted the concerns about timelines for stage 2, especially the lead time required to create and widely implement new functionality.  Options include reducing the reporting period for Stage 2 from 1 year to 90 days resulting in a 9 month delay, deferring stage 2 for a year,  and splitting Stage 2 into 2a for increased thresholds on existing measures and 2b for introducing new technology.   We discussed the need for detailed descriptions of the new stage 2 functionality so that we can determine where new standards are needed. The committee had a robust discussion about the need to work on   certification criteria and testing criteria for stage 2, since the committee's stage 1 input was limited to standards alone.

Doug Fridsma then discussed the process we'll use this Summer to complete the necessary standards work in support of MU Stage 2.   Steve Posnak provided a visual representation of the interactions among the HIT Policy Committee, Standards Committee and ONC described in the HITECH Act.

Doug outlined a very practical idea - as the HIT Standards Committee studies the standards to support stage 2, it should divide them into buckets as follows

A) Those meaningful use criteria for which no standards are needed - they are process goals
B) Those meaningful use criteria for which one existing standard exists and is a perfect fit
C) Those meaningful use criteria for which standards exist, but they are imperfect and need work
D) Those meaningful use criteria for which no mature standards exist

In the next few weeks, the Standards Committee will develop a detailed workplan for May and June by placing meaningful use stage 2 priorities into these buckets and then assigning them to ad hoc workgroups ("power teams") to rapidly analyze.   We'll handoff those in buckets C and D to the S&I framework for further deliberation.    The S&I groups will return their finished work to the Standards Committee for review and final polish.

Jamie Ferguson provided an update on the vocabulary task force, noting its efforts to specify one major vocabulary for each domain area (labs, problems, allergies, medications etc.), identify vocabulary subsets/codesets that will accelerate interoperability, and  ensuring mappings between vocabularies are available where necessary.

Paul Egerman presented an overview of the PCAST Workgroup recommendations.   Doug Fridsma discussed an analysis of existing metadata standards prepared by Mitre.

Paul's work as facilitator of the PCAST Workgroup effort was truly remarkable and his energy resulted in a comprehensive report that suggests a very reasonable path forward.   One early pilot we discussed was the use of the Universal Exchange Language to send data from EHRs to PHRs.   An existing CCR or CCD could be wrapped with patient identification and provenance (who generated the data in what setting) metadata and incorporated into PHRs, assuming existing PHR vendors can be convinced to accept a universal format.

Jim Walker provided an overview of the Clinical Quality Workgroup  including their plan for supporting meaningful use stage 2 measures.

Dixie Baker and Walter Suarez presented their plan for Provider Directory work.    Upcoming testimony will enable them to make recommendations at the May Standards Committee meeting.

A great meeting.   As next step, we'll set up a group to evaluate the metadata possibilities for a PCAST pilot.  We'll complete the workplace for the Summer and begin assigning that work.    The next few months will be a sprint as we complete all the work needed to support the next level of Meaningful Use.

Tuesday, April 19, 2011

Mobile Applications for Medical Education

Every year in April, we survey the HMS medical students about their use of mobile devices.

At HMS, we encourage students to buy the device of their choice - iPhone/iPod/Ipad, Android, Blackberry, Kindle etc.  We then support these devices with software licenses and controlled hosted applications.  

Our Mycourses Learning Management System has a Mobile Applications tab.  Under General Resources, we offer a mobile version of all course content via connected devices (WiFi, 3G etc.).   We also offer a Kindle version for downloading course content to the device.

On our Mobile Resources page, we offer downloads of many popular applications.  Most include native iPad support.

What are the most popular in 2011?

Dynamed - a clinical reference tool created by physicians for physicians and other health care professionals for use primarily at the 'point-of-care' .

Unbound Medicine uCentral - a collection of popular titles including 5 Minute Clinical Consult, A to Z Drug Facts, Drug Interaction Facts (an interaction checker), Review of Natural Products,Medline Table of Contents Alerts, and Medline Auto Alerts.

VisualDx Mobile - a visual decision support tool. VisualDx merges medical images with a problem-oriented findings-based search.

Epocrates Essentials  - an all-in-one mobile guide to drugs, diseases, and diagnostics which includes Epocrates Rx Pro, Epocrates SxDx, and Epocrates Lab.

iRadiology -  a compendium of over 500 unique images demonstrating classic radiological findings.

I'll post the complete survey for 2011 soon.

Monday, April 18, 2011

The Attestation Experience

This morning at 8am the CMS attestation website went live.

At 8:30am, I completed the attestation for Beth Israel Deaconess Medical Center.

Here's an overview of the experience.

At the top of the Attestation page, you'll see the link "Click here to attest."

Once in the Medicare & Medicaid EHR Incentive Program Registration and Attestation System you need to choose Eligible Hospitals or Eligible Professionals.   I chose Eligible Hospitals and logged in with the same user ID and password we used to register BIDMC.

1.  You're asked to enter your EHR Certification Number from the Office of the National Coordinator.  This is an interesting concept, because the EHR Certification number is not the same as the Certified Healthcare IT Product List (CHPL) Product number assigned during the certification process.  For example, the BIDMC Online Medical Record was assigned a CHPL Product Number of CC-1112-549900-1 during the CCHIT EACH certification process.

To obtain an EHR Certification Number, go to the CHPL Website.

Click on Ambulatory or Inpatient.   I clicked on Inpatient.

I searched by product name for BIDMC's "Online Medical Record".

I clicked Add to Cart.   Do this for all the products you need to meet 100% of the Certification Criteria.  Note that there is a bug in the CHPL page.  See Keith Boone's blog for the workaround.

Once you've achieved 100% of the required criteria, you can click on "Get CMS EHR Certification ID" in the View Cart area.  I was assigned a CMS EHR Certification ID of 30000001TMQOEAC

2.  Next you must specify if you've chosen to count all ED visits or use the "observation services" method for calculating ED visits. This includes ED patients admitted to inpatient or observation services and excludes ED patients discharged from the ED.  We used the observation services method.

3.  Next you attest to the Core Criteria.  Here's a copy of my completed submission.   Of interest, no patient requested an electronic copy of their discharge instructions or an electronic copy of their lifetime record.   Our software has the ability to generate these, but since no patient asked for them during the reporting period, the denominator was zero and no numerator needed to be reported.

4. Next you attest to the Menu Set Criteria.  Here's a copy of my completed submission.   We've tested immunizations transactions with the Department of Public Health sent via secure FTP.  We've tested lab results and syndromic surveillance with the Boston Public Health Commission sent via NEHEN.   Formulary enforcement is included in all our ordering systems.   We have numerous screening sheets and business intelligence tools that generate patient lists based on clinical criteria.

5. Next you attest to the 15 hospital quality measures.  Remember that the ED measures include stratifications for admitted, observation, and psychiatric patients.  The numerators of the ED measures are times measured in minutes.  For all other quality measures you must provide numerators, denominators and exclusion measures using patient counts.   In 2012, CMS will require these measures to be submitted electronically using PQRI XML.   I look forward to the automation of this step, since manually entering more than 50 numbers accurately was challenging.

6. Once you've completed the core, menu set, and quality measures, you're asked to answer a series of questions attesting to the accuracy of your submissions and your authority to perform the attestation.

If you've met all the criteria successfully, your attestation is approved and a submission receipt page appears (the graphic above). Print or save this receipt since it is not emailed to you.

That's it.  It should take 30-60 minutes to complete if you have all your data handy.    I welcome comments on the attestation experience of others so that the Healthcare IT Standards Committee Implementation Workgroup can provide input as Stage 2 is planned.

Friday, April 15, 2011

Cool Technology of the Week

This week's cool technology is not about any specific hardware or software, but is about a trend.

Mobile technology for healthcare is fast replacing desktops and laptops in many settings.

As of this morning, there are 1600 iPhones and 300 iPads connected to the BIDMC network, using our administrative and clinical applications.   These were all purchased by individual clinicians and staff to enhance their productivity.   All we do centrally is provide the server components to access applications (web servers, citric, active synch) and enforce mobile device security polices.

Mobile devices for healthcare are becoming increasingly important at the bedside, in the home, and in hostile environments.

Here's a YouTube video illustrating how the Medical Communications for Combat Casualty Care (MC4) handheld is used to record patient encounters on the battlefield.

Given the increasing prevalence of Traumatic Brain Injury (TBI) in the military due to powerful explosive devices the Army is using handhelds to track and treat personnel with TBI.

The Army is piloting iPads/iPhones/iPods, Android devices, and Windows Smartphones for training.

There's speculation that the military may issue a smartphone to ever solider.

Some IT leaders consider mobile computing to be a burden and distraction - a wild west of client devices brought in by customers demanding new services.   The reality is that CIOs should develop a mobile device strategy assuming that tablets, smartphones and laptops will replace desktops in many settings.  By defining security policies and providing server side applications, IT organizations can become mobile device enablers and leverage the momentum created by users who are investing their own time and resources to make them work.

Mobile devices purchased and supported by users, connected to standardized central services.   That's cool!

Thursday, April 14, 2011

Medicare EHR Incentive Program Attestation To Begin on Monday

On April 18, 2011, attestation for the CMS Medicare Electronic Health Record (EHR) Incentive Program begins. The Medicare EHR Incentive Program Attestation System will be available to eligible professionals (EPs), eligible hospitals and critical access hospitals (CAHs) on that date, enabling the next step in the process for EPs and hospitals to qualify for the Medicare EHR Incentive Program. Once EPs and hospitals successfully attest through the online system, they will then be able to receive Medicare incentive payments from CMS.  I will spend Monday completing the attestation for BIDMC.

CMS has created several resources that will help EPs and hospitals successfully navigate the attestation process, which can currently be found on the CMS EHR Incentive Programs website. CMS is developing a dedicated attestation section on this website, which will be launching Monday.

Sign up for CMS’ EHR listserv updates to receive timely information and updates about the EHR Incentive Programs.

I'll let you know how it goes on Monday!

1951 HZ

I drive a 2005 Prius that just crossed the 100,000 mile mark.   My total cost of owning the car has been very low.  My mileage has averaged 50mpg.   I'm very satisfied.

In Masaschusetts, license plates from 2005 are of the form  xxxx yy.  My license plate 1951 HZ is purely random.   Recently, someone with a musical bent asked me:

"You're a thoughtful person.  I'm sure your personalized license plate has some profound meaning.   What subtle and amazing thing happens at a frequency of 1951 Hertz (HZ)?"

Ok, I need some help here.   I've done my best to search the web for phenomenon that occur between 1900 and 2000 Hertz.   Here's what I've found so far:

*It's a common form of audio tone remote control  from the 1970s.

*It's a common frequency of frog mating calls.

*It's the frequency of the humpback whale grunt.

*It's the frequency of alveolar nasal consonants (such as when you say "mmmmm").

*It's a point in the audio spectrum used to analyze noise induced hearing loss.

Ok, so far, my profound license plate has me
1. walking into a swamp and asking some wayward frog to hop down to my place
2. getting the attention of a distracted humpback whale
3. having my nasal consonants misunderstood by folks who frequent rock concerts.

I welcome your help - my car needs a sublime story as to why 1951 HZ says something about me, the human condition, or the natural world.   Comments welcome!

Wednesday, April 13, 2011

The Care Connectivity Consortium

On April 6, five healthcare organizations announced the Care Connectivity Consortium to accelerate interoperability activities in the US.   Mayo Clinic, Geisinger, Kaiser Permanente, Intermountain Healthcare and Group Health will share patient identified data in real time using Nationwide Health Information Exchange gateways (NwHIN Exchange) incorporating national and international standards.

Each organization will decide how to support NwHIN Exchange transport specifications independently.  Some may choose to implement the Open Source Connect gateway.  Others will implement their own solutions. For example, Kaiser has a self-developed gateway in production.

Initially, content will include problem lists, medication lists, and allergy lists.   The next phase will include laboratories, vitals, immunizations, and other content.  Of course, if organization sends more than the minimum required content, the others will be capable of receiving it.  They plan to use the same Clinical Document Architecture (CDA) implementation specifications as the VA/DOD Virtual Lifetime Electronic Record (VLER) project.

In addition to supporting healthcare information exchange for clinical care, they will implement a patient-chosen portable ID, similar to the HealthURL concept I've discussed.

They will also adopt a common Data Use and Reciprocal Support Agreement.

The five organizations share very few patients in common, so the real benefit is that they are implementing efficient, standard health information exchange methods that are extensible to organizations which do share significant numbers of patients.

If they work out standards-based, secure methods for sharing information among themselves, those methods would then be attractive to and implementable by other organizations with fewer resources for development and become usable, useful, de facto standards for information sharing.

I look forward to their progress.  When five major institutions across the country implement common policy and technology for healthcare information exchange, we'll achieve a tipping point and others will rapidly follow.

Tuesday, April 12, 2011

The RSA Attack

I've worked with RSA Security since my days as an informatics fellow when I first used SecurIDs as part of my early health information exchange work.

Just as I was transparent about the CareGroup Network Outage in 2002, RSA has shared all the details of their recent security breach.

It all started with a well crafted phishing email to a non-technical staff member with the subject line “2011 recruitment plan”.

Attached to the email was an excel spreadsheet that contained an exploit for a known vulnerability in Adobe Flash.

The exploit installed a hard-to-detect remote administration tool named Poison Ivy on at least one RSA computer.   The end result was that an attacker gained access to the RSA network.

The attackers moved from system to system harvesting accounts until they came across those users who had highly privileged access to sensitive systems and data.

An internal staging system was “created” to collect, encrypt and transmit back up lists of usernames/passwords to systems.

Confidential material related to SecurID technology was FTPed to a remote site.

The attackers have not been identified.

The attack was remarkably sophisticated and illustrates the evolution of cybercrime over the past 10 years.    Here are the 4 principal stages:

1st Generation – Because I can
Worms, defacement of web sites

2nd Generation – I can make money
Botnets appear, denial of service attacks, seeking payment to stop attacks

3rd Generation – Organized crime
Large scale management of attacks, coordinated use of tools and techniques, trojans, worms Phishing, targeted attacks

4th Generation – Selling the tools
Tools to perform attacks become “vended” with 24/7 support available, Botnet rentals, sophisticated Id theft services, Licensed Malware appears, Exploit knowledge is sold.  Social Networks just for cybercriminals appear.  Cybercrime supply chains are formalized and fine tuned.

I've described security as a Cold War - the faster we implement protections, the faster the cybercriminals innovate.

Thanks to RSA for sharing their experience with the rest of the industry.

Monday, April 11, 2011

Hail to the New ONC Chief

On Friday, HHS Secretary Kathleen Sebelius announced that Dr. Farzad Mostashari will be the next National Coordinator:

"I’m very pleased to announce that Farzad Mostashari, MD, ScM will become the new National Coordinator for Health Information Technology within the Office of the National Coordinator (ONC) effective today. Dr. Mostashari joined ONC in July 2009 and serves as Deputy National Coordinator for Programs and Policy. Previously, he served at the New York City Department of Health and Mental Hygiene as Assistant Commissioner for the Primary Care Information Project, where he facilitated the adoption of prevention-oriented health information technology by over 1,500 providers in underserved communities. Dr. Mostashari also led the Centers for Disease Control and Prevention-funded NYC Center of Excellence in Public Health Informatics, and an Agency for Healthcare Research and Quality-funded project focused on quality measurement at the point of care. He is a graduate of Harvard College, the Harvard School of Public Health, and the Yale School of Medicine, and conducted his training in internal medicine at Massachusetts General Hospital.

Farzad has been a critical member of the leadership team at ONC, and I look forward to continuing to work with him in this new capacity as he builds on the incredible progress made in the adoption and meaningful use of health information technology during David Blumenthal’s tenure."

I agree that Farzad is the logical choice to follow David Blumenthal.

What can we expect from Farzad?

David Blumenthal's role was akin to a startup CEO.  He had a small staff, $2 billion dollars to spend quickly/wisely, and a bold vision to improve healthcare quality/safety/efficiency using IT.

Now that the ONC staff is hired, the initial regulations are written, and the money is allocated, Farzad must evolve vision and startup into implementation and operations.

ONC has a very broad portfolio at the moment including

*Achieving EHR adoption goals by leveraging Regional Extension Centers
*Accelerating health information exchange (HIE) by providing oversight of state HIE plans
*Ensuring the success of the Beacon Communities
*Completing the standards and certification regulations needed for Stage 2 and Stage 3
*Supporting the policy goals of HIPAA, ARRA/HITECH, and Healthcare Reform through continued work by the Federal Advisory Committees working for ONC

Farzad must make midcourse corrections as needed to manage these projects to completion.

Can he do it?   Absolutely.

When the EHR marketplace did not offer the features that New York City needed to measure quality and improve population health, he motivated the industry to change the EHRs.    For example, Farzad greatly influenced the evolution of eClinicalWorks version 7 into version 8.

When the marketplace did not offer standards-based quality registries, he organized the PopHealth  initiative to transform CCRs and CCDs into quality metrics.  Given Farzad's background and public health orientation, we'll likely see an enhanced ONC focus on population health.

During his tenure as National Coordinator I predict he'll oversee the completion of blueprints for a true Nationwide Health Information Network including certificate management, provider directories, patient matching algorithms, PCAST-inspired metadata envelopes, and transport for query/response transactions (resolving the REST v. SOAP debate).

Yes, the work ahead will be hard.   However, the path forward for Meaningful Use is clear.   With resources and sound project management, we can do anything.

As a next step for the HIT Standards Committee, ONC staff will work with Jon Perlin and I to create a Gantt chart of everything we must do so we'll have monthly milestones and deadlines to guide us.

Good luck Farzad, we're here to support you.

Friday, April 8, 2011

Cool Technology of the Week

Structured electronic clinical documentation is the next frontier in EHR implementation.  It's particularly challenging in Emergency Medicine which is a fast paced, sometimes chaotic environment.   Documentation can be time consuming and is the most frequently interrupted task.  How can we balance the need for structured data capture with ED workflow?

There is no single right answer - we've used iPads with web applications, voice recognition, and disease specific templates.

We now have a new tool in our quiver, the Digital Pen from Anoto as implemented by Forerun Systems for ED charting.    Forerun is a technology spin out of BIDMC and I have no financial relationship to it.

The digital pen is NOT a handheld scanner, capturing graphics or PDFs.   It's a means to capture discrete patient data to support clinical documentation, quality reporting, and regulatory requirements.  It captures data at the bedside with a granularity that dictation cannot.

Here's how it works.

Forms are printed with a special background matrix that identifies the unique form, patient, clinician, and data elements.  Think of it as a page of  2D barcodes.   Forms can be printed as needed at various points in the ED workflow either manually or automatically.

The pen "knows" exactly which patient form is being used and what fields are being entered.   You can write on multiple forms in parallel without confusing it.    Checking a box generates structured data indicating that a sign or symptom is present.   Writing through a word generates structured data indicating that a sign or symptom is absent.   When free text is entered, both the original text and an optical character recognition interpretation are available.

As the pen is inserted in a USB dock, the structured data is uploaded into the EHR.

Here's an example

The form as entered by the user


The discrete data displayed in a web application (including handwriting recognition in red for editing).




The final output with all discrete data converted to structured documentation for clinical use.



Although we hope to use iPads for clinical documentation throughout the institution, there are workflows in which digital pens are faster, easier, and less intrusive to the caregiver/patient interaction.

A pen for patient specific structured electronic clinical documentation.   That's cool!

Thursday, April 7, 2011

Preparing for Scotland

On Friday of Memorial Day weekend, I'll be lecturing about the US National Healthcare IT program in Edinburgh, Scotland

Since I'll have Saturday and Sunday free, my hosts graciously arranged a trip to the Scottish Highlands to climb:

Aonach Eagach 
Bidean nam Bian
Ben Nevis via Carn Mor Dearg (the highest point in the UK).

I asked my hosts what kind of weather to expect in May.  Their answer -
Ben Nevis has an average year 261 gales per year and 171 inches of rain.

I'm used to the mountains in the lower 48 US states - the Sierra, the Tetons, and the Whites.   All are dry climbs with an occasional brief afternoon shower.

For Scotland, I must choose boots that are completely waterproof, grip wet rocks, and are light/compact/easy to pack.     Did I mention that Ben Nevis in May may have snow at the top?

After considering many brands and possibilities, I decided on a new boot technology - the TrekSta Men's Evolution Mid GTX.

Treksta designed these Gortex lined books by scanning 20,000 feet to create a three dimensional shape that is quite different from other shoes.    The sole is made from sticky rubber plus a series of fiberglass embedded inserts that add traction on rock, ice and mud.

The shoes weigh 15 ounces each, are ankle height, are completely waterproof, fit like a glove, and are stiff enough for technical climbing.

It was tough to find a boot that would work in the constant rain of the Scottish Highlands, but the Treksta's seem like an amazing departure from the road usually travelled by boot manufacturers.

Armed with my usual total body Gortex and the same clothing approach I've used in New England, I'll let you know how I fare with the Scottish Highlands next month.

Wednesday, April 6, 2011

The Cost of Storing Patient Records

Yesterday, I participated in a National Library of Medicine Conference called "Long term Preservation and Management of the EHR."    Given that the EHR is a legal record, a source of data for clinical care, and a repository of knowledge for clinical research, how do we preserve it for a sufficiently long period of time to maximize value to patient, caretaker, and scientist?

Here are the program details.

I presented an overview of our tiered storage approach to information lifecycle management at BIDMC.

One controversial item was my conclusion that the storage costs per patient to retain data are insignificant.

Here's the calculation.    At BIDMC we generate approximately 1 terabyte of clinical text data (structured and unstructured) per year.    We generate approximately 19 terabytes of image data per year (radiology, cardiology, pathology, Gastrointestinal, Pulmonology, Ob/Gyn etc).    We have approximately 250,000 active patients.    20 terabytes/250,000 = 80 megabytes per patient per year.

There are many kinds of storage and many ways to calculate cost.   Rather than specify a vendor or an infrastructure, I'll use storage numbers from a non-BIDMC site for purposes of computation.

The other site offers 2 kinds of storage:

Standard storage which has a marginal cost of .34 cents per gigabyte added (or .68 per gigabyte with replication).

High performance storage which has a marginal cost of .55 cents per gigabyte added (or .89 per gigabyte if it is replicated onto standard storage)

Let's choose high performance replicated storage at .89 per gigabyte.    In Massachusetts we retain medical records for 15 years and images for 7 years.     Let's compute the cost of storing the 80 megabytes per patient per year (4 megabytes of text and 76 megabytes of images) for these regulatory lifetimes.

Text storage = 4 megabytes added per person per year.    We'll need to compute the cost of storing old data plus adding new data every year i.e.

Year 1 = 4 megabytes
Year 2 = 4 megabytes old + 4 megabytes new
Year 3 = 8 megabytes old + 4 megabytes new
Year 4 = 12 megabytes old + 4 megabytes new

and sum all these costs over 15 years.     Let's use the formula for summing numbers:  n*(n+1)/2  for 15 years and .89/gigabyte

4 megabytes*15*16/2*.89/1000 = 42 cents per patient for the first 15 years

After year 15, we can begin deleting the oldest data, so we'll always have just 15 years of data - 4 megabytes*15*.89/1000= 5 cents per year thereafter

Image storage = 76 megabytes added per person per year, retained for 7 years

76 megabytes*7*8/2*.89/1000= $1.89 per patient for the first 7 years

After year 7, we can begin deleting the oldest data, so we'll always have just 7 years of data - 76 megabytes*7*.89/1000= 47 cents per year thereafter

So when we debate the question of storing data for later reuse, keep in mind that the cost per patient is 42 cents for the first 15 years of text and $1.89 for the first 7 years of images.

The equivalent of Moore's law applies to storage - continuously decreasing costs and higher density.   We'll also have cloud storage options (although only a few public cloud providers offer HIPAA compliant storage with indemnification for privacy breaches).

In my analysis above, some may question the cost per gigabyte I used.  Feel free to multiply it by 10 such that text records could be stored for $4.20 per patient for 15 years.   It's still very economical.

In the interest of completeness, let's examine fully loaded cost.  At BIDMC, we have multiple storage platforms.  About 40% of the cost is depreciation on capital budgets.   The rest is staff, software/hardware maintenance, and other operating cost.   The average cost among these collective platforms runs $1.27 per GB or $1,270 per TB per year, fully loaded.

Of course, there are other considerations:

1.   The  definition of the "official medical record" is in flux.  The usual process for most diagnostic and treatment modalities is to cull the media so that only the important content is saved.   For example, in a sleep study, you would not save uneventful sleep time.    If medical/legal issues push us toward saving raw content, especially video, the amount of data per patient is going to rapidly expand.

2.  At BIDMC, technologies and vendors have been stable for many years.    This makes backward compatibility issues much more manageable.   By staying with the same vendors and technologies, we've not been challenged with migrating our clinical data to a new database or vendor.

3.   The increased use of multimedia in clinical care may also expand the amount of storage per patient.    Voice files (call center, voice mail, raw transcription, and the like) might someday be required to be saved for medical/legal reasons.

4.  As data expands, so does the burden of dealing with release of information requests, backup/recovery, disaster replication, testing new versions, and other application life cycle requirements.  We seldom operate with just two copies of the data.  There are usually two copies locally, sometimes more for high availability, and another copy at our disaster recovery site.    We may store additional copies for testing new versions of software, snap backups, and the like.

5.  Emerging factors contribute to costs.   e-Discovery can expand our overall costs because because backups must be retained indefinitely.   The  "digital footprint"  of patient data is changing.  Text only is manageable, but the imaging/diagnostic components are ever growing, both in number and in size.

Yes, costs add up over time for large patient populations, but the cost of storing text data is so minimal that we have not deleted a single datum from the electronic health record since I became CIO in 1997 and we have no plans to do so!



Tuesday, April 5, 2011

Meaningful Use Attestation

Last week CMS announced that attestation for the Medicare EHR Incentive Program (Stage 1 of Meaningful Use) will open on April 18, 2011.

Their announcement includes screenshots of the attestation website that will be used by eligible professionals, eligible hospitals, and critical access hospitals to document meaningful use of certified EHR technology.

Prior to attestation, all providers must register through CMS' web-based Medicare and Medicaid EHR Incentive Program Registration and Attestation System.

The screen shots (such as the sample above) suggest that attestation will require detailed data entry including numerator, denominator, and exclusion results for meaningful use core measures, meaningful use menu measures, core clinical quality measures, alternate core clinical quality measures (required only if a core quality measure has a denominator of zero), and additional clinical quality measures (applies only to eligible professionals).

In preparation for all of these metrics, BIDMC created reports last Summer (sample for August-October 2010) to help us track our progress.   For the measurement period from January 1 to April 1, 2011,  BIDMC has achieved all meaningful use thresholds using our CCHIT EACH Certified EHR, so we plan to attest on April 18.

Once eligible professionals, eligible hospitals, and critical access hospitals complete a successful online submission through the Attestation System, they qualify for a Medicare EHR incentive payment.

For the Medicaid EHR Incentive Program, providers will follow a similar process using their state's Attestation System.  Information on state Medicaid timelines and programs is available here.

It will be fascinating to see how many hospitals and eligible providers attest in the early months of the program.     CMS plans to take a careful look at the progress on Stage 1 before finalizing its plans for Stage 2.

Monday, April 4, 2011

The Accountable Care Organization NPRM

Since the Affordable Care Act passed, many senior healthcare executives have told me - "I do not know what an Accountable Care Organization is, but I know we need to be one!"

On March 31, HHS released the Accountable Care Organization Notice of Proposed Rulemaking, so now we know what an Accountable Care Organization must be.  From the Introduction:

"This proposed rule would implement section 3022 of the Affordable Care Act which contains provisions relating to Medicare payments to providers of services and suppliers participating in Accountable Care Organizations (ACOs). Under these provisions, providers of services and suppliers can continue to receive traditional Medicare fee-for-service payments under Parts A and B, and be eligible for additional payments based on meeting specified quality and savings requirements."

Here's a bookmarked copy of the 429 page Notice of Proposed Rulemaking signed by Don Berwick and Kathleen Sebelius.

There are numerous summaries online from various stakeholders

Don Berwick's NEJM Perspective

Kaiser's excellent list of resources

The Healthcare.gov overview

I've read the regulation and there are few items to highlight from an IT perspective:

Electronic Health Records appear on 39 pages

Health Information Exchange appears on 12 pages

To be an ACO, you must first achieve meaningful use, embrace interoperability, and gather the data to pool clinical data for quality measurement.

From Page 404:

"Electronic health records technology. 
(1) At least 50 percent of an ACO's primary care physicians must be meaningful EHR users, using certified EHR technology as defined in §495.4, in the HITECH Act and subsequent Medicare regulations by the start of the second performance year in order to continue participating in the Shared Savings Program.
(2) CMS may terminate an ACO agreement under § 425.14 of this part if fewer than 50 percent of an ACO's primary care physicians are not meaningfully EHR users, using certified EHR technology as defined in §495.4, the HITECH Act and subsequent Medicare regulations by the start of the ACO's second performance year."

Pages 174-194 outline 65 quality metrics than can only be accomplished with a cross organizational quality registry.  Claims analyses are not enough.  From page 170 and 173:

"We propose that ACOs will submit data on these measures using the process described later in this proposed rule and meet defined quality performance thresholds.

Better Care for Individuals:
* Patient/Caregiver Experience
* Care Coordination
* Patient Safety

Better Health for Populations:
*Preventive Health
*At-Risk Population/Frail Elderly Health"

The NPRM is well aligned with Meaningful Use (which appears on 27 pages).  The rule notes that all the CMS programs - Meaningful Use,  Medicare Improvements for Patients and Providers Act (e-prescribing incentives), and Accountable care organizations require separate but aligned IT efforts

"Page 220
 We note that including metrics based on EHR Incentive Program and eRx Incentive Program data does not in any way duplicate or replace specific program measures within each of the two respective programs or allow eligible professionals to satisfy the requirements of either of the two programs through the Shared Savings Program. To receive incentive payments under the EHR incentive or eRx programs (or to avoid payment adjustments), eligible professionals will be required to meet all the requirements of the respective EHR and eRx programs."

So what it an ACO?

It's a group of healthcare providers who have implemented electronic health records, health information exchange, and quality data warehouses to coordinate care and measure population health.  

Our marching orders are clear.   We must implement Meaningful Use Stage 1/2/3, Medicare e-Prescribing incentives, and Healthcare Reform in parallel.

The words ICD-10 and 5010 do not appear in the NPRM.  I continue to hope that ICD-10 is deferred until 2016 to free up the resources we need to create the EHRs, HIEs, and Registries in support  of Accountable Care Organizations.

Friday, April 1, 2011

Cool Technology of the Week

Evolving accountable care organizations will need to coordinate care, measure quality, and implement the tools needed to manage wellness. All these goals require novel IT infrastructure and applications.

I've prepared a roadmap which illustrates how every EHR and Hospital Information System among all our affiliated partners exchanges (or will soon exchange data).   Yesterday, an important part of that strategy, bidirectional data exchange between BIDMC's systems and Atrius Epic System went live.

Since January 2010, all Atrius Epic clinical users have been able to view BIDMC data about the patients we share in common.

Since March 31, 2011, all BIDMC clinical users have been able to view Epic data about the patients we share in common.

A link called Atrius Epic Web appears on the BIDMC Patient Profile screen when a patient is shared between Atrius and BIDMC.  The link displays automatically if the patient has a primary care provider who is affiliated with Atrius.

When a BIDMC clinical user wants to access Epic Records, they click on the link. and a comprehensive web-based summary of the patient's Epic records  (the graphic above)  appears, without requiring a new login or re-specifying the patient.   The patient lookup is captured in the audit logs of both BIDMC and Atrius.

The entire exchange is done using simple secure RESTful calls between web servers at BIDMC and Atrius.

Comprehensive bidirectional data sharing in support of patient care among independent organizations.    That's cool.