Wednesday, April 30, 2008

Decision Support Service Providers

I've recently written about decision support and speculated on the ways we can transform data to information to knowledge to wisdom.

Over the past few weeks, I've seen a convergence of emerging ideas that suggest a new path forward for decision support. Application Service Providers offer remotely hosted, high value Software as a Service applications at low cost. I believe we need Decision Support Service Providers (DSSP), offering remotely hosted, low cost knowledge services to support the increasing need for evidence-based clinical decision making.

BIDMC has traditionally bought and built its applications. Our decision support strategy will also be a combination of building and buying. However, it's important to note that creating and maintaining your own decision support rules requires significant staff resources, governance, accountability, and consistency. Our Pharmacy and Therapeutics Committee recently examined all the issues involved in maintaining our own decision support rules and you'll see that it's an extensive amount of work. We use First Data Bank as a foundation for medication safety rules. We use Safe-Med to provide radiology ordering guidelines based on American College of Radiology rules. Our internal committees and pharmacy create and maintain guidelines, protocols, dosing limits, and various alerts/reminders. We have 2 full time RNs just to maintain our chemotherapy protocols.

Many hospitals and academic institutions do not have the resources to create and maintain their own best practice protocols, guidelines, and order sets. The amount of new evidence produced every year exceeds the capacity of any single committee or physician to review it. The only way to keep knowledge up to date is to divide the maintenance cost and effort among many institutions.

A number of firms have assembled teams of clinicians and informatics experts to offer these kinds of knowledge resources. UptoDate maintains world class clinical information with thousands of authors reviewing literature and providing quarterly revisions. Safe-Med has a large team of experts codifying decision support rules and building the vocabulary tools needed to make them work with real world clinical data. Medventive provides the business intelligence tools needed to create physician report cards and achieve pay for performance incentives.

However, none of these firms can plug directly into an electronic health record in a way that offers clinicians just in time decision support.

Here's a strawman for the way a Decision Support Service Provider should work:
a. A hospital or clinic selects one or many Decision Support Service Providers based on clinician workflow needs, compliance requirements and quality goals
b. Electronic health record software connects to Decision Support Service Providers via a web services architecture, including appropriate security to protect any patient specific information transfered to remote decision support engines. For example, an EHR might transfer a clinical summary such as the Continuity of Care Document to a Decision Support Service Provider along with a clinical question to be answered.
c. A clinician begins to order a therapy or diagnostic test. The patient's insurance eligibility and formulary are checked via a web service. The patient's latest problem list, labs, and genetic markers are compared to best practices in the literature for treating their specific condition. A web service returns a rank ordered list of desirable therapies or diagnostics, based on evidence, and provides alerts, reminders, or monographs personalized for the patient.
d. Clinicians complete their orders, complying with clinical guidelines, pay for performance incentives and best practices.
e. The decision support feedback is realtime and prospective, not retrospective. Physicians get CME credit from learning new approaches to diagnosis and treatment.

In order to do this, EHR vendors must work with Decision Support Service Providers to implement the uniform architecture and interoperability standards needed to integrate decision support into EHR workflow. I would be happy to host a Harvard sponsored conference with all the stakeholder companies to kick off this work.

Of course, some may worry about the liability issues involved in using a Decision Support Service Provider. What if clinicians comply with flawed guidelines or fail to comply with suggested therapies and bad outcomes occur?

An excellent summary of Information-Based Liability and Clinical Decision Support Systems is available on the Clinical Informatics Wiki.

Based on my review of the literature, I believe decision support liability is a new area without significant case law. The good news is that there are no substantive judgments against clinicians for failing to adhere to a clinical decision support alert. As a licensed professional, the treating clinician is ultimately responsible for the final decision, regardless of the recommendations of a textbook, journal, or Decision Support Service Provider. However, as Clinical Decision Support matures and becomes more powerful and relevant, I believe that there could be greater liability for not using such tools to prevent harm.

This blog entry is a call to action for EHR vendors and emerging Decision Support Service Provider firms. It's time to align our efforts and integrate decision support into electronic health records. Working together is the only affordable way for the country to rapidly implement and maintain high quality decision support.

Tuesday, April 29, 2008

Integrating the Electronic Record

This is the third blog in my series of 2008-2009 areas of personal focus - clinical documentation, decision support, and the integrated electronic record .

At BIDMC, we buy and we build applications. We've purchased vendor systems for Lab, PACS, anesthesia documentation, ICU charting, and cardiology data management. We've built our electronic health record, provider order entry and our portals, which means that we create the enterprise front ends used by clinicians. All our home built applications share one fully integrated database. However, we must integrate data from dozens of vendor applications with our home built applications, which we've done via a service oriented architecture. Further, we have to integrate data from multiple outside organizations to reconcile medications and ensure continuity of care. Our strategy is to converge all viewing, ordering, documentation, and sign on to a single application - our self built electronic health record called webOMR. Of course, some departments such as the lab will use purchased applications such as the laboratory system for their internal departmental workflow, but the clinicians will never know that we have a purchased lab system, since all their work will be done via webOMR. Here are the top 40 ways we integrate data among our various applications.

1. WebOMR displays laboratory and radiology results from our community hospitals - Beth Israel Deaconess Needham, Mt. Auburn, and New England Baptist using real time integration from Meditech via a service oriented architecture.
2. WebOMR supports full ambulatory order entry with automated electronic routing of radiology, cardiology and laboratory for our affiliated community hospital, BID Needham.
3. WebOMR will soon display an indicator on the patient summary if community hospital data is available for viewing, reminding clinicians to look for these external results.
4. WebOMR includes viewing of all endoscopy reports and photos from our Gcare endoscopy system.
5. WebOMR includes viewing of all echocardiology reports from our Encor echo reporting system.
6. We've built a 'popup' version of WebOMR for eClinicalWorks and CareTracker users, so private clinicians in the community can view all BIDMC data on their patients without having to leave their office EHR.
7. We've also included ordering of BIDMC labs and radiology diagnostics within the 'popup' version.
8. WebOMR displays all problems, medications, allergies, visits, notes/reports, and labs from the Joslin clinic, an affiliated but separate institution.
9. WebOMR includes viewing of allRadiology images from GE Centricity PACS.
10. WebOMR includes viewing of Radiation Oncology Reports from Impac.
11. WebOMR includes viewing of Electrophysiology Reports from GE Cardiology applications.
12. WebOMR includes viewing of Anesthesia flow sheets from Philips Compurecord.
13. WebOMR will include viewing of IMDSoft ICU Charting later this year.
14. WebOMR will include Enterprise Image viewing of all non-radiology imaging modalities by next year.
15. WebOMR will include viewing of scanned inpatient papers records by June 2008.
16. WebOMR includes integration of pediatric immunization records from our community health centers.
17. WebOMR displays inpatient discharge meds from Provider Order Entry and offers one click conversion of inpatient medications to outpatient medications.
18. WebOMR includes full e-prescribing integration to our statewide eRx Gateway including eligibility checking, formulary enforcement, routing to retail and mail order pharmacies, and community-wide medication history with drug/drug and drug/allergy checking.
19. WebOMR pushes discharge summaries to community clinicians via our statewide Clinical Data Exchange Gateway.
20. WebOMR offers a single point of entry to all other systems including all inpatient functions such as POE. Single signon enables clinicians to logon once to use all functions.
21. Webomr and Provider Order Entry share and exchange data such as immunizations, allergies, and health care proxies. Our Perioperative Management System (OR Management) also shares allergies and health care proxies.
22. Our Cardiology systems consolidation (planned for 2009-2010) will include full webOMR viewing of cardiology data.
23. Inpatient documentation including Medication reconciliation, History and Physicals, Sign out/Team Census enhancements, and Discharge enhancements planned for 2009-2010 will ensure that all inpatient documentation is available elecronically in webOMR.
24. Our transplant record will soon be fully integrated into webOMR
25. Our Perioperative Information Management system offers fully integrated OR workflow management. We are planning PACU documentation and its integration with ICU documentation now.
26. Our ED Dashboard offers fully integrated viewing of all outpatient and inpatient data about the patient in one place
27. Our Oncology Management System offers fully integrated Heme/ONC Management incorporating all lab, medication and history data for both outpatient and inpatient settings in webOMR.
28. Our Radiology Portal offers fully integrated radiology services supporting key clinical, quality assurance, and administrative workflow functions in one place.
29. Antibiotic precautions (e.g., MRSA, VRE) are displayed in our admitting and outpatient scheduling systems.
30. Perioperative Information Management, Appointment Scheduling and Admitting all share information to eliminate duplicate data entry as an operating room case and its associated admission and pre-admission testing appointment are scheduled.
31. Appointment scheduling integration with webOMR includes printing the webOMR med list at appointment check in, prompts for diabetic ophthalmology screens in scheduling, reports of missing and unsigned notes within scheduling, and webOMR display of recent and future appointments.
32. Admitting/Emergency Department integration with webOMR includes displaying recent Emergency Department and inpatient visits.
33. Registration integration with webOMR includes displaying insurance summary, Medicare D coverage, and medication assistance counseling alerts to assist with prescribing in webOMR.
34. Our physician billing application, ETicket, automatically adds patients to providers’ censuses for operating room cases after patients are entered into the Perioperative Information Management System.
35. ETicket automatically adds patients to providers’ censuses after encounters in Labor and Delivery Triage and OB/Gyn triage.
36. All our electronic payer transactions for eligibility, referral/authorization, claims, and remittance are fully integrated into our scheduling, registration and billing systems.
37. Our capacity Dashboard integrates real time utilization data from admitting, Emergency Department, Provider Order Entry, Cath Lab, Transfer Log, and ICU Callout systems.
38. Our Diabetes Registry incorporates data from BIDMC and Joslin to facilitate comprehensive, integrated reporting on diabetic care
39. We maintain data marts for a broad range of clinical systems, including laboratory, microbiology, blood bank, radiology, inpatient pharmacy, outpatient medications, OR, and demographics, admitting and scheduling. These support data mining, research, and quality/performance measurement within and across systems.
40. A web page that tracks the health of all this integration runs 7x24 in our command center, refreshing itself every minute. In addition to monitoring many different infrastructure level parameters, this caretaker also checks on various applications, including integration engine feeds, Medquist (dictation) transactions, and various Emergency Department functions.

From the list above, I hope I've illustrated that interoperability between systems at BIDMC includes human readable data exchange, semantic interoperability for decision support, and process interoperability, linking workflow among various systems.

With this degree of data integration, we're approaching the tipping point which will enable us to transition from a hybird paper/electronic record to a fully electronic record and we have a retreat this Summer to plan for that transition and adjust our governance as needed.

Will we ever be done with 100% integration of every element of data from purchased and built systems? No, but the journey moves fast enough to keep the patients safe and the clinicians satisfied.

Monday, April 28, 2008

A Field Trip to Dell

From 4/15 through 4/17 my BIDMC team visited the Dell facilities in the Austin area for an executive briefing on several areas of their operations and futures. Today's blog is about their lessons learned.

Data Center Tour
They toured the main Dell data center in Round Rock. This was one of Dell's two Tier III data centers, reduced from the dispersed 16 data center model they used from 2001-2003. Cooling and power were left the way that they had been designed at the time of the acquisition. The system delivers cool air from the ceiling and from the floor. They acknowledged that it was not optimal and are making plans for a re-vamp. More on the plans under the “Data Center Engineering Lab” section below.

The site is a completely "lights out" facility housing over 7,500 servers and associated storage. They use remote control power strips to control the power cycling of systems. They initially used APC Power Distribution Units, but saw a market opportunity and designed their own switchable PDU product. The site was manned by data center manager and security guards.

They have a very stringent access control policy for employees. For an employee to gain access to perform work there must be an active, approved ticket in their tracking system. The employee swipes their badge for access. The guard, using the swiped data, is presented with any tracking tickets the employee may have assigned. There must be a ticket approved, matching the time window they are attempting to access the center for them to gain access. If there is not a match, they do not gain access.

They use an automated build procedure for all of their servers with a standard base image. A key component of that build is their Dell Open Management software agent on all servers.

The Open Management software provides a inventory of physical equipment and running software. This is used for their inventory control and for ensuring disaster recovery currency. In addition to the software based inventory they perform a physical inventory every 3 to 4 months.

Lessons learned
• BIDMC does not fully exploit the value the HP Insight manager/agent we own. Like the Dell management software it can deliver server power consumption graphs, server temperature data and disaster recovery data on the services and applications running.
• BIDMC would benefit from a more tightly controlled process for hardware deployment in which data is recorded in a data base to be compared against the Insight reports and network switch data for cross inventory validation

Manufacturing Tour
They toured the manufacturing plant for servers and gaming systems. The manufacturing process is tightly integrated with the ordering system and is a just in time inventory model. The parts for a system arrive by truck 2 hours prior to being required. The product used for any particular system build is determined by the order placed.

The quality control infrastructure is extensive. Once a machine is built, it is burned in twice. Once briefly at the technician assembly area where it is confirmed the system powers up and the requested Bios and any preloaded software images are applied. It then passes on to a diagnostic rack were it is hooked up and specific diagnostics are run on all of the hardware.

Quality control is also applied to all of the cosmetics from the positioning of the tags on cards and the case to the appearance of the assembled product. If even an external inventory or FCC tag is misaligned the system will be rejected.

They use an interesting system to rank each technician, team, group and plant. Each one of these has their own metrics and the metrics roll up. The more complex the system being built, the more points awarded. The sooner a quality problem is located the fewer points deducted. The later a problem is detected the more points deducted progressively down the chain. This gives a high incentive to catch the a problem as early as possible. If a final Q/A engineer catches the problem the whole plant takes a quality hit.

Lessons Learned
• Dell offers the ability to have all servers ordered pre-imaged with an image the customer supplies and controls. This would seem to be a big time saver for BIDMC.
• The pre-loading of the image would also be beneficial for BIDMC workstations. A base image could be flashed with a script that would run on initial boot up. The script could prompt for the type of system this was to be, point back to a second script and the system initialization would be completed. In that model, the system could be unpacked – moved to the users desk, the tech on connecting the system would answer the initial boot question and walk away.

Data Center Engineering Lab
They met with a representative from the Data Center Engineering Lab to talk about the futures of Data Center design – in particular areas of cooling.

Dell maintains a 1500 sq ft data center to test the different cooling technologies and techniques in. The racks in the test areas are filled with shells of servers that are basically large toasters. They can be controlled to generate amounts of heat to simulate data center load.

The most promising cooling technique is hot aisle containment. There are number of methods to contain and provide cooling. This is the method that they will be using in their own data center. Their studies have shown the techniques provides for smaller foot print, more efficient, targeted cooling which allows the data center ambient temperature to be raised. It also reduces the power used for cooling, and allows for higher power/utilization of the server thus getting the power supply efficiencies that come with higher utilization.

Dell Servers/Blades
Dell is about to begin shipping their next generation blade server technology. Their chassis holds 16 blade servers or combinations of blades and storage as needed. The design objectives and features are similar to other vendors. They claim a 7 to 10 year life cycle for the chassis, the ability to mix and match different generations of blades that might be issued during the chassis life and fully redundant power and embedded switches. Dell also stresses a green theme, highlighting dynamic fan and power supply efficiencies. They have built in some temperature and load sensing to auto adjust the power consumption and shutdown some components.

From a needs perspective Dell stated they see the blade server market as solving three problems 1 – Footprint, 2 – Cabling , 3 – Power issues.

Lessons Learned
•The packaging and engineering of the blade servers is well done The granularity of control for power and components is a nice touch.
• Dell ranks their applications by server need. Based on the applications needs the application is placed in VM, on a blade, or on a standalone server. This requires upfront assessment of an application before deployment.

• Dell is working to complete their line of services and products to meet enterprise needs. They have made a significant improvement/advancement in their blade servers. They are positioning themselves as leaders in data center design relative to power/cooling.

• Dell has a systems management software package, Open Management. This provides full management services to their and third party products. The interesting features were the recording , graphing, reporting on each servers power consumption and operating temperature. Limits/ranges could be sent for the sending alerts etc.

• There is an opportunity through the use of addressable power strips to use some logic/intelligence to turn off systems when they are not required. The best opportunity is in the evenings to possibly turn off portions of clusters or systems that do not need to be operational during the evening.

A great visit and many lessons learned about data center management in general. Thanks to Dell for their time.

Friday, April 25, 2008

Cool Technology of the Week

This week I led the Spring IT Town meeting at Harvard Medical School and one of the issues we discussed was the difficulty of protecting intellectual property on hard drives we're replacing. How does a drive containing sensitive clinical, financial or research data get sanitized when EMC or other storage vendor replaces a drive that has gone bad?

I found a great overview of the approaches to the data sanitization problem for hard drives.

I also found an article from NIST which lists every type of media (e.g. PDA's, tape, fax, servers, network devices, and more) and the preferred method for sanitizing them.

These articles referred to "secure erase" for ATA drives. I've not heard of this before, but the University of California at San Diego's Center for Magnetic Recording Research describes the protocol in detail. Apparently, it is available in most ATA drives manufactured in the past several years. They also have a free software utility that accesses the secure erase feature on ATA drives. One CIO describes her search for commercial products to address this issue. A commercial appliance is available from Ensconce Data Technology which incorporates the same data sanitization technology.

My cool technology of the week is the secure erase feature on ATA drives, which solves many of our problems for sanitizing data before we return drives to manufacturers.

Given the visibility of identity theft and a focus on securing data in the enterprise, I hope to inventory our disposal methods and the techniques being used to sanitize them. Among the items of interest would be cell phones, blackberry devices, problem disks replaced by EMC, tape, USB drives, and others. I'm seeking to add staff to our security and disaster recovery areas at HMS and this task along with user education/communication will be high on their agenda.

Thursday, April 24, 2008

Go Climb a Rock

It's Thursday, so it's time for a personal introspection blog.

At times I get quizzical looks for being vegan, playing the Japanese flute, or wearing black. However, the most unusual looks from my peers occur when they see photos like this one of Dark Shadows on Mescalito North in Red Rocks, Nevada.

Winter mountaineering is a good way to get away from your cell phone (the battery life is 2 minutes at -40F), but why climb a rock?

Think of climbing a mountain as a giant Rubik's cube - a wonderful mental exercise. Climbing requires a well orchestrated combination of gear, route finding, movement, and teamwork to make it to the top.

Here's how it's done.

1. Make the approach - some climbs are a 5 minute walk from the road and some are 10 miles through bush, over stream crossings, and up thousands of feet. The Grand Teton is usually a day of hiking followed by a 2am departure which puts you at the base of the technical portion of the summit pyramid at sunrise. The climb to the top takes until 8am or so, enabling a retreat before the usual afternoon thunderstorms hit the summit. Many Yosemite approaches, such as Half Dome, require hours of tricky route finding on poorly marked trails before the climbing begins. Of course, you must carry everything with you - climbing shoes, harness, rope, food, water, and extra clothing during the approach.

2. Prepare to climb - At the base of the climb, you put on your harness, unwind your rope, put on your climbing shoes, and do safety checks with your partner. Safe climbing requires at least two people - one to climb and one to control the rope, catching the leader if a fall occurs. Once gear and knots are checked, climbing can begin.

3. Lead the pitch - The leader uses a combination of footholds, handholds, cracks, and friction to climb up the rock face. Every 10 feet or so, the leader places a piece of protection into a crack and clips the rope to that protection. Modern climbing is "clean climbing" that does not involve pitons or any gear that damages the rock. Spring loaded cams, metal nuts/stoppers, and slings around natural features such as trees provide temporary protection that support the rope so your climbing partner can catch a fall.

If you consider the physics of leading, it's where the bulk of the risk of climbing occurs. If you place a piece of protection every 10 feet, you'll fall 20 feet if you slip off the rock, since 10 feet of rope to the last piece of protection means you'll fall 10 feet below that piece before you stop. A 170 pound climber falling 20 feet generates a lot of force on the climber's body, the rope and the person controlling the rope.

Routes are divided into segments of under 200 feet in length because that is the length of the most common climbing rope.

Route difficulty is graded on a scale from 5.1 (climbing a ladder) to 5.15 (basically climbing an inverted slab of glass with butter on your hands). I lead 5.8 and follow 5.10a, which is climbing a vertical to overhanging wall with dime-sized handholds and footholds.

4. Prepare the belay - Once you've climbed the pitch, you must set up a safe and secure attachment to support your climbing partner as he or she climbs. Typically, I place 3 pieces of protection in the rock for redundancy, then use a 16 foot loop of rope or sling material called a cordelette to provide a secure attachment point for me and my climbing partner. Once I'm safe and I've prepared the rope to support my climbing partner, I give the signal to my partner to begin climbing.

5. Bring up your partner - When your partner climbs, it's called seconding the pitch, since you were the primary or lead climber. Seconding is not particularly risky because you are supported by a rope from above if you fall. Mentally its easier but it has the same physical challenges as leading the climb.

6. Repeat - My climbing partner and I do swinging leads, alternating leading and seconding, so we each take half the risk. We do this until we're at the top of the mountain. Some major climbs in Yosemite can be 15 pitches or more, which can take half a day to climb. We do not typically do "big walls" which require you to sleep on a portaledge strapped onto the face of the mountain. We'd rather have a very long climbing day and then descend before nightfall.

7. Descent - In some ways, descent is the most dangerous part of the climb. Rappelling down the mountain is risky because a gear failure means death or injury. When we're climbing up, we typically do not fall - the gear is there for safety just in case. When you rappel down the mountain, the gear is all that is keeping you from falling. For this reason, we typically walk off most climbs, down hiking trails rather than rappelling. We always carry headlamps and a rain jacket in our packs when doing long alpine routes, just in case we're delayed or trapped by a storm.

Great climbing areas in the Northeast are the Shawangunks in New Paltz, New York; Cathedral Ledges and Whitehorse Slabs in North Conway, New Hampshire; Franconia Notch near Lincoln, New Hampshire; and Rumney near Plymouth, New Hampshire.

In the US, places like Red Rocks, Nevada; Joshua Tree, California; and the Tetons, Wyoming are extraordinary. However, nothing quite measures up to grandeur of Yosemite and the Eastern Sierra. My Yosemite climbing schedule this year is

August 12 Tenaya Peak
August 13 Cathedral Peak/Eichorn Pinnacle - per my New Year's Resolutions, I'll play a Japanese Flute concert from the top of Eichorn Pinnacle if the weather holds.
August 14 Fairview Dome
August 15 North approach to Mt. Conness
August 16 Tioga Crest

Rock climbing is an incredible way to focus the mind, experience the outdoors from a new perspective, build teamwork, and work with technical gear all at the same time. As I age, the routes I'll attempt will likely get easier, but whether they are 5.5 or 5.10a, the joy of a mile of air under your feet is the same!

Wednesday, April 23, 2008

Designing the Ideal Electronic Health Record

Yesterday, I keynoted a Veterans Administration meeting via teleconference (part of my effort to reduce travel, improve my carbon footprint, and be increasingly virtual) on the topic of designing the ideal electronic health record.

I was posed a simple question - If I had infinite resources, infinite time, and no legacy compatibility issues, how would I design the electronic health record of the future?

Here's my answer:

The web is the way. Given the 24x7 nature of healthcare, the need for physicians to be in many physical locations, and the multitude of clinician computing devices, the ideal EHR should be web-based, browser neutral and run flawlessly on every operating system. I highly recommend the use of AJAX techniques to give physicians a more real time interactive experience. Client/Server may have some user interface advantages, but it's just too challenging to install thick clients on every clinician computing device. Citrix is an expensive and sometimes slow remote access solution. Native web works.

Data in medicine is stored hierarchically i.e. a patient has multiple visits with multiple labs, with multiple results. This is a tree of data with the patient as the root and the lab values as the leaves. Using a hierarchical database such as Intersystems' Cache ensures that data for clinical care is stored in this tree format and thus can be very rapidly retrieved, ensuring fast response times for clinician users. For population health, clinical research, and performance reporting, relational databases work very well. Thus, I recommend a hierarchical database for the clinical care applications and relational data marts for the research applications.

The ideal EHR should incorporate decision support in laboratory, medication, and radiology ordering. EHRs should include "Event Driven Medicine" alerts about critical clinical issues and patient specific reminders about preventative/wellness care. Event Driven Medicine is the transformation of data into information, knowledge and wisdom based on decision support, business rules and timely notification of clinicians.

The EHR should include an easy to read clinical summary of all active patient problems, medications, visits, and labs and should be able to export this summary to personal health records such as Google Health, Microsoft Health Vault and Dossia.

Problem Lists
Problems should be entered via an electronic pick list of vocabulary controlled terms using SNOMED CT. The community of caregivers - PCPs, specialsts, ED physicians and hospitalists should be able maintain this problem list collectively, using social networking type tools. Call this Wikipedia for the patient. All caregivers should be able to associate notes and medications with entries on the problem list, making it easy to filter notes by problem and discontinue medications that are problem-specific when problems are resolved.

Medication Management features should include e-Prescribing for new medications, automatically linked to payer-specific formularies, electronic real time pre-auth/eligibility for high cost therapies, links to lifetime medication history from retail pharmacy and payer databases, and safety checking for drug/drug and drug/allergy interactions. Pharmacy initiated renewal workflow would reduce calls to the physician's office to refill medications.

Ideally, medication reconciliation features should include pre-population of the medication list based on the lifetime medication history from retail pharmacy, payer databases and personal health record applications. Using the same social networking type approach as mentioned with problem lists, all caregivers should be able to update/change/edit/comment on patient medications to keep them current. One click quick picks of commonly used medications should be available to make ordering Tylenol as easy as ordering books on Amazon.

Allergies should be recorded by caregivers using vocabulary controlled entries for therapeutics, foods and environmental substances. Reaction type and severity should be codified as well as the identity of the allergy observer/documentation source i.e. did the patient self report that their Mom saw a rash to pencillin 30 years ago or did an ICU nurse watch the patient anaphylax to pencillin?

Each visit should be documented with a reason for visit (symptoms or problem), a pre-existing condition flag if the patient has had this before, a diagnosis, a list of therapies given, and the followup arranged.

Notes should be entered via structured and unstructured electronic forms. All text data should be searchable, so that physicians can easily locate old notes. Templates that are disease specific and macros that are specialty specific should be available to make documenting easier. Voice recognition for automated entry of free text should be available. Workflow for signing notes and forwarding notes to other providers should be easy to use.

Laboratory results
Laboratory results should be displayable in several ways - by date, by class of lab, by single result trended over time and in screening sheet format. Screening sheets are lists of disease specific lab results combined with decision support. For example, a diabetic screening sheet would include glucose, hemoglobin a1c, lipids, recent eye exam results, podiatry consults, and urinalysis. Alerts and reminders should be generated based on disease state, lab value, and trends.

As results are delivered, especially important results, clinicians should electronically sign an acknowledgement of lab result notification, ensuring that appropriate next steps are taken for patient care.

Radiology results
As mentioned in my recent blog on image management, all "ologies" should be stored in one place in the EHR and all should be viewable with a single electronic viewer. Radiology, Cardiology, GI, Pulmonology, Echo, Vascular, and Gynecology images should be easily viewable and these images should be managed according to business rules i.e. retained as required for medical record compliance, archived when no longer relevant etc.

Electronic ordering should include medications, Oncology Management, Laboratories, Radiology and general care (i.e. ordering home care supplies, wheelchairs etc). Orders should automatically be routed to the department and staff responsible for executing them.

Health Information Exchange
The EHR should be able to retrieve medication lists and clinical summaries from outside institutions as part of local/regional healthcare information exchange. EHRs should be able to send data to personal health records and receive patient entered data, especially telemetry data from home devices like glucometers, from personal health records.

Data Marts
Every night, data from the EHR should be exported to data marts for appropriate use with IRB approval for clinical trials, clinical research, population health analysis, performance measurement, and quality improvement.

At BIDMC, we're continuously improving our systems and we're well on the road to achieving much of this functionality. Of course, we'll never be done because the goal of the ultimate electronic health record is a continuously evolving target.

Tuesday, April 22, 2008

Decision Support for Inpatient Systems

I've written about the 10 projects that keep me up at night and published my clinical systems mid year strategic update. In addition, as a physician CIO, I tend to have a few projects that receive more attention and guidance from me than others.

The three areas that are my passion and focus in 2008-2010 are Clinical Documentation, Decision Support, and an Integrated Electronic Record throughout the community. I wrote about Clinical Documentation last week. The Integrated Electronic Record is next week. I've covered Outpatient Decision Support earlier, so this week is an overview of all our Inpatient Decision Support efforts at Beth Israel Deaconess. You'll see that we have thousands of decision support rules, updated regularly, which are fully integrated into our inpatient systems.

Drug interactions and alerts
a) Drug duplicate warnings - Identifies duplicates using chemical component information that is included with First Data Bank (FDB), the commercial drug database underlying our pharmacy and provider order entry systems. Component information is updated via monthly FDB updates.
b) Drug-drug interactions - Identifies interactions using chemical components and rules included with the FDB database. Displays only the classes of alerts identified as clinically significant by Pharmacy, to ensure clinicians are not overwhelmed with minor interactions. Rules are updated via monthly FDB updates. Clinical significance filters are maintained in a dictionary by Pharmacy.
c) Drug-allergy interactions - Identifies interactions using chemical components included with the FDB database. Components are updated via monthly FDB updates.
d) Drug Substitutions – Alerts the ordering provider with suggested substitutions for therapeutic equivalents on the inpatient formulary, which is maintained by Pharmacy.

Default dosing
There are currently 1,545 entries in the Pharmacy Provider Order Entry dictionary with default dosing schemas defined. These are maintained by Pharmacy.

Renal Dosing
Targeted drugs, rules and default dosing are maintained in a dictionary by Pharmacy. There are currently 48 renally dosed drugs.

Diagnosis or Indication based dosing
Target drugs, rules and default dosing are maintained in a dictionary by Pharmacy, such as ciprofloxacin and chemotherapy agents used for non-oncological reasons such as cytoxan.

Geriatric alerts
Beers criteria are used to alert physicians to age-specific medication issues. Information is updated via monthly FDB updates.

Heparin, insulin and enoxaparin
Ordering of these high risk medications is done via automated protocols and guidelines to promote best practice dosing. These are updated as best practices change.

Information Displayed on Medication Ordering Screens
a) Screens display the patient’s most recent relevant labs specific to the drug being ordered. The relationship between drug and lab(s) to display is maintained in a dictionary by the Pharmacy
b) Informational text about ordering parameters for the drug is displayed. These are maintained in a dictionary by the Pharmacy. A link to Micromedex is also available on the ordering screen.

Nurse instructions for drug handling
Custom instructions are displayed on Medication Administration Record labels. These are maintained by the Pharmacy.

Information Displayed on Lab Ordering Screen
a) Messages appear during ordering that inform clinicians about best practices such as guidance for ordering heparin dependent antibodies. These are maintained in a dictionary by the Lab.
b) Links appear to the online lab manual for each test ordered. Thousands of lab tests are documented including all the details on how the specimen is collected and processed. The Lab updates this manual regularly.

Blood bank
Specialized screens appear to provide pertinent clinical information when blood products are ordered.

Total Parenteral Nutrition
Guidelines, relevant labs and prior day’s TPN order appear automatically.

Consequent Orders
a) IV access flush ordering -POE automatically generates the correct flush order when documenting IV status
b) Blood products – POE gives current information about active specimens in the blood bank and can generate a type and screen if necessary
c) Mechanical ventilation- POE automatically suggests mouth care orders after ventilation is ordered

Quick access to reference/informational screens
Specific links to guidelines/web sites appear on many POE screens. For example, a link to the current weaning protocol is available directly on the mechanical ventilation order screen.

Quick access to patient specific data in other clinical systems
Specific links to data in other information systems appear on many Provider Order Entry screens. For example, a link to display transfusion restrictions available in the Blood Bank system appears on the blood product ordering screen.

Alerts and reminders for items such as pneumovax and influenza
Providers are prompted at appropriate points over the course of a hospital admission to order and document outcomes.

Order sets
"Packages" of orders that can be rapidly entered as a group are available to all clinicians. These are updated as clinical practice changes. There are currently 95 inpatient admission, postoperative, transfer and procedure order sets (5 megabyte download) and 195 outpatient chemotherapy regimens.

Hydration protocol to minimize risk of iodinated contrast nephropathy
Provider Order Entry recommends specific IV fluid orders when ordering pre-procedure prep orders.

You can see that the task of providing comprehensive decision support for our inpatient systems involves thousands of rules, kept updated by pharmacy, lab, and IS staff, as guided by our various steering committees. Only through the use of this much decision support can we ensure that data is transformed into information, knowledge and wisdom.

Monday, April 21, 2008

Accelerating Electronic Health Record Adoption

Over the past 3 years, we've recognized that the use of an Electronic Healthcare Record is a prerequisite to quality, effective, ad personalized healthcare. Our Medical Executive Committee changed the Medical Staff Bylaws to require use of our home-built ambulatory care record by all physicians working at BIDMC sites by July 30, 2008.

The justification we used to change the bylaws was:

*Improved documentation - legibility and accuracy
*Support for patient safety initiatives such as medication reconciliation, communicating test results and immunization recording.
*Maintenance of a consolidated, active problem list
*Anytime, anywhere availability to ALL providers including secure access
*Decision support features such as health maintenance reminders
*Supports documentation that contributes to quality patient care
*Templates available for standard documentation by specialty
*Allows for continuity of care with regard to practice/on call coverage and referral communications
*Ability to universally apply documentation improvement techniques
*Moves away from the paper chart, stored in a practice under lock and key, thereby not available for urgent care issues
*Dictation easily imported to notes/letters, etc
*Provides for easier audits vis-a-vis compliance with regulatory statutes
*Forward Communication of patient’s health information to referring MDs
multiple providers can view/access/document in the record simultaneously-Attending/Residents/RNs

We defined "use" of an electronic medical record as 100% compliance with:

*Online Ordering
*Medication Management including e-Prescribing
*Electronic clinical documentation of visit notes
*Problem List management
*Results viewing and sign off

Our change management approach to move clinicians from paper-based workflows included:

*Intense planning and customization with each department
*Group and Individual Training
*Presentations to Faculty by Physician Super Users-interactive presentation/discussion
*On-line Web OMR Tutorial and Self Help modules
*On site support at Go-live for 2 weeks –Amb Services and I.T. staff partnership
*Refresher training as needed
*Web OMR User Group Developed - focus to improve and prioritize updates to electronic record for clinician users

Our governance model for the effort has been the webOMR user's group, which is the multidisciplinary committee comprised of the practices and providers that use the tool. It is the primary vehicle to ensure that webOMR meets the needs of the clinicians and ensure good communication, education & support It's mission is:

*To provide stewardship for setting priorities for development of the Web OMR application
*To communicate relevant information and recommendations for enhancements to the Web OMR Development team
*To improve quality utilization through communication and training for the user community

We also established the webOMR Advisory Group, a multidisciplinary team incorporating Legal and Compliance that was created as the guiding body for decision making and policy development.

We're nearing the finish line on getting 100% compliance for use of electronic health records by physicians at BIDMC sites. The metrics we've developed to monitor our users and departments illustrate the uptake of e-prescribing - nearly 30% of all prescriptions are electronically routed at this point and that number is growing every day.

As I've described in my blog over the past few weeks, our next challenge is the rollout of an EHR for community physicians at non-BIDMC sites. For convenience, the folks at Solaris Health System packaged all my blog entries about EHRs into a single handy PDF. Feel free to use it!

Friday, April 18, 2008

Cool Technology of the Week

I've written about the challenges of Spam filtering - false positives and false negatives.

Recently we've experienced a doubling of the volume of incoming Spam. It's essentially a "Spam Denial of Service attack" that is overwhelming our Spam filters. The filters have a failsafe behavior that automatically lets Spam through if the servers get overwhelmed. Leaks of Spam and the increasing challenge of providing reliable, secure, 99% spam free email has caused us to revisit our email configuration and spam filtering products.

Our Spam filtering company, Symantec, provided onsite engineers to examine our configuration and hardware design. They suggested reconfiguration, enhancement of our CPU capacity and an upgrade to the latest software version that "learns" about common email patterns within the organization and whitelists selected traffic, relieving the burden on the spam filtering servers. Symantec also suggested replacing our software-based product with their 8300 series appliance. The appliance is better equipped to process large volumes of mail.

As a class of technologies, Spam filters include pattern recognition, Bayesian probabilistic decisionmaking, and neural network techniques among others. The best comparison of Spam flters, I've found is a recent Infoworld article.

The article illustrates the difficulty in improving our situation. The Symantec product comes out best in class, but only stops 96.4% of Spam. There were products that did better, but most had offsetting problems with false positives. Only Sendio and Proofpoint had better Spam blocking rates and no "critical" false positives. They both had much higher "bulk email" false positives than Symantec which accounted for the "best in class" rating for Symantec. The Infoworld evaluation was based on the Symantec appliance. The appliance has the same anti-spam engine as their software, but can perform additional functions e.g. better reporting, smtp-based throttling based on locally observed reputation, and others. We are testing the appliance now.

Our challenge is that as a healthcare provider, we cannot have false positives. A critical patient email, lab notification, or followup from a medical colleague must be delivered. We will accept a bit more Spam in order to have few false positives.

Thus, for now, we've concluded that an appropriately configured, hardware optimized Symantec configuration is our best bet. The war against Spam is a continuous battle, but for now, 96.4% filtering with very few false positives, wins the race. Hence, Symantec Anti-Spam (formerly known as Brightmail) and their 8300 series appliance is my candidate for the Cool Technology of the Week. Spam is an elusive target, so we'll continue to watch the efficacy of all the available products.

Thursday, April 17, 2008

The Tradition of Coining

Last Friday, I delivered the C. Everett Koop lecture at Dartmouth and after the lecture, Dr. Koop shook my hand and passed me a coin in his palm. The coin, pictured above, contains his official 3 Star Vice Admiral insignia as Surgeon General of the US. I thanked him for his support of healthcare IT and one of the Dartmouth professors explained the tradition of coining:

During World War I, American volunteers from all parts of the country joined newly formed flying squadrons in France. One of the lieutenants ordered bronze medallions struck. These medallions carried the squadron emblem and were given out to all squadron members.

On a flight shortly thereafter, the lieutenant's plane was downed behind German lines and he was immediately captured. The Germans took all of his personal identification except for the bronze medallion which he wore in a small leather pouch around his neck. He was taken to a small town near the front. Bombardment was heavy that night, he escaped his captors, but without his identification. He made his way to the front lines avoiding German patrols. Eventually, he managed to find a French outpost. Unfortunately for him, the French in that area had been plagued by sabotage. The French were ready to execute him as a spy when he remembered the leather pouch containing the medallion. He showed the medallion to them and they recognized the squadron insignia on it. His medallion bought him enough time to confirm his identity. Now instead of shooting the young lieutenant, they gave him a bottle of wine.

When he was returned to his squadron and his companions heard his story, it became a tradition to ensure that all members carried their medallion at all times. To ensure that each member carried their coins, they instituted the "challenge". A challenger would ask to see your coin. If you couldn't produce your coin, you were required to buy a drink, but if you produced your coin, the challenging member was required to pay for both your drinks.

The coin is a unique way of recognizing service and building camaraderie.

Dr. Koop is an amazing guy. He's 91 years old and is still a very active academician and public servant. I really appreciated his introduction of me at Dartmouth. He said that if I had never been born, the earth would have continued to turn, however, there's a chance it may have turned a bit more slowly. I'm honored and will proudly carry his coin.

Wednesday, April 16, 2008

Management Lessons Learned As a Parent

My daughter, Lara Halamka, is 15 years old. Being a parent has taught me more about leadership and management than any of the Spencer Johnson or Peter Drucker books. I've learned patience, communication and the ability to trust.

Here are my top 10 management lessons learned from being a parent:

1. Yelling never has a positive outcome - In my life as a parent, I've raised my voice twice over the past decade and a half. My daughter can remember both times, even though they occurred in the distant past. My outbursts diminished me and had no positive impact on her behavior. In business, if I ever feel that raising my voice would win the battle, I reflect on my life as a parent and hold back, since I know that confrontation ultimately makes the situation worse. As I've said before, "save as draft".

2. Formal authority rarely works - As a CIO, I would never stand in front of group of stakeholders and say "you must do this, because I'm the CIO." Standing in front of a teenager and saying "you must do this, because I'm your father" is just as problematic. Leadership comes from thoughtful discussion, weighing pro's and con's, then ultimately arriving at a consensus. Shared decision making between parents and children based on a fair, consistent, and predictable process preserves domestic tranquility. IT governance preserves organizational tranquility in the same way.

3. Give permission to make mistakes - Wellesley and the surrounding western suburbs of Boston have had a number of teen suicides over the past few years. Parents apply such pressure to perform that many teens have irrational expectations of perfection for themselves i.e. "you can be valedictorian, captain of the squash team, and a Pulitzer Prize winner by the time you're 18". Making mistakes and learning along the way is the way we learn as children and the way we learn as leaders. In management, I find that setting limits, then offering staff the flexibility to excel on their own is far more effective than micro-management and a constant threat of management retribution.

4. Communication is key - During a teenager's development, Mom and Dad may not be perceived as cool, smart, or fun to be with, but this can change by the day. Keeping the channels of communication open as moods change is key. There will be disagreements, but it's less important to win the argument than to ensure you're still speaking when the discussion is over. The same thing is true with customers and employees - I'd rather hear from them about bad news and fix the problem than not hear anything at all.

5. Get the basics right - Why was religion invented? There are encyclopedias written about that topic, but in my opinion, religion was invented to provide a moral/behavioral framework that puts boundaries on human instincts to compete, reproduce, and survive. We do not have a religious household, but we have a moral household. As a parent, I've tried to be a living example that the nice guy can finish first, that theft and aggressive behavior are wrong, and that kindness and consensus win the day. If my examples lead my daughter to make the right choices when faced with tough decisions, then the basic moral framework we've built will be a foundation for her success. In business, setting a tone of expected behavior by being a living example of ethical, fair, and collaborative behavior spreads to staff and customers.

6. You can criticize ideas but do not criticize people - As the brain matures, sensory input is integrated with experience to produce more robust decision making. During that process there will be many experiments, trial/error, and fine tuning. If my daughter makes a decision that I do not agree with, we can debate her ideas but not her abilities. The same is true with employees and customers. I treat everyone with respect even if I do not agree with their ideas.

7. Build a joy of success rather than fear a failure - When I was teen and took the SATs, I had no real knowledge of their importance, I arrived a bit late, and did not stress over the outcome. The result was a scholarship, not because I was smarter than my peers, but because I did not have a fear of failure during the exam. I watch many parents link performance on every test to an admission or rejection from Harvard. Admission to an Ivy League school is equated to happiness. With Lara, we've tried to celebrate success and build a joy of achievement rather than a fear of failure. Thus far, the motivation from within to do well seems much more sustainable than fear of failure for imposed by authority figures. Emphasizing growth and achievement among employees creates a higher performance organization than management by intimidation.

8. Delegate responsibility but emphasize accountability - My daughter tends to have the same sleep cycle that I do, often sleeping 4 or 5 hours a night. Her schedule is left up to her to decide, but when the 6:30am alarm sounds, she is accountable for her decision to go to bed late. Rather than enforce a bedtime and wake time, delegating her sleeping hours to her, but holding her accountable for getting to school on time, awake and alert, has enhanced her decision making. Leadership is the work of worry and it's important to learn accountability early. The more responsibility I'm given, the greater the accountability.

9. Respect innovation - As vegans, my wife, daughter and I grow beets, carrots, turnips each year. Last year, when we picked a bucket of carrots, I recommended to my daughter that we use a sieve to wash off the dirt. She had a different idea of laying them out in the driveway and washing them off with a sprayer. I suggested that sieves have always been used and it's the "right way". Her method, although non-standard, was fast, effective, and efficient. Just because business as usual has always worked, there may be better ways. As I tell my staff - If I become the obstacle to innovation, it's time for me to move on.

10. Accept that the best lessons learned come from independence - For my daughter to develop self esteem, assertiveness, and a willingness to take acceptable risks, she needs to make decisions on her own, even if they are imperfect. If I make decisions for her, she'll be less prepared for life in college and beyond when I may not be present. I give her the best guidance that I can, hope that she develops a strong internal compass, and then let her change from within as she experiences the world. Developing the next generation of leaders in an organization requires the same approach.

I highly recommend parenting over an MBA. Parenthood teaches humility, selflessness, and self-control. No matter what I do in IT, my daughter will be my greatest legacy.

Tuesday, April 15, 2008

The Journey Towards Electronic Clinical Documentation

Over the past 3 years, we've been executing our Clinical Systems strategic plan to enhance clinical documentation, decision support, and data integration for all our care areas. Each of these deserves its own blog entry, and here's an overview of our clinical documentation efforts.

Over the past 5 years, we've used our self-built, web-based Online Medical Record, called webOMR for ambulatory care automation. By June 30, 2008, all outpatient clinical documentation must be done in webOMR to comply with medical staff bylaws as amended by the Medical Executive Committee. For those areas which require scanning of drawings, we're implementing an outpatient documentation scanning application using Fujitsu scanners and Captiva software, beginning in May.

Emergency Department
Our ED Dashboard is the workflow tool which drives all aspects of patient tracking and results display for the department. We're adding complete electronic documentation to this dashboard by this Summer through a combination templates/macros, structured forms and free text typing.

We've installed the Philips Compurecord Intraoperative charting application which includes automated interfaces to anesthesia machines, labs and all OR telemetry.

We've installed the iMDSoft Metavision charting application which includes automated interfaces to all ICU monitors, labs, and ventilators. We worked with iMDSoft to implement highly structured documentation which makes rounding and charting more accurate and efficient.

We're studying the right solution for the PACU now and we may implement iMDSoft there, ensuring one patient charting system is used for all perioperative management.

Ward Beds
We have a very comprehensive self-built Online Medical Record for all ambulatory encounters and we're enhancing it to support all clinical documentation on the wards. Here's the step by step implementation plan:

2007 - We implemented scanning of existing paper charts to provide a means of retiring our dependence on paper for medical record coding and historical review

2008 - By June, we'll go live with electronic History and Physicals which include a medication reconciliation function that is tightly linked to the outpatient record. The latter ensures medications will be tracked accurately as patients transition between inpatient and outpatient settings of care.

2009 - We'll expand our automated history and physicals charting applications to support daily inpatient progress notes. Once this is complete, we'll be able to integrate our self-built team census application with electronic charting to automate all signout processes. Automated signout processes will provide a means to document the responsible caregiver for the patient at all times.

2010 - Once all aspects of charting, signout, historical and physicals, operative notes, etc. are completed, we'll be able to create a highly detailed automated discharge summary. Today, we have a discharge document that is sent via the MA-Share infrastructure to the next provider of care and includes meds/problems/followup, but our next version will also incorporate all our electronic documentation features for a truly multidisciplinary continuity of care document for each patient.

By 2010, we will have reached the tipping point such that our need for paper documentation will have diminished and we can officially declare the electronic record as the official medical record. Today, we have a hybrid paper/electronic record during our transition state. Step by step we're on a logical journey toward clinical documentation and we're involving many clinicians, the HIM department and our governance committees in all our efforts.

Monday, April 14, 2008

Most Popular Educational Technologies

In 2001, Harvard Medical School went live with the Mycourses educational portal (check it out by clicking on take a tour) , which includes content management, collaboration, and online evaluation for faculty and students.

Here's an overview of the most popular technologies in Mycourses and the reasons they've been popular.

Virtual Microscopy
Remember using a light microscope and the trying to get a clear, focused image while dripping oil on the 1200x lens? Using a microscope is a different skill than learning pathology/histology, so we teach them separately. Students have a few hours of hands on experience with lenses and oil followed by a 100 hours of learning the pathology/histology via Virtual microscopy - streaming, high definition, zoom-able, movable images onto the web using technologies from Aperio and MicroBrightField.

With Virtual Microscopy, faculty navigate tissue sections via the web and project them on an HDTV display or LCD projector, pointing out salient areas on a slide in real-time without the use of a 12 headed microscope or other expensive optical technology previously needed for group work. Faculty digitize rare slides and make them accessible to all students and faculty in a very convenient way for both education and research purposes. From the student’s point of view, slides can be reviewed 24x7 from their dorm room.

Visual Encyclopedias
The web is an ideal vehicle for delivering "new media" that beyond the text based content of traditional textbooks. We've created our own specialized visuals for radiology instruction and visual diagnosis, but we've also licensed two commercial products.

VisualDx, an online visual decision support tool, was developed to assist students and physicians in pattern recognition, diagnosis, and treatment. Unlike traditional atlases or textbooks, VisualDx allows one to enter the patient’s key signs and symptoms (eg, dyspnea, abdominal pain, widespread papules), and in seconds the system generates a patient-relevant differential diagnosis.

Primal Pictures is a 3-D Online Anatomical resource with extremely detailed models of the human body that we've used in the anatomy lab by placing flat screens on mobile mounts above cadavers. The students can navigate three dimensional images, remove virtual tissue layers, and explore the relationships of structures to one another in real time while doing dissection.

Online Procedures and Simulations
Does the bevel go up or down when doing a blood gas? What are the anatomical landmarks when doing a lumber puncture? By making flash and streaming video procedure instruction available via the web and mobile devices, we provide our students with just in time instruction before they do a procedure.

We also use Flash for highly interactive simulation/exploration of difficult to learn concepts. For example, to teach the relationship between heart sounds, PA catheter tracings, EKG, and Pressure-Volume loop, we've used flash. We have 200 of these simulations for all aspects of human physiology.

Collaboration tools
We've implemented centralized shared storage for individuals and ad hoc collaborations which enable any groups to exchange files and set file read/write/delete attributes for every participant.

For real time group collaboration, we've deployed Webex and Elluminate. Although we do have video teleconferencing facilities available, we've found that audio conference calls combined with real time presentation tools are the most effective way to deliver real time educational materials to collaborative groups.

Streaming videos and podcasting
Our most popular application is streaming video of recorded lectures with over 60,000 views each year. We also podcast all our lectures so that students can replay lectures on their ipods. We use Apreso for combining lecture slides and videos, Streamsage for full text indexing of spoken words, and Real Server/Player for routine video delivery. One of the most popular features is Enounce Time Scale Modification of Audio , which enables videos to be watched at twice normal speed with frequency correction so that voices sound normal. Students can watch 8 hours of lectures in 4 hours! It is true that attendance of lectures has diminished since we made streaming video available, but attendance at interactive sessions such as tutorials has stayed the same.

Our next generation of portal will include more social networking features, more opportunities for collaboration among the medical school and hospital affiliates, and support for classrooms of the future which incorporate more real time video collaboration and resource sharing.

It's very clear that the web is empowering entirely new ways to deliver educational material and the way we teach must evolve.

Friday, April 11, 2008

Cool Technology of the Week

I travel about 400,000 miles a year.

I can tolerate the late departures and arrivals, the surly airline staff, and the sardine-like seating arrangements, but the unpredictability of the security screening process is a nightmare. Sometimes I arrive at an airport and jog through the security line in minutes. Many times I arrive to find a security line longer than a football field with an hour long wait, causing me to miss my flight.

Given that I'm a trustworthy traveler who only carries a toothbrush and an extra pair of socks, should I wait in the same line as the once a year traveler with a bag full of liquids/gels, a giant carry on suitcase, and a stroller?

The notion of a "Registered traveler", who is trustworthy and carries non-repudiatable identity credentials makes a great deal of sense.

Clear has implemented a fast pass for airport security with a process and a smart card. It's the Cool Technology of the Week. Clear members are pre-screened via a government approval process and carry an identity card which allows them to access designated airport security "fast lanes" nationwide. In my experience at Orlando, Dulles, Reagan, and San Francisco, Clear members pass through airport security faster, with more predictability.

The smart card contains basic demographic data - name and address, but also contains biometric data including a photograph, height, fingerprints and iris scans.

Enrollment is a two step process - an online application and in person identity verification.

The identity verification is completed at a Clear enrollment station (airports supporting the technology), where a Clear staffer verifies two government issued IDs, takes your picture, captures your iris and fingerprint scan, then submits everything to the government for clearance.

Clear's identity theft policy is well thought out and minimizes the risk to the Clear members if their database or card technology is compromised.

The price is $100 per year plus the TSA vetting fee of $28.

I plan on completing my Clear enrollment on my next flight to Washington DC in May. Once I have the card, I can bypass security lines and go directly to baggage screening.

The number of airports supported Clear is growing, but just the support for the Washington DC and San Francisco airports make it worthwhile for me, since I pass through these dozens of times per year. Making the airport experience a little more predictable is about the best way I can improve my mental health in 2008, so Clear is a definitely cool technology.

Thursday, April 10, 2008

The Cosmopolitan Dating Test

Today's blog is about that fine academic journal of all things health and relationships, Cosmopolitan. I do not read Cosmo, but I know several people who have.

I was recently told about groundbreaking Cosmopolitan research that identified the "4 Types of Men You Never Want to Date". I have to publicly admit that I have failed the "Cosmopolitan Dating Test".

Let's take a look at the 4 personality types that Cosmo has declared to be losers:

1. The Adrenaline Junkie - You definitely want to stay away from rock climbers, alpinists, and ice climbers because they will spend so much time on their outdoor adventures that there will never be quiet time for a bowl of popcorn and "Sleepless in Seattle". They'll be planning their next adventure, coiling their ropes, and checking their gear lists. Next thing you'll know they'll want to climb every mountain in New Hampshire.

2. Nice Guy with a Chip on His Shoulder - I am a nice guy, but alas, I have a Chip in my shoulder containing all my medical records. In addition, the Cosmo researchers warn against the guy with stylized dressing habits, definitely ruling out my black Nehru jacket, black shirt, and vegan shoes. Stylish dressers spend so much time thinking about ways to accessorize that they'll never have time for moonlit walks on the beach.

3. Smooth Operator - The guy with the polished anecdotes about life as a CIO, leadership lessons learned, and spellbinding tales of project management will never have time to whisper sweet nothings in your ear.

4. Workaholic Hotshot - Definitely be wary of the guy with an 80 hour work week who has multiple jobs and doesn't sleep much. He'll be so attached to his Blackberry that there will never be a romantic moment away from a keyboard.

My wife and I have been together for 28 years, through sickness and health, Windows and Mac OS, residency and network outages, so I think it will last. When I explained what a loser I am according to Cosmopolitan, her response is that she would never want to date me, just marry me. Aw shucks...

Wednesday, April 9, 2008

The Challenges of a Software Legacy

Ever year as I prioritize new application development, I remind my governance committees that 80% of my staff resources are devoted to keeping existing systems stable, secure, and error free. These staff maintain infrastructure and add incremental improvements to support compliance with new rules, new standards, and new workflow requirements.

As I reflect on Microsoft's current challenges with Vista, I sympathize with their dilemma. On the one hand, the user community expects each upgrade to offer bold new features and innovation. On the other, users expect all their Windows 98, NT, 2000, and XP software to work flawlessly.

The amount of engineering required to ensure this backward compatibility is enormous and in large part explains the difficulty the Microsoft Operating System Development group has in releasing something that is boldly new.

Industry analysts point to the speed of innovation of Google or the fact that Canonical's Ubuntu is rapidly converging on the Windows feature set. Both Google and Canonical have the advantage of little legacy compatibility support.

I've experienced this same burden of a software legacy several times.

At Harvard Medical School, we introduced as the educational portal in 2001. Each year through Mycourses, we deliver 60,000 streaming videos, thousands of documents, and hundreds of simulations. Per my recent post about Educational Technology priorities, there is a desire to enhance usability and add many new features. The challenge is that we need to maintain existing features while innovating. This is like changing the wings on a 747 while it's flying. A perfect example is our Surveybuilder and Testbuilder. These web-based applications in Mycourses evolved over years based on hundreds of user feature requests. At this point, adding new features will likely break old features. Our best approach is to rewrite them from scratch, based on a streamlined set of user requirements. Thus, we'll evolve the existing Mycourses into a new portal framework, then rewrite the applications over time. This will be evolution rather than revolution. Some people will comment that our pace of innovation is slower in 2008 than it was in 2001. This is the reality of an existing legacy of highly functional software.

At BIDMC, we launched our intranet in 1998. At that point in web history an intranet was just a list of links and not a highly interactive Web 2.0 infrastructure supporting blogs, wikis, forums, and new media. In 2007 we introduced a new intranet based on many modern features such as single sign on, user customization, support for RSS feeds, and Sharepoint features. Today, of our 5000 users, 4000 use the old portal and 1000 use the new portal. Our user survey indicated that the average user just does not want to change. Learning a new portal is more effort than is justified by the new features. In 2008, we're relaunching the portal again, making the look and feel more similar to the old portal but supporting many new collaboration and content management features. In 1998, launching the intranet was simple because there was nothing to compare it with. In 2008, it's very hard because we have to support the legacy of highly functional old portal features while moving forward.

In both cases, we're leveraging our governance processes to build top down and bottom up support for change. We hope that by creating an urgency to change, a vision for the future and a guiding coalition, we'll be able to overcome the burden of our software legacies.

Thus, Vista may have its warts, but I understand the struggle Microsoft faces.

Tuesday, April 8, 2008

Electronic Health Records for Non-owned doctors - Support

This is my tenth entry about providing electronic health records for non-owned doctors. The previous entries have described the efforts to go from vision to live implementation. The subject of this post is support after go live and ongoing operational funding. As with my post about implementation funding, I've asked all the implementers of EHR projects in Massachusetts to comment on their plans.

At BIDMC, we'll provide a central help desk (Concordant), outsourced desktop/network support (Concordant), and ongoing application support (internal staff, Mass eHealth Collaborative staff and eClinicalWorks). Clinicians will pay a fixed monthly rate for this service. We'll centrally contract for all these services, so the cost will be as low as possible. BIDMC may pay for the ongoing operation of the centrally hosted eClinicalWorks system (i.e. rent in the co-location data center, server support staff) and this is still under discussion.

Cartias is evaluating their strategy for ongoing support. They are considering the possibly of reassigning members of the implementation team to support as implementation is completed. The have not yet identified a specific funding model for support, but are considering an approach similar to BIDMC.

Children's will provide a similar model to BIDMC. The help desk function and first tier application support will be outsourced to a third party vendor (The Ergonomic Group). They will escalate to eClinicalWorks as necessary. Ergonomic will also manage and support network operations at each of the practice sites. Children's will support the central hosting site hardware and infrastructure. Children's will also support all network operations inside the core data center. Clinicians will pay a fixed monthly rate for this service.

Mt. Auburn Hospital/MACIPA
Mt. Auburn/MACIPA will provide a central help desk and ongoing hardware/application support. They are currently retraining clinicians to help them increase the utilization of the product, given that during the initial training there is only so much a physician can absorb. They also intend to also hold classes at the IPA periodically. Post live financial support is still being discussed.

New England Baptist Hospital
NEBH will provide an outsourced help desk, ongoing hosting, and application support. Clinicians will pay for non-Meditech interfaces, software maintenance, and connectivity/support to billing companies.

Partners will follow the same model as BIDMC, with clinicians funding ongoing support services.

Community physicians will fund ongoing software and hardware support. The team in Highland Management (joint venture between the hospital and IPA) will provide guidance in the development of templates and the use of the system for reporting to meet P4P goals and clinical integration. Winchester IT will also be involved in the development of interfaces and the transfer of patient data for care delivery.

This post marks the conclusion of my first series about electronic health records for non-owned physicians.

Today, the BIDMC Finance Committee approved our pilots, so we'll be moving forward with all the plans I've outlined. This is a major milestone for our project that enables all our contracts, service level agreements and spending to progress.

My next series about this topic will start in July as our pilots go live. I'm sure there will be many more lessons learned to share including comments on budgets, practice workflow transformation and loss of productivity. I hope these first 10 posts about planning the project have been useful to you!

Monday, April 7, 2008

Tamperproof prescriptions

On May 27, 2007, Congress enacted regulations requiring the use of tamper-resistant prescription pads. This primarily affects patients covered by state Medicaid programs.

Following several meetings internally, with state agencies and with professional organizations, BIDMC elected to use tamper proof paper stock in printers which produce electronic prescriptions (not e-prescribing, which is exempt from the regulation). The new stock will be used to write all prescriptions, regardless of payer/insurer and contains the features proposed by MassHealth as "standards" for Tamper-Resistant Prescription Pads:

It has a greenish hue.

It is perforated (twice) so that up to three prescriptions can be printed on one sheet.

When photocopied, the word "VOID" will appear in multiple locations.

The backside has Rx icons printed in thermochromic ink. This means the icon will disappear when rubbed with your finger and then reappear when you stop.

We're implementing this in two phases. From April 1, 2008 to October 1, 2008, we can still use plain paper, but we've modified prescription printing as permitted by the new regulation :

“Quantity Border and Fill (for computer generated prescriptions on paper only), i.e. Quantities are surrounded by special characters such as an asterisk to prevent alteration; e.g. QTY **50** and Value may also be expressed as text, e.g. (Fifty), (optional).”

“Refill Border and Fill (for computer generated prescriptions on paper only), i.e. Refill quantities are surrounded by special characters such as an asterisk to prevent alteration; e.g. QTY **5** and Value may also be expressed as text, e.g. (FIVE), (optional).”

Here is a link to an example of our new printed prescriptions. This is live now.

We're also providing a Notice to Dispensing Pharmacies which we're attaching to prescriptions.

By October 1, 2008, we'll replace all plain paper with the tamperproof paper as required by the regulation.

This has been an effort requiring coordination among IT, providers, administration, pharmacies, and government. It's been quite complex and we're hopeful that our planning, communication, and phased implementation effort will be successful.

Friday, April 4, 2008

Cool Technology of the Week

One of the challenges of being a CIO is the "application is slow, can you fix it" phone call. Generally, the network is blamed first, but there are many layers that all need to be examined - desktop, network, server, storage, database, active directory, internet service provider etc. For example, a complaint about email slowness can be caused by a multitude of factors.

We recently worked with our electronic health record infrastructure partner, Concordant, to do an end to end application performance analysis.

The tools they employed were:

WhatsUp Gold for Network and Server Monitoring
Windows Performance Monitor for Server and Client Monitoring
OPNET Ace for End to End Network traffic analysis
Computer Associates eHealth for Network Monitoring

The general approach they used covered three domains. They began by identifying and defining the problems from a user perspective. This helped to identify issues related to system performance versus non-technical issues that amplified the technical issues and affected user perception of performance, e.g. training, improper usage of the application. They used multiple subject matter experts to focus on the different domains to ensure they had the in-depth knowledge to evaluate each of them.

The three investigation domains and key focus areas within each domain were:

End User Observation & Interviews
Client device performance analysis
Device configuration & Log review
Device specification analysis per application vendor recommendations

WAN link utilization
Device performance analysis
Device configuration & Log review
Packet Loss & Latency analysis
Traffic Analysis

Server and Storage performance analysis
Device configuration & Log review
Service and Process performance analysis
Device specification analysis per application vendor recommendations

The findings from the assessment did not identify a "magic bullet" issue that caused performance issues, but instead identified multiple smaller issues that combined to impact system performance.

In my experience of troubleshooting complex IT systems, I've found that the comprehensive approach outlined above works very well.

If I had to choose one simple approach to determine the cause of application performance issues, I would:

1. Check to see that the desktop, the server, and the database all have their network cards set to Auto, since performance problems are often network card duplex mismatches

2. Install OPNET agents on the client and server. More often than not, OPNET rapidly identifies root causes of application performance issues.

Based on my positive experience with OPNET, including in this particular project, I'm naming OPNET as the cool technology of the week. Now I can respond to the "application is slow" question with an OPNET answer.