Friday, October 31, 2008

A Tale of Two Cities

Normally, this would be a Thursday blog entry, but because I wanted to devote two days to the ICD-10 question, I deferred my more personal weekly blog entry to today.

Post Labor Day always brings an immense amount of travel to conferences, Board meetings, and Washington policy events. This week is definitely planes, trains, and automobiles with craziness such as breakfast in Boston, Lunch in Washington and Dinner in New York, followed by Dinner in San Francisco, followed by Breakfast in Boston.

Although I do not usually frequent restaurants, I had a unique opportunity this week - to eat at the best vegetarian restaurant in New York City on Tuesday and the best vegetarian restaurant in San Francisco on Wednesday. Here's my brief reviews:

Candle 79 154 East 79th Street New York, NY 10021

Candle 79 is an acclaimed organic vegetarian restaurant specializing in fresh vegetable, tofu, seitan, and tempeh dishes. When we arrived at 8:30pm, the restaurant was packed with a line out the door. It's great to see New Yorkers flock to vegetarian and vegan food. I ordered two specials

Lentil, Arugula, Wild Mushroom, and Harvest Vegetable salad
Pumpkin seed coated Seitan with harvest vegetables and chipolte/black bean sauces

The presentation of the food was remarkable. The salad was extremely fresh, and a great blend of different textures and tastes. The spicy arugula was definitely a contrast to the typical steakhouse iceberg and tomato salad.

The Seitan was heavenly. A perfect blend of sauces, fresh vegetables, and a coating of pumpkin seeds.

The menu was filled with so many choices that I could easily eat at Candle 79 for a month without repeating myself.

Greens Fort Mason Center, San Francisco, CA 94123

When I arrived at 6:00pm, a local CSA was sorting fresh organic vegetables just outside the entrance to Greens. Definitely good karma.

I sat by the window overlooking the San Francisco Marina, watching the homebound traffic cross the Golden Gate Bridge.

Greens is affiliated with Green Gulch Farm/Zen Center and I started off with a great organic salad of fresh picked Green Gulch baby lettuces, heirloom apples, and pomegranate. Superb flavor and texture.

For my entree I had a squash and japanese eggplant lasagne - not your typical italian dish. This was a beautifully baked round stack of vegetables wrapped in a pasta covering. Tender Brocollini and root vegetables were served as sides.

Service and ambiance were great.

How do a compare the two experiences?

Candle 79 was like New York itself - a bit more edgy, energetic, and filled folks in their 20's. Vegan dishes filled the menu - no doubt that New Yorker Rory Freedman's cookbooks have popularized vegan cuisine.

Greens was filled with folks in their 50's, engaged in calm conversation, as you might expect in a Zen inspired restaurant. Most dishes have some artisan cheese in them, although Greens is happy to accommodate vegan diners with special pastas and by leaving ingredients out of their freshly prepared dishes.

Service in both spots was excellent. Vegetarian/Vegan servers are always friendly.

I enjoyed both immensely and look forward to returning to these spots as my travels take me from coast to coast.

Thursday, October 30, 2008

ICD9, ICD10 and SNOMED, a guest blog

Knowing that yesterday's blog about ICD-10 would raise questions about how HITSP will incorporate ICD-10 into its future work products as well as the role of SNOMED verses ICD-10 as a clinical vocabulary, I asked one of our HITSP Technical Committee Co-chairs to give me his perspective. Jamie Ferguson leads standards efforts for Kaiser and he gave me his permission to publish his personal thoughts on ICD9, ICD10 and SNOMED, which provide helpful background about the issues. Jamie wrote:

"On the point that SNOMED and ICD-10 have the potential to be competing vocabularies, Kaiser Permanente's position is that ICD-10-CM/-PCS and SNOMED-CT are not competing vocabularies. As CMS noted in its NPRM, ICD-10 is a hierarchical classification system for billing and administrative purposes, whereas SNOMED is a knowledge-based ontology for clinical documentation and clinical decision-making purposes. SNOMED-CT is recommended for the US private sector and mandated for federal agencies for clinical documentation purposes as a result of its selection in the CHI initiative of OMB's e-Gov program. A one-way authoritative mapping from SNOMED-CT clinical documentation to the US ICD-10 billing codes is a current project and has been requested to be made official before the ICD-10 compliance date, i.e. published by an authoritative source such as NLM and cited by CMS, in comments on the NPRM from several multi-stakeholder organizations. Also, perhaps most importantly, for clinically-relevant analysis and clinical decision making the value of inference based operations such as subsumptive queries should not be underestimated. Therefore, KP would advise HITSP to make use of an authoritative mapping of clinical documentation (SNOMED) to administrative classification (ICD-10) when it is available and to include it as needed for its scenario solutions.

A further note on this was submitted to HHS by Kaiser Permanente as part of our public comments on the NPRM:

According to CMS, the benefits of transitioning to the ICD-10 code sets will include: 1) more accurate payments for new procedures; 2) fewer rejected or improper claims, i.e., fewer supplemental information requests will be required to support the medical necessity of claims and the details of procedures to be reimbursed; and 3) better understanding of new procedures. These improvements are aimed at providing more consistency in how conditions and treatment are captured for billing purposes because of the increased level of detail and specificity in the new code sets and are appropriately within the scope of the ICD-10 code sets adoption (73 FR 49821-23).

However, CMS has also identified potential benefits that go beyond the primary administrative purposes of the new ICD-10 code sets. These include improved disease management by payers, and better understanding of health conditions and health care outcomes. More specifically, CMS envisions using the ICD-10 codes within claims data to support a broad array of population health research and quality initiatives, such as: outcomes analysis, quality assessments, bio-surveillance, chronic care management, registries (including immunization registries), etc. (73 FR 49821, 49823-25). For these purposes, the SNOMED-CT is preferred because it is a knowledge-based ontology capable of inferences and subsumptive queries (e.g., “find all disorders that include “kidney””) whereas ICD is not, which among other reasons make SNOMED CT the coding system of choice for clinical documentation. Moreover, SNOMED CT codes can also be used for billing and reporting purposes without being inappropriately manipulated as CMS suggests (73 FR 49803). In fact, inference-based query systems render them more useful than billing classification systems for these reporting and analytical purposes related to population health and quality programs.

There is a distinction between coding for billing, which is relatively simple and coding for clinical documentation and decision-making, which is complex by comparison. As a result, we developed our clinical systems based on the SNOMED CT clinical terminology, which was specifically designed to support clinical decision-making and interoperability. For these reasons, we strongly support the continued use of SNOMED CT for these clinically relevant purposes and urge CMS to refrain from mandating the exclusive use of ICD-10 code sets for clinical purposes. We also note that using ICD-10 code sets for some of the cited purposes may conflict with the existing recognized federal standard, such as coding for the chief complaint or reason for visit. Those standards require SNOMED-CT in certain EHR and personal health record (“PHR”) systems, for electronic laboratory reporting, and for bio-surveillance, including public health reporting.

CMS also mentions that adoption will lead to harmonization of disease monitoring and reporting worldwide. However, these ICD-10 code sets are uniquely U.S. versions, so whether such harmonization is achievable is questionable.

HHS and OMB recommend adoption of SNOMED CT as the preferred coding system for clinical documentation in the Consolidated Health Informatics initiative within OMB’s e-Gov program, and SNOMED CT is the required clinical terminology standard in certain recognized HITSP Interoperability Specifications such as IS-01. "

Thus a strategy of using SNOMED CT for clinical observations such as problem lists while using ICD-10 for billing makes a great deal of sense. I imagine that the National Library of Medicine and vendors will develop products that will help turn SNOMED clinical documentation into accurate ICD-10 billing codes to streamline workflow and ensure appropriate coding.

Wednesday, October 29, 2008

The Transition to ICD-10

I'm always enthusiastic about the adoption of new standards that enhance semantic interoperability. The use of modern vocabulary standards such as ICD-10 improve administrative efficiency, enhance the ability of decision support systems to enforce guidelines, and enable a more granular reimbursement process.

The Centers for Medicare and Medicaid Services (CMS) circulated two Notices of Proposed Rulemaking (NPRM) on August 22, 2008 that require adoption of new standards for claims submission (X12 5010) and coding (ICD10)

The 5010 Proposed Rule - Health Insurance Reform; Modifications to the Health Insurance Portability and Accountability Act (HIPAA) Electronic Transaction Standards; Proposed Rule (73 Fed. Reg. 49742)

The ICD-10 Proposed Rule - HIPAA Administrative Simplification: Modification to Medical Data Code Set Standards To Adopt ICD-10-CM and ICD-10-PCS; Proposed Rule (73 Fed. Reg. 49706)

The first step in the transition to ICD-10 is the upgrade of the Electronic Transaction Standard (administrative data communications between payers and providers) from version 4010 to 5010. Once this upgrade is complete, the work on ICD-10 can begin. In the CMS NPRM, the deadline for 5010 implementation is April 1, 2010 and the deadline for ICD10 is October 1, 2011

As much as I support ICD-10, I also know that the change management effort to upgrade systems and train personnel will be huge. The Association of American Medical Colleges (AAMC) summarized the issues in a comment letter that was submitted to Secretary Leavitt.

Recently, the New England Health EDI Network (NEHEN), representing the payers and providers of Eastern Massachusetts, wrote comment letters to Secretary Leavitt recommending a longer transition timeline. By consensus, Massachusetts stakeholders recommended a 5010 implementation date of April 1, 2012 and an ICD10 implementation date of April 1, 2015.

Here are some of the issues NEHEN identified:

HHS expects that HIPAA 5010 and ICD-10 will run as concurrent projects. The supply of experienced and skilled resources to complete work on both efforts is limited. The accelerated implementation of both of these projects would create significant competition for scarce business and technical resources as well as project funding. This is not a recommended approach as it is a high risk, high cost implementation strategy.

The overall cost of implementing this change is technological and operational. For example, there must be modifications to existing training curriculum as well as claim submission and payment policies to ensure no adverse impact to the revenue cycle. I anticipate a real challenge to train, recruit, and retain ICD-10 savvy coders.

NEHEN also identified several unanswered questions:

When can covered entities expect to receive a complete mapping of ICD-9 to ICD-10 codes, both diagnosis and procedure?

What is the exact timeline for payers to be able to accept both ICD-9 and ICD-10 versus ICD-10 only? How does this impact the response transactions?

Should the code set used be validated based on the date of transmission? The start Date of Service or Discharge? The Payment date?

Are paper claim submissions required to use the ICD-10 code sets? If so, what is the timeline for this conversion and acceptance?

How does the change from the ICD-9 to the ICD-10 impact the other code sets used in the transactions (HCPCS, CPT-4, NDC, etc)?

ICD-10 is a needed change to replace the 30 year old ICD-9 coding vocabulary. As with any change, we need the time and resources to bring the people, processes, and systems to a future state while minimizing risks of business disruption. Hopefully CMS will revise its implementation deadlines based on all the comments from healthcare stakeholders so we can align the scope, timing and resources needed to do the project right.

Tuesday, October 28, 2008

Removing Complexity

"Fools ignore complexity. Pragmatists suffer it. Geniuses remove it."
Alan Perlis (Creator of ALGOL, one of the first programming languages)

Whenever I purchase something for myself or my home, I always think about the complexity that the purchase will add to my life. Adding more stuff to my life can lead to short term gratification, but it also can lead to long term maintenance headaches.

The same can be said of information technology. Here a few examples:

1. A few years ago, I had dinner with Steve Ballmer and explained that Microsoft should produce secure, reliable products with fewer features and lower cost. Who really wants their outline reformatted by the Outline Wizard in Word? Who really wants to apply the latest emergency patch that's required because of too much code supporting too many seldom used features? He explained that I was mistaken since most people use 95% of the features in Office and the average user prioritizes new features over everything else. We agreed to disagree and he returned to Redmond to manage the creation of Vista.

2. At BIDMC, we buy and build software. Every time we buy a commercial product we need to think about interfaces from our existing systems to the new product and from the new product to our existing systems. All those interfaces add significant complexity, makes recovery from downtime more difficult and increase the cost of support. Recently, a clinician commented that one of our new software purchases really surprised her, since it added complexity, fractured workflow, and inconvenienced many users for the benefit of a few.

3. When we build software, we are often tempted to add all the bells and whistles requested by the user. For each new custom feature there is a cost of maintenance, additional training, and potential bugs that could compromise stability/reliability. I've been involved in many development projects that eventually became so complex that the software had to be rewritten to ensure usability, security and maintainability.

4. Customizing commercial packages seems like a good idea to get the buy in of stakeholders. Over my past decade as a CIO, I've found that stakeholders come and go, and when they leave, all the esoteric customizations they designed are often retired. In fact, many upgrade projects include the retirement of all the previous customizations that became an impediment to life cycle management of software, added complexity, and over the long term were more hassle than benefit.

5. Best of breed seems like a good idea when you're comparing products based on narrowly focused requirements. We did that with our email system i.e. Exchange for general email functions, Brightmail for spam protection, McAfee for virus protection, Tumbleweed for secure email transmission, SendMail for SMTP gateways etc. The end result was a feature rich system that has been too challenging to maintain and debug. Our next purchase will be an appliance from a single vendor which consolidates Spam filtering and security into a single product.

In short, complexity is generally not a good thing. What am I doing to battle complexity?

I try to use the fewest number of vendors possible - one (or at most two) storage vendors, one desktop vendor, one network vendor, and a very few application vendors. The more vendors, the greater the integration effort, the increased support and maintenance burden and the higher the cost.

I aim to avoid customizing commercial software whenever possible. My experience is that customizations are rarely worth the investment. Once customizations are in place and the users really understand the implications to workflow, cost, and impediments to future upgrades, they are no longer so enthusiastic about them.

I use enterprise-wide generalizable tools whenever possible i.e. one content management system for the web, one means of authentication/single signon, one ERP system for all fiscal/administrative functions.

How are we seeing this "removing complexity" idea play out in the industry?

People are adopting Gmail, Google Apps, and Facebook as "good enough" productivity tools.

People are adopting commodity hardware, clustered together using basic Linux operating systems, instead of proprietary niche solutions.

People are using Software as a Service offerings with thin client computers running nothing more than a browser. Even Microsoft has embraced the new reality of cloud computing, demonstrating a willingness to eliminate the complexity of its current operating system and application environment.

In the world of IT, simplicity is often more reliable, more secure, and more usable. Whenever I'm tempted to add complexity to address the needs of a few customers, I remind myself that Less is More. Per the Alan Perlis quote above, we should all strive to be geniuses!

Monday, October 27, 2008

The Return on Investment of EHRs

An experienced clinician recently emailed me, lamenting that Electronic Health Records do not have a return on investment i.e. doctors buy them, lose productivity, and do not get paid incrementally to justify the acquisition and use of EHRs.

This is indeed a complex issue. My answer to this clinician is below. I thought you'd find it interesting:

"The challenge is how to calculate Return on Investment i.e. who spends and who gets the return.

The literature suggests the e-prescribing reduces costs for pharmacies, payers, and providers (by reducing the burden on administrative staff to process renewals)

Decision support in an EHR results in better coordinated, appropriate, and less redundant care. However, it may be that the clinician pays for the decision support but the payer benefits from it through reduced claims.

To me, the Healthcare system (payers, providers, patients, employers, labs, pharmacies) achieves a substantial ROI through the use of EHRs which keep patients healthy and thus reduce costs.

To make the equation work, payers, hospitals, pharmacies and other beneficiaries of savings have to gainshare i.e. share the ROI with the providers. At BIDMC, I'm paying 85% of the implementation costs of EHRs for community physicians to better align incentives i.e. doctors do the work, the healthcare system benefits from care coordination and hospitals as one of those beneficiaries can subsidize costs.

Hopefully Medicare will start subsidizing EHRs so that the ROI is better aligned as I suggested in my letter to the President.

Soon, we'll have the outcome from the analysis of the Massachusetts eHealth Collaborative Project, the $50 million dollar BCBS pilot to implement medical records in 3 cities. I anticipate that it will conclude EHRs should be implemented to control costs, however the economics of how to pay for EHRs and encourage their ongoing use may require a novel scheme to 'redistribute the wealth'."

Friday, October 24, 2008

Cool Technology of the Week

At BIDMC and other Caregroup hospitals, auditing is a critical component of HIPAA compliance and ensuring patient privacy. We currently have 1 billion rows of audit data from 146 mission critical clinical applications. Our comprehensive audits of every clinical lookup yield 300,000 – 500,000 transactions per day. HIPAA requires an audit system to record who is looking up what, where and why. We need to keep these audit logs for 20 years.

The graphic above describes the unique approach we've taken with Microsoft SQL Server 2008 Enterprise Edition to implement a federated audit system that consolidates all our audit logs from multiple SQL Servers and non-SQL sources into one place. We use a SQL Server Integration Service (SSIS) package every 15 minutes to fetch through the Audit files and upload the data to Central SQL Audit DB Repository to capture:

i. Server level: all login in/out/failed events, and server configuration changes

ii. Database level: Create/Alter/Drop db events

iii. Object level: Create/Alter/Drop object events

iv. Data level: Insert/Update/delete and select events (we didn’t enable Select events in phase I)

Then, we use SQL Reports to query and view the audited data (i.e. who made this change, who modified a table, who insert/update/del a record)

Our next step is to process all audit data with SQL Server Analysis Services, create cubes to analyze the collected data, and build reports/alerts based on threshold (e.g. on average there are 10,000 logins/day, an alert will raise if we exceed the threshold)

Microsoft will be releasing soon a Compliance SDK on Security and Auditing based on their collaboration with BIDMC's SQL team. The SDK will be available for download so that other companies can use our Auditing solution as a model.

Creating an enterprise tool for consolidated storage, reporting and alerting of all application audit data - that's cool!

Thursday, October 23, 2008

Dear Mr. President

Technology Review, a great publication from MIT, asked me to write a letter to the incoming President, summarizing the healthcare IT agenda for the next administration. Here's what I wrote:

Dear Mr. President:

As you know, the United States is spending 16 percent of our gross domestic product on health care, a percentage that is likely to rise. That might be reasonable if we were getting correspondingly high quality, but we're not. While we have some of the best individual-care facilities in the world, our system does not rank well against other industrialized nations on basic health measures.

Health-care information tech­nology is one of the major tools the United States can use to constrain cost increases and enhance quality. To date, the U.S. has adopted electronic health records (EHRs) at a much lower rate than most other industrialized nations, including Germany, Canada, the United Kingdom, and Australia. The U.S. spends 43 cents per capita on health-care IT, compared with $193 per capita in the U.K.

Incentives to introduce EHRs and a compelling business case for continuing to use them are crucial to getting the technology adopted on a wide scale. In the outpatient setting, implementing a system of EHRs that providers can easily share costs those providers $40,000 to $60,000. Yet most of the benefits go to payers and purchasers--often the U.S. government. To fix the misalignment, the government should offer incentives directly to providers.

We need to be careful, though, about what actions the government takes. A recent Congressional Budget Office report concluded that imposing penalties for failing to adopt health IT would be more cost effective than providing financial incentives. Primary-care physicians in the U.S. are already struggling with high costs and low reimbursement. Asking them to comply with another unfunded mandate based on penalties rather than incentives won't solve the problem, because it doesn't acknowledge the underlying economic misalignment that has discouraged adoption in the first place. The result won't be more EHRs; it will be fewer medical students choosing primary-care careers, which will fuel even greater increases in health-care costs.

I recommend a three-point plan for your administration:

(1) Provide incentives through Medicare for the adoption and use of EHRs. Target these incentives so that cost savings are shared with clinicians.

(2) Encourage insurers to provide incentives for hospitals to adopt CPOE (computerized physician order entry). This technology, which lets physicians communicate treatment instructions electronically, is the most important tool hospitals can introduce to improve their safety, quality, and efficiency of care.

(3) Continue to provide federal funding for technology and policies that encourage interoperability between health-care providers.

If we coordinate the care of all Americans and ensure that every person has a lifetime electronic record, we will enjoy safer care at a reasonable price.

Wednesday, October 22, 2008

Signature projects for 2009

I've written about the projects that will keep me awake at night and those which are part of 2009 operating plans.

As we approach 2009 there are 10 projects that I consider high risk and high gain. I call them my "signature" projects. These projects have the potential to radically transform the way we work at Harvard, CareGroup, and throughout Massachusetts. Although many are nascent ideas and have been assigned to my most senior direct reports with the notion that they have partially defined scope and vision, I am committing my time and reputation to their success:

1. High Performance Computing for the Northeast - By creating a large, community-based 3000 CPU core resource with flexible federated storage options and software licensed at low cost via economies of scale, we can accelerate collaboration among Harvard schools, Boston University, Northeastern, and the numerous industry-based researchers in our region. We've kicked off the project and are now assembling a governance committee to guide the effort.

2. Social Networking for research - By bringing together the community to identify collaborators, mentors, and resources via social networking, we can accelerate innovation. We've launched Catalyst Profiles but now need to enhance it with novel tools that provide additional value to stakeholders such as an eBay like marketplace for exchange of tools, talent and technologies among our faculty. Our success will be measured by the number of users embracing our social networking tools.

3. A hosting center for Electronic Health Records supporting all the doctors in our community - We're already live with our first sites and our success will be measured by our ability to implement electronic health records for every member of the BIDMC community by the end of 2010.

4. A suite of healthcare information exchange solutions - It's clear to me that Healthcare Information Exchange is not a one size fits all project. We've had success in Massachusetts by creating centralized repositories for cities (Newburyport, Brockton, North Adams) and also creating secure peer to peer summary exchange via the internet. We're working on a suite of solutions in 2009 including linking together all private practices within a physician's organization, linking together hospitals, and linking together cities. Each of these solutions requires a different set of tools and a different balance of central/local data stores.

5. Eliminating paper with automated clinical documentation - Although BIDMC is largely electronic in all ambulatory areas, inpatient progress notes are still handwritten. Our goal is to be 85% electronic throughout the hospital by 2010.

6. Supporting virtual teams and flexible work arrangements - As the economy slows, commuting and travel expenses become more painful. Also, there is pressure to reduce the overhead costs of space, parking and utilities. Flexible work arrangements and virtual teams can enhance productivity, reduce costs, and increase employee satisfaction. Thus I will continue rollout infrastructure and policies supporting virtual teams in 2009. The metric for success is the number of mobile workers we support.

7. Implementing iPod Touch/iPhone 3G and Amazon Kindle applications for enhanced mobile computing - Clinicians are mobile knowledge workers and their productivity depends upon being connected to the applications they need regardless of their location. Although we're already supporting Blackberry, iPhone and wireless connectivity to our applications, I hope to enhance our mobile application support, likely through an initial pilot of an iPhone compatible Emergency Department workflow tool.

8. Deploy a suite of web-based tools which support business process automation - 2009 is all about workflow. Rather than deploy a variety of niche applications supporting administrative needs at Harvard Medical School, I'd rather implement an integrated set of collaboration tools which support the workflow of committee meetings, faculty selection processes, record keeping, and information sharing. I'm working on a requirements analysis for this suite of tools over the next 60 days.

9. Create a cloud of life cycle managed storage for enterprise image management - At Beth Israel Deaconess Medical Center and Harvard Medical School, the demand for tiered storage - fast/high availability/expensive, medium performance/mostly available/$1 per gigabyte and low performance/less available/really cheap storage continues to escalate. Over the next 6 months, we'll study many products from many vendors so that we can offer storage solutions approaching a pedabyte at a cost the community can afford.

10. Working with HITSP to embrace a Service Oriented Architecture approach to interoperability - Over the past three years, the Health Information Technology Standards Panel (HITSP) has worked on content/vocabulary and transport standards for electronic health data. In 2009, we'll work on transaction orchestration standards. HITSP has formed a working group to evaluate the value of embracing a Services Oriented Architecture (SOA) for all HITSP interoperability specifications. If this moves forward, HITSP standards are likely to become plug and play, enhancing the interoperability of healthcare data between organizations.

These 10 signature projects are where I'll spend most of my time in 2009, managing change, politics, and innovation at BIDMC, Harvard, MA-Share/NEHEN and HITSP.

Monday, October 20, 2008

The Personal Genome Project

Today at 4pm, my genome will be released publicly at the Personal Genome Project (PGP) Site.

As part of the PGP's first 10 participants, I contributed my entire medical record, phenotype, and genotype in the hope that this data will support research to enhance personalized medicine for future patients.

The first analysis of my genome reveals:

1. I carry a mutation for Hereditary Motor and Sensory Neuropathy with Optic Atrophy (HMSN VI), also known as Charcot Marie-Tooth disease. Specifically, the base pair change is

Chromosome 1, MFN2 (mitofusin 2 protein)
HEREDITARY MOTOR AND SENSORY NEUROPATHY VI
H T V R A K Q
Reference: CAC ACG GTC CGG GCC AAG CAG
Me: CAC ACG GTC TGG GCC AAG CAG
H T V W A K Q

My father has had Multiple Sclerosis for 18 years and thus my family has had many discussions about neurological disorders. HMSN typically affects patients in their childhood and thus far, no one in my family or me has been directly affected.

2. I'm heterozygous for Severe Combined Immunodeficiency Disease (Boy in a Bubble Syndrome)

Other than 2 episodes of Lyme disease, I've not had any infections requiring treatment nor has my daughter.

3. I have 2.23 times average risk for Prostate Cancer.

The papers about this particular mutation studied two simultaneous mutations and I only have one. Thus, it's unclear if a single mutation has the same risk as two.

4. I have a no Kell antigen which could have implications for future blood transfusions. If I was ever transfused, I could develop antibodies against Kell antigens that could cause a transfusion reaction upon a second transfusion.

5. I have several mutations which put me at increased risk for Tuberculosis.

During residency, I led the TB service at Harbor UCLA Medical Center. Several of my fellow residents developed positive PPD's, but I did not seroconvert. Thus, after extensive exposure, I've remained PPD negative, so I appear to be doing well despite the genetic risk.

What does all this mean?

1. I will certainly be aware of any neurological or ocular findings in any family member

2. My PSA is .4 and my prostate exam is completely normal. I will take any changes in prostate health more seriously than before.

3. I've encouraged other members of my family to get involved as future PGP subjects. Is there any relationship between my father's MS and the mutation I carry which causes Hereditary Motor and Sensory Neuropathy? Is there any relationship between the mutation for Severe Combined Immunodeficiency Disease and my mother's Celiac disease? These and other research questions will be possible as more people, including those in my family, contribute their lifetime medical records and genomes to PGP.

I'll share all my experiences with the Personal Genome Project and the release of my genome via my blog. You'll also find a podcast on the BIDMC website.

Laptop Encryption

In my recent blog about the new Massachusetts Data Protection regulations, I described Section 17.04 subsection (5) which requires "Encryption of all personal information stored on laptops or other portable devices;" by January 1, 2009.

At BIDMC, we've researched several solutions and have chosen McAfee Endpoint Encryption (formerly SafeBoot Encryption) to ensure we comply with these new regulations.

We've done a comprehensive analysis of the application, which I encourage you to download.

In summary, the impact of encryption on disk write and read is so small that users cannot perceive any change in performance.

How will we implement the product?
Today, we have asset control software which lists all laptops received through IS Inventory Control. These records make it easy to contact customers and schedule to have their laptop hard disk encrypted. During that visit, we will teach them on how to use the system with the encryption software on it. On average, we're experiencing a one time 2.5 hour encryption time. This varies depending upon the speed of the processor, amount of RAM and the size of the hard disk. The encryption can also be removed if necessary, but it will take approximately the same amount of time to decrypt the hard disk as it took to encrypt it. Decrypting must be done by IS.

What about support?
From a support perspective McAfee Endpoint utilizes an enterprise control console and if passwords are forgotten, encryption access can be reauthorized by contacting IS. We've found the support effort to be less than other products we've investigated lately such as Seagate Full Disk Encryption that we looked at recently.

What are the challenges?
Currently there is no McAfee Endpoint solution for Apple products. McAfee is currently working on a solution and they are hoping to have it released some time next year. Since McAfee Endpoint encrypts the entire hard disk and the encryption drivers must be loaded to decrypt the hard disk, Windows emulator solutions for Mac OSX such as Fusion or Parallels will not work.

Thus, based on our research, the McAfee encryption solution addresses our requirements for protecting 1000 laptops to ensure compliance with the new Massachusetts Law by January 2009. We'll complement this software solution with education to ensure users avoid storing protected health/identified information on mobile devices whenever possible.

Friday, October 17, 2008

Cool Technology of the Week

A number of new mobile devices are entering the market - the Amazon Kindle 2, the Google G1, and the Blackberry Storm. Each has its own ideal uses and unique technologies. I'll test all of them by the end of the year and report the results.

I am a minimalist and carry few gadgets. My only current personal devices are a Macbook Air and a Blackberry Curve.

I've tried the iPhone 3G and although it's a remarkable device, the lack of a tactile keyboard makes it less than ideal for my daily high volume of email.

Although I have not yet tested the Blackberry Storm (the Verizon/Vodaphone model should arrive in my office for testing in a few weeks), I am impressed by the design which includes a tactile touch screen.

What's a tactile touch screen?

The user distinctly feels the screen being pressed and released with a gentle "click," similar to the feeling of a key on a physical keyboard or a button on a mouse. The "clickable" touch-screen, which Blackberry calls ClickThrough, gives the user positive confirmation that they have made a selection and the result is an enhanced touch interface and a more intuitive typing experience. ClickThrough lets you depress any portion of the screen to make a selection so it feels like there’s actually a button below your finger, thanks to mechanical switch suspension below the surface of the display that lets each press feel like a separate button, even though the entire screen moves as though it is one large button.

The Storm contains an accelerometer and when the device is tilted horizontally, a full keyboard appears rather than then tiny compressed keyboard of the iPhone or the "cell phone-like" SureType keyboard of the BlackBerry Pearl. The fact that Blackberry Storm enables a full sized keyboard in horizontal mode with tactile feedback could make the touch screen usable for high volume email users.

I'll report back as soon as I test it, but the idea of a touch screen Blackberry with a full size tactile keyboard is cool!

Thursday, October 16, 2008

The Books on My Nightstand

It's Thursday so it's time for a blog entry related to my personal life. In my blog profile, I've listed some of my favorite books - The Omnivore's Dilemma, The Cactus Eaters, and The Road to Sata. I'm a voracious reader and spend the late evening hours reading multiple books. Here are the 10 books on my nightstand tonight:

Michael Moore's 2008 Election Guide by Michael Moore - the always irreverent Michael Moore (Roger and Me, Sicko etc.) offering his insight into the 2008 presidential campaign

Roughing It by Mark Twain - One of Mark Twain's most popular books about his early life exploring the Gold Country and Eastern Sierra of California, the same locations I've spent my summers for almost 30 years

Waiter Rant by The Waiter - A hilarious tale of real life experiences in restaurants. I'll never eat out again

The Nature of Nature Edited by William H. Shore - an anthology about some of the greatest joys of nature and challenges facing the environment today.

In Defense of Food by Michael Pollan - relearning how we eat in a world filled with high fructose corn syrup and overprocessed foods

The Assault on Reason by Al Gore - an insider's look at how Washington and our political system functions/dysfunctions

Plenty by Alisa Smith and JB MacKinnon - a yearlong journey devoted to eating regionally and sustainably.

The Devil's Whisper by Miyuki Miyabe - a murder mystery by Japan's leading mystery writer

Beyond the White House by Jimmy Carter - an autobiography of our nation's greatest living "post" president describing his life after the White House

These are all great reads and I recommend them highly.

Wednesday, October 15, 2008

Another Success for Virtual Work Teams

I believe that virtual work teams/flexible work arrangements, as I've written about previously, are even more essential in a challenging economy with increasing pressures to improve efficiency and reduce costs.

My friend, Jessica Lipnack, recently did a webcast for the American Management Association about "Leading Virtual Teams."

Over 1000 people attended, demonstrating the overwhelming interest in this topic.

Here's my most recent evidence about the success of working virtually. Starting in July of 2008 we began scanning all inpatient paper charts and making them available online to all our clinicians and coding teams.

Clinicians have been extremely satisfied with an approach that makes all consultant notes, nursing notes, vital signs and outside paper records (from non-BIDMC referring sites) available to care teams electronically.

The most immediate impact has been on the Health Information Management (HIM) department which applies ICD-9 diagnosis codes to hospital stays for billing purposes. Medical record coders are hard to hire and retain. With scanned charts available online, we can create virtual teams of coders located anywhere in the country.

The graphic above illustrates the impact on accounts receivable since we created virtual teams of coders. The measurable results are

Faster coding of recent discharges (48-72 hours)
Coder recruitment and availability – no vacancies
Coders more likely to work extra hours when needed
Reduced days in accounts receivable

As Jessica suggests, virtual teams work. Yes, management needs to adjust to new ways of ensuring accountability and a bit of infrastructure is required to enable collaboration, but virtual teams have significantly improved HIM business processes in just a few months.

I've committed to all of my organizations (BIDMC, HMS, NEHEN, MA-Share, and HITSP) that we'll use virtual workteams and online collaboration tools whenever possible in 2009. Who needs those frequent flier miles anyway?

Tuesday, October 14, 2008

The Kindle for Medical Education

I've described the iPod Touch as the next great technology for medical education, but the Kindle is also a device with great potential.

We've recently implemented Kindle support for all our 20,000 educational resources at HMS.

Our integration on the Mycourses educational website enables any Word or PDF document to be delivered to the Kindle wirelessly. There is a cost which is clearly explained to the user (10 cents per document to Amazon). Those that don't want to pay the 10 cents can download documents to their PC and transfer the documents via USB cable. Once the user enters their Kindle account into the MyCourses Kindle setup page (accessible via our resources page or the GoMobile page), any resource which can be sent to the device has a little icon and label "My Kindle" which when clicked sends the resource to the Kindle. It does this by sending the document to the Amazon account via email attachment which then gets converted into Kindles's specific format and delivered to the device using Sprint's Whispernet.

HMS is the first Medical School to offer such a green alternative to all of their compatible resources to be downloaded directly to an eBook. At some point it would be nice to bypass the 10 cent fee with some utility that allows us to send to the device, but it's a reasonable cost when you consider that Sprint is giving Kindle users free internet.

We're rolling this out by giving a few students free Kindles to pilot the new Mycourses functionality.

I'll report back how it goes. Since we spend $50,000 a year on paper for printing course documents, I hope it is successful!

Friday, October 10, 2008

Cool Technology of the Week

Today's post is not about endorsing a specific candidate, it's about the cool use of social networking technology. As soon as McCain and Palin introduce a cool new social networking application for the iPhone, I'll write about it!

I've written extensively about social networking in many venues. We've embraced social networking at Harvard and its affiliates for everything from patient-doctor communications to sharing ideas among researchers.

Barack Obama's campaign has released a very innovative social networking tool for the iPhone that supports the Obama compaign. It inspired me to think about potential applications for similar tools in the research and academic environment.

It's called a "volunteering tool" on the website, but I can see this as a research collaboration tool. It has people grouped by battleground state (think research interest), videos/photos of campaign events (think lecture videos), news and events both national and local (think Harvard-wide vs individual institution), and call stats (think the network analysis to provide metrics of social networking success).

The iPhone SDK has a steep learning curve, but given the hundreds of iPhone apps already available in the App Store, it's clear that the iPhone has become a premier handheld computer environment for innovative software. All apps developed with the SDK run on the original iPhone, the iPhone 3G, and the iPod Touch.

My hat is off to the creative folks at the Obama campaign for an inspirational application that connects people and provides updated campaign information from the convenience of a phone. That's cool.

Thursday, October 9, 2008

Staying Warm in New England

Now that the leaves are falling and frosts are beginning in New England, it's time to retire my summer wardrobe and prepare for the cold, wet, harsh seasons ahead.

Every year, numerous people die in New Hampshire's White Mountains die from hypothermia. It's already snowing on Mt. Washington.

To understand how to keep warm when the weather outside is frightful, you first have to understand how the human body loses heat - conduction, convection, evaporation, radiation and respiration.

Conduction is heat loss when the body comes into direct contact with a cold object - snow, a metal trekking pole/ice ax, or a cold rock.

Convection is heat loss when air or water passes by the body such as a brisk wind passing by the surface of the skin.

Evaporation is heat loss when moisture on the body becomes airborne - either sweat or rain water on wet clothing

Radiation is heat loss that occurs when heat escapes directly into the still air

Respiration is heat loss that occurs when breathing air that is colder than body temperature that is warmed by the body then exhaled.

Here's my strategy for avoiding hypothermia from any of these causes.

During Fall/Winter/Spring I wear:

A torso base layer of thin polyester (Arcteryx Rho LT)
A torso shell layer of Gortex Pro Shell (Arcteryx Alpha LT)
A lower extremity combination of thin insulation and shell (Arcteryx Gamma MX)
A head base layer of thin polyester (OR Ninjaclava)
A warm, windproof hat (OR Windpro Hat)
A hand base layer of thin polyester (OR PL100 Gloves)
A waterproof/windproof shell layer (OR Cornice Mitts)
A belay jacket on my upper extremity when I stop moving (Arcteryx Solo)

How do these work?

Conduction - the lower extremity insulation layer slows conduction if I sit on the snow or on a rock. I do not lie down on snow, so my upper extremity does not contact anything cold directly.

Convection - the upper extremity Gortex layer and the lower extremity wind proof softshell minimizes an wind related heat loss

Evaporation - you'll notice that I do no wear any upper body insulation - just a wicking layer of thin polyester. This ensures I do not sweat, even when climbing thousands of feet. I may be a bit cold when I start, but as I climb, the exertion keeps me warm without sweat, minimizing evaporative heat loss

Radiation - my multilayered hats prevent significant radiant heat loss since greater than 50% of the body's radiation occurs through the head. The Belay jacket, which I put on over the Gortex layer when I stop moving, minimizes radiant heat loss.

Respiration - there's really nothing I can do to eliminate heat loss due to respiration. However, I carry 1 liter of boiling water and 3 Lara Bars to keep myself fed and hydrated. I carry the hot water in nalgene polyethyelene container (BPA free) wrapped in an OR water bottle parka Adding warm liquids to my body during an ice climb or hike maintains my body core temperature.

Dress right, eat right, keep moving and no matter what the temperature, you'll keep warm.

Remember, in New England there is no bad weather, just poor clothing choices!

Wednesday, October 8, 2008

The Northeast Biomedical High Performance Computing Collaborative

Today at the 2008 Harvard High Performance Computing Summit, we launched a bold new initiative - The Northeast Biomedical High Performance Computing Collaborative.

Massachusetts is a unique place that fosters collaborations. Whether it's the Clinical and Translational Science Awards or the New England Health EDI Network , we seem to be able to put aside our competitive tendencies and share intellectual property for the benefit of all.

Given our success with High Performance Computing at Harvard and the number of institutions needing biomedical computing resources in the Northeast, the formation of a New England-wide High Performance Computing collaborative makes great sense.

We assembled a national group of experts from industry, other high performance computing centers (Texas, San Diego, Germany, Virginia), and academia to discuss our vision. Here's our strawman plan:

Background
Biomedical Informatics is at an exciting cross roads: the computational challenges facing researchers, clinicians and public health professionals now exceed the computational power typically available in an academic biomedical setting. This is exciting because it means that the advances in high performance computing from other disciplines (e.g. physics) can be brought to bear on the great challenges of life sciences, health and medical research. The opportunities to develop new therapies, monitor trends in ambulatory hospital data and catch and avert drug related mishaps (e.g. Vioxx) are truly astounding. With the advent of the $1,000 “ome” (genotype, phenotype, labs) – the capacity to analyze and predict longitudinally and in real time as well as the ability to hypothesis test retrospectively will challenge the computational boundaries of all biomedical research organizations. Computational power is now at the very core or our ability to rapidly advance the state of clinical care and healthcare.

As these computational challenges have risen, however, Harvard, and the Northeast in general, have not taken the lead in high performance computing. Partly because of the limited importance computing has played until recently, it has been a secondary consideration at best. As a result Harvard and the Boston/Cambridge biomedical community as a whole are not viewed as being at the forefront of the computational side of biomedical research. Moreover most researchers have had to resort to under the desk solutions run by well meaning but ill prepared post‐docs – struggling to manage growing computational needs. From the pedabytes of data that will be rolling off of instruments to the millions of daily points of data from the dozens of institutions that will be participating in the new CTSA, it is imperative that the institutions in the Northeast come together to quickly remedy these deficits and move the Northeast to the forefront of HPC infrastructure in the country.

The Vision
The Northeast High Performance Computing Collaborative will be a new institution that will sit under the Harvard Medical School administratively but will have governance drawn from the participating research organizations. Under the overall direction of a rotating chair‐person , the organizations will be led by a board of directors and a small administrative team. The mission of the collaboration will be to “Provide and facilitates access to shared computational infrastructure for biomedical research and reporting and to create and nurture a community of shared experiences, tools and systems across the academic, biomedical, public and private sector users of biomedical HPC tools”. Funded through a combination of philanthropic support, vendor dues, vendor equipment grants, grant funding and user fees, the Collaborative will conduct the following activities
1) Host and run the NE Biomed Collaborative Research Cluster ‐‐‐ this 3000+ core cluster will highlight the participating vendors technologies and will be available to any researcher affiliated with any of the participating organizations
2) Create the NE Biomed Grid – this grid will link together clusters from participating institutions as well as existing grids such as NSFnet and Amazon EC2
3) Help to facilitate the wide scale adoption of the CTSA adopted credentials to create open and shared exchange and collaboration
4) Create a web based repository of shared information and connections for the Northeast Biomed HPC community including an HPC wiki, research software mirrors, and data sharing tools
5) Make available a team of bioinformatics experts at very low cost to researchers. Vendors such would fund positions within this group that would provide special expertise in their tools as well as general tools. In addition several senior bioinformatics staff would be available on a fee basis
6) An internship program which would couple accomplished young graduate students in computer sciences and bioinformatics with researchers needing help optimizing their code for the cluster
7) Provide a research testbed for cutting edge research and technologies
8) An annual HPC outlook report which will provide data on the state of biomedical HPC
9) Conduct the annual Biomed HPC Summit

Objectives
1) Provide CPU cycles to researchers
2) Provide a repository of shared tools and resources for the entire Boston based Biomed community
3) To create an open grid across all of Harvard, the Hospital affiliates and related academic institutions
4) To facilitate credentialing of individuals and access to data across institutional barriers
5) To provide a gateway to computational networks such as NSFnet
6) To allow industry‐academic partnerships
7) To create a test bed within which commercial companies can test their latest technologies
8) To allow PI’s access to the latest and greatest technologies
9) To provide bioinformatics support for researchers

Our next step is to refine this vision, formalize the governance of the group, and prioritize the requirements of the stakeholders. This project will be one of my most passionate causes in 2009.

Tuesday, October 7, 2008

The Work of Worry

As I've taken on more responsibility for more organizations, I've discovered that more authority does not lead to more power. It leads to more responsibility. Translated into a simple statement - when everything goes right, many people get the credit. When anything goes wrong, the leader is responsible.

This creates, what I call the "Work of Worry."

The burden of ensuring that every aspect of your job - human resources, budgets, customer service, reliability, security, and strategy are optimized requires constant vigilance and daily management attention.

For example, each night before I go to sleep, I mentally run through every one of my direct reports and ask myself what issues are unresolved, what projects are going off track, what budgets are at risk, and what strategies need adjusting. I make a list and then sleep on it. In the morning, when I'm refreshed, I send out email and schedule meetings to address everything on the list.

This means that no issue remains unaddressed for more than a day. There may be a multiweek process needed to resolve some issues, but at least that process is initiated in a timely way.

When leadership is not a job, but a lifestyle and every aspect of the organization's performance becomes the responsibility of the leader, the work of worry can be intense. It can become challenging to balance responsibility/anxiety with family life, free time, and maintaining a positive mood.

So if you are thinking that your leader's work is not always visible, consider the time that is spent on the "Work of Worry" and ensuring that the organization does the right thing, all the time.

Monday, October 6, 2008

Massachusetts Data Protection regulations

On September 19, 2008, the Massachusetts Office of Consumer Affairs and Business Regulation established significant new regulations, 201 CMR 17.00: Standards for The Protection of Personal Information, which affect how all Massachusetts organizations protect confidential data.

The Boston Globe’s Business Section featured an article titled “Tougher Consumer Data Rule Adopted, Businesses must improve safeguards."

The deadline for compliance is January 1, 2009. (This has been revised to March 1, 2010).

Like all regulations the cost/effort of implementation is dependant on how stringently we choose to interpret them. Putting aside the physical security portions of the regulations and focusing on the electronic/IT portions there are several areas that we are working on. To follow these to the letter of the regulation will require additional capital and labor. We do not yet have estimates, since we're in the planning phase now.

I have included below the sections of the regulations that I think will impact us the most.

section 17.03 subsection C - This states that there needs to be an explicit policy that governs how employees are allowed to keep, access, and transport records containing personal information outside of business premises. This has two components, electronic records and physical records. We are reviewing our policies and procedures to close any gaps we may have.

section 17.03 subsection E - This states "Preventing terminated employees from accessing records containing personal information by immediately terminating their physical and electronic access to such records, including deactivating their passwords and user names." We have kicked off a project to address this point. since our existing processes take a few hours rather than immediately.

Section 17.03 subsection H. This refers to vendors/third parties who are provided access to our information and/or are obtaining copies of any data from us. It requires that we obtain a written certification that the third party has a written, comprehensive information security program that is in compliance with the provisions of the regulations. There may be a need for some capital expenditures late in the FY09 year. We first need to build up a policy, educate and determine an auditing technique before pursuing any product based solutions.

Section 17.03 subsection H This section requires that we know where every paper and computing system including laptops and portable devices (portable devices are not defined in the regulation so it is unclear if this includes handheld devices ) are located that contain personal information. To conform to the regulation we will need to put some additional vended solutions in place and labor to operate them.

Section 17.04 subsection (3) This requires reasonable monitoring of systems for unauthorized use/access to personal information. We do this today.

Section 17.04 subsection (5) This section states "Encryption of all personal information stored on laptops or other portable devices;" We have just started to roll out encryption for laptops. The question of what is a portable device is a challenge. It could mean USB drives, Blackberries and cell phones. We're working through the implications of that.

We spend over a million dollars per year for IT security. This only includes expenses that are purely security related. There are other
costs embedded in the software and hardware. For example, when we purchase a server operating system, data base product, or network router, the manufacturers have expended effort making these products secure.

As you can see, the regulations will involve a great deal of planning, the addition of new staff and the purchase of new software to ensure compliance. We are committed to protecting the privacy of patient records, so adding additional resources to enforce privacy policy with technical security is a "must do".

Friday, October 3, 2008

Cool Technology of the Week

I recently experimented with Picasa's (Google) new face recognition software. For me, it's 90% sensitive and specific.

The face-recognition feature is called “name tags” and it automatically group photos together based on the people in them.

Here's how I tested it.

All the photos used in my blog are uploaded to Blogger, which stores them on Picasa (since Google owns both sites). I clicked the Add Name Tags button on the right side of the screen and the Google server began sorting my 70 megabytes of photos. It clustered shots together based on its best guess of the people in them. It identified three major people in my photo collection - me, my wife, and my daughter. It then asked me to tag these three folks and then clustered the shots by name into easy to navigate slide shows of similar faces. A few photos such as the shot of Ney player Rifat Varol from Istanbul did not cluster and I clicked the skip button to ignore these outlying photos. A few photos of me in profile did not sort automatically and I labeled them manually.

How does it work? The software performs a few basic measurements such as the relative distance between a person's eyes and nose/nose and ears.

With the new Picasa site, you can also choose to make your public albums searchable. This lets people locate your public photos by searching Picasa Web Albums or other Google services such as Image Search. Or you can keep you albums private by clicking on the Unlisted radio button.

I used a Mac with Firefox to test all these features. I'm told that the site also supports a Windows-only upgrade to Google’s Picasa photo management software, which offers photo-editing capabilities, but I did not test this.

Automated face recognition on Google's photo management site. That's cool!

Thursday, October 2, 2008

Rethinking our Food Supply

Every day the news is filled with new stories of foods recalled worldwide because of the Chinese milk scandal - powdered tea mix, chocolate, cookies, cheese etc.

Given the popularity of pre-prepared foods and the overly processed fast food American diet, do we really know what is in the foods we eat or where they came from?

In the quest for "shareholder value", you can bet that the large food processing companies in the US are using raw materials imported from China.

Food recalls, tales of poisoning, and the rise of "frankenfoods" (genetically modified, overly processed foods that your Grandmother would not have consumed in her lifetime), have motivated many people to rethink the foods they eat. Instead of purchasing pre-prepared meals or buying produce from South America, they are buying regionally from small farmers and producers, as I described recently in my Locavore blog.

To take this one step further, some people are even making their own foods from local raw materials. This week, my family and I started making our own soymilk and tofu from soy beans.

A SoyQuick appliance includes a grinder, a heating element and computer control to transform soaked soybeans and water into soymilk in under 20 minutes. We select the soybeans (organic) and the water (our local well water, filtered), so we know that the finished soy milk is unadulterated with any chemical and is very fresh.

Once you have fresh soymilk, making tofu is easy. Unlike cheese, tofu is not a cultured product, it's just the curds of coagulated soymilk, pressed and drained. We simply add 2 teaspoons of Nigari (Magnesium Chloride produced from seawater after the sodium chloride has been removed, and the water evaporated.) to a batch of soymilk, wait 20 minutes, then press the curds through cheesecloth.

Soy Yogurt is easy to make as well. Just incubate soymilk with non-dairy yogurt culture for 12 hours, and strain (you can use the remaining whey in recipes). Also, Greek-style yogurt (labneh) or yogurt cheese/ can be made from soy yogurt .

In the technological world we live in, it's amazing that many people I know are striving for a simpler life with simpler foods. In my lifetime, we've gone from the ultimate processed foods such as Twinkies and TV dinners, back to making your own tofu from raw soy beans. Hopefully, we'll all live longer and be healthier because of it.

Wednesday, October 1, 2008

Open Source for Healthcare - a Guest Blog

Tomorrow, I keynote the Medsphere meeting in New York City, where I will discuss the Potential and Caveats of Open Source software for healthcare. To prepare, I asked Fred Trotter, a leading expert on free and open source software for healthcare, to comment.

Fred wrote:

The heart and soul of Open Source is and always has been Freedom. That is ideally suited to medicine because Doctors need to be able to leverage Health IT to meet their real needs, not just the needs that can be meet by proprietary business models. I have, like you, been focused on what makes a good EHR for years, and I have to admit, I still have no idea. The question itself is unfair, its like asking: "What makes a good car?" The answer will always be: "It depends on how
you want to use it."

Open Source gives doctors the opportunity to get 80% of what they need from a common codebase and then make sure that the 20% that they uniquely need is actually done right. The proprietary alternative is always one-size-fits-all to a certain extent. "Real Profit" in the proprietary vendor business model comes when you can give 1000 doctors exactly the same software, over and over again. This creates a feature-to-funding mapping problem. Proprietary vendors only fund the development that they see will be able to be sold over and over, in a cookie cutter fashion. Features that would improve care, but cannot be copied in this fashion fail to appear.

Consider something as simple as oncology. An EHR in a hospital or a practice is normally designed to help find the diagnosis, but for an oncologist, the diagnosis is already well-understood. The oncologist is looking at the same information seeking the best combination of treatments rather than the diagnosis.

The difference is not just in "what" information is being tracked, that is always slightly different between any given specialty. Rather, it is a whole new way to approach the same information. I know that you could probably come up with 10 different examples of this kind of "non-trivial rethinking" needed for specific issues. For a given doctor, specialty, or even patient the design of the software may need to be turned on its head. That kind of flexibility only comes with source-code access.

You will be keynoting a company that supports VistA. So it is critical that you consider carefully the simple question "Why is VistA good?" It is not a trivial issue.

I am also the primary author of "What is VistA really" on the WorldVistA wiki site

But what are the draw-backs of open source?

It is poorly understood. It is nothing less than the answer for modern Health IT, but important projects continue to struggle for funding. The problem is that doctors do not have the time to understand either software or software licensing. Most doctors operate under a
hand-shake business philosophy. They think: If this deal becomes unfair, I will just leave. It is a privilege of a profession in high-demand. As a result they do not evaluate software licenses, or
understand the implications of software licensing on the software process. Every software contract that a doctor signs should be point-by-point compared against the GPL. When the (new) Medsphere talks about Freedom, it is not a political promise or jargon, it is in the contract. The licenses that they use to release software essentially makes them co-owners of the software with their clients. I am constantly trying to get doctors to understand that the issue is not what you have to pay for the software now, but what does it even mean to "get software". If I offered to sell you a watch for $1000 or to rent you one for $1000, you would immediately focus your attention on the "rent" vs "own" issue. But for some reason the exact same distinction seems slip past most doctors and hospitals (despite my best efforts to make noise about it).