Now that Spring has arrived, many folks in the country are planning their vacations. I've had numerous requests about the best places to see in Boston. Here's my top 10 list
1. Boston Museum of Fine Arts - check out John White Alexander's "Isabella and the Pot of Basil" (the photo above) and explore the Japanese Temple Room
2. Isabella Stewart Gardner Museum - check out John Singer Sargent's "El Jaleo" in the entryway. Stop for lunch at the Cafe.
3. The Institute for Contemporary Art - check out the Shepard Fairey exhibit
4. The New England Aquarium - check out the jellyfish
5. The Museum of Science - check out the Electrical wing
6. Harvard Square and the Harvard Museums - check out the glass flowers
7. Faneuil Hall Marketplace is a great site for walking. For shopping, check out Prudential Center/Copley Square or Newbury Street.
8. Take a walk in the Boston Common - check out the Granary Cemetery, the Freedom trail, the Boston Statehouse, and the Swan Boats. The Theater district is nearby - check out the Blue Man Group
9. Farther afield, check out Concord and Walden Pond to the West, Plymouth Rock to the South and Salem/Marblehead to the North.
10. I often stroll the many forested lands of the Audubon Society nature preserves. Also, the beaches in Ipswich, Manchester by the Sea, and Duxbury are wonderful spots.
Thursday, April 30, 2009
Wednesday, April 29, 2009
NCVHS Testimony about Meaningful Use
Yesterday, I attended the NCVHS public hearing about meaningful use. Here's the agenda and my presentation.
I've described the importance of meaningful use in prior blog posts.
Much depends on the definition of meaningful use, including the characteristics of the EHRs which will qualify for stimulus dollars, the kind of interoperability we'll implement regionally/nationally, and the policies that will be required to support health information exchange.
My specific testimony included an overview of the interoperability needed for quality.
I highlighted the work of the NQF Health Information Technology Expert Panel (HITEP) which selected 84 metrics supported by 35 data types as an initial minimum dataset for quality measurement in 13 care processes. HITEP II will meet next week to further refine this work into a core minimum Quality Data Set (QDS).
I also highlighted the work done in Massachusetts on data exchange including the Massachusetts eHealth Collaborative Quality Data Warehouse.
My summary of the day, based on the testimony of 25 folks is
1. The country must rollout EHRs with baseline functionality that at a minimum includes e-prescribing, automated lab workflow, clinical summary exchange, and quality data reporting.
2. Health Information Exchanges will evolve locally based on business cases in communities. The services offered may include e-prescribing, diagnostic test results delivery, quality data warehousing, data normalization into common formats and vocabularies, and "convening services" to create data use agreements for the community.
3. Quality warehouses are needed to provide caregivers with rapid feedback and serve as population health registries. They will often be local based on the political feasibility of co-mingling data.
4. Standards will continue to evolve, but existing standards wrapped in a service oriented architecture using a common data transport approach are good enough. We should use clinical data preferentially over administrative data for quality reporting, population health analysis, and PHRs.
5. Policies in support of this technology will continue to evolve locally. Although there should some common national policies, regional variation must be allowed.
Several of my colleagues will testify today. I'll update this blog entry after their testimony.
I've described the importance of meaningful use in prior blog posts.
Much depends on the definition of meaningful use, including the characteristics of the EHRs which will qualify for stimulus dollars, the kind of interoperability we'll implement regionally/nationally, and the policies that will be required to support health information exchange.
My specific testimony included an overview of the interoperability needed for quality.
I highlighted the work of the NQF Health Information Technology Expert Panel (HITEP) which selected 84 metrics supported by 35 data types as an initial minimum dataset for quality measurement in 13 care processes. HITEP II will meet next week to further refine this work into a core minimum Quality Data Set (QDS).
I also highlighted the work done in Massachusetts on data exchange including the Massachusetts eHealth Collaborative Quality Data Warehouse.
My summary of the day, based on the testimony of 25 folks is
1. The country must rollout EHRs with baseline functionality that at a minimum includes e-prescribing, automated lab workflow, clinical summary exchange, and quality data reporting.
2. Health Information Exchanges will evolve locally based on business cases in communities. The services offered may include e-prescribing, diagnostic test results delivery, quality data warehousing, data normalization into common formats and vocabularies, and "convening services" to create data use agreements for the community.
3. Quality warehouses are needed to provide caregivers with rapid feedback and serve as population health registries. They will often be local based on the political feasibility of co-mingling data.
4. Standards will continue to evolve, but existing standards wrapped in a service oriented architecture using a common data transport approach are good enough. We should use clinical data preferentially over administrative data for quality reporting, population health analysis, and PHRs.
5. Policies in support of this technology will continue to evolve locally. Although there should some common national policies, regional variation must be allowed.
Several of my colleagues will testify today. I'll update this blog entry after their testimony.
Tuesday, April 28, 2009
Point to Point Messaging and Persistent Document Exchange
In a recent letter to the HITSP panel describing the interoperability needed for meaningful use, I discussed point to point messaging and persistent document exchange. Here are a few additional details about these approaches.
Point to point does not imply that one EHR is communicating with one recipient via a specialized interface for that interaction. Requiring a custom interface for every connection between two stakeholders would not be scalable. Point to point simply implies that a transient message is sent from a data source such as a cloud computing EHR hosting center to a data recipient such as an e-prescribing gateway, a healthcare information exchange, or payer.
In Massachusetts we use interface engines, gateways such as NEHEN, and community-based health information exchanges such as EHX created by eClinicalWorks to connect thousands of users in dozens of organizations via transient messages.
There has been debate in the informatics community about using point to point messaging as a means of interoperability. Some suggest that all EHRs should have consistent data elements to foster the most complete interoperability. Although a common information model will be helpful in the future, we need to implement "good enough" standards now to improve quality and efficiency in the short term.
Sending packages of content between organizations using a common web-based transport mechanism enables such high value data exchanges such as e-prescribing, lab data sharing, and administrative workflow.
Point to point messaging works very well for secure transmission of a content package between two stakeholders. To ensure that HITSP interoperability specifications using point to point approaches are sufficiently complete to test, we need to be very specific about the transport mechanism, as complete as possible listing the vocabularies/code sets, and as constrained as possible describing the package contents. ONC will soon release a Common Data Transport Extension/Gap document which illustrates the kinds of secure transport transactions we'll need to harmonize.
What are the disadvantages of the point to point approach?
a. It does not work for complex scenarios such as an Emergency Department requesting the lifetime clinical record of a person from all the places their data exists in the country. That requires a master patient index, a record locator service, or a national healthcare identifier. In the short term, there are enough high value provider to pharmacy, provider to provider, and provider to payer exchanges that waiting to solve the unique patient identifier problem is not necessary.
b. Auditing the transfer of clinical records between two organizations based on transient messages may be more challenging than exchanging persistent documents with a non-reputiable time/date stamp and signature.
c. Reconstructing a damaged clinical record by replaying transient messages from an interface engine may be harder than simply reassembling persistent documents.
While point to point messaging uses a transient message from source to destination, a persistent document transfer uses the HL7 Clinical Document Architecture (CDA r2) to transfer an XML document between two stakeholders. That signed document is persisted by the recipient, providing a very clear audit trail about what information was transferred, by whom, and for what purpose. Examples of persistent document exchange include discharge summaries, quality data sets, and population health metrics being sent from one organization to another.
To support meaningful use such as medication workflow, laboratory exchange, clinical summaries for coordination of care, and quality reporting, it's clear to me that we need both point to point messaging and exchange of persistent documents.
I hope this discussion clarifies the kind of short term exchanges that will accelerate interoperability. This morning I'll be in Washington at the NCVHS meeting testifying about meaningful use. I'll post my testimony on my blog as soon as it is delivered.
My personal opinion is that metrics for meaningful use of point to point messaging and persistent document exchange may include
* Using a product that incorporates HITSP specifications and is certified by CCHIT using its Laika tool to validate conformance
* Passing an online test with a vendor recognized as compliant with HITSP interoperability specifications such as Surescripts
* Participating in a production health information exchange organization which incorporates HITSP standards such as NEHEN
Over the next few months, the entire healthcare IT community will engage in a very important dialog which will finalize these details.
Point to point does not imply that one EHR is communicating with one recipient via a specialized interface for that interaction. Requiring a custom interface for every connection between two stakeholders would not be scalable. Point to point simply implies that a transient message is sent from a data source such as a cloud computing EHR hosting center to a data recipient such as an e-prescribing gateway, a healthcare information exchange, or payer.
In Massachusetts we use interface engines, gateways such as NEHEN, and community-based health information exchanges such as EHX created by eClinicalWorks to connect thousands of users in dozens of organizations via transient messages.
There has been debate in the informatics community about using point to point messaging as a means of interoperability. Some suggest that all EHRs should have consistent data elements to foster the most complete interoperability. Although a common information model will be helpful in the future, we need to implement "good enough" standards now to improve quality and efficiency in the short term.
Sending packages of content between organizations using a common web-based transport mechanism enables such high value data exchanges such as e-prescribing, lab data sharing, and administrative workflow.
Point to point messaging works very well for secure transmission of a content package between two stakeholders. To ensure that HITSP interoperability specifications using point to point approaches are sufficiently complete to test, we need to be very specific about the transport mechanism, as complete as possible listing the vocabularies/code sets, and as constrained as possible describing the package contents. ONC will soon release a Common Data Transport Extension/Gap document which illustrates the kinds of secure transport transactions we'll need to harmonize.
What are the disadvantages of the point to point approach?
a. It does not work for complex scenarios such as an Emergency Department requesting the lifetime clinical record of a person from all the places their data exists in the country. That requires a master patient index, a record locator service, or a national healthcare identifier. In the short term, there are enough high value provider to pharmacy, provider to provider, and provider to payer exchanges that waiting to solve the unique patient identifier problem is not necessary.
b. Auditing the transfer of clinical records between two organizations based on transient messages may be more challenging than exchanging persistent documents with a non-reputiable time/date stamp and signature.
c. Reconstructing a damaged clinical record by replaying transient messages from an interface engine may be harder than simply reassembling persistent documents.
While point to point messaging uses a transient message from source to destination, a persistent document transfer uses the HL7 Clinical Document Architecture (CDA r2) to transfer an XML document between two stakeholders. That signed document is persisted by the recipient, providing a very clear audit trail about what information was transferred, by whom, and for what purpose. Examples of persistent document exchange include discharge summaries, quality data sets, and population health metrics being sent from one organization to another.
To support meaningful use such as medication workflow, laboratory exchange, clinical summaries for coordination of care, and quality reporting, it's clear to me that we need both point to point messaging and exchange of persistent documents.
I hope this discussion clarifies the kind of short term exchanges that will accelerate interoperability. This morning I'll be in Washington at the NCVHS meeting testifying about meaningful use. I'll post my testimony on my blog as soon as it is delivered.
My personal opinion is that metrics for meaningful use of point to point messaging and persistent document exchange may include
* Using a product that incorporates HITSP specifications and is certified by CCHIT using its Laika tool to validate conformance
* Passing an online test with a vendor recognized as compliant with HITSP interoperability specifications such as Surescripts
* Participating in a production health information exchange organization which incorporates HITSP standards such as NEHEN
Over the next few months, the entire healthcare IT community will engage in a very important dialog which will finalize these details.
Monday, April 27, 2009
Dispute Resolution in Healthcare
At the recent Health 2.0 conference, I was asked an interesting question. If there is a dispute about any data in healthcare - PHR, EHR, or Health Information Exchange, how is it resolved?
eBay does millions of transactions via the internet and it has automated, web-based dispute resolution workflows. Can healthcare learn something from eBay?
On May 5, I will be attending a workshop in Washington called "Online Dispute Resolution in a Technology-oriented Healthcare World.”
The attendees are evenly split between representatives of the Healthcare, Dispute Resolution and Computer Science communities.
The goals of the meeting are:
*Identify the key risks of disputes in the networked health information technology environment.
*Identify the best practices in avoiding and resolving such disputes and the need for new dispute prevention/resolution approaches in problem areas.
*Identify the computing and other research challenges inherent in supporting these practices.
You'll find a list of attendees and the conference background materials online.
As the recent work with I've done with e-Patient Dave illustrates, Personal Health Records should have a process for resolving data issues. If such a feature would have been built into Patientsite, Google Health or Microsoft Health, we might have identified the issues with administrative data and PHRs sooner.
I will report back next week with lessons learned from the conference, included recommended next steps for the software we use with patients at BIDMC.
eBay does millions of transactions via the internet and it has automated, web-based dispute resolution workflows. Can healthcare learn something from eBay?
On May 5, I will be attending a workshop in Washington called "Online Dispute Resolution in a Technology-oriented Healthcare World.”
The attendees are evenly split between representatives of the Healthcare, Dispute Resolution and Computer Science communities.
The goals of the meeting are:
*Identify the key risks of disputes in the networked health information technology environment.
*Identify the best practices in avoiding and resolving such disputes and the need for new dispute prevention/resolution approaches in problem areas.
*Identify the computing and other research challenges inherent in supporting these practices.
You'll find a list of attendees and the conference background materials online.
As the recent work with I've done with e-Patient Dave illustrates, Personal Health Records should have a process for resolving data issues. If such a feature would have been built into Patientsite, Google Health or Microsoft Health, we might have identified the issues with administrative data and PHRs sooner.
I will report back next week with lessons learned from the conference, included recommended next steps for the software we use with patients at BIDMC.
Friday, April 24, 2009
Cool Technology of the Week
The New England Health EDI Network (NEHEN) and MA-Share are completing their merger over the next month into a new non-profit LLC called the New England Healthcare Exchange Network. The resulting merged organization will provide a single appliance for exchange of benefits/eligibility, referral/authorization, claims/remittance, the full suite of e-prescribing functionality (eligibility, formulary, history, routing, refills), and clinical summary exchange of continuity of care documents.
Many recent articles in the popular press have questioned the cost savings of health information exchange. Here's more data from our experience in Massachusetts.
The quantifiable savings are different for each provider organization depending on what their starting point is, however here are some example of significant savings:
*For Brigham & Women's and Mass General their 'Total Denial Write Off
Rates as a Percent of Net Revenue' has reduced from 3.78% to 0.88% and
from 4.17% to 1.28% respectively.
* For Brigham & Women's and Mass General their 'A/R Days' have reduced from
81 days to 55.6 days and from 99 days to 54 days respectively.
*Since Baystate's go live with NEHEN in January 2007, they have saved over
$1.5M by avoiding per transaction fees.
In general, NEHEN users have experienced the following clinical and administrative benefits
*Reduction of ambulatory medication errors
*Enhanced communication among providers
*Improvement in the Patient Experience
*Reduction in claim denials due to lack of good information
–Correcting insufficient or inaccurate eligibility or referral information
–Correcting invalid PCP, DOB
*Reduction in write-offs due to eligibility and exceeding the filing limit
*Improved collection of Copays
*Labor savings
_Reduction in ambulatory care staff needed to manage medications.
–Reduction in time spent on manual transactions: eligibility, claim
status inquiry
–Focusing on the exception processing
*Reduction in “Days in A/R” & claims rework
–Focus on front-end weighted, clinically driven revenue cycle
operations
*Reduction in bad-debt
A single healthcare information interchange platform that pays for itself via cost avoidance. That's cool!
Many recent articles in the popular press have questioned the cost savings of health information exchange. Here's more data from our experience in Massachusetts.
The quantifiable savings are different for each provider organization depending on what their starting point is, however here are some example of significant savings:
*For Brigham & Women's and Mass General their 'Total Denial Write Off
Rates as a Percent of Net Revenue' has reduced from 3.78% to 0.88% and
from 4.17% to 1.28% respectively.
* For Brigham & Women's and Mass General their 'A/R Days' have reduced from
81 days to 55.6 days and from 99 days to 54 days respectively.
*Since Baystate's go live with NEHEN in January 2007, they have saved over
$1.5M by avoiding per transaction fees.
In general, NEHEN users have experienced the following clinical and administrative benefits
*Reduction of ambulatory medication errors
*Enhanced communication among providers
*Improvement in the Patient Experience
*Reduction in claim denials due to lack of good information
–Correcting insufficient or inaccurate eligibility or referral information
–Correcting invalid PCP, DOB
*Reduction in write-offs due to eligibility and exceeding the filing limit
*Improved collection of Copays
*Labor savings
_Reduction in ambulatory care staff needed to manage medications.
–Reduction in time spent on manual transactions: eligibility, claim
status inquiry
–Focusing on the exception processing
*Reduction in “Days in A/R” & claims rework
–Focus on front-end weighted, clinically driven revenue cycle
operations
*Reduction in bad-debt
A single healthcare information interchange platform that pays for itself via cost avoidance. That's cool!
Thursday, April 23, 2009
My Work Spaces
I've written about the technologies I use personally, but I've not written about the places I use them.
I have three primary workspaces - my Harvard office, my BIDMC office, and my home. Since being a CIO is a 24x7x365 lifestyle, I do not store paper, supplies or technologies in any of my offices. I can work equally well wherever I am. Here's the overview of where I work:
Harvard office - I'm in Vanderbilt Hall, built by George and Cornelius Vanderbilt in 1927. Everything in my office is from that era. In my twenties I collected Arts and Crafts/Mission furniture from flea markets and estate sales. Back in the 80's no one really wanted old dark oak pieces, so everything was inexpensive. My desk is a deposition table from the Milwaukee County Courthouse. You can image Clarence Darrow pounding his fist on its well worn oak top. There's a supply of green tea and a whiteboard. Otherwise, there is no paper, no technology and no phone. I use this office for meetings with Harvard faculty and staff from the Longwood medical area, since it's centrally located near BIDMC, Joslin, Dana Faber and Brigham and Women's.
BIDMC office - I'm in the Renaissance Center, a 9 story office building next to the Boston Police Station. Because Renaissance has a large conference room, this is the office I generally use to host visiting groups and foreign dignitaries. It's typical for international visitors to bring some momento from their country, so my office is a shrine to dozens of countries - Japan, China, the UK, Scandinavia, Dubai, Switzerland, and even sub-saharan Africa. As with my other offices, there is no paper or technology specific to the office.
Home office - At home, I do not have a separate office, but work from the family room. All of my blogs and articles are written in my Morris Chair, which I've used for the past 15 years. I also have a small writing desk, an incense burner, and a fragment of a tree that's hundreds of years old. While climbing, I found the tree at 13,000 feet and noticed something remarkable about it. As a seedling, the tree grew from under a rock, eventually surrounded the rock, and split the rock in half. To me, that's a great metaphor for perseverance. You'll find a cup of green tea but no papers, files, or clutter in my home office space.
It's taken years for me to create workspaces that foster creativity, productivity, and peace of mind. As you'll see from all the photos, less stuff can bring more efficiency.
I have three primary workspaces - my Harvard office, my BIDMC office, and my home. Since being a CIO is a 24x7x365 lifestyle, I do not store paper, supplies or technologies in any of my offices. I can work equally well wherever I am. Here's the overview of where I work:
Harvard office - I'm in Vanderbilt Hall, built by George and Cornelius Vanderbilt in 1927. Everything in my office is from that era. In my twenties I collected Arts and Crafts/Mission furniture from flea markets and estate sales. Back in the 80's no one really wanted old dark oak pieces, so everything was inexpensive. My desk is a deposition table from the Milwaukee County Courthouse. You can image Clarence Darrow pounding his fist on its well worn oak top. There's a supply of green tea and a whiteboard. Otherwise, there is no paper, no technology and no phone. I use this office for meetings with Harvard faculty and staff from the Longwood medical area, since it's centrally located near BIDMC, Joslin, Dana Faber and Brigham and Women's.
BIDMC office - I'm in the Renaissance Center, a 9 story office building next to the Boston Police Station. Because Renaissance has a large conference room, this is the office I generally use to host visiting groups and foreign dignitaries. It's typical for international visitors to bring some momento from their country, so my office is a shrine to dozens of countries - Japan, China, the UK, Scandinavia, Dubai, Switzerland, and even sub-saharan Africa. As with my other offices, there is no paper or technology specific to the office.
Home office - At home, I do not have a separate office, but work from the family room. All of my blogs and articles are written in my Morris Chair, which I've used for the past 15 years. I also have a small writing desk, an incense burner, and a fragment of a tree that's hundreds of years old. While climbing, I found the tree at 13,000 feet and noticed something remarkable about it. As a seedling, the tree grew from under a rock, eventually surrounded the rock, and split the rock in half. To me, that's a great metaphor for perseverance. You'll find a cup of green tea but no papers, files, or clutter in my home office space.
It's taken years for me to create workspaces that foster creativity, productivity, and peace of mind. As you'll see from all the photos, less stuff can bring more efficiency.
Wednesday, April 22, 2009
Enhancing our Problem Lists
In my recent posts about the Limitations of Administrative Data and the Lessons Learned, I've concluded that clinical observations of symptoms and conditions coded in SNOMED-CT are the most relevant problem list data to share with patients and use for data analysis.
We have a 3 step approach to implementing SNOMED-CT in BIDMC clinical systems
1. Our existing problem list dictionary was developed in 1998 and hence it's called BI-98. We contributed it to the National Library of Medicine and it was incorporated into the Metathesaurus.
About 70% of the terms we used are easily mapable to SNOMED-CT Codes. The National Library of Medicine will send us a BI-98 to SNOMED-CT mapping in the next few days and we'll incorporate it into our existing dictionary, giving us a SNOMED-CT vocabulary for 70% of the existing problem list entries in our system.
2. The NLM will be giving us a compendium of the 6000 most commonly used terms in the local problem list vocabularies of large health care institutions, and their equivalent SNOMED CT codes. We'll incorporate that list into our systems and create a novel "Problem List Picker" using AJAX technologies that will assist doctors in choosing the best problem description associated with a SNOMED-CT term. This will give us a great framework for the terminology of newly entered problems.
3. I'm working with other organizations, such as Kaiser Permanente, to gather problem list "best practices". We'll leverage their experience to innovate at BIDMC and I'll share the experience broadly via my blog. I'll post their problem list dictionaries as I receive permission to do so.
I look forward to your participation and feedback as we work together to improve the usefulness of data in EHRs and PHRs nationwide.
We have a 3 step approach to implementing SNOMED-CT in BIDMC clinical systems
1. Our existing problem list dictionary was developed in 1998 and hence it's called BI-98. We contributed it to the National Library of Medicine and it was incorporated into the Metathesaurus.
About 70% of the terms we used are easily mapable to SNOMED-CT Codes. The National Library of Medicine will send us a BI-98 to SNOMED-CT mapping in the next few days and we'll incorporate it into our existing dictionary, giving us a SNOMED-CT vocabulary for 70% of the existing problem list entries in our system.
2. The NLM will be giving us a compendium of the 6000 most commonly used terms in the local problem list vocabularies of large health care institutions, and their equivalent SNOMED CT codes. We'll incorporate that list into our systems and create a novel "Problem List Picker" using AJAX technologies that will assist doctors in choosing the best problem description associated with a SNOMED-CT term. This will give us a great framework for the terminology of newly entered problems.
3. I'm working with other organizations, such as Kaiser Permanente, to gather problem list "best practices". We'll leverage their experience to innovate at BIDMC and I'll share the experience broadly via my blog. I'll post their problem list dictionaries as I receive permission to do so.
I look forward to your participation and feedback as we work together to improve the usefulness of data in EHRs and PHRs nationwide.
Tuesday, April 21, 2009
Apologizing with Candor and Grace
As readers of my blog know, I've adopted many aspects of Japanese lifestyle in my household - food, music, and clothing.
Learning to apologize is also something I've learned from the Japanese. You'll find a great description in the Etiquette Guide to Japan by Boye De Mente.
A typical corporate apology in Japan is accompanied by a low bow, a sincere apology, and a possible resignation.
Atoning for a mistake in the US does not require the loss of your job (or anything more extreme).
As I mentioned in yesterday's blog about being a public figure, bad things can happen. You may or may not be able to control them.
When bad things happen, here is the approach I use:
1. Encourage openness and transparency in your staff i.e. do not shoot the messenger. By empowering every person to communicate the events objectively, you'll get to the root cause more rapidly.
2. Ask what can be done to improve the organization rather than blaming any one individual. If an error occurs in medication administration, ask what systems and processes should be improved rather than fire people.
3. Broadly communicate the issue in terms of the lessons learned and continuous quality improvement. The Institute for Healthcare Improvement (IHI) espouses Plan, Do, Study, Act (PDSA). Many IT projects are cutting edge and require incremental fine tuning. We try, we evaluate, we revise, and we try again. Unintended negative consequences during the learning process require full disclosure and an apology.
4. Do not hide information or sugar coat the events. It is far worse to deny the truth, then have to explain the facts later. In a world of instant communication via email, IM, blogs, and Twitter, assume that everyone knows the facts as soon as they happen.
5. Openly discuss the events, their cause, the immediate corrective action taken and the long term changes made to prevent the issue from happening again. Declare that you've made a mistake and that you apologize for it. This may be painful and could result in a great deal of short term publicity, but it's better than a long term investigation and future disclosure of management misdeeds. Imagine what would have happened to Bill Clinton if he said "I did have an affair with that woman and it was wrong. I have taken short term steps to prevent any such incidents from happening again and I will seek counseling from religious mentors and mental health experts to ensure my future behavior is exemplary". The issue would have disappeared in a few weeks.
In my many years of leading change and making mistakes along the way, I've found that great communication, openness, candor, and admission of mistakes, followed by a sincere apology results in healing the organization and bringing rapid closure to the issue.
Learning to apologize is also something I've learned from the Japanese. You'll find a great description in the Etiquette Guide to Japan by Boye De Mente.
A typical corporate apology in Japan is accompanied by a low bow, a sincere apology, and a possible resignation.
Atoning for a mistake in the US does not require the loss of your job (or anything more extreme).
As I mentioned in yesterday's blog about being a public figure, bad things can happen. You may or may not be able to control them.
When bad things happen, here is the approach I use:
1. Encourage openness and transparency in your staff i.e. do not shoot the messenger. By empowering every person to communicate the events objectively, you'll get to the root cause more rapidly.
2. Ask what can be done to improve the organization rather than blaming any one individual. If an error occurs in medication administration, ask what systems and processes should be improved rather than fire people.
3. Broadly communicate the issue in terms of the lessons learned and continuous quality improvement. The Institute for Healthcare Improvement (IHI) espouses Plan, Do, Study, Act (PDSA). Many IT projects are cutting edge and require incremental fine tuning. We try, we evaluate, we revise, and we try again. Unintended negative consequences during the learning process require full disclosure and an apology.
4. Do not hide information or sugar coat the events. It is far worse to deny the truth, then have to explain the facts later. In a world of instant communication via email, IM, blogs, and Twitter, assume that everyone knows the facts as soon as they happen.
5. Openly discuss the events, their cause, the immediate corrective action taken and the long term changes made to prevent the issue from happening again. Declare that you've made a mistake and that you apologize for it. This may be painful and could result in a great deal of short term publicity, but it's better than a long term investigation and future disclosure of management misdeeds. Imagine what would have happened to Bill Clinton if he said "I did have an affair with that woman and it was wrong. I have taken short term steps to prevent any such incidents from happening again and I will seek counseling from religious mentors and mental health experts to ensure my future behavior is exemplary". The issue would have disappeared in a few weeks.
In my many years of leading change and making mistakes along the way, I've found that great communication, openness, candor, and admission of mistakes, followed by a sincere apology results in healing the organization and bringing rapid closure to the issue.
Monday, April 20, 2009
The Challenge of Being a Public Figure
Although I'm not really a public figure, I do enough presentations in my roles at BIDMC, Harvard, NEHEN, and HITSP to appreciate the challenges of highly visible corporate and government public figures. Here are my top 10 observations:
1. There is no downtime
While on a plane, train, or in any public space, you cannot be freewheeling with your opinions. Your communications must be thoughtful regardless of venue. Emails must be written with the assumption they will appear in The New York Times. While going about the activities of day to day living, you must always be "on". I've had deep conversations about IT strategy and government policy at the Wellesley Dump.
2. You must be a good listener
Public figures are assumed to have power and there will be many opinions about how to best use that power. Employees, colleagues, and the blogosphere will offer continuous advice as to the best path forward. All of this input should be gathered and acknowledged. Since every action you take will be documented and scrutinized, it's important to incorporate multi-stakeholder input into your decision-making.
3. You must hold yourself to high standards.
Watching the confirmation activities as candidates have been vetted in the new administration, we know that you must be a tax expert, avoid hiring domestic help, and shun association with lobbyists. The good news for me is that my tax returns are simple, I've never had domestic help, and I rarely get out much, so I have few opportunities for any conflicts of interest with lobbyists or other nefarious characters. I married the first woman I dated in college and this year is our 25th wedding anniversary. There are no experiences in my life thus far that Dr. Phil or Jerry Springer would find interesting.
4. You cannot be too extreme in your views
The press has recently observed that some of Obama's bold proposals have been tempered by political reality
Public figures listen to all sides of an issue then select a path forward that works for most people.
In a recent keynote I did with Senator Whitehouse (D-RI), he noted that politics is like topography - there are peaks and valleys of political issues. Some mountains, like single payer healthcare, cannot be climbed in the short term.
5. You rarely use formal authority
In many societies, policy can be made by benign dictators at an accelerated pace without debate. That's not the way policy is made in the US. Whether in institutions like Harvard University or in government, there is a process for everything. A leader can communicate a vision or assemble a guiding coalition, but rarely can a public figure just declare an action to be done by fiat.
6. It's more about responsibility than power
Public figures take responsibility for all the actions and events that take place in their sphere of influence. My experience has been that lofty positions come with huge responsibility but little power. Many public figures are like the General Secretary of the UN - charged with communicating a vision, organizing people, and moving issues forward, but without significant power to orchestrate rapid change.
7. Your communications will be interpreted in ways you never intended.
In my own small world of healthcare IT, I find it interesting to read blogs, articles, and news stories which interpret my actions and comments. People will find support for their own views, will extend my opinions to meet their needs, or will create controversy where none exists. I'm always amused when I read headlines such as "Was HITSP work shift a political maneuver?" since politics never crossed my mind when I thought about transport standards and simple EHR data content exchange.
8. There will be good days and bad days
As I begin each day, I never know what press, email, and unexpected events will occur. Some days have a relaxed schedule but turn into a firestorm of communication about controversies I did not anticipate. There is no potential for completing a day without some measure of angry emails, hostile phone calls, and unresolved issues. Each day, I look at the trajectory and the issues that were moved more forward than backward. On balance, if I feel that I've done everything possible to bring closure to my open issues, it's a good day.
9. You'll receive credit for things you did not do and blame for things you cannot control
Whenever I'm introduced at keynote addresses, my life summary sounds like I'm super human. The reality of being a public figure is that you'll get credit for many things done by people working for you or done by colleagues working with you. I constantly credit the team and institution with the accomplishments, not myself. Spreading the credit for success is easy since "Success has a 1000 fathers". However, when bad things happen, it's expected that the public figure will accept responsibility, even if the events were not directly controllable. Apologizing with candor and grace will be the subject of another blog. It's an important skill to have.
10. You cannot make everyone happy
There are so many special interests in the world today that there is no such thing as a policy or idea that everyone will accept. A solution based on 90% consensus means that 10% will feel wronged and will opposed the path forward. The best a public figure can do is listen, facilitate, communicate and then move forward with the optimal thinking at the time. Even while executing a well orchestrated plan, there will be naysayers, continued debate, and controversy. The public figure should continue to listen, provide mid-course correction as needed, and support forward progress.
I've known many public figures in my career - Milton Friedman, Edward Teller, Condoleezza Rice. I have some sense of the energy they require(d) just to be themselves. Next time you're feeling angst for a public figure, take a moment to empathize with their challenges.
1. There is no downtime
While on a plane, train, or in any public space, you cannot be freewheeling with your opinions. Your communications must be thoughtful regardless of venue. Emails must be written with the assumption they will appear in The New York Times. While going about the activities of day to day living, you must always be "on". I've had deep conversations about IT strategy and government policy at the Wellesley Dump.
2. You must be a good listener
Public figures are assumed to have power and there will be many opinions about how to best use that power. Employees, colleagues, and the blogosphere will offer continuous advice as to the best path forward. All of this input should be gathered and acknowledged. Since every action you take will be documented and scrutinized, it's important to incorporate multi-stakeholder input into your decision-making.
3. You must hold yourself to high standards.
Watching the confirmation activities as candidates have been vetted in the new administration, we know that you must be a tax expert, avoid hiring domestic help, and shun association with lobbyists. The good news for me is that my tax returns are simple, I've never had domestic help, and I rarely get out much, so I have few opportunities for any conflicts of interest with lobbyists or other nefarious characters. I married the first woman I dated in college and this year is our 25th wedding anniversary. There are no experiences in my life thus far that Dr. Phil or Jerry Springer would find interesting.
4. You cannot be too extreme in your views
The press has recently observed that some of Obama's bold proposals have been tempered by political reality
Public figures listen to all sides of an issue then select a path forward that works for most people.
In a recent keynote I did with Senator Whitehouse (D-RI), he noted that politics is like topography - there are peaks and valleys of political issues. Some mountains, like single payer healthcare, cannot be climbed in the short term.
5. You rarely use formal authority
In many societies, policy can be made by benign dictators at an accelerated pace without debate. That's not the way policy is made in the US. Whether in institutions like Harvard University or in government, there is a process for everything. A leader can communicate a vision or assemble a guiding coalition, but rarely can a public figure just declare an action to be done by fiat.
6. It's more about responsibility than power
Public figures take responsibility for all the actions and events that take place in their sphere of influence. My experience has been that lofty positions come with huge responsibility but little power. Many public figures are like the General Secretary of the UN - charged with communicating a vision, organizing people, and moving issues forward, but without significant power to orchestrate rapid change.
7. Your communications will be interpreted in ways you never intended.
In my own small world of healthcare IT, I find it interesting to read blogs, articles, and news stories which interpret my actions and comments. People will find support for their own views, will extend my opinions to meet their needs, or will create controversy where none exists. I'm always amused when I read headlines such as "Was HITSP work shift a political maneuver?" since politics never crossed my mind when I thought about transport standards and simple EHR data content exchange.
8. There will be good days and bad days
As I begin each day, I never know what press, email, and unexpected events will occur. Some days have a relaxed schedule but turn into a firestorm of communication about controversies I did not anticipate. There is no potential for completing a day without some measure of angry emails, hostile phone calls, and unresolved issues. Each day, I look at the trajectory and the issues that were moved more forward than backward. On balance, if I feel that I've done everything possible to bring closure to my open issues, it's a good day.
9. You'll receive credit for things you did not do and blame for things you cannot control
Whenever I'm introduced at keynote addresses, my life summary sounds like I'm super human. The reality of being a public figure is that you'll get credit for many things done by people working for you or done by colleagues working with you. I constantly credit the team and institution with the accomplishments, not myself. Spreading the credit for success is easy since "Success has a 1000 fathers". However, when bad things happen, it's expected that the public figure will accept responsibility, even if the events were not directly controllable. Apologizing with candor and grace will be the subject of another blog. It's an important skill to have.
10. You cannot make everyone happy
There are so many special interests in the world today that there is no such thing as a policy or idea that everyone will accept. A solution based on 90% consensus means that 10% will feel wronged and will opposed the path forward. The best a public figure can do is listen, facilitate, communicate and then move forward with the optimal thinking at the time. Even while executing a well orchestrated plan, there will be naysayers, continued debate, and controversy. The public figure should continue to listen, provide mid-course correction as needed, and support forward progress.
I've known many public figures in my career - Milton Friedman, Edward Teller, Condoleezza Rice. I have some sense of the energy they require(d) just to be themselves. Next time you're feeling angst for a public figure, take a moment to empathize with their challenges.
Friday, April 17, 2009
Lessons Learned from e-Patient Dave
I started the week with a blog about the Limitations of Administrative Data, so it's fitting to end the week with lessons learned and next steps.
e-Patient Dave, his doctor Danny Sands, Roni Zeiger from Google, and I spent many hours in online and phone conversation about the data elements in healthcare that are of greatest use to e-patients. Since the American Recovery and Reinvestment Act requires patients be given access to their electronic data, I have wanted to share all data with patients, both clinical and administrative. It's clear from our discussions that sharing billing data with patients is unreliable for clinical history, and it was a mistake to do that.
Administrative data is a coded summary of the clinical care that lacks perfect specificity and time references i.e. just because you had a diagnosis of low potassium 5 years ago does not imply it is a problem today.
Thus, we must be careful about what data we send to PHRs and how that data is presented to patients. Here's the action plan that Dave, Danny, Roni, and I developed to optimize the PHR experience for e-patients:
Problem List
This is useful clinical information as long as clinicians keep it current. Danny has done that with Dave's data, so it's Dave's best current source of relevant diagnoses and ongoing treatment.
Plan
1. Remove our ICD9 administrative data feed from Google so that the clinician's problem list is the only data which populates the Conditions area
2. Continue to improve our problem list functionality in webOMR so that it maps to SNOMED-CT, enabling Google and other PHR vendors to provide medical information and decision support based on a controlled vocabulary instead of just free text
3. Change the BIDMC Google Health Upload screen from "Diagnoses" to "Problem List"
Medication List
Name (with NDC coding), Dosage/Frequency, Prescription, Date provides good "data liquidity" of active medications. We will continue to investigate the utility of sending inactive medications.
Allergy List
Name, reaction, and level of certainty of the reaction has worked well. However, Google Health does not display the detailed reaction information. We will either insert this information into the Google Allergy notes or work with Google to add a new field.
Procedures
We do not currently send procedures to Google Health, nor do they appear in Patientsite. However, Dave feels they may be useful to e-patients. We will add Procedure name and date as a pilot
A great week of discussion with many lessons learned. We look forward to our ongoing work with e-patients, doctors, and Google.
e-Patient Dave, his doctor Danny Sands, Roni Zeiger from Google, and I spent many hours in online and phone conversation about the data elements in healthcare that are of greatest use to e-patients. Since the American Recovery and Reinvestment Act requires patients be given access to their electronic data, I have wanted to share all data with patients, both clinical and administrative. It's clear from our discussions that sharing billing data with patients is unreliable for clinical history, and it was a mistake to do that.
Administrative data is a coded summary of the clinical care that lacks perfect specificity and time references i.e. just because you had a diagnosis of low potassium 5 years ago does not imply it is a problem today.
Thus, we must be careful about what data we send to PHRs and how that data is presented to patients. Here's the action plan that Dave, Danny, Roni, and I developed to optimize the PHR experience for e-patients:
Problem List
This is useful clinical information as long as clinicians keep it current. Danny has done that with Dave's data, so it's Dave's best current source of relevant diagnoses and ongoing treatment.
Plan
1. Remove our ICD9 administrative data feed from Google so that the clinician's problem list is the only data which populates the Conditions area
2. Continue to improve our problem list functionality in webOMR so that it maps to SNOMED-CT, enabling Google and other PHR vendors to provide medical information and decision support based on a controlled vocabulary instead of just free text
3. Change the BIDMC Google Health Upload screen from "Diagnoses" to "Problem List"
Medication List
Name (with NDC coding), Dosage/Frequency, Prescription, Date provides good "data liquidity" of active medications. We will continue to investigate the utility of sending inactive medications.
Allergy List
Name, reaction, and level of certainty of the reaction has worked well. However, Google Health does not display the detailed reaction information. We will either insert this information into the Google Allergy notes or work with Google to add a new field.
Procedures
We do not currently send procedures to Google Health, nor do they appear in Patientsite. However, Dave feels they may be useful to e-patients. We will add Procedure name and date as a pilot
A great week of discussion with many lessons learned. We look forward to our ongoing work with e-patients, doctors, and Google.
Thursday, April 16, 2009
A Winter's Tale
Recently, my friend John Winship was caught in poor visibility, strong winds, freezing rain/snow and was missing for two days in New Hampshire's White Mountains. I asked him about his lessons learned and this is what he wrote (published with his permission). I think you'll find it meaningful and profound.
"Solo expedition to Mt. Rainier in May: Cancelled.
Solo expedition to Mt. Cook/NZ winter 2010: Cancelled.
Although I did not request a rescue, I know that the damage to my body would have been exponential with one more overnight. I have already made a gift to New Hampshire Fish and Game for double the estimated rescue cost. I have also commended them to Governor Lynch for their selflessness and heroism.
This had to happen. I was addicted. I have been pushing the envelope for two years with ever more audacious speed climbs. Last year I survived a slab avalanche on Washington and went on to summit on the same day. Even that experience wasn't enough to wake me up. Although I have promised my wife that I would never go higher than 18,000 feet, I know that at this rate I would have found myself on Everest or worse within three years.
So many armchair critics, and hindsight advice! I carried 90 pounds of gear up! One critic pointed out that snowshoes would have helped on Day 2, and I retorted that a kayak would have been ideal on Day 4! Another well-wisher asked why I did not have a phone and GPS. Apart from the unreliability of lithium above tree line, I pointed out that I might have been tempted, innumerable times on Day 2, to call for a rescue, thus needlessly endangering strangers, when clearly I had the power to get out on my own. That person then had the temerity to reply, "But that's their job." To which I said, "Dying for me is not their job."
The critics have a difficult time comprehending (1) the ethos of self-reliance inherent to the solo alpine style; (2) the calculus of risk, error, and severe consequences in our risk-adverse society; and (3) the fact that being "lost" has a novel, unfamiliar definition for alpinists. I was only "lost" for the three hour duration through which I had been executing an unworkable plan, because I was not where I thought I was. The problem for speed climbers is that "getting lost" usually means getting lost big. At my pace, I had passed a point of no return far too quickly. Once I fully comprehended my peril, I made several adjustments to plan, and made a severe attempt to get out of Dry River Valley (and nearly died in the attempt) before conceding defeat and deciding instead to mixed-climb down the river.
I have the solace of knowing that no one was hurt rescuing me from my blunder. I will continue to day-hike, once I can feel my feet again, but no more pushing the envelope. I have too much to live for."
"Solo expedition to Mt. Rainier in May: Cancelled.
Solo expedition to Mt. Cook/NZ winter 2010: Cancelled.
Although I did not request a rescue, I know that the damage to my body would have been exponential with one more overnight. I have already made a gift to New Hampshire Fish and Game for double the estimated rescue cost. I have also commended them to Governor Lynch for their selflessness and heroism.
This had to happen. I was addicted. I have been pushing the envelope for two years with ever more audacious speed climbs. Last year I survived a slab avalanche on Washington and went on to summit on the same day. Even that experience wasn't enough to wake me up. Although I have promised my wife that I would never go higher than 18,000 feet, I know that at this rate I would have found myself on Everest or worse within three years.
So many armchair critics, and hindsight advice! I carried 90 pounds of gear up! One critic pointed out that snowshoes would have helped on Day 2, and I retorted that a kayak would have been ideal on Day 4! Another well-wisher asked why I did not have a phone and GPS. Apart from the unreliability of lithium above tree line, I pointed out that I might have been tempted, innumerable times on Day 2, to call for a rescue, thus needlessly endangering strangers, when clearly I had the power to get out on my own. That person then had the temerity to reply, "But that's their job." To which I said, "Dying for me is not their job."
The critics have a difficult time comprehending (1) the ethos of self-reliance inherent to the solo alpine style; (2) the calculus of risk, error, and severe consequences in our risk-adverse society; and (3) the fact that being "lost" has a novel, unfamiliar definition for alpinists. I was only "lost" for the three hour duration through which I had been executing an unworkable plan, because I was not where I thought I was. The problem for speed climbers is that "getting lost" usually means getting lost big. At my pace, I had passed a point of no return far too quickly. Once I fully comprehended my peril, I made several adjustments to plan, and made a severe attempt to get out of Dry River Valley (and nearly died in the attempt) before conceding defeat and deciding instead to mixed-climb down the river.
I have the solace of knowing that no one was hurt rescuing me from my blunder. I will continue to day-hike, once I can feel my feet again, but no more pushing the envelope. I have too much to live for."
Wednesday, April 15, 2009
Combating Malware
Every day we're reading about new viruses, trojans, spyware and other malware on the internet. I was recently asked about the need to reinstall the operating system from scratch on a virus infected machine. Here is the answer from Security Officer at BIDMC:
Is there a valid technical reason for requiring a rebuild? The answer to this is yes. The thing to focus on here is the Anti in the title of Anti-Virus. These applications are intended to stop an infection. Most of them also include a cleaning component and there are many products marketed solely as cleaning products - spybot-SD is a good example. The problem with these products is that malware is constantly morphing. You see this often in the names of the malware , they will contain .a, .b. .c etc. The longer the malware is out the more variants. This means that the cleaning tools need to keep up as well. The fact of the matter is is that they can not. If a system has critical content on it and it appears to be compromised the only way to ensure it is clean is to completely rebuild the system. The more sophisticated viruses will hide in the boot sector of a drive, others will replace O/S files with variants that contain the virus. The former will load on system startup and have not tracks for the AV or file cleaning applications to locate and clean. The later will look like standard files and be skipped over. We also take the precaution of a system rebuild here at BIDMC when we have a system with clinical or privacy content on it that is believed to have been compromised.
On the discovery component - we are also seeing a uptick in Torpig and mebroot. Torpig and mebroot are of the same family - sinowal. These trojans are a high risk trojan as their objective is to steal identity information - and they are good at it. These are of the type that imbed themselves in the boot sector of the system. As I mentioned above it is very difficult to both detect and to clean this type of trojan. I had a family member with this. As an exercise I attempted to clean the boot sector rather then rebuild. I logged over 40 hours of labor on this effort with a wide range of tools - even down to using a disk sector editor to attempt to clean it with no success.
There is no way to determine the original source of the infection without detailed examination of the system. But, this system is used to browse the web, it has Google desktop loaded and it is running MS Office. The infection could be sourced from a web site that is believed to be good. We saw this on Boston.com not to long ago. These pages link to active advertising sites that are not in their control. Those advertising sites can and often do have malware in them. Google desktop in itself is not an issue - but the actions/benefits it provides automatically link the system to sites in a more automated fashion that increases the exposure of a system. Lastly is Windows itself. During the time between the discovery of a vulnerability and the release of a match all systems are vulnerable. In many cases the exposure time is lengthy. Keeping up with patches is critical but in itself does not ensure protection.
There are companies that offer system analysis. In general you can look to pay $350 - $400 per hour from a quality service. For an 80 Gig drive you are looking at about 4 hours of time for a basic pass over the system. A more detailed analysis will take in excess of 10 hours. As an example we are performing an analysis now on a system. The forensic copy of the disk to perform the analysis took 2 hours. The first pass analysis took an additional 6 hours. We are now starting the second pass and that will be 10 to 12 hours. Our times are about 30% more then a commercial provider due to the equipment we use. This is not a cheap process in money or time.
Due to the high risk that the torpig and mebroot trojans present I would highly recommend to completely rebuild the system ensuring that the boot sector is wiped and re-written. I would then ensure before the system goes back into usage that all windows, Internet Explorer, and office patches are applied.
Is there a valid technical reason for requiring a rebuild? The answer to this is yes. The thing to focus on here is the Anti in the title of Anti-Virus. These applications are intended to stop an infection. Most of them also include a cleaning component and there are many products marketed solely as cleaning products - spybot-SD is a good example. The problem with these products is that malware is constantly morphing. You see this often in the names of the malware , they will contain .a, .b. .c etc. The longer the malware is out the more variants. This means that the cleaning tools need to keep up as well. The fact of the matter is is that they can not. If a system has critical content on it and it appears to be compromised the only way to ensure it is clean is to completely rebuild the system. The more sophisticated viruses will hide in the boot sector of a drive, others will replace O/S files with variants that contain the virus. The former will load on system startup and have not tracks for the AV or file cleaning applications to locate and clean. The later will look like standard files and be skipped over. We also take the precaution of a system rebuild here at BIDMC when we have a system with clinical or privacy content on it that is believed to have been compromised.
On the discovery component - we are also seeing a uptick in Torpig and mebroot. Torpig and mebroot are of the same family - sinowal. These trojans are a high risk trojan as their objective is to steal identity information - and they are good at it. These are of the type that imbed themselves in the boot sector of the system. As I mentioned above it is very difficult to both detect and to clean this type of trojan. I had a family member with this. As an exercise I attempted to clean the boot sector rather then rebuild. I logged over 40 hours of labor on this effort with a wide range of tools - even down to using a disk sector editor to attempt to clean it with no success.
There is no way to determine the original source of the infection without detailed examination of the system. But, this system is used to browse the web, it has Google desktop loaded and it is running MS Office. The infection could be sourced from a web site that is believed to be good. We saw this on Boston.com not to long ago. These pages link to active advertising sites that are not in their control. Those advertising sites can and often do have malware in them. Google desktop in itself is not an issue - but the actions/benefits it provides automatically link the system to sites in a more automated fashion that increases the exposure of a system. Lastly is Windows itself. During the time between the discovery of a vulnerability and the release of a match all systems are vulnerable. In many cases the exposure time is lengthy. Keeping up with patches is critical but in itself does not ensure protection.
There are companies that offer system analysis. In general you can look to pay $350 - $400 per hour from a quality service. For an 80 Gig drive you are looking at about 4 hours of time for a basic pass over the system. A more detailed analysis will take in excess of 10 hours. As an example we are performing an analysis now on a system. The forensic copy of the disk to perform the analysis took 2 hours. The first pass analysis took an additional 6 hours. We are now starting the second pass and that will be 10 to 12 hours. Our times are about 30% more then a commercial provider due to the equipment we use. This is not a cheap process in money or time.
Due to the high risk that the torpig and mebroot trojans present I would highly recommend to completely rebuild the system ensuring that the boot sector is wiped and re-written. I would then ensure before the system goes back into usage that all windows, Internet Explorer, and office patches are applied.
Tuesday, April 14, 2009
A Followup on Wal-Mart's EHR effort
I've recently written about Wal-mart's effort to reduce the cost of EHR acquisition. Many folks have asked for more details about how the product will be promoted and sold. Here's the answer based on followup calls with the Wal-mart folks running the project.
Sam's Club currently has "feet on the street" visiting with small business operators. In particular, there are individuals from the Health and Wellness business group that are currently calling on physician members. In the past, they would have spoken to physicians about the $4 Pharmacy program and office supplies. That same group will also be used to promote the EHR Solution. However, that is not enough to spread the word.
In partnering with eClinicalWorks and Dell, Sam's Club will be leveraging their "feet on the street" to get out the message. Sam's will rely on eClinicalWorks to communicate the technical details of how the software product works. ECW will do all the demos. Sam's is working with them to streamline that demo process. A first step is to do a web-demo which is relatively low cost.
Sam's is also beginning the process to partner with state governmental organizations and professional organizations to communicate the value of the program. Those partnerships will also spread the word.
Sam's will eventually create traveling demos so that physicians can be invited to clubs for a more interactive event targeted to their needs.
Per Sam's, all of this may not be enough. Depending on the pace of demand, they may also leverage existing infrastructure from other organizations to get the word out.
So, in summary, you'll not find a doctor at Sam's getting a case of toilet paper and an EHR, you'll find one on one discussion, demonstrations, and the involvement of many professional groups. Just as with the rollout and support, Sam's seems to have really thought this out.
Sam's Club currently has "feet on the street" visiting with small business operators. In particular, there are individuals from the Health and Wellness business group that are currently calling on physician members. In the past, they would have spoken to physicians about the $4 Pharmacy program and office supplies. That same group will also be used to promote the EHR Solution. However, that is not enough to spread the word.
In partnering with eClinicalWorks and Dell, Sam's Club will be leveraging their "feet on the street" to get out the message. Sam's will rely on eClinicalWorks to communicate the technical details of how the software product works. ECW will do all the demos. Sam's is working with them to streamline that demo process. A first step is to do a web-demo which is relatively low cost.
Sam's is also beginning the process to partner with state governmental organizations and professional organizations to communicate the value of the program. Those partnerships will also spread the word.
Sam's will eventually create traveling demos so that physicians can be invited to clubs for a more interactive event targeted to their needs.
Per Sam's, all of this may not be enough. Depending on the pace of demand, they may also leverage existing infrastructure from other organizations to get the word out.
So, in summary, you'll not find a doctor at Sam's getting a case of toilet paper and an EHR, you'll find one on one discussion, demonstrations, and the involvement of many professional groups. Just as with the rollout and support, Sam's seems to have really thought this out.
Monday, April 13, 2009
The Limitations of Administrative Data
The data issue described in the Boston Globe this morning is really unrelated to Google, Microsoft, or any Personal Health Record (PHR) provider. In the US, there are two kinds of healthcare data - clinical data in Electronic Health Records (EHRs) and administrative billing data that is used by payers, researchers, and the government. Billing data is imprecise, but it is a starting point to describe the care given by a doctor or hospital. The only thing that's new in 2009 is that PHRs now enable patients to see the kind of billing data that's been used for 20 years for reimbursement, quality measurement, and population health. Blue Cross of Massachusetts and Medicare (in a few pilot states) share billing data with patients via Google Health, so this is not just a BIDMC implementation.
As a society we're likely to see increased data transparency between patients and providers, which will lead to several improvements:
1. Doctors will likely begin using more structured problem lists based on SNOMED-CT, a standardized clinical vocabulary of symptoms and conditions. This will enable their EHRs to better share data with PHRs as well as to more accurately measure quality. The Healthcare Information Technology Standards Panel (HITSP) has harmonized the national standards needed to reduce the dependency on billing data for PHRs, quality measurement, and population health.
2. Eventually, billing data will become more detailed as ICD-10 replaces ICD-9 billing codes in 2013. It will take several years for ICD-10 to be widely adopted and improve data granularity.
3. In the future, patients and doctors will work together to ensure records are up to date and accurate. It's a shared responsibility. Now that the Stimulus Bill requires doctors to make records available electronically to patients, the limitations of billing data will become more widely understood.
In the meantime, BIDMC will take the following actions to accelerate this work:
1. I'm working with the National Library of Medicine to map the most common Problem List terms used at BIDMC to SNOMED-CT, enabling BIDMC to use a clinical vocabulary and not just a billing vocabulary.
2. I'm working with Google to evaluate the impact of sending our existing free text problem lists instead of billing codes. It will reduce the number of features available to patients, since Google's educational materials are based on billing codes, but it may be more informative to patients to see the text their clinician wrote, not the diagnosis on the bill. Showing problem lists is what we've done in Patientsite for 10 years.
3. We'll hold a conference call with e-Patient Dave, his doctor, Google, and me to review Dave's clinical and administrative data (with his permission), to capture a real world example of the differences between these data sources.
As a society we're likely to see increased data transparency between patients and providers, which will lead to several improvements:
1. Doctors will likely begin using more structured problem lists based on SNOMED-CT, a standardized clinical vocabulary of symptoms and conditions. This will enable their EHRs to better share data with PHRs as well as to more accurately measure quality. The Healthcare Information Technology Standards Panel (HITSP) has harmonized the national standards needed to reduce the dependency on billing data for PHRs, quality measurement, and population health.
2. Eventually, billing data will become more detailed as ICD-10 replaces ICD-9 billing codes in 2013. It will take several years for ICD-10 to be widely adopted and improve data granularity.
3. In the future, patients and doctors will work together to ensure records are up to date and accurate. It's a shared responsibility. Now that the Stimulus Bill requires doctors to make records available electronically to patients, the limitations of billing data will become more widely understood.
In the meantime, BIDMC will take the following actions to accelerate this work:
1. I'm working with the National Library of Medicine to map the most common Problem List terms used at BIDMC to SNOMED-CT, enabling BIDMC to use a clinical vocabulary and not just a billing vocabulary.
2. I'm working with Google to evaluate the impact of sending our existing free text problem lists instead of billing codes. It will reduce the number of features available to patients, since Google's educational materials are based on billing codes, but it may be more informative to patients to see the text their clinician wrote, not the diagnosis on the bill. Showing problem lists is what we've done in Patientsite for 10 years.
3. We'll hold a conference call with e-Patient Dave, his doctor, Google, and me to review Dave's clinical and administrative data (with his permission), to capture a real world example of the differences between these data sources.
What is Meaningful Use?
The definition of "Meaningful Use" in ARRA is one of the most critical decision points of the new administration's healthcare IT efforts. That definition will influence the types of products that will be implemented in clinician offices and the types of standards used for healthcare exchange to qualify for stimulus dollars.
For example, if meaningful use is defined as e-prescribing, then standalone products such as Dr. First's Rcopia could be used as part of a clinician's office compliance in lieu of a complete EHR.
If meaningful use is defined as the basics of ordering/viewing labs, then products like 4medica could constitute meaningful use.
If meaningful use requires sophisticated quality measurement, decision support, and workflow redesign to enhance efficiency, then a CCHIT certified comprehensive EHR may be required.
My prediction of meaningful use is that it will focus on quality and efficiency. It will require electronic exchange of quality measures including process and outcome metrics. It will require coordination of care through the transmission of clinical summaries. It will require decision support driven medication management with comprehensive eRx implementation (eligibility, formulary, history, drug/drug interaction, routing, refills).
Each year, the definition of meaningful use will be expanded, setting the bar higher and requiring more features and more data exchange.
Thus, in the short term, meaningful use may be a combination of products or an EHR lite. However, over the longer term, a comprehensive EHR will be the best foundation for meaningful use.
The definition of "certified" is also important. Today, CCHIT includes those criteria that make an EHR capable of supporting an optimal set of functionality. If certification is redefined as a baseline set of functionality, then more basic EHR lites may meet the definition of "certified". If certification is based on the criteria as written today and the likely evolving criteria for usability and interoperability, then a comprehensive EHR will be the best foundation.
There are many stakeholders on both sides of this discussion. Small clinician offices with few resources want stand alone e-prescribing and lightweight EHRs to get them started on e-health. Hospitals, larger practices, population health experts, and researchers favor a more comprehensive EHR.
As background, here's the HIMSS strawman proposal for meaningful use.
The next few months will settle this question once and for all. If you have an opinion about meaningful use, I expect the first recommendations to come from the new HIT Policy Committee and possibly NCVHS, an existing FACA advising HHS. Participation in any call for public comment will be the best opportunity to contribute your opinion.
For example, if meaningful use is defined as e-prescribing, then standalone products such as Dr. First's Rcopia could be used as part of a clinician's office compliance in lieu of a complete EHR.
If meaningful use is defined as the basics of ordering/viewing labs, then products like 4medica could constitute meaningful use.
If meaningful use requires sophisticated quality measurement, decision support, and workflow redesign to enhance efficiency, then a CCHIT certified comprehensive EHR may be required.
My prediction of meaningful use is that it will focus on quality and efficiency. It will require electronic exchange of quality measures including process and outcome metrics. It will require coordination of care through the transmission of clinical summaries. It will require decision support driven medication management with comprehensive eRx implementation (eligibility, formulary, history, drug/drug interaction, routing, refills).
Each year, the definition of meaningful use will be expanded, setting the bar higher and requiring more features and more data exchange.
Thus, in the short term, meaningful use may be a combination of products or an EHR lite. However, over the longer term, a comprehensive EHR will be the best foundation for meaningful use.
The definition of "certified" is also important. Today, CCHIT includes those criteria that make an EHR capable of supporting an optimal set of functionality. If certification is redefined as a baseline set of functionality, then more basic EHR lites may meet the definition of "certified". If certification is based on the criteria as written today and the likely evolving criteria for usability and interoperability, then a comprehensive EHR will be the best foundation.
There are many stakeholders on both sides of this discussion. Small clinician offices with few resources want stand alone e-prescribing and lightweight EHRs to get them started on e-health. Hospitals, larger practices, population health experts, and researchers favor a more comprehensive EHR.
As background, here's the HIMSS strawman proposal for meaningful use.
The next few months will settle this question once and for all. If you have an opinion about meaningful use, I expect the first recommendations to come from the new HIT Policy Committee and possibly NCVHS, an existing FACA advising HHS. Participation in any call for public comment will be the best opportunity to contribute your opinion.
Friday, April 10, 2009
Cool Technology of the Week
Although Google does not generally share the details of its infrastructure, Google's hardware architect recently shared the secrets of its servers, data centers and power management.
Having been involved on the advisory council for Google Health, I know that Google runs hundreds of thousands of servers. What I did not know is that it designs and builds its own. The real innovation - each server has a 12 volt battery attached the motherboard to keep the CPU running in case of power failure. Google does not use centralized uninterruptible power supplies. Building the power supply into the server means costs are matched directly to the number of servers. Google also uses the battery design on its network equipment.
Also interesting is that its data centers are standard metal shipping containers, each containing 1,160 servers and drawing 250 kilowatts of power. These shipping containers optimize power distribution, cooling and efficiency to reduced waste heat. For example, Google uses ultra-efficient power supplies that convert AC current to 12 volts DC. It's more efficient to transmit 12 volts over copper wires than 5 volts. All other power conversions take place on the motherboard.
Google servers, pictured above are 3.5 inches thick and contain two processors, two hard drives and 8 memory slots.
Incredible attention to detail to create highly scalable, very reliable, and maximally "green" server farms. That's cool!
Having been involved on the advisory council for Google Health, I know that Google runs hundreds of thousands of servers. What I did not know is that it designs and builds its own. The real innovation - each server has a 12 volt battery attached the motherboard to keep the CPU running in case of power failure. Google does not use centralized uninterruptible power supplies. Building the power supply into the server means costs are matched directly to the number of servers. Google also uses the battery design on its network equipment.
Also interesting is that its data centers are standard metal shipping containers, each containing 1,160 servers and drawing 250 kilowatts of power. These shipping containers optimize power distribution, cooling and efficiency to reduced waste heat. For example, Google uses ultra-efficient power supplies that convert AC current to 12 volts DC. It's more efficient to transmit 12 volts over copper wires than 5 volts. All other power conversions take place on the motherboard.
Google servers, pictured above are 3.5 inches thick and contain two processors, two hard drives and 8 memory slots.
Incredible attention to detail to create highly scalable, very reliable, and maximally "green" server farms. That's cool!
Thursday, April 9, 2009
Vegan Dining in Chicago
While at HIMSS, you can munch on the usual gourmet convention center cuisine - Hamburgers, Cappuccino, Pretzels, or Jumbo dogs. Or you could a walk a mile or two and enjoy Vegan Chicago. Most folks think of Chicago as deep dish pizza, ribs or steaks. However, there are a remarkable array of Vegan friendly restaurants and delis.
I walked from the Convention Center to Opera at 1301 S. Wabash to sample their vegan specialties. On Friday night I had a starter of a Vegan Corn fritter, followed by Spicy Shitake Wontons surrounding an asparagus salad, and then an entree not on the menu but made by the chef for me as a experimental future addition to the vegan menu - Pressed Tofu Kung Pao style. However, a side of Bok Choy provided a cool counterpoint.
On Monday night, I dropped by for a reprise and had a starter of fresh vegetable Moo Shu in two spicy sauces followed by an entree of Mapo Tofu. Another remarkable meal.
Here are a few resources for Chicago Vegan dining:
The Chicago Diner
Vegan Living overview of dining in Chicago
EcoBusinessLinks overview of dining in Chicago
The Happy Cow overview of dining in Chicago
Next time you're at HIMSS, go for something vegan. It gives you the endurance to run through the HIMSS exhibits in record time!
I walked from the Convention Center to Opera at 1301 S. Wabash to sample their vegan specialties. On Friday night I had a starter of a Vegan Corn fritter, followed by Spicy Shitake Wontons surrounding an asparagus salad, and then an entree not on the menu but made by the chef for me as a experimental future addition to the vegan menu - Pressed Tofu Kung Pao style. However, a side of Bok Choy provided a cool counterpoint.
On Monday night, I dropped by for a reprise and had a starter of fresh vegetable Moo Shu in two spicy sauces followed by an entree of Mapo Tofu. Another remarkable meal.
Here are a few resources for Chicago Vegan dining:
The Chicago Diner
Vegan Living overview of dining in Chicago
EcoBusinessLinks overview of dining in Chicago
The Happy Cow overview of dining in Chicago
Next time you're at HIMSS, go for something vegan. It gives you the endurance to run through the HIMSS exhibits in record time!
Wednesday, April 8, 2009
The Data Elements of an EHR
I've recently been asked to provide a list of the data elements of an EHR which might be used as part of the ARRA mandate to exchange data as part of meaningful use. There are a nearly infinite number of actors, actions and events for data exchange, but in the interest of getting "data liquidity" in healthcare, here are the elements that are most commonly used and represent a great starting point for healthcare information exchange. I always strive for parsimony of standards - the fewest that we need for the purpose. Below you'll see that I've included the standards that support the systems we have in place today as well as the XML/Web-based standards that support newer web-centric systems and healthcare information exchanges.
Demographics
Content: HL7 2.x for messaging, CCD for document summaries
Vocabulary: HITSP Harmonized codesets for gender, marital status
Problem List
Content: HL7 2.x for messaging, CCD for document summaries
Vocabulary: SNOMED-CT
Medications
Content: NCPDP script for messaging, CCD for document summaries
Vocabulary: RxNorm and Structured SIG
Allergies
Content: HL7 2.x for messaging, CCD for document summaries
Vocabulary: UNII for foods and substances, NDF-RT for medication class, RxNorm for Medications
Progress Notes and Other Narrative Documents (History and Physical, Operative Notes, Discharge Summary)
Content: HL7 2.x for messaging, CCD for document summaries
Vocabulary: CDA Templates
Departmental Reports (Pathology/Cytology, GI, Pulmonary, Cardiology etc.)
Content: HL7 2.x for messaging, CCD for document summaries
Vocabulary: SNOMED-CT
Laboratory Results
Content: HL7 2.x for messaging, CCD for document summaries
Vocabulary: LOINC for lab name, UCUM for units of measure, SNOMED-CT for test ordering reason
Microbiology
Content: HL7 2.x for messaging, CCD for document summaries
Vocabulary: LOINC for lab name/observation
Images
Content: DICOM
Administrative Transactions (Benefits/Eligibility, Referral/Authorization, Claims/Remittance)
Content: X12
Vocabulary: X12, CAQH CORE
Quality Measures
Content: Derived from all the data elements above
Vocabulary: Derived from all the data elements above
Privacy and Security
Transport: HTTPS, SOAP/REST
Transport Orchestration: WS*
Authorization/Access Control: XACML
Given that meaningful use needs to be achieved by 2011-2012, it's clear that we cannot rip and replace existing hospital information systems and EHRs. We need to leverage them, upgrade them over time, and install new systems in an incremental fashion. This is really true of any change in healthcare. If we had a greenfield, we would design healthcare delivery, payment, and infrastructure entirely differently. Unfortunately, we do not have a greenfield, we have limited resources, and limited time to achieve healthcare reform, so we need to leverage what we have and evolve it in phases.
HITSP will be reformatting and streamlining its previous work over the next 90 days to support ARRA, the HIT Standards and Policies Committees, and ONC. I hope you agree that the list of EHR data elements above is practical, achievable now, and reasonable.
Demographics
Content: HL7 2.x for messaging, CCD for document summaries
Vocabulary: HITSP Harmonized codesets for gender, marital status
Problem List
Content: HL7 2.x for messaging, CCD for document summaries
Vocabulary: SNOMED-CT
Medications
Content: NCPDP script for messaging, CCD for document summaries
Vocabulary: RxNorm and Structured SIG
Allergies
Content: HL7 2.x for messaging, CCD for document summaries
Vocabulary: UNII for foods and substances, NDF-RT for medication class, RxNorm for Medications
Progress Notes and Other Narrative Documents (History and Physical, Operative Notes, Discharge Summary)
Content: HL7 2.x for messaging, CCD for document summaries
Vocabulary: CDA Templates
Departmental Reports (Pathology/Cytology, GI, Pulmonary, Cardiology etc.)
Content: HL7 2.x for messaging, CCD for document summaries
Vocabulary: SNOMED-CT
Laboratory Results
Content: HL7 2.x for messaging, CCD for document summaries
Vocabulary: LOINC for lab name, UCUM for units of measure, SNOMED-CT for test ordering reason
Microbiology
Content: HL7 2.x for messaging, CCD for document summaries
Vocabulary: LOINC for lab name/observation
Images
Content: DICOM
Administrative Transactions (Benefits/Eligibility, Referral/Authorization, Claims/Remittance)
Content: X12
Vocabulary: X12, CAQH CORE
Quality Measures
Content: Derived from all the data elements above
Vocabulary: Derived from all the data elements above
Privacy and Security
Transport: HTTPS, SOAP/REST
Transport Orchestration: WS*
Authorization/Access Control: XACML
Given that meaningful use needs to be achieved by 2011-2012, it's clear that we cannot rip and replace existing hospital information systems and EHRs. We need to leverage them, upgrade them over time, and install new systems in an incremental fashion. This is really true of any change in healthcare. If we had a greenfield, we would design healthcare delivery, payment, and infrastructure entirely differently. Unfortunately, we do not have a greenfield, we have limited resources, and limited time to achieve healthcare reform, so we need to leverage what we have and evolve it in phases.
HITSP will be reformatting and streamlining its previous work over the next 90 days to support ARRA, the HIT Standards and Policies Committees, and ONC. I hope you agree that the list of EHR data elements above is practical, achievable now, and reasonable.
Tuesday, April 7, 2009
Dispatch from HIMSS
Every year, I walk the floor of HIMSS and summarize the broad trends I see in the industry. Here are my 2009 observations
1. The Stimulus - There's energy and optimism at HIMSS this year caused by the prospect of $34 billion dollars of stimulus funding for Healthcare IT. Since the current healthcare IT annual spend is somewhere between $15-25 billion, this could double the entire industry.
2. The Economy - the euphoria of the stimulus is tempered by the challenging economy - downsizing, wage cuts, and the collapse of 401k's are a stark contrast to the hope of new EHR rollouts.
3. Software as a Service - It's clear that all these new EHRs are not going to be hosted in the offices of rural solo practitioners. Web-based software as a service hosting centers for EHRs are being discussed by many companies - software producers, infrastructure providers and consulting companies. Many server, storage and virtualization suppliers are offering new products to support the Cloud Computing infrastructure needed to support Software as a Service hosting.
4. Security - with the prospect of every patient in the country having an interoperable EHR by 2014, there are increasing concerns about protecting confidentiality. Companies are creating new security tools, new consent management systems, and new audit reporting systems.
5. Open Source - Just as Linux has become mainstream in corporate data centers, open source EHR and HIE products are becoming more mainstream. A group of open source vendors met with CCHIT to discuss their role in the EHR ecosystem.
6. PHRs - with more EHRs comes the prospect of more PHRs to share electronic data with patients. Google announced its Medicare data sharing pilot. A really interesting question to be answered is the role of PHRs in the meaningful use of EHRs. Will EHR to PHR data sharing qualify for the interoperability requirements of meaningful use? To be determined.
7. Appliances for HIE - In previous years, folks exchanging data among stakeholders focused on content - shall we use HL7 2.x or 3.x, shall we use LOINC or SNOMED-CT? This year, the focus has been on infrastructure - how do we transport data securely from one stakeholder to another? A few companies are offering integration engines and health information exchange appliances to address this secure transport requirement. HHS released its CONNECT open source Nationwide Health Information Network gateway, built by 20 members of the Federal Health Architecture team.
8. Home Healthcare Care/Telemedicine - Continua Alliance, GE and Intel have all embraced remote monitoring and home care as one strategy to reduce healthcare costs while improving quality.
9. Performance Measurement and Outcomes - Tools for quality warehousing, business intelligence/reporting, and risk adjustment are being offered by many vendors.
10. Decision Support - As more EHRs are rolled out, we'll need decision support rules and services. Several companies offer order sets, knowledgebases, and decision support web services.
A good show with many innovative interoperability products, especially in the interoperability showcase. It was good to catchup with colleagues and vendors - 24,000 of my closest friends!
1. The Stimulus - There's energy and optimism at HIMSS this year caused by the prospect of $34 billion dollars of stimulus funding for Healthcare IT. Since the current healthcare IT annual spend is somewhere between $15-25 billion, this could double the entire industry.
2. The Economy - the euphoria of the stimulus is tempered by the challenging economy - downsizing, wage cuts, and the collapse of 401k's are a stark contrast to the hope of new EHR rollouts.
3. Software as a Service - It's clear that all these new EHRs are not going to be hosted in the offices of rural solo practitioners. Web-based software as a service hosting centers for EHRs are being discussed by many companies - software producers, infrastructure providers and consulting companies. Many server, storage and virtualization suppliers are offering new products to support the Cloud Computing infrastructure needed to support Software as a Service hosting.
4. Security - with the prospect of every patient in the country having an interoperable EHR by 2014, there are increasing concerns about protecting confidentiality. Companies are creating new security tools, new consent management systems, and new audit reporting systems.
5. Open Source - Just as Linux has become mainstream in corporate data centers, open source EHR and HIE products are becoming more mainstream. A group of open source vendors met with CCHIT to discuss their role in the EHR ecosystem.
6. PHRs - with more EHRs comes the prospect of more PHRs to share electronic data with patients. Google announced its Medicare data sharing pilot. A really interesting question to be answered is the role of PHRs in the meaningful use of EHRs. Will EHR to PHR data sharing qualify for the interoperability requirements of meaningful use? To be determined.
7. Appliances for HIE - In previous years, folks exchanging data among stakeholders focused on content - shall we use HL7 2.x or 3.x, shall we use LOINC or SNOMED-CT? This year, the focus has been on infrastructure - how do we transport data securely from one stakeholder to another? A few companies are offering integration engines and health information exchange appliances to address this secure transport requirement. HHS released its CONNECT open source Nationwide Health Information Network gateway, built by 20 members of the Federal Health Architecture team.
8. Home Healthcare Care/Telemedicine - Continua Alliance, GE and Intel have all embraced remote monitoring and home care as one strategy to reduce healthcare costs while improving quality.
9. Performance Measurement and Outcomes - Tools for quality warehousing, business intelligence/reporting, and risk adjustment are being offered by many vendors.
10. Decision Support - As more EHRs are rolled out, we'll need decision support rules and services. Several companies offer order sets, knowledgebases, and decision support web services.
A good show with many innovative interoperability products, especially in the interoperability showcase. It was good to catchup with colleagues and vendors - 24,000 of my closest friends!
Monday, April 6, 2009
Saving the Boston Globe
I've joined a dozen other bloggers in posting this message simultaneously:
"We have all read recently about the threat of possible closure faced by the Boston Globe. A number of Boston-based bloggers who care about the continued existence of the Globe have banded together in conducting a blog rally. We are simultaneously posting this paragraph to solicit your ideas of steps the Globe could take to improve its financial picture.
"We view the Globe as an important community resource, and we think that lots of people in the region agree and might have creative ideas that might help in this situation. So, here's your chance. Please don't write with nasty comments and sarcasm: Use this forum for thoughtful and interesting steps you would recommend to the management that would improve readership, enhance the Globe's community presence, and make money. Who knows, someone here might come up with an idea that will work, or at least help. Thank you."
"We have all read recently about the threat of possible closure faced by the Boston Globe. A number of Boston-based bloggers who care about the continued existence of the Globe have banded together in conducting a blog rally. We are simultaneously posting this paragraph to solicit your ideas of steps the Globe could take to improve its financial picture.
"We view the Globe as an important community resource, and we think that lots of people in the region agree and might have creative ideas that might help in this situation. So, here's your chance. Please don't write with nasty comments and sarcasm: Use this forum for thoughtful and interesting steps you would recommend to the management that would improve readership, enhance the Globe's community presence, and make money. Who knows, someone here might come up with an idea that will work, or at least help. Thank you."
My Telepresence Experience
I returned to Boston from HIMSS to spend the day with my daughter on her 16th birthday. We had a great day cooking, hiking, and enjoying a fabulous Japanese meal as a family. However, I was also in Chicago for 2 hours, speaking with the press about the Stimulus Bill, Interoperability, and Decision Support.
The press gathered at the Cisco booth on the floor of HIMSS and I welcomed them to my basement via Telepresence. We chatted for an hour with full life-sized 1080p real time video. Truly, there was no difference from sitting in a room together, other than the fact that my basement had a bubbling fish tank and 2 rabbits running around.
My Telepresence experience was different from traditional video conferencing in that it did not feel like a video conference. All the other units I've used in the past have had a small grainy picture with tinny sound, often displaying picture within a picture. Telepresence is just high definition video and audio with the feeling that you are in room with the other participants, not on a video conference.
All the eye contact, gestures, and common courtesy you'd use in an in person meeting are natural in Telepresence. I changed my body position, my eye focus and my voice direction as I spoke to various participants.
The technology worked perfectly over my home Verizon FIOS 20 megabit connection.
I really enjoyed using the technology to connect with HIMSS while being in Boston with my daughter. At 4am I'm heading back to HIMSS for a day of meetings with many stakeholders. I'm working on becoming 100% virtual, but our culture is not quite ready for that!
The press gathered at the Cisco booth on the floor of HIMSS and I welcomed them to my basement via Telepresence. We chatted for an hour with full life-sized 1080p real time video. Truly, there was no difference from sitting in a room together, other than the fact that my basement had a bubbling fish tank and 2 rabbits running around.
My Telepresence experience was different from traditional video conferencing in that it did not feel like a video conference. All the other units I've used in the past have had a small grainy picture with tinny sound, often displaying picture within a picture. Telepresence is just high definition video and audio with the feeling that you are in room with the other participants, not on a video conference.
All the eye contact, gestures, and common courtesy you'd use in an in person meeting are natural in Telepresence. I changed my body position, my eye focus and my voice direction as I spoke to various participants.
The technology worked perfectly over my home Verizon FIOS 20 megabit connection.
I really enjoyed using the technology to connect with HIMSS while being in Boston with my daughter. At 4am I'm heading back to HIMSS for a day of meetings with many stakeholders. I'm working on becoming 100% virtual, but our culture is not quite ready for that!
Friday, April 3, 2009
The HIT Policy Committee Members
ARRA directed the Comptroller General to appoint 13 members to the HIT Policy Committee for terms of three years, although the members first appointed by the Comptroller General have staggered terms. An additional seven members will be appointed by the Secretary of Health and Human Services, the Majority and Minority leaders of the Senate, and the Speaker and Minority leader of the House of Representatives. The President can appoint other members as representatives of relevant federal agencies.
The 13 members the Acting Comptroller General has appointed across 10 different categories
are:
Advocates for Patients or Consumers
1. Christine Bechtel, Washington, D.C. (3 year term)
Vice President, National Partnership for Women & Families
2. Arthur Davidson, M.D., Denver Colorado (2 year term)
Denver Public Health Department; Director, Public Health Informatics; Director, Denver
Center for Public Health Preparedness; Medical epidemiologist; Director, HIV/AIDS
Surveillance, City and County of Denver
3. Adam Clark, Ph.D., Austin, Texas (1 year term)
Director of Research and Policy, Lance Armstrong Foundation
Representatives of Health Care Providers, including 1 physician
4. Marc Probst, Salt Lake City, Utah (3 year term)
Chief Information Officer, Intermountain Healthcare
5. Paul Tang, M.D., Mountain View, California (2 year term)
Vice President and Chief Medical Information Officer, Palo Alto Medical Foundation
Labor Organization Representing Health Care Workers
6. Scott White, New York City, New York (1 year term)
Assistant Director, Technology Project Director, 1199 SEIU Training and Employment
Fund
Expert in Health Information Privacy & Security
7. LaTanya Sweeney, Ph.D., Pittsburgh, Pennsylvania (3 year term)
Director, Data Privacy Lab, Associate Professor of Computer Science, Technology and
Policy, Carnegie Mellon University
Expert in Improving the Health of Vulnerable Populations
8. Neil Calman, M.D., New York City, New York (2 year term)
President and CEO, The Institute for Family Health, Inc.
Research Community
9. Connie Delaney, R.N., Ph.D., Minneapolis, Minnesota (1 year term)
Dean, School of Nursing, University of Minnesota
Representative of Health Plans or Other Third-Party Payers
10. Charles Kennedy, M.D., Camarillo, California (3 year term)
Vice President, Health Information Technology, Wellpoint, Inc.
Representative of Information Technology Vendors
11. Judith Faulkner, Verona, Wisconsin (2 year term)
Founder, CEO, President, Chairman of the Board, Epic Systems Corporation
Representative of Purchasers or Employers
12. David Lansky, Ph.D., San Francisco, California (1 year term)
President and CEO, Pacific Business Group on Health
Expert in Health Care Quality Measurement and Reporting
13. David Bates, M.D., Boston, Massachusetts (3 year term)
Medical Director for Clinical and Quality Analysis, Chief of General Internal Medicine,
Partners HealthCare/Brigham & Women’s Hospital
All are great choices and I look forward to working with these folks. They are the new "Board of Directors" for Healthcare IT in the US!
The 13 members the Acting Comptroller General has appointed across 10 different categories
are:
Advocates for Patients or Consumers
1. Christine Bechtel, Washington, D.C. (3 year term)
Vice President, National Partnership for Women & Families
2. Arthur Davidson, M.D., Denver Colorado (2 year term)
Denver Public Health Department; Director, Public Health Informatics; Director, Denver
Center for Public Health Preparedness; Medical epidemiologist; Director, HIV/AIDS
Surveillance, City and County of Denver
3. Adam Clark, Ph.D., Austin, Texas (1 year term)
Director of Research and Policy, Lance Armstrong Foundation
Representatives of Health Care Providers, including 1 physician
4. Marc Probst, Salt Lake City, Utah (3 year term)
Chief Information Officer, Intermountain Healthcare
5. Paul Tang, M.D., Mountain View, California (2 year term)
Vice President and Chief Medical Information Officer, Palo Alto Medical Foundation
Labor Organization Representing Health Care Workers
6. Scott White, New York City, New York (1 year term)
Assistant Director, Technology Project Director, 1199 SEIU Training and Employment
Fund
Expert in Health Information Privacy & Security
7. LaTanya Sweeney, Ph.D., Pittsburgh, Pennsylvania (3 year term)
Director, Data Privacy Lab, Associate Professor of Computer Science, Technology and
Policy, Carnegie Mellon University
Expert in Improving the Health of Vulnerable Populations
8. Neil Calman, M.D., New York City, New York (2 year term)
President and CEO, The Institute for Family Health, Inc.
Research Community
9. Connie Delaney, R.N., Ph.D., Minneapolis, Minnesota (1 year term)
Dean, School of Nursing, University of Minnesota
Representative of Health Plans or Other Third-Party Payers
10. Charles Kennedy, M.D., Camarillo, California (3 year term)
Vice President, Health Information Technology, Wellpoint, Inc.
Representative of Information Technology Vendors
11. Judith Faulkner, Verona, Wisconsin (2 year term)
Founder, CEO, President, Chairman of the Board, Epic Systems Corporation
Representative of Purchasers or Employers
12. David Lansky, Ph.D., San Francisco, California (1 year term)
President and CEO, Pacific Business Group on Health
Expert in Health Care Quality Measurement and Reporting
13. David Bates, M.D., Boston, Massachusetts (3 year term)
Medical Director for Clinical and Quality Analysis, Chief of General Internal Medicine,
Partners HealthCare/Brigham & Women’s Hospital
All are great choices and I look forward to working with these folks. They are the new "Board of Directors" for Healthcare IT in the US!
Cool Technology of the Week
I'm flying to HIMSS today to present a keynote with Senator Whitehouse about Health Information Exchange
I'll describe the work we've done in Massachusetts to create an appliance, using HITSP standards, that transports data for medication management, clinical summary exchange, administrative transactions, and quality reporting.
Over the past several years, I've been involved in many Healthcare Information Exchange projects for academic health centers, communities, and physician groups.
One interesting approach, that is my cool technology of the week, is OpenHRE(tm), Free and Open Source Software (FOSS) created and supported by Browsersoft. The goals of the OpenHRE project are
* to foster development, distribution and support of standard Record Locator, Health Record Exchange and Access Control services held as Free/Open Source Software
* to build a community to this aim
* to realize this goal via a self-sustaining business model and open collaboration among all stakeholders
The OpenHRE.org site is currently hosted as a collaboration among all interested parties including OpenHRE Community Contributors. These folks have implemented live data exchanges serving rural, metropolitan and State level initiatives in Tehachapi California, Franklin Louisiana, San Antonio Texas and throughout the state of Kansas.
The architecture is simple and is based on the Markle Foundation’s Common Framework that was implemented in the 2004-2005 Nationwide Health Information Network prototype projects. OpenHRE offers an open source record locator service (which provides community master patient index services), basic content exchange services using HL7 2.x and Continuity of Care Document standards, and a web-based clinical data viewer.
Importantly, they also provide a consent and information protection framework for easy control of clinical data flows. You'll find all the details in this manual provided by Gijs van Oort, from Healthcare Access of San Antonio , an OpenHRE Community Contributor.
They've developed these components, and their production exchanges, on small budgets using Linux, Java, and MySQL plus a great deal of volunteer time.
Their code is available at the OpenHRE site. Just click on "Downloads" in the Main Menu, and then click on "Software" to access the link to SourceForge. From this site you can download the software, submit to the support or developer forums, report bugs, and request new features.
An open source Health Information Exchange using standards, good policies, and a well thought out architecture.
That's cool!
I'll describe the work we've done in Massachusetts to create an appliance, using HITSP standards, that transports data for medication management, clinical summary exchange, administrative transactions, and quality reporting.
Over the past several years, I've been involved in many Healthcare Information Exchange projects for academic health centers, communities, and physician groups.
One interesting approach, that is my cool technology of the week, is OpenHRE(tm), Free and Open Source Software (FOSS) created and supported by Browsersoft. The goals of the OpenHRE project are
* to foster development, distribution and support of standard Record Locator, Health Record Exchange and Access Control services held as Free/Open Source Software
* to build a community to this aim
* to realize this goal via a self-sustaining business model and open collaboration among all stakeholders
The OpenHRE.org site is currently hosted as a collaboration among all interested parties including OpenHRE Community Contributors. These folks have implemented live data exchanges serving rural, metropolitan and State level initiatives in Tehachapi California, Franklin Louisiana, San Antonio Texas and throughout the state of Kansas.
The architecture is simple and is based on the Markle Foundation’s Common Framework that was implemented in the 2004-2005 Nationwide Health Information Network prototype projects. OpenHRE offers an open source record locator service (which provides community master patient index services), basic content exchange services using HL7 2.x and Continuity of Care Document standards, and a web-based clinical data viewer.
Importantly, they also provide a consent and information protection framework for easy control of clinical data flows. You'll find all the details in this manual provided by Gijs van Oort, from Healthcare Access of San Antonio , an OpenHRE Community Contributor.
They've developed these components, and their production exchanges, on small budgets using Linux, Java, and MySQL plus a great deal of volunteer time.
Their code is available at the OpenHRE site. Just click on "Downloads" in the Main Menu, and then click on "Software" to access the link to SourceForge. From this site you can download the software, submit to the support or developer forums, report bugs, and request new features.
An open source Health Information Exchange using standards, good policies, and a well thought out architecture.
That's cool!
Thursday, April 2, 2009
New Priorities for HITSP
Some of you may have seen the news alert from Modern Healthcare today "HITSP suspends activity for 90 days" and the followup article at Modern Healthcare's website "HITSP pauses use case work to focus on EHR stimulus requirements"
I've spoken with the author Joe Conn, shared my powerpoint presentation, and have been told that further clarification will appear in Modern Healthcare's Beyond the Headlines tomorrow.
The article notes that the HIT Standards Committee "is the apparent replacement of the current Healthcare Information Technology Standards Panel, which was created in 2005 as a private, not-for-profit organization but funded under a $3.3 million contract from HHS. "
The HIT Standards Committee will hopefully will be the evolution of NeHC. It is my hope and expectation that HITSP will now receive its priorities from the HIT Standards Committee, not be replaced by it.
I'd like to summarize my webinar message to the HITSP Panel today, which is about reprioritizing and accelerating our work, not suspending it.
The American Recovery and Reinvestment Act contains numerous technology and privacy provisions with aggressive timelines for completion.
Many of these ARRA milestones are related to standards and the work of the Healthcare Information Technology Standards Panel including
1. Technology to protect privacy and security
2. A nationwide health information infrastructure that supports exchange of health information
3. The use of certified health record for each person in the US by 2014
4. Technologies to account for disclosures of health information
5. The use of certified electronic health records to improve the quality of health care, such as by promoting the coordination of health care and improving continuity of health care among health care providers, by reducing medical errors, by improving population health, by reducing health disparities, by reducing chronic disease, and by advancing research and education.
6. Technologies that allow individually identifiable health information to be rendered unusable, unreadable, or indecipherable to unauthorized individual
7. The use of electronic systems to ensure the comprehensive collection of patient demographic data
8. Technologies that address the needs of children and other vulnerable populations
In order to meet these statutory requirements, HITSP must focus the energies of its volunteers, staff, and leadership on these areas for the next 90 days. This means that HITSP's products to date - 13 interoperability specifications - will be leveraged to create new streamlined electronically published standards guides organized around the ARRA EHR interoperability requirements . The end result will be much more compact, easy to implement, and flexible implementation guidance which supports the meaningful use of EHRs and protection of privacy.
This focus on ARRA will result in a re-examination and adjustment of the current HITSP work schedule for the next 90 days. Some efforts (e.g., SSA, Interoperability Showcase, Quality Measures, NHIN and CCHIT coordination, etc.) are expected to continue on a non-interference basis.
We've circulated our early thoughts about the work ahead to our technical committee chairs and encouraged them to discuss scope, time, and resource needs with their committee members.
I look forward to the great work HITSP will do together over the next 90 days. It will be like running a marathon, but it will be worth it!
I've spoken with the author Joe Conn, shared my powerpoint presentation, and have been told that further clarification will appear in Modern Healthcare's Beyond the Headlines tomorrow.
The article notes that the HIT Standards Committee "is the apparent replacement of the current Healthcare Information Technology Standards Panel, which was created in 2005 as a private, not-for-profit organization but funded under a $3.3 million contract from HHS. "
The HIT Standards Committee will hopefully will be the evolution of NeHC. It is my hope and expectation that HITSP will now receive its priorities from the HIT Standards Committee, not be replaced by it.
I'd like to summarize my webinar message to the HITSP Panel today, which is about reprioritizing and accelerating our work, not suspending it.
The American Recovery and Reinvestment Act contains numerous technology and privacy provisions with aggressive timelines for completion.
Many of these ARRA milestones are related to standards and the work of the Healthcare Information Technology Standards Panel including
1. Technology to protect privacy and security
2. A nationwide health information infrastructure that supports exchange of health information
3. The use of certified health record for each person in the US by 2014
4. Technologies to account for disclosures of health information
5. The use of certified electronic health records to improve the quality of health care, such as by promoting the coordination of health care and improving continuity of health care among health care providers, by reducing medical errors, by improving population health, by reducing health disparities, by reducing chronic disease, and by advancing research and education.
6. Technologies that allow individually identifiable health information to be rendered unusable, unreadable, or indecipherable to unauthorized individual
7. The use of electronic systems to ensure the comprehensive collection of patient demographic data
8. Technologies that address the needs of children and other vulnerable populations
In order to meet these statutory requirements, HITSP must focus the energies of its volunteers, staff, and leadership on these areas for the next 90 days. This means that HITSP's products to date - 13 interoperability specifications - will be leveraged to create new streamlined electronically published standards guides organized around the ARRA EHR interoperability requirements . The end result will be much more compact, easy to implement, and flexible implementation guidance which supports the meaningful use of EHRs and protection of privacy.
This focus on ARRA will result in a re-examination and adjustment of the current HITSP work schedule for the next 90 days. Some efforts (e.g., SSA, Interoperability Showcase, Quality Measures, NHIN and CCHIT coordination, etc.) are expected to continue on a non-interference basis.
We've circulated our early thoughts about the work ahead to our technical committee chairs and encouraged them to discuss scope, time, and resource needs with their committee members.
I look forward to the great work HITSP will do together over the next 90 days. It will be like running a marathon, but it will be worth it!
Infinite Growth in a Finite World
We're all aware of the Bernie Madoff Ponzi scheme. He used money from new investors to pay unreasonably high, consistent rates of return to his old investors.
Two recent articles by Thomas L. Friedman, The Inflection Is Near? and Mother Nature's Dow ask if we are engaged in a global Ponzi scheme of accelerating consumption and growth.
Most for-profit companies I've worked with as an advisor or Board member measure their success in quarter per quarter growth percentages.
In the economy of a bygone era, local businesses were considered successful when they made a high quality product, maintained the livelihood of a few employees, and built relationships with customers. There was a focus on service to the community rather than endless growth in value for shareholders.
Just as Bernie Madoff promised double digit returns, the US economy experienced rapid growth (likely unsustainable) via credit cards, speculation, and mortgaging our children's future.
If our resources are finite - we have a limited amount of fresh water, a fixed set of raw materials, and a cap to the population that can be sustained in the environment - infinite growth is not possible.
I believe that the era of "growth is good" is coming to an end. It is my hope that the era of quality, employee retention, customer satisfaction, and sustainability will replace it.
Call me old fashioned, but does it make sense for hedge fund managers, venture capitalists, and option traders to make such high returns without really contributing to society? Do they create new ideas, innovative products, or value added services? Or are they no better than sophisticated speculators in a global Ponzi scheme? The vast sums of money they make come from somewhere and we're all paying the price now for their creation of derivative investments that were based on the notion that home prices and businesses would have infinite growth in a finite world.
As I approach 50, my view of the world and my own needs have changed significantly. In my 20's I measured success by the amount of stuff I owned, the size of my house, and the speed of my car. Now I measure my success by the amount of stuff I do not have, the smallness of my house, and the carbon footprint of my car.
If we all endeavor to focus on the quality of life, the sustainability of our environment, and the future of our children rather than endless growth, the world will be a better place.
Two recent articles by Thomas L. Friedman, The Inflection Is Near? and Mother Nature's Dow ask if we are engaged in a global Ponzi scheme of accelerating consumption and growth.
Most for-profit companies I've worked with as an advisor or Board member measure their success in quarter per quarter growth percentages.
In the economy of a bygone era, local businesses were considered successful when they made a high quality product, maintained the livelihood of a few employees, and built relationships with customers. There was a focus on service to the community rather than endless growth in value for shareholders.
Just as Bernie Madoff promised double digit returns, the US economy experienced rapid growth (likely unsustainable) via credit cards, speculation, and mortgaging our children's future.
If our resources are finite - we have a limited amount of fresh water, a fixed set of raw materials, and a cap to the population that can be sustained in the environment - infinite growth is not possible.
I believe that the era of "growth is good" is coming to an end. It is my hope that the era of quality, employee retention, customer satisfaction, and sustainability will replace it.
Call me old fashioned, but does it make sense for hedge fund managers, venture capitalists, and option traders to make such high returns without really contributing to society? Do they create new ideas, innovative products, or value added services? Or are they no better than sophisticated speculators in a global Ponzi scheme? The vast sums of money they make come from somewhere and we're all paying the price now for their creation of derivative investments that were based on the notion that home prices and businesses would have infinite growth in a finite world.
As I approach 50, my view of the world and my own needs have changed significantly. In my 20's I measured success by the amount of stuff I owned, the size of my house, and the speed of my car. Now I measure my success by the amount of stuff I do not have, the smallness of my house, and the carbon footprint of my car.
If we all endeavor to focus on the quality of life, the sustainability of our environment, and the future of our children rather than endless growth, the world will be a better place.
Wednesday, April 1, 2009
The Hive Mind
Over the past few years, I've radically redesigned my approach to learning. In the past, I memorized information. Now, I need to be a knowledge navigator, not a repository of facts. I've delegated the management of facts to the "Hive Mind" of the internet. With Web 2.0, we're all publishers and authors. Every one of us can be instantly connected to the best experts, the most up to date news, and an exobyte multimedia repository. However, much of the internet has no editor, so the Hive Mind information is probably only 80% factual - the challenge is that you do not know which 80%.
Here are few examples of my recent use of the Hive Mind as my auxiliary brain.
I was listening to a 1970's oldies station and heard a few bars of a song. I did not remember the song name, album or artist. I did remember the words "Logical", "Cynical", "Magical". Entering these into a search engine, I immediately retrieved Supertramp's Logical Song lyrics. With the Hive Mind, I can now flush all the fragments of song lyrics from my brain without fear.
My daughter asked me a question from her chemistry homework about calculating the mass of nitrogen gas gathered over water. I did remember the ideal gas law (PV=nRT), but I did not recall how to correct for the partial pressure of water using Dalton's Law. One quick search for "nitrogen collected over water" yield sample problem sets from colleges that refreshed my memory with all I needed to know.
While writing, I'm constantly looking up words, concepts, maps, and dates. I know how to look for them and where to find them.
There are a few times when the Hive Mind yields surprising results. I wanted to learn more about the Stimulus Bill's "Healthcare IT Standards Committee". I wanted to check out the "ARRA privacy timeline". Finally, I was looking for information about the "healthcare CIO". All three of these searches returned my own writing as the first hit. The blessing and the curse of Web 2.0 is that blogs are the news and personal opinions can become facts.
At the moment I have a balanced separation between my own mind and the Hive Mind. However, as we Twitter, Facebook, and LinkedIn, I wonder if the separation between our human mind and our network mind will blur.
I remember an Outer Limits episode Stream of Consciousness (actually, I found it in Wikipedia by searching Google for "outer limits episode stream") in which everyone in society is connected to the "Stream" and shares a network connected existence based on information, not knowledge. In the end, the Stream is destroyed and mankind has to re-learn how to think for themselves.
As the closing dialog of that episode notes
"We make tools to extend our abilities, to further our reach, and fulfill our aspirations. But we must never let them define us. For if there is no difference between tool and maker, then who will be left to build the world?"
Words to live by as we use the Hive Mind of the internet.
Here are few examples of my recent use of the Hive Mind as my auxiliary brain.
I was listening to a 1970's oldies station and heard a few bars of a song. I did not remember the song name, album or artist. I did remember the words "Logical", "Cynical", "Magical". Entering these into a search engine, I immediately retrieved Supertramp's Logical Song lyrics. With the Hive Mind, I can now flush all the fragments of song lyrics from my brain without fear.
My daughter asked me a question from her chemistry homework about calculating the mass of nitrogen gas gathered over water. I did remember the ideal gas law (PV=nRT), but I did not recall how to correct for the partial pressure of water using Dalton's Law. One quick search for "nitrogen collected over water" yield sample problem sets from colleges that refreshed my memory with all I needed to know.
While writing, I'm constantly looking up words, concepts, maps, and dates. I know how to look for them and where to find them.
There are a few times when the Hive Mind yields surprising results. I wanted to learn more about the Stimulus Bill's "Healthcare IT Standards Committee". I wanted to check out the "ARRA privacy timeline". Finally, I was looking for information about the "healthcare CIO". All three of these searches returned my own writing as the first hit. The blessing and the curse of Web 2.0 is that blogs are the news and personal opinions can become facts.
At the moment I have a balanced separation between my own mind and the Hive Mind. However, as we Twitter, Facebook, and LinkedIn, I wonder if the separation between our human mind and our network mind will blur.
I remember an Outer Limits episode Stream of Consciousness (actually, I found it in Wikipedia by searching Google for "outer limits episode stream") in which everyone in society is connected to the "Stream" and shares a network connected existence based on information, not knowledge. In the end, the Stream is destroyed and mankind has to re-learn how to think for themselves.
As the closing dialog of that episode notes
"We make tools to extend our abilities, to further our reach, and fulfill our aspirations. But we must never let them define us. For if there is no difference between tool and maker, then who will be left to build the world?"
Words to live by as we use the Hive Mind of the internet.
Subscribe to:
Posts (Atom)