Next Tuesday, my daughter turns 18. She becomes an adult with the ability to vote, take legal responsibility for her actions, and assert her own independence.
In some ways, my job as a parent is done. She has a good moral compass, feels good about herself, and is resilient. She knows when to ask for help and is open with us about her feelings, challenges, and goals. She's decided to skip much of adolescence and go directly from child to adult, bypassing most of the rebelliousness and occasional self destructive behavior of teens.
She's learned to balance work and play, limit texting and use of electronic devices, and how to build and grow relationships. She has the tools she needs to navigate the next stage of life as she enters college at Tufts University this Fall.
What have I learned from our past 18 years together?
1. Create a non-punitive climate of trust. It's far better to encourage discussion of tough issues than to "shoot the messenger" and create a fear of communication.
2. Strike a balance between too much oversight and too little. In 4 months, she'll leave home and make decisions for herself. She'll decide what to eat and drink, who to spend her time with, and how to balance academics with leisure. Managing her every moment at home with strict oversight may produce short term success but does not enable her to take ownership of the decisions she makes - good and bad. Providing no oversight can lead to risky and destructive behaviors. We've tried to set wide and reasonable limits, then give her free reign to run her life within those limits. She's learned from her mistakes and is a stronger, more self-reliant person because she had the freedom to choose her own path.
3. As with my professional life, I pay more attention to her trajectory than her position. Humans between 12 and 19 can have highly variable moods, rapidly changing ideas, and contrary behaviors. Reacting to every event day to day is likely to cause frustration on both sides. Chances are that today's troubling issue will be gone tomorrow or next week. Focus on the big picture, not the brushstrokes.
4. Strong negative emotions accomplish nothing. In the past 18 years, I can only remember a few times that I've raised my voice. Not only was it ineffective, I spent substantial time repairing the emotional damage done. The term I've used before is "Save as Draft". If you ever feel negative emotions and want to yell, Save as Draft. Have a thoughtful discussion and rethink your emotions based on winning the war, not the battle.
5. Family experiences last a lifetime. Although it may not be immediately clear that time spent together has a profound affect, I can see that my daughter will pursue activities throughout her life inspired by the things we've done together over the past 18 years. Her love of nature, mountains, Japan, gardening, and beaches all come from those hours we spent experiencing the world together.
Of course, she'll have triumphs and tribulations in college. She'll seek our advice and support when she needs it. We'll help her launch a family of her own and continue to share our 50 years of life lessons when they can aid her decision making.
In August, we become empty nesters. Just as we transitioned from the spontaneity of our 20's to the parental responsibilities of our 30's, we're now headed into our next phase.
Thank you Lara for the past 18 years. You've made me a better person and I am confident you'll fledge into a magnificent young woman.
As president of the Mayo Clinic Platform, I lead a portfolio of new digital platform businesses focused on transforming health by leveraging artificial intelligence, the internet of things, and an ecosystem of partners for Mayo Clinic. This is made possible by an extraordinary team of people at Mayo and collaborators worldwide. This blog will document their story.
Thursday, March 31, 2011
Wednesday, March 30, 2011
The March HIT Standards Committee Meeting
The March HIT Standards Committee meeting focused on the Stage 2 Meaningful Use work ahead, the Direct Project, certificate management, provider directories, devices, and plans to ensure the certification process has the tools and scripts it needs to reduce the burden on vendors and self-certifiers.
Farzad Mostashari, Acting National Coordinator, began the meeting with a discussion of the trajectory we're on. We're guided by policy outcomes - improved quality, safety, and efficiency. The work in 2011 will include the regulatory effort to finalize Stage 2 of Meaningful Use and its accompanying standards and certification criteria. Efforts will be guided by the new Federal HIT Strategic Plan and will require the hard working teams of volunteers that staff the Policy committee, the Standards committee and numerous other working groups such as the Institute of Medicine Learning Healthcare System group, the PCAST workgroup, and the Privacy/Security Tiger Team. It will be a busy year.
Doug Fridsma reviewed the Timeline and Milestones for NPRM Stage 2, noting the NPRM draft needs to be completed in Q3 2011, the NPRM will be published Q4 2011, and the final rule will be published Q1 2012. This means that the work on the standards needs to be completed this Summer. Timing for Stage 2 is going to be tight because certification tests must be developed, vetted, and implemented by certification bodies so that EHRs have lead-time for software development, certification, upgrade/installation, and training by October 1, 2012. The work ahead is in 4 areas:
1. Vocabulary - we need to reduce the optionality and alternatives for vocabularies and code sets. Multiple vocabularies create significant complexity for vendors and users. We need a library of constrained codesets that comprise 95% of transactional volume and value i.e. 300 LOINC codes are sufficient for 98% of all ordered lab tests.
2. Upgrade from paper to electronic data transmission - we need to increase the specificity of data transmission standards (i.e. Direct, Connect, Exchange should support most point to point and query/response use cases). A major focus of standards work should be the content, vocabulary and transmission standards needed for care transitions.
3. President's Council of Advisors on Science and Technology (PCAST) - we should implement pilots and experiments which will provide the foundation for the kinds of query/response transactions suggested by the PCAST report.
4. The Nationwide Health Information Network specifications should be upgraded to include content and transport standards needed for meaningful use.
Next, Arien Malec reviewed the real world experience deploying Direct. Feedback has been generally positive and lessons learned include issues of certificate management and challenges of integration into complex corporate email systems. The Standards Committee was very pleased with the progress and looks forward to upcoming Best Practices and Compliance guides.
Jim Walker accepted the position of Chair of the Clinical Quality Workgroup. The Workgroup scope includes reducing barriers to implementation of quality measures, focusing on the cost/benefit of gathering the necessary data elements from the EHR including exclusionary criteria.
Dixie Baker presented the Privacy & Security Standards Workgroup Recommendations for Certificate Management The committee accepted all 3 recommendations made by the Workgroup including:
*Requirements and evaluation criteria for digital certificates
*Need for investigation of alternatives for cross-certifying digital certificate issuers with the Federal Bridge (important for interoperability between non-Federal and Federal organizations)
*Policy Questions for the HIT Policy Committee related to creating a trust fabric for health information exchange
Next, Walter Suarez presented the specifications for entity level provider directories based on input from the HIT Policy Committee. Standards will be needed for
*Directory Structure and Content
*Submission of directory entries to a National Registry via a publication/posting protocol
*A directory query language
On April 13, the Policy Committee will finish its requirements for Individual Level Provider Directories (ILPDs). By May, the Privacy and Security Workgroup will present its standards recommendations for both entity level and individual level provider directories. Although there is much work needed to harmonize standards for the entire directory ecosystem. In the short term, the most important work is the ability of an EHR to query a provider directory. That is likely to be a stage 2 certification item.
Jamie Ferguson presented a summary of the March 28 Device Hearings. Major themes were
*Consumers and patients need device interoperability to be inexpensive and easy
*The last mile of connectivity into home devices is currently a barrier
*EHRs need a place to collect device and patient submitted data
*Incentives need to be aligned so that clinicians are willing to accept and review device data
*Patient identity needs to be specified in the transactions between the device and the EHR/PHR
*Standards need to support end to end communication from the device to the EHR and not just to/from intermediaries/hubs/service providers.
*Device interoperability should be included in Stage 3, not Stage 2
Finally, Judy Murphy and Liz Johnson presented their plan for ensuring future NIST scripts and supporting tools reduce the burden of certification including such items as
*the need to pilot scripts
*the need to ensure medications and labs in the scripts are clinically reasonable
*the need to ensure certification criteria such as security requirements are easily testable
*clarifying how modules can be assembled, inheriting the certification characteristics of each
*ensuring the availability of tools needed to test every standards-based transaction
A great meeting. We're all committed to creating standards and certification criteria for Stage 2 that will address the needs of all our stakeholders.
Farzad Mostashari, Acting National Coordinator, began the meeting with a discussion of the trajectory we're on. We're guided by policy outcomes - improved quality, safety, and efficiency. The work in 2011 will include the regulatory effort to finalize Stage 2 of Meaningful Use and its accompanying standards and certification criteria. Efforts will be guided by the new Federal HIT Strategic Plan and will require the hard working teams of volunteers that staff the Policy committee, the Standards committee and numerous other working groups such as the Institute of Medicine Learning Healthcare System group, the PCAST workgroup, and the Privacy/Security Tiger Team. It will be a busy year.
Doug Fridsma reviewed the Timeline and Milestones for NPRM Stage 2, noting the NPRM draft needs to be completed in Q3 2011, the NPRM will be published Q4 2011, and the final rule will be published Q1 2012. This means that the work on the standards needs to be completed this Summer. Timing for Stage 2 is going to be tight because certification tests must be developed, vetted, and implemented by certification bodies so that EHRs have lead-time for software development, certification, upgrade/installation, and training by October 1, 2012. The work ahead is in 4 areas:
1. Vocabulary - we need to reduce the optionality and alternatives for vocabularies and code sets. Multiple vocabularies create significant complexity for vendors and users. We need a library of constrained codesets that comprise 95% of transactional volume and value i.e. 300 LOINC codes are sufficient for 98% of all ordered lab tests.
2. Upgrade from paper to electronic data transmission - we need to increase the specificity of data transmission standards (i.e. Direct, Connect, Exchange should support most point to point and query/response use cases). A major focus of standards work should be the content, vocabulary and transmission standards needed for care transitions.
3. President's Council of Advisors on Science and Technology (PCAST) - we should implement pilots and experiments which will provide the foundation for the kinds of query/response transactions suggested by the PCAST report.
4. The Nationwide Health Information Network specifications should be upgraded to include content and transport standards needed for meaningful use.
Next, Arien Malec reviewed the real world experience deploying Direct. Feedback has been generally positive and lessons learned include issues of certificate management and challenges of integration into complex corporate email systems. The Standards Committee was very pleased with the progress and looks forward to upcoming Best Practices and Compliance guides.
Jim Walker accepted the position of Chair of the Clinical Quality Workgroup. The Workgroup scope includes reducing barriers to implementation of quality measures, focusing on the cost/benefit of gathering the necessary data elements from the EHR including exclusionary criteria.
Dixie Baker presented the Privacy & Security Standards Workgroup Recommendations for Certificate Management The committee accepted all 3 recommendations made by the Workgroup including:
*Requirements and evaluation criteria for digital certificates
*Need for investigation of alternatives for cross-certifying digital certificate issuers with the Federal Bridge (important for interoperability between non-Federal and Federal organizations)
*Policy Questions for the HIT Policy Committee related to creating a trust fabric for health information exchange
Next, Walter Suarez presented the specifications for entity level provider directories based on input from the HIT Policy Committee. Standards will be needed for
*Directory Structure and Content
*Submission of directory entries to a National Registry via a publication/posting protocol
*A directory query language
On April 13, the Policy Committee will finish its requirements for Individual Level Provider Directories (ILPDs). By May, the Privacy and Security Workgroup will present its standards recommendations for both entity level and individual level provider directories. Although there is much work needed to harmonize standards for the entire directory ecosystem. In the short term, the most important work is the ability of an EHR to query a provider directory. That is likely to be a stage 2 certification item.
Jamie Ferguson presented a summary of the March 28 Device Hearings. Major themes were
*Consumers and patients need device interoperability to be inexpensive and easy
*The last mile of connectivity into home devices is currently a barrier
*EHRs need a place to collect device and patient submitted data
*Incentives need to be aligned so that clinicians are willing to accept and review device data
*Patient identity needs to be specified in the transactions between the device and the EHR/PHR
*Standards need to support end to end communication from the device to the EHR and not just to/from intermediaries/hubs/service providers.
*Device interoperability should be included in Stage 3, not Stage 2
Finally, Judy Murphy and Liz Johnson presented their plan for ensuring future NIST scripts and supporting tools reduce the burden of certification including such items as
*the need to pilot scripts
*the need to ensure medications and labs in the scripts are clinically reasonable
*the need to ensure certification criteria such as security requirements are easily testable
*clarifying how modules can be assembled, inheriting the certification characteristics of each
*ensuring the availability of tools needed to test every standards-based transaction
A great meeting. We're all committed to creating standards and certification criteria for Stage 2 that will address the needs of all our stakeholders.
Tuesday, March 29, 2011
The HIT Standards Committee Device Hearing
Yesterday, the Clinical Operations Workgroup of the HIT Standards Committee held an all day hearing to identify barriers and enablers for device interoperability including those used in clinical settings and the home.
Here are the questions we asked.
We started with a Patient/Consumer Panel
Rob Havasy, Partners Center for Connected Health
Courtney Rees Lyles, MD, Group Health Cooperative
Robert Jarrin, co-chair Continua Health Alliance
We then continued with a Provider Panel
Julian Goldman, MD, Partners Healthcare
Scott Evans, Intermountain Healthcare
Jo Carol Hiatt, MD, Kaiser Permanente
Sara Toscano, RN, BSN, Clinical Informatics Coordinator & Coordinator for IntelliVue Clinical Information Portfolio, VA Maryland Healthcares System, Veterans Health Administration
We then reviewed the current work on interoperability and data integration standards
Dale Wiggins, Chief Technology Officer, Philips Healthcare
Charles Jaffe, HL7
Elliot B. Sloane, PhD, Drexel University, IHE
Charles Parisot, GE Healthcare IT
John Garguilo, NIST
We reviewed the issues of Data Accuracy & Integrity
Tim Escher, Epic
John Zaleski, PhD, CPHIMS, Chief Technology Officer, Nuvon
Marlene Haddad, Office of Health Informatics Management, Veterans Health Administration
Karen Thomas, Advanced TeleHealth Solutions
We next heard from exports on Device Security
Patrick Heim, Kaiser Permanente, Healthcare Security Alliance
Todd Cooper, 80001 Experts, LLC
David Fisher, Medical and Imaging Technology Alliance
We had an important discussion of Unique Device Identification
Jay Crowley, Food and Drug Administration, HHS
Betsy Humphreys, National Library of Medicine
James P. Keller, Jr., ECRI Institute
Elliot B. Sloane, PhD, Drexel University, IHE
Michael Howells, Bosch Healthcare
A great day with many thoughtful presentations. I'll review the major themes tomorrow when I wrote about the March HIT Standards Committee meeting.
Here are the questions we asked.
We started with a Patient/Consumer Panel
Rob Havasy, Partners Center for Connected Health
Courtney Rees Lyles, MD, Group Health Cooperative
Robert Jarrin, co-chair Continua Health Alliance
We then continued with a Provider Panel
Julian Goldman, MD, Partners Healthcare
Scott Evans, Intermountain Healthcare
Jo Carol Hiatt, MD, Kaiser Permanente
Sara Toscano, RN, BSN, Clinical Informatics Coordinator & Coordinator for IntelliVue Clinical Information Portfolio, VA Maryland Healthcares System, Veterans Health Administration
We then reviewed the current work on interoperability and data integration standards
Dale Wiggins, Chief Technology Officer, Philips Healthcare
Charles Jaffe, HL7
Elliot B. Sloane, PhD, Drexel University, IHE
Charles Parisot, GE Healthcare IT
John Garguilo, NIST
We reviewed the issues of Data Accuracy & Integrity
Tim Escher, Epic
John Zaleski, PhD, CPHIMS, Chief Technology Officer, Nuvon
Marlene Haddad, Office of Health Informatics Management, Veterans Health Administration
Karen Thomas, Advanced TeleHealth Solutions
We next heard from exports on Device Security
Patrick Heim, Kaiser Permanente, Healthcare Security Alliance
Todd Cooper, 80001 Experts, LLC
David Fisher, Medical and Imaging Technology Alliance
We had an important discussion of Unique Device Identification
Jay Crowley, Food and Drug Administration, HHS
Betsy Humphreys, National Library of Medicine
James P. Keller, Jr., ECRI Institute
Elliot B. Sloane, PhD, Drexel University, IHE
Michael Howells, Bosch Healthcare
A great day with many thoughtful presentations. I'll review the major themes tomorrow when I wrote about the March HIT Standards Committee meeting.
Monday, March 28, 2011
The Federal Health IT Strategic Plan
On Friday, March 25, ONC released the Federal Healthcare IT Strategic Plan 2011-2015.
Here's an outline of the five goals and a bit of commentary:
Goal I: Achieve Adoption and Information Exchange through Meaningful Use of Health IT
A. Accelerate adoption of electronic health records
*Provide financial incentive payments for the adoption and meaningful use of certified EHR technology.
*Provide implementation support to health care providers to help them adopt, implement, and use certified EHR technology.
*Support the development of a trained workforce to implement and use health IT technologies.
*Encourage the inclusion of meaningful use in professional certification and medical education.
*Establish criteria and a process to certify EHR technology that can support meaningful use criteria.
*Communicate the value of EHRs and the benefits of achieving meaningful use.
*Align federal programs and services with the adoption and meaningful use of certified EHR technology.
*Work with private sector payers and provider groups to encourage providers to achieve meaningful use.
*Encourage and facilitate improved usability of EHR technology.
B. Facilitate information exchange to support meaningful use of electronic health records
*Foster business models that create health information exchange.
*Monitor health information exchange options and fill the gaps for providers that do not have viable options.
*Ensure that health information exchange takes place across individual exchange models, and advance health systems and data interoperability.
C. Support health IT adoption and information exchange for public health and populations with unique needs
*Ensure public health agencies are able to receive and share information with providers using certified EHR technology.
*Track health disparities and promote health IT that reduces them.
*Support health IT adoption and information exchange in long-term/post-acute, behavioral health, and emergency care settings.
Goal II: Improve Care, Improve Population Health, and Reduce Health Care Costs through the Use of Health IT
A. Support more sophisticated uses of EHRs and other health IT to improve health system performance
*Identify and implement best practices that use EHRs and other health IT to improve care, efficiency, and population health.
*Create administrative efficiencies to reduce cost and burden for providers, payers, and government health programs.
B. Better manage care, efficiency, and population health through EHR-generated reporting measures
*Identify specific measures that align with the National Health Care Quality Strategy and Plan.
*Establish standards, specifications, and certification criteria for collecting and reporting measures through certified EHR technology.
C. Demonstrate health IT-enabled reform of payment structures, clinical practices, and population health management
*Fund and administer demonstration communities to show how the advanced use of health IT can achieve measurable improvements in care, efficiency, and population health.
*Align health IT initiatives and clinical and payment reform pilots and demonstrations.
D. Support new approaches to the use of health IT in research, public and population health, and national health security
*Establish new approaches to and identify ways health IT can support national prevention, health promotion, public health, and national health security.
*Invest in health IT infrastructure to support the National Prevention and Health Promotion Strategy.
*Ensure a mechanism for information exchange in support of research and the translation of research findings back into clinical practice.
Goal III: Inspire Confidence and Trust in Health IT
A. Protect confidentiality, integrity, and availability of health information
*Promulgate appropriate and enforceable federal policies to protect the privacy and security of health information.
*Enforce existing federal privacy and security laws and maintain consistency with federal confidentiality, policy.
*Encourage the incorporation of privacy and security functionality into health IT.
*Identify health IT system security vulnerabilities and develop strategic solutions.
*Identify health IT privacy and security requirements and best practices, and communicate them through health IT programs.
B. Inform individuals of their rights and increase transparency regarding the uses of protected health information
*Inform individuals about their privacy and security rights and how their information may be used and shared.
*Increase transparency regarding the development of policies and standards related to uses and sharing of protected health information.
*Require easy to understand reporting of breach notifications.
C. Improve safety and effectiveness of health IT
*Provide implementation and best practice tools for the effective use of health IT.
*Evaluate safety concerns and update approach to health IT safety.
*Monitor patient safety issues related to health IT and address concerns.
Goal IV: Empower Individuals with Health IT to Improve their Health and the Health Care System
A. Engage individuals with health IT
*Listen to individuals and implement health IT policies and programs to meet their interests.
* Communicate with individuals openly and spread messages through existing communication networks and dialogues.
B. Accelerate individual and caregiver access to their electronic health information in a format they can use and reuse
*Through Medicare and Medicaid EHR Incentive Programs, encourage providers to give patients access to their health information in an electronic format.
*Through federal agencies that deliver or pay for health care, act as a model for sharing information with individuals and make available tools to do so.
*Establish public policies that foster individual and caregiver access to their health information while protecting privacy and security.
C. Integrate patient-generated health information and consumer health IT with clinical applications to support patient-centered care
*Support the development of standards and tools that make EHR technology capable of interacting with consumer health IT and build these requirements for the use of standards and tools into EHR certification.
*Solicit and integrate patient-generated health information into EHRs and quality measurements.
*Encourage the use of consumer health IT to move toward patient-centered care.
Goal V: Achieve Rapid Learning and Technological Advancement
A. Lead the creation of a learning health system to support quality, research, and public/population health
*Establish an initial group of learning health system participants.
*Develop standards, policies, and technologies to connect individual participants within the learning health system.
*Engage patients, providers, researchers, and institutions to exchange information through the learning health system.
*Grow the learning health system by adding more members and expanding policies and
population standards as needed.
B. Broaden the capacity of health IT through innovation and research
*Liberate health data to enable health IT innovation.
*Make targeted investments in health IT research.
*Employ government programs and services as test beds for innovative health IT.
*Monitor and promote industry innovation.
*Provide clear direction to the health IT industry regarding government roles and policies for protecting individuals while not stifling innovation.
As I read the report, there were a few paragraphs I found particularly interesting. I believe they suggest important directions to watch:
*Stages two and three are anticipated to transition gradually away from further process requirements like those included in stage one, to requirements for improvement in outcomes and quality of care.
*For providers ineligible for incentive payments (for example, long-term and post-acute care facilities, community mental health centers, or substance use disorder treatment providers), the government is developing technology and policy solutions that build on meaningful use and fit their unique needs.
*The health information exchange strategy in Goal I focuses on first fostering exchange that is already happening today, supporting exchange where it is not taking place, and creating means for exchange between local initiatives.
*Goal IV recognizes the importance of engaging and empowering individuals with electronic health information in order to move to patient-centered care, and proposes strategies for doing so.
*Goal V includes a clear vision and path forward for building a “learning health system" that will become increasingly prominent over the next several years.
Of note, ICD-10 is mentioned only once (page 24) and X12 5010 is not mentioned at all. The justification for work on ICD-10 is "ICD-10-CM/PCS code sets will enable a more granular understanding of health care treatments and outcomes, and more complete analyses of treatment costs, ultimately allowing for better disease management and more efficient health care delivery." My personal opinion is that we should defer the work on ICD-10 while we're navigating meaningful use stage 1, 2 and 3. Accurate coding requires comprehensive clinical documentation on the front end including adoption of clinical vocabularies such as SNOMED-CT. Let's enhance our front end documentation before thinking about the back end coding.
Overall the Federal Strategic plan is a winner - it melds Meaningful Use, Certification, Health Information Exchange, PCAST, and the Institute of Medicine work on creating a learning healthcare system.
The next National Coordinator will have the benefit of a great strategic plan - David Blumenthal's parting gift.
Here's an outline of the five goals and a bit of commentary:
Goal I: Achieve Adoption and Information Exchange through Meaningful Use of Health IT
A. Accelerate adoption of electronic health records
*Provide financial incentive payments for the adoption and meaningful use of certified EHR technology.
*Provide implementation support to health care providers to help them adopt, implement, and use certified EHR technology.
*Support the development of a trained workforce to implement and use health IT technologies.
*Encourage the inclusion of meaningful use in professional certification and medical education.
*Establish criteria and a process to certify EHR technology that can support meaningful use criteria.
*Communicate the value of EHRs and the benefits of achieving meaningful use.
*Align federal programs and services with the adoption and meaningful use of certified EHR technology.
*Work with private sector payers and provider groups to encourage providers to achieve meaningful use.
*Encourage and facilitate improved usability of EHR technology.
B. Facilitate information exchange to support meaningful use of electronic health records
*Foster business models that create health information exchange.
*Monitor health information exchange options and fill the gaps for providers that do not have viable options.
*Ensure that health information exchange takes place across individual exchange models, and advance health systems and data interoperability.
C. Support health IT adoption and information exchange for public health and populations with unique needs
*Ensure public health agencies are able to receive and share information with providers using certified EHR technology.
*Track health disparities and promote health IT that reduces them.
*Support health IT adoption and information exchange in long-term/post-acute, behavioral health, and emergency care settings.
Goal II: Improve Care, Improve Population Health, and Reduce Health Care Costs through the Use of Health IT
A. Support more sophisticated uses of EHRs and other health IT to improve health system performance
*Identify and implement best practices that use EHRs and other health IT to improve care, efficiency, and population health.
*Create administrative efficiencies to reduce cost and burden for providers, payers, and government health programs.
B. Better manage care, efficiency, and population health through EHR-generated reporting measures
*Identify specific measures that align with the National Health Care Quality Strategy and Plan.
*Establish standards, specifications, and certification criteria for collecting and reporting measures through certified EHR technology.
C. Demonstrate health IT-enabled reform of payment structures, clinical practices, and population health management
*Fund and administer demonstration communities to show how the advanced use of health IT can achieve measurable improvements in care, efficiency, and population health.
*Align health IT initiatives and clinical and payment reform pilots and demonstrations.
D. Support new approaches to the use of health IT in research, public and population health, and national health security
*Establish new approaches to and identify ways health IT can support national prevention, health promotion, public health, and national health security.
*Invest in health IT infrastructure to support the National Prevention and Health Promotion Strategy.
*Ensure a mechanism for information exchange in support of research and the translation of research findings back into clinical practice.
Goal III: Inspire Confidence and Trust in Health IT
A. Protect confidentiality, integrity, and availability of health information
*Promulgate appropriate and enforceable federal policies to protect the privacy and security of health information.
*Enforce existing federal privacy and security laws and maintain consistency with federal confidentiality, policy.
*Encourage the incorporation of privacy and security functionality into health IT.
*Identify health IT system security vulnerabilities and develop strategic solutions.
*Identify health IT privacy and security requirements and best practices, and communicate them through health IT programs.
B. Inform individuals of their rights and increase transparency regarding the uses of protected health information
*Inform individuals about their privacy and security rights and how their information may be used and shared.
*Increase transparency regarding the development of policies and standards related to uses and sharing of protected health information.
*Require easy to understand reporting of breach notifications.
C. Improve safety and effectiveness of health IT
*Provide implementation and best practice tools for the effective use of health IT.
*Evaluate safety concerns and update approach to health IT safety.
*Monitor patient safety issues related to health IT and address concerns.
Goal IV: Empower Individuals with Health IT to Improve their Health and the Health Care System
A. Engage individuals with health IT
*Listen to individuals and implement health IT policies and programs to meet their interests.
* Communicate with individuals openly and spread messages through existing communication networks and dialogues.
B. Accelerate individual and caregiver access to their electronic health information in a format they can use and reuse
*Through Medicare and Medicaid EHR Incentive Programs, encourage providers to give patients access to their health information in an electronic format.
*Through federal agencies that deliver or pay for health care, act as a model for sharing information with individuals and make available tools to do so.
*Establish public policies that foster individual and caregiver access to their health information while protecting privacy and security.
C. Integrate patient-generated health information and consumer health IT with clinical applications to support patient-centered care
*Support the development of standards and tools that make EHR technology capable of interacting with consumer health IT and build these requirements for the use of standards and tools into EHR certification.
*Solicit and integrate patient-generated health information into EHRs and quality measurements.
*Encourage the use of consumer health IT to move toward patient-centered care.
Goal V: Achieve Rapid Learning and Technological Advancement
A. Lead the creation of a learning health system to support quality, research, and public/population health
*Establish an initial group of learning health system participants.
*Develop standards, policies, and technologies to connect individual participants within the learning health system.
*Engage patients, providers, researchers, and institutions to exchange information through the learning health system.
*Grow the learning health system by adding more members and expanding policies and
population standards as needed.
B. Broaden the capacity of health IT through innovation and research
*Liberate health data to enable health IT innovation.
*Make targeted investments in health IT research.
*Employ government programs and services as test beds for innovative health IT.
*Monitor and promote industry innovation.
*Provide clear direction to the health IT industry regarding government roles and policies for protecting individuals while not stifling innovation.
As I read the report, there were a few paragraphs I found particularly interesting. I believe they suggest important directions to watch:
*Stages two and three are anticipated to transition gradually away from further process requirements like those included in stage one, to requirements for improvement in outcomes and quality of care.
*For providers ineligible for incentive payments (for example, long-term and post-acute care facilities, community mental health centers, or substance use disorder treatment providers), the government is developing technology and policy solutions that build on meaningful use and fit their unique needs.
*The health information exchange strategy in Goal I focuses on first fostering exchange that is already happening today, supporting exchange where it is not taking place, and creating means for exchange between local initiatives.
*Goal IV recognizes the importance of engaging and empowering individuals with electronic health information in order to move to patient-centered care, and proposes strategies for doing so.
*Goal V includes a clear vision and path forward for building a “learning health system" that will become increasingly prominent over the next several years.
Of note, ICD-10 is mentioned only once (page 24) and X12 5010 is not mentioned at all. The justification for work on ICD-10 is "ICD-10-CM/PCS code sets will enable a more granular understanding of health care treatments and outcomes, and more complete analyses of treatment costs, ultimately allowing for better disease management and more efficient health care delivery." My personal opinion is that we should defer the work on ICD-10 while we're navigating meaningful use stage 1, 2 and 3. Accurate coding requires comprehensive clinical documentation on the front end including adoption of clinical vocabularies such as SNOMED-CT. Let's enhance our front end documentation before thinking about the back end coding.
Overall the Federal Strategic plan is a winner - it melds Meaningful Use, Certification, Health Information Exchange, PCAST, and the Institute of Medicine work on creating a learning healthcare system.
The next National Coordinator will have the benefit of a great strategic plan - David Blumenthal's parting gift.
Friday, March 25, 2011
Cool Technology of the Week
As a Prius driver since 2005, I've closely watched the evolution of hybrid vehicles. The FY10 Prius included an optional solar powered cooling system. The FY11 Prius offers a plug in option to charge the batteries from household current overnight.
This week, Google installed a wireless induction unit at its Mountain View headquarters to charge a specially equipped Prius.
The charging system is a prototype product from Plugless Power which works on the principle of electromagnetic induction. A coil in the charging station is connected to an electrical source and another coil is placed in the Prius. Electric current flowing through a primary coil creates a magnetic field that acts on the secondary coil producing a current within it, charging the Prius battery.
A Prius driver parks the car near a charging station. A paddle on the charging station moves to align the two coils and charging begins. The end result is a fully charged battery without wires.
A zero emission vehicle with automatic charging in your parking space.
That's cool!
This week, Google installed a wireless induction unit at its Mountain View headquarters to charge a specially equipped Prius.
The charging system is a prototype product from Plugless Power which works on the principle of electromagnetic induction. A coil in the charging station is connected to an electrical source and another coil is placed in the Prius. Electric current flowing through a primary coil creates a magnetic field that acts on the secondary coil producing a current within it, charging the Prius battery.
A Prius driver parks the car near a charging station. A paddle on the charging station moves to align the two coils and charging begins. The end result is a fully charged battery without wires.
A zero emission vehicle with automatic charging in your parking space.
That's cool!
Thursday, March 24, 2011
What is Leadership?
Thousands of books have been written about leadership. I've posted many blogs about my leadership lessons learned as a CIO. As I mature (I turn 50 next year), my view of leadership has become increasingly clear. Here's what I look for in a leader (and what I aspire to do myself)
1. Guidance - A consistent vision that everyone can understand and support.
2. Priority Setting - A sense of urgency that sets clear mandates for what to do and importantly want not to do.
3. Sponsorship - "Air Cover" when a project runs into difficulty. Communication with the Board, Senior Leadership, and the general organization as needed.
4. Resources - A commitment to provide staff, operating budget, and capital to ensure project success.
5. Dispute resolution - Mediation when stakeholders cannot agree how or when to do a project.
6. Decision making - Active listening and participation when tough decisions need to be made.
7. Compassion - Empathy for the people involved in change management challenges.
8. Support - Trust for the managers overseeing work and respect for the plans they produce that balance stress creation and relief.
9. Responsiveness - Availability via email, phone, or in person when issues need to be escalated.
10. Equanimity - Emotional evenness that is highly predictable no matter what happens day to day
When my daughter asks me what I do every day, I tell her that I provide guidance and priority setting for my staff, resolve disputes, and continuously communicate. Of the 10 items above, the Resource part is the only item I cannot personally control, since organizational processes beyond my pay grade set budgets (which always seem to mismatch supply with demand - it's a curse of IT.)
When I think about the best times in my own career, the real breakthroughs occurred when leaders created a sense of urgency, provided resources, and broadly communicated. These circumstances led to such innovations as the Mycourses educational portal at Harvard Medical School, the widespread adoption of Provider Order Entry in CareGroup hospitals, and implementation of the BIDMC disaster recovery data center.
Healthcare reform will give us all many opportunities for leadership. We'll have increasing Volatility, Uncertainty, Complexity and Ambiguity ahead and by embracing the 10 characteristics above, I'm confident we will succeed.
Maybe we should add "optimism" as the 11th characteristic of leadership. Colin Powell says that optimism is a force multiplier. When workload seems overwhelming, budgets look bleak, a complex project struggles toward completion, or a key staff member departs, a leader will buoy morale by offering words of encouragement that inspire optimism. When I think of the great leaders in history, optimism in the face of seemingly impossible odds (Winston Churchill at the Battle of Britain, FDR in the Great Depression, John Kennedy during the race to the moon) has made it possible for people and nations to accomplish things never believed possible.
1. Guidance - A consistent vision that everyone can understand and support.
2. Priority Setting - A sense of urgency that sets clear mandates for what to do and importantly want not to do.
3. Sponsorship - "Air Cover" when a project runs into difficulty. Communication with the Board, Senior Leadership, and the general organization as needed.
4. Resources - A commitment to provide staff, operating budget, and capital to ensure project success.
5. Dispute resolution - Mediation when stakeholders cannot agree how or when to do a project.
6. Decision making - Active listening and participation when tough decisions need to be made.
7. Compassion - Empathy for the people involved in change management challenges.
8. Support - Trust for the managers overseeing work and respect for the plans they produce that balance stress creation and relief.
9. Responsiveness - Availability via email, phone, or in person when issues need to be escalated.
10. Equanimity - Emotional evenness that is highly predictable no matter what happens day to day
When my daughter asks me what I do every day, I tell her that I provide guidance and priority setting for my staff, resolve disputes, and continuously communicate. Of the 10 items above, the Resource part is the only item I cannot personally control, since organizational processes beyond my pay grade set budgets (which always seem to mismatch supply with demand - it's a curse of IT.)
When I think about the best times in my own career, the real breakthroughs occurred when leaders created a sense of urgency, provided resources, and broadly communicated. These circumstances led to such innovations as the Mycourses educational portal at Harvard Medical School, the widespread adoption of Provider Order Entry in CareGroup hospitals, and implementation of the BIDMC disaster recovery data center.
Healthcare reform will give us all many opportunities for leadership. We'll have increasing Volatility, Uncertainty, Complexity and Ambiguity ahead and by embracing the 10 characteristics above, I'm confident we will succeed.
Maybe we should add "optimism" as the 11th characteristic of leadership. Colin Powell says that optimism is a force multiplier. When workload seems overwhelming, budgets look bleak, a complex project struggles toward completion, or a key staff member departs, a leader will buoy morale by offering words of encouragement that inspire optimism. When I think of the great leaders in history, optimism in the face of seemingly impossible odds (Winston Churchill at the Battle of Britain, FDR in the Great Depression, John Kennedy during the race to the moon) has made it possible for people and nations to accomplish things never believed possible.
Wednesday, March 23, 2011
The PCAST Use Cases
As I posted yesterday, the PCAST Workgroup has discussed use cases which correspond to three levels of healthcare information exchange supported by a Universal Exchange Language (UEL) and Data Element Access Service (DEAS) - "push by patient of data between two points", "simple search for data", and "complex search for data". They are intended to support PHR and EHR health information exchanges for a multitude of uses, include clinical care, population health and clinical research. A 4th Use Case incorporates de-identified data.
Use Case 1 - Push by patient between two points.
The patient logs into a tethered PHR via username/password or other authentication mechanism provided by the clinical organization hosting the data. The patient chooses to push the data to the non-tethered PHR of their choice. Many possible architectures and approaches can support this including download from the tethered PHR with upload to the un-tethered PHR, a push directly from the tethered PHR to the un-tethered PHR (as Google Health and Microsoft Health support today), or the use of secure email from the tethered PHR to the un-tethered PHR using the Direct standards via a secure health email address. In each case, the data sent wrapped in a UEL envelope containing patient identity, provenance, and privacy metadata information. UEL Metadata might also include non-disclosing information about the categories health data available in the content package i.e. medication list, problem list, allergy list, labs, radiology images etc.
When the UEL arrives at the non-tethered EHR, data is shown to the patient, who can elect to incorporate structured and unstructured data into their existing un-tethered PHR dataset. Then, the patient can then choose to share PHR data with clinicians, clinical researchers, or public health by pushing selective PHR data wrapped in an UEL envelope via secure transmission (such as Direct) to recipients of their choice. Organizational certificates are needed for the senders (un-tethered PHR hosting organization) and the recipients (clinician offices, clinical research organizations, public health organizations). Audit trails are held by senders, recipients and any Health Information Service Providers used as part of Direct transport. Patient authentication is username/password as required by the PHRs. Provider authentication is username/password or other modality as required by the EHR.
Summarizing the infrastructure for this approach, we will need
:
*A UEL that includes patient identity, provenance, privacy metadata, and categories of health data available in the content package. There will need to be semantic standards for this metadata including the content/vocabulary of identity, providence, privacy metadata, and categories of health data
*Applications which are capable of wrapping content packages of clinical data in the UEL
*Applications which are capable of receiving the UEL and unwrapping content packages
*Certificate management to secure the endpoints and support privacy controls
*Policies that support push of data between two points.
Use Case 2 - Simple Search
A patient presents to an Emergency Department and notes their records are stored at a specific clinician office and a specific hospital. An Emergency Physician obtains patient consent to retrieve their records. A query is created that includes patient identity, consent information, and provider authentication data. A Data Element Access Service which serves as an entity level provider directory is securely queried to determine the Uniform Resource Identifiers (URIs) of the clinician office and hospital. The query is sent to the URIs, which return a UEL wrapper containing identity information, provenance, patient privacy metadata based on any consents on file at the organizations hosting patient records, and non-disclosing information about the categories health data available in the content package. The content package inside the UEL includes numerous appropriate vocabularies. The receiving clinician can choose to incorporate structured and unstructured data into the Emergency Department record. All exchanges are query/response. Organizational certificates are needed for the Emergency Department, the clinician office and the hospital. Audit trails are held by all these organizations. Provider authentication is username/password or other modality as required by the ED information system or national policy.
Summarizing the infrastructure for this approach, in addition to the infrastructure of Use Case 1, we will need:
*Policy for issuing queries to organizations hosting patient records
*A DEAS that includes entity level provider directory information to provide the URIs of provider data sources
*The syntax and semantics of a query for clinical data including identity information that is sent to provider organizations hosting patient information
*Applications which are capable of issuing a query to known URIs
*An approach to disambiguate identity conflicts if the query results in multiple patient matches
Use Case 3 - Complex Search
A patient presents to an Emergency Department and is non-responsive. However, her wallet contains an ID with name and date of birth. An Emergency Physician, based on policy which grants implied consent for unconscious patients, clicks the external search icon in their EHR. The EHR creates a query containing patient identity, implied consent information, and provider authentication and role, then sends it to a Data Element Access Service. The DEAS returns a list of Uniform Resource Identifiers of the organizations which hold the patient's records. The Emergency Physician’s EHR sends a query containing patient identity, consent information, provider authentication and role to each of the URIs, with a request for problems, medications or allergies. Each organization returns as many UEL wrapped data packages as match the query and pass the conditions of patient privacy metadata based on any consents they have on file. Each UEL wrapped package includes identity, provenance and privacy metadata and non-disclosing information about the categories health data available in the content package. The content package inside the UELs includes numerous appropriate vocabularies. The receiving EHR filters and organizes the information for the clinician who can choose to incorporate structured and unstructured data into the local Emergency Department record. All exchanges are query/response. Organizational certificates are needed for the Emergency Department, the DEAS provider, and the organizations which contain patient records. Audit trails are held by all these organizations. Provider authentication is username/password or other modality as required by the ED information system or national policy.
Summarizing the infrastructure for this approach, in addition to the infrastructure of Use Case 2, we will need:
*Policy for issuing a query to the DEAS
*A DEAS which contains patient identity information, provider URIs and potentially more granular information about the types of data available at those URIs
*The syntax and semantics of a query including identity information that is sent to the DEAS.
*Applications which are capable of querying a DEAS and then querying URIs of provider data sources specified by the DEAS, assembling the data returned into a meaningful display
*Support for privacy metadata that are returned by the DEAS and provider data sources
Interoperation among Use Cases 1-3
The Use Cases and the Levels of Exchange are not mutually exclusive. If all three are supported, the patient in Use Case 1 can use the simple search of Use Case 2 to query for the URI of a provider they would like to push their information to; and the complex search of Use Case 3 to expose a UEL wrapped subset of their PHR to the DEAS tagged with a privacy tag indicating their desire that it be made available to someone giving them care and a provenance tag indicating that she had edited it.
Use Case 4 - De-identified aggregate data mining
A researcher wants to retrieve de-identified mammograms to investigate a new technology that provides computer assisted interpretation. The researcher issues a query to the DEAS requesting de-identified mammograms that are reusable for research based on patient consent. A list of URIs is returned including pointers to mammograms. The researcher queries the URIs and receives de-identified mammograms.
Summarizing the infrastructure for this approach
*Policy for issuing a research queries to the DEAS
*A DEAS which supports de-identified queries for a specific type of data
*The syntax and semantics of a query including data type information that is sent to the DEAS.
*Provider data sources that are capable of returning de-identified data
*An application that can query a DEAS and query provider data sources
*Support for privacy metadata that include consent to release data for research and ensure de-identification
The combination of these use cases and the security model described in yesterday's post provides a clear path forward that enables pilots and research to be done in parallel with the meaningful use activities already in progress.
As I think about the exciting years ahead - a PCAST inspired expansion of health information exchange, meaningful use stage 2 & 3, and healthcare reform, I am concerned that doing ICD-10 in the middle of all these other activities will overwhelm healthcare systems and IT organizations. My thoughts on rebalancing and aligning all of the projects in front of us will be a blog post for next week.
Tuesday, March 22, 2011
The PCAST Security Model
As the PCAST Workgroup produces its report suggesting implementation options to ONC, the team members have worked hard to communicate the principles embodied in the report so that the group can make informed recommendations.
Dixie Baker and Carl Gunther have produced one of the most useful visual aids- an overview of the security model described in the PCAST report.
It illustrates 10 steps for secure, audited, universal data exchange
1. A user authenticates him/herself to the local system, within the context of an authorized role.
2. On behalf of authenticated and authorized user and role, local server sends metadata search parameters to the Data Element Access Service (DEAS).
3. The DEAS mediates the request and searches metadata as permitted by privacy metadata within context of authorized role sent with query. The transaction is recorded in audit trail.
4. The DEAS returns a data locator list (Uniform Resource Identifiers) resulting from metadata search.
5. The Local server requests data from URIs returned from DEAS.
6. The data storage location server mediates the request and returns encrypted records to local server.
7. On behalf of an authenticated and authorized user and role, the server sends a DEAS request for encryption key for each data element provided.
8. The DEAS mediates the request and retrieves the key as authorized from key management service. The transaction is recorded in the audit trail.
9. The DEAS returns the key to local server, along with digitally signed privacy preferences
10. After use, system destroys the cleartext and key, possibly retaining the ciphertext.
This 10 step process illustrates "Separation of Concerns" - the idea that privacy is protected by isolating components and the responsibility for managing data among separate entities and infrastructure. In this case, the data, the metadata, and the keys to access the data are kept in separate places, minimizing the risk of a privacy breach of any one component of the architecture.
It remains to be seen how all the key exchange will work in practice. As we create secure transport infrastructure I think we'll see a progression from organizational level keys (as are used in the Direct Project) to individual level keys (as might be used to authenticate a clinician accessing data) to keys managed by a third party to unencrypt specific data elements (as the PCAST report suggests).
Also, it will be interesting to see if clinicians will accept the idea of saving data from healthcare information exchanges in encrypted form, since if the provider uses the information to make a clinical decision, they'll need to have that information in the health record as documentation supporting that action.
My guess is that we'll see a progression of levels of sophistication as we evolve toward the ultimate PCAST vision. The PCAST Workgroup has discussed push by the patient, simple search, and complex search via use cases which illustrate how we can incrementally adopt PCAST ideas over time. I'll blog about that tomorrow.
Dixie Baker and Carl Gunther have produced one of the most useful visual aids- an overview of the security model described in the PCAST report.
It illustrates 10 steps for secure, audited, universal data exchange
1. A user authenticates him/herself to the local system, within the context of an authorized role.
2. On behalf of authenticated and authorized user and role, local server sends metadata search parameters to the Data Element Access Service (DEAS).
3. The DEAS mediates the request and searches metadata as permitted by privacy metadata within context of authorized role sent with query. The transaction is recorded in audit trail.
4. The DEAS returns a data locator list (Uniform Resource Identifiers) resulting from metadata search.
5. The Local server requests data from URIs returned from DEAS.
6. The data storage location server mediates the request and returns encrypted records to local server.
7. On behalf of an authenticated and authorized user and role, the server sends a DEAS request for encryption key for each data element provided.
8. The DEAS mediates the request and retrieves the key as authorized from key management service. The transaction is recorded in the audit trail.
9. The DEAS returns the key to local server, along with digitally signed privacy preferences
10. After use, system destroys the cleartext and key, possibly retaining the ciphertext.
This 10 step process illustrates "Separation of Concerns" - the idea that privacy is protected by isolating components and the responsibility for managing data among separate entities and infrastructure. In this case, the data, the metadata, and the keys to access the data are kept in separate places, minimizing the risk of a privacy breach of any one component of the architecture.
It remains to be seen how all the key exchange will work in practice. As we create secure transport infrastructure I think we'll see a progression from organizational level keys (as are used in the Direct Project) to individual level keys (as might be used to authenticate a clinician accessing data) to keys managed by a third party to unencrypt specific data elements (as the PCAST report suggests).
Also, it will be interesting to see if clinicians will accept the idea of saving data from healthcare information exchanges in encrypted form, since if the provider uses the information to make a clinical decision, they'll need to have that information in the health record as documentation supporting that action.
My guess is that we'll see a progression of levels of sophistication as we evolve toward the ultimate PCAST vision. The PCAST Workgroup has discussed push by the patient, simple search, and complex search via use cases which illustrate how we can incrementally adopt PCAST ideas over time. I'll blog about that tomorrow.
Monday, March 21, 2011
A Banner Day for the Direct Project
I delayed my usual morning blog posting so that I could fully describe all of today's Direct Project updates.
This morning, Doug Fridsma at ONC released the progress report below, describing the next steps for wide scale rollout of push transactions (point A to point B transport) using Direct implementation guides.
Today a 1pm, Arien Malec presented a Direct Project overview at the National eHealth Collaborative NHIN University course, announcing the Direct Ecosystem. Beth Israel Deaconess is proud to be an early supporter and implementer of Direct.
On March 29, the Direct Project will be on the agenda for the HIT Standards Committee meeting as we review the progress to date.
Here are Doug's comments
"As an internal medicine physician, I know how hard it was to coordinate patient care across diverse healthcare systems. Primary care providers struggle to keep up with the flow of information coming in and going out of their offices on faxes, couriered documents and hand carried patient notes. The Direct Project was created to address this problem head-on by creating a simple, secure way to send this information electronically, so that providers can concentrate on what counts: excellent patient care.
Today, The Direct Project announced that over 60 healthcare and health IT organizations, including many state based and private sector health information exchanges, leading IT vendors, and several leading integrated delivery systems, have planned support for the Direct Project. The broad reach of so many significant national players is helping the project reach its goal of providing healthcare stakeholders with universal addressing and universal access to secure direct messaging of health information across the U.S. This is quite an accomplishment, given that the Direct Project just started twelve months ago.
This broad swath of support for the Direct Project represents approximately 90% of market share covered by the participating health IT vendors. With over 20 states participating in the project, including many of the largest states in the country, nearly half of the total U.S. population can now benefit from the Direct Project’s growing integration into the national health IT ecosystem. Growing participation with the Direct Project will alleviate a healthcare system awash in a sea of paper and faxes.
The Office of the National Coordinator for Health Information Technology (ONC) convened the Direct Project to expand the existing specifications incorporated in the Nationwide Health Information Network to be as inclusive as possible for any caregiver regardless of their technology used or the size of the organization. The Direct Project is facilitating “direct” communication patterns, meeting the providers where they are today, with an eye toward approaching more advanced levels of interoperability as they invest in health IT systems.
The result of this groundbreaking public/private collaborative is a set of specifications for simple and directed messages among caregivers and to patients
Widespread Adoption – Up to 160 Million Americans May Soon be Positively Impacted
Many of the country’s largest health IT vendors, most populous states, and robust integrated delivery systems are incorporating Direct Project specifications into their health IT systems. What’s exciting about this growing list of organizations is that over half the country’s population could benefit from the availability of secure, directed health information messaging. The numbers are sure to continue growing in the coming months as more organizations support Direct Project specifications for health information exchange. A complete list of participating organizations, including states, health information exchanges and health IT vendors, is available on the Direct Project website.
Transport of Coordination of Care Messages
The Direct Project also announced finalization of the Direct Project specifications, including the core Direct Project requirements and a specification which describes how EHRs and other health IT systems can leverage the Direct Project to securely exchange direct messages. Such communication is critical, especially when a primary care doctor in the U.S. on average has to coordinate care with 229 doctors across 117 different practices. The Direct Project helps address the technology interoperability challenge created by needing to coordinate with such a large group of diverse organizations. It does so by fulfilling the promise of a real-time secure electronic transport mechanism for referrals and clinical documentation, integrated into the health care workflows and systems across different settings of care. This has enormous impact on the provider’s ability to keep the patient at the center of care. The Direct Project meets providers where they are today and grows with them as they invest in electronic health records, enabling EHR to EHR direct message transport.
Specifications and Compliance
Finally, the Direct Project announced the release two specifications and a draft compatibility statement that will help stakeholders create software that can speak with other Direct-enabled products and will help organizations deploy that software. The Direct Project specifications documents help define and shape the wider adoption of Direct Project technology by healthcare stakeholders. The Applicability Statement for Simple Health Transport outlines the core requirements for a system to declare itself a fully qualified and compliant Health Information Service Provider, or HISP. The Direct Project Compatibility Statement (in draft) addresses the universality of Direct Project messaging. It defines the conditions to participate in universal addressing and transport. The XDR and XDM for Direct Messaging Specification defines a specific gateway solution between the core Direct Project specification and senders and receivers who use IHE specifications.
We are finalizing the Direct Project specifications, engaging with the states, organizations and vendors, and coordinating with the IHE profiles to expand the applicability and value of the Direct Project specifications into a wide variety of use cases.
These developments will be discussed in more depth by Arien Malec, Direct Project coordinator, during a Webinar about the Direct Project on March 21 hosted by the National eHealth Collaborative. For more information, and to register for this webinar, click here.
This has been an exciting year for the Direct Project, and I am encouraged by the quality and speed with which the Direct Project developed its work products and humbled by the community’s incredible work on the project. The Direct Project was started by ONC, but it has been made successful because of the active engagement and support of the Direct community. It is an excellent example of an open government initiative focusing on a specific challenge and working to resolve that challenge in an open and transparent process. If your organization would like more information on the Direct Project or would like to join the growing list of private, public, and government entities that are integrating Direct into their health IT systems, click here."
Congrats to Doug, Arien and all the stakeholders who have worked so hard to make this happen. As I've said since the days of HITSP, when we solve transport we'll see Metcalfe's law for healthcare - the value of healthcare information exchange will surpass the cost and we'll experience an interoperability tipping point.
This morning, Doug Fridsma at ONC released the progress report below, describing the next steps for wide scale rollout of push transactions (point A to point B transport) using Direct implementation guides.
Today a 1pm, Arien Malec presented a Direct Project overview at the National eHealth Collaborative NHIN University course, announcing the Direct Ecosystem. Beth Israel Deaconess is proud to be an early supporter and implementer of Direct.
On March 29, the Direct Project will be on the agenda for the HIT Standards Committee meeting as we review the progress to date.
Here are Doug's comments
"As an internal medicine physician, I know how hard it was to coordinate patient care across diverse healthcare systems. Primary care providers struggle to keep up with the flow of information coming in and going out of their offices on faxes, couriered documents and hand carried patient notes. The Direct Project was created to address this problem head-on by creating a simple, secure way to send this information electronically, so that providers can concentrate on what counts: excellent patient care.
Today, The Direct Project announced that over 60 healthcare and health IT organizations, including many state based and private sector health information exchanges, leading IT vendors, and several leading integrated delivery systems, have planned support for the Direct Project. The broad reach of so many significant national players is helping the project reach its goal of providing healthcare stakeholders with universal addressing and universal access to secure direct messaging of health information across the U.S. This is quite an accomplishment, given that the Direct Project just started twelve months ago.
This broad swath of support for the Direct Project represents approximately 90% of market share covered by the participating health IT vendors. With over 20 states participating in the project, including many of the largest states in the country, nearly half of the total U.S. population can now benefit from the Direct Project’s growing integration into the national health IT ecosystem. Growing participation with the Direct Project will alleviate a healthcare system awash in a sea of paper and faxes.
The Office of the National Coordinator for Health Information Technology (ONC) convened the Direct Project to expand the existing specifications incorporated in the Nationwide Health Information Network to be as inclusive as possible for any caregiver regardless of their technology used or the size of the organization. The Direct Project is facilitating “direct” communication patterns, meeting the providers where they are today, with an eye toward approaching more advanced levels of interoperability as they invest in health IT systems.
The result of this groundbreaking public/private collaborative is a set of specifications for simple and directed messages among caregivers and to patients
Widespread Adoption – Up to 160 Million Americans May Soon be Positively Impacted
Many of the country’s largest health IT vendors, most populous states, and robust integrated delivery systems are incorporating Direct Project specifications into their health IT systems. What’s exciting about this growing list of organizations is that over half the country’s population could benefit from the availability of secure, directed health information messaging. The numbers are sure to continue growing in the coming months as more organizations support Direct Project specifications for health information exchange. A complete list of participating organizations, including states, health information exchanges and health IT vendors, is available on the Direct Project website.
Transport of Coordination of Care Messages
The Direct Project also announced finalization of the Direct Project specifications, including the core Direct Project requirements and a specification which describes how EHRs and other health IT systems can leverage the Direct Project to securely exchange direct messages. Such communication is critical, especially when a primary care doctor in the U.S. on average has to coordinate care with 229 doctors across 117 different practices. The Direct Project helps address the technology interoperability challenge created by needing to coordinate with such a large group of diverse organizations. It does so by fulfilling the promise of a real-time secure electronic transport mechanism for referrals and clinical documentation, integrated into the health care workflows and systems across different settings of care. This has enormous impact on the provider’s ability to keep the patient at the center of care. The Direct Project meets providers where they are today and grows with them as they invest in electronic health records, enabling EHR to EHR direct message transport.
Specifications and Compliance
Finally, the Direct Project announced the release two specifications and a draft compatibility statement that will help stakeholders create software that can speak with other Direct-enabled products and will help organizations deploy that software. The Direct Project specifications documents help define and shape the wider adoption of Direct Project technology by healthcare stakeholders. The Applicability Statement for Simple Health Transport outlines the core requirements for a system to declare itself a fully qualified and compliant Health Information Service Provider, or HISP. The Direct Project Compatibility Statement (in draft) addresses the universality of Direct Project messaging. It defines the conditions to participate in universal addressing and transport. The XDR and XDM for Direct Messaging Specification defines a specific gateway solution between the core Direct Project specification and senders and receivers who use IHE specifications.
We are finalizing the Direct Project specifications, engaging with the states, organizations and vendors, and coordinating with the IHE profiles to expand the applicability and value of the Direct Project specifications into a wide variety of use cases.
These developments will be discussed in more depth by Arien Malec, Direct Project coordinator, during a Webinar about the Direct Project on March 21 hosted by the National eHealth Collaborative. For more information, and to register for this webinar, click here.
This has been an exciting year for the Direct Project, and I am encouraged by the quality and speed with which the Direct Project developed its work products and humbled by the community’s incredible work on the project. The Direct Project was started by ONC, but it has been made successful because of the active engagement and support of the Direct community. It is an excellent example of an open government initiative focusing on a specific challenge and working to resolve that challenge in an open and transparent process. If your organization would like more information on the Direct Project or would like to join the growing list of private, public, and government entities that are integrating Direct into their health IT systems, click here."
Congrats to Doug, Arien and all the stakeholders who have worked so hard to make this happen. As I've said since the days of HITSP, when we solve transport we'll see Metcalfe's law for healthcare - the value of healthcare information exchange will surpass the cost and we'll experience an interoperability tipping point.
Friday, March 18, 2011
Cool Technology of the Week
As we've all watched the news about the Japanese earthquake, we've heard a great deal about tsunamis and their impact to the Japanese coast and the West Coast of the United States.
What technology is used to detect and track tsunamis? The tsunami warning network.
The system is a string of Deep-ocean Assessment and Reporting of Tsunamis (DART) devices, ocean-floor sensors that can detect changes in pressure when a tsunami rolls over and are designed to transmit a signal to a buoy on the surface, which in turn sends a signal to a satellite, which then relays it to a Tsunami Warning Center.
Martha Grabowski, PhD, is Vice-Chair of the Committee on Review of the Tsunami Warning and Forecast System at the National. Academies which recently issued a report that assesses the American preparedness effort.
Here is a link to Martha's recent interview about the tsunami caused by the Japanese earthquake.
39 sensors that monitor tsunamis in the Pacific so that every coastal resident can get an early warning and seek high ground. That's cool!
What technology is used to detect and track tsunamis? The tsunami warning network.
The system is a string of Deep-ocean Assessment and Reporting of Tsunamis (DART) devices, ocean-floor sensors that can detect changes in pressure when a tsunami rolls over and are designed to transmit a signal to a buoy on the surface, which in turn sends a signal to a satellite, which then relays it to a Tsunami Warning Center.
Martha Grabowski, PhD, is Vice-Chair of the Committee on Review of the Tsunami Warning and Forecast System at the National. Academies which recently issued a report that assesses the American preparedness effort.
Here is a link to Martha's recent interview about the tsunami caused by the Japanese earthquake.
39 sensors that monitor tsunamis in the Pacific so that every coastal resident can get an early warning and seek high ground. That's cool!
Thursday, March 17, 2011
Thoughts about the Japanese Earthquake
Just two weeks ago, while walking north of Tokyo on a sunny winter afternoon, I could have never imagined the destruction and pain inflicted upon a country I consider my second home.
My daughter begins her college major in Japanese language and culture at Tufts this Fall.
My wife creates art inspired by Japanese themes.
I'm writing tonight while drinking a cup of green tea from Shizuoka, surrounded by the delicate smoke of Japanese incense and listening to the resonant sounds of a Japanese flute, the Shakuhachi.
It's hard to reconcile the immersion of Japanese culture in my life with the reality of the loss of life and property in Japan.
Like many of you, I've watched the news and read the articles. I've contacted my friends and colleagues in Japan to check on their safety.
I've also reflected on the Japanese people's response to the crisis, which in many ways is unique to the special culture of the country.
128 million Japanese live in an area slightly smaller than California.
Despite hunger, thirst, and cold, there has been no looting. There has been no public violence.
The government announced the need for rolling blackouts to address energy shortages. The Japanese people conserved on their own and no blackouts were needed.
Japanese Prime Minister Naoto Kan addressed the nation Sunday night and said this is the most serious crisis since World War II, calling on people to come together using the phrase, “ittai,” which means to become one body.
The Japanese are a strong, resilient, and selfless people.
In this time of great sorrow, I will learn from their example. May my family and I (and all Americans) show the same solidarity the next time we have to face adversity.
My daughter begins her college major in Japanese language and culture at Tufts this Fall.
My wife creates art inspired by Japanese themes.
I'm writing tonight while drinking a cup of green tea from Shizuoka, surrounded by the delicate smoke of Japanese incense and listening to the resonant sounds of a Japanese flute, the Shakuhachi.
It's hard to reconcile the immersion of Japanese culture in my life with the reality of the loss of life and property in Japan.
Like many of you, I've watched the news and read the articles. I've contacted my friends and colleagues in Japan to check on their safety.
I've also reflected on the Japanese people's response to the crisis, which in many ways is unique to the special culture of the country.
128 million Japanese live in an area slightly smaller than California.
Despite hunger, thirst, and cold, there has been no looting. There has been no public violence.
The government announced the need for rolling blackouts to address energy shortages. The Japanese people conserved on their own and no blackouts were needed.
Japanese Prime Minister Naoto Kan addressed the nation Sunday night and said this is the most serious crisis since World War II, calling on people to come together using the phrase, “ittai,” which means to become one body.
The Japanese are a strong, resilient, and selfless people.
In this time of great sorrow, I will learn from their example. May my family and I (and all Americans) show the same solidarity the next time we have to face adversity.
Wednesday, March 16, 2011
Improving Massachusetts Post-Acute Care Transfers
In January 2011, Massachusetts was awarded two HIE Challenge Grants, Improving Massachusetts Post-Acute Care Transfers (IMPACT) and Massachusetts Department of Public Health Net (MDPHNet).
The major themes of IMPACT are:
*Reducing barriers to adoption of Clinical Document Architecture (CDA) Templates in Electronic Health Records (EHRs)
*Enabling facilities that lack EHRs to take advantage of Health Information Exchanges (HIEs)
*Facilitating communication with consumers.
As we journey toward accountable care nirvana, it's increasingly important that information follows the patient at all transfers of care. Nationally, the S&I Framework Transition of Care Use Case Workgroup is harmonizing the standards needed to support transitions. In Massachusetts, we're creating the necessary HIE capabilities.
During a recent IMPACT planning call, we discussed the need to incorporate transfer of care HIE into clinician workflows. A major barrier to HIE adoption to date has been the use of portals that are separate from the EHRs clinicians use at the point of care. It is true that Meaningful Use Stage 1 requires EHRs to produce care summaries (CCR or CCD), but it does not require them to be sent through a consistent regional or national transport infrastructure that is tightly coupled to existing care processes.
Here's a possible integrated workflow.
Doctors and nurses use EHRs to enter the necessary data to support transfers of care. All this data will be assembled in an enhanced CCD. That CCD will be sent to a skilled nursing facility (SNF) which may or may not have an EHR. The CCD will be incorporated into the SNF's EHR if one exists. If no EHR exists, the CCD will be viewable via a portal, renderable as fax, or transmitted by secure email.
The data elements in the ideal transfer of care summary are a superset of the typical CCD. CDA Templates should provide us the flexibility we need. For example, a complete transfer document should include patient risks (falls, restraints, elopement), personal belongings (glasses, hearing aids, dental appliances), and whether a healthcare proxy has been invoked . Keith Boone nicely summarized the gaps between the CCD and the transfer of care summary form developed in Massachusetts.
Of course, the workflow might be more complicated. Nurses and doctors may complete their portions in EHRs, but there may be additional data provided outside the EHR by case managers. There may be a need to send preliminary summary information to multiple SNFs to match the patient to the right level of care. The clinicians filling out the summary may not know which facility the patient will be discharged to. A case manager may finalize the patient placement, add additional summary information, then route the finished transfer of care document to the right facility.
Once the patient is at the facility, new care will be rendered. The SNF will need to document a summary of that care and a plan for additional care once the patient is discharged home. The SNF could use its EHR for such documentation, but since many SNFs are still using paper, they will need a portal which supports structured documentation and routing of summaries electronically to PCPs, homecare agencies, and families. Ideally, this portal would incorporate data sent from the original transferring hospital, so that the summary could be updated instead of starting from scratch.
Of course, all of these electronic transfers must be protected with appropriate security, auditing, and data integrity checking.
Once we document the workflow, we'll finalize the technical solution. It could involve secure transmission from hospitals to an intermediary health information services provider (HISP) that hosts a web-based application enabling case managers to monitor queues of soon to be discharged patients and supporting access by appropriate candidate SNF facilities. The portal will track what has been completed and what still needs to be completed based on the ultimate type of destination for the document (e.g. SNF vs. home health vs. PCP vs. patient, etc.). Once the patient is matched with a SNF, the final step - transfer to the SNF's EHR could occur securely. The portal would also send confirmation of the final transmission along with a copy of the complete transfer document back to the original transferring hospital's EHR.
As we drill down on the workflow, we'll find interesting questions. If transfer data are missing, someone will need to be notified that completion is necessary before it can be sent. How and to whom does this notification take place? Different deficiencies require action by different personnel. Missing treatment or follow-up plans may be the responsibility of the Attending Physician (or Resident), while identifying the actual Home Health Agency or SNF may be the responsibility of a case manager. How can this be integrated seamlessly into EHR-based workflows without requiring customization of the EHRs? Should the CCD contain a listing of the entire care team (including case managers and temporary float nurses) from the sending institution? If state (or national) Provider Directories are created, will they include nurses and case managers? How will registration and authentication of all of these users take place? How will users be restricted to just seeing their patients and not anyone else's? Who will proactively manage the audit trails? Will it pass the "Boston Globe" test to have CCDs from across the state or region stored in a central portal location?
The project will answer many workflow and policy questions. It's clear that having certified EHRs with the ability to send and receive CDA templates and eventually reducing dependency on stand alone web portals will greatly simplify the workflow.
The major themes of IMPACT are:
*Reducing barriers to adoption of Clinical Document Architecture (CDA) Templates in Electronic Health Records (EHRs)
*Enabling facilities that lack EHRs to take advantage of Health Information Exchanges (HIEs)
*Facilitating communication with consumers.
As we journey toward accountable care nirvana, it's increasingly important that information follows the patient at all transfers of care. Nationally, the S&I Framework Transition of Care Use Case Workgroup is harmonizing the standards needed to support transitions. In Massachusetts, we're creating the necessary HIE capabilities.
During a recent IMPACT planning call, we discussed the need to incorporate transfer of care HIE into clinician workflows. A major barrier to HIE adoption to date has been the use of portals that are separate from the EHRs clinicians use at the point of care. It is true that Meaningful Use Stage 1 requires EHRs to produce care summaries (CCR or CCD), but it does not require them to be sent through a consistent regional or national transport infrastructure that is tightly coupled to existing care processes.
Here's a possible integrated workflow.
Doctors and nurses use EHRs to enter the necessary data to support transfers of care. All this data will be assembled in an enhanced CCD. That CCD will be sent to a skilled nursing facility (SNF) which may or may not have an EHR. The CCD will be incorporated into the SNF's EHR if one exists. If no EHR exists, the CCD will be viewable via a portal, renderable as fax, or transmitted by secure email.
The data elements in the ideal transfer of care summary are a superset of the typical CCD. CDA Templates should provide us the flexibility we need. For example, a complete transfer document should include patient risks (falls, restraints, elopement), personal belongings (glasses, hearing aids, dental appliances), and whether a healthcare proxy has been invoked . Keith Boone nicely summarized the gaps between the CCD and the transfer of care summary form developed in Massachusetts.
Of course, the workflow might be more complicated. Nurses and doctors may complete their portions in EHRs, but there may be additional data provided outside the EHR by case managers. There may be a need to send preliminary summary information to multiple SNFs to match the patient to the right level of care. The clinicians filling out the summary may not know which facility the patient will be discharged to. A case manager may finalize the patient placement, add additional summary information, then route the finished transfer of care document to the right facility.
Once the patient is at the facility, new care will be rendered. The SNF will need to document a summary of that care and a plan for additional care once the patient is discharged home. The SNF could use its EHR for such documentation, but since many SNFs are still using paper, they will need a portal which supports structured documentation and routing of summaries electronically to PCPs, homecare agencies, and families. Ideally, this portal would incorporate data sent from the original transferring hospital, so that the summary could be updated instead of starting from scratch.
Of course, all of these electronic transfers must be protected with appropriate security, auditing, and data integrity checking.
Once we document the workflow, we'll finalize the technical solution. It could involve secure transmission from hospitals to an intermediary health information services provider (HISP) that hosts a web-based application enabling case managers to monitor queues of soon to be discharged patients and supporting access by appropriate candidate SNF facilities. The portal will track what has been completed and what still needs to be completed based on the ultimate type of destination for the document (e.g. SNF vs. home health vs. PCP vs. patient, etc.). Once the patient is matched with a SNF, the final step - transfer to the SNF's EHR could occur securely. The portal would also send confirmation of the final transmission along with a copy of the complete transfer document back to the original transferring hospital's EHR.
As we drill down on the workflow, we'll find interesting questions. If transfer data are missing, someone will need to be notified that completion is necessary before it can be sent. How and to whom does this notification take place? Different deficiencies require action by different personnel. Missing treatment or follow-up plans may be the responsibility of the Attending Physician (or Resident), while identifying the actual Home Health Agency or SNF may be the responsibility of a case manager. How can this be integrated seamlessly into EHR-based workflows without requiring customization of the EHRs? Should the CCD contain a listing of the entire care team (including case managers and temporary float nurses) from the sending institution? If state (or national) Provider Directories are created, will they include nurses and case managers? How will registration and authentication of all of these users take place? How will users be restricted to just seeing their patients and not anyone else's? Who will proactively manage the audit trails? Will it pass the "Boston Globe" test to have CCDs from across the state or region stored in a central portal location?
The project will answer many workflow and policy questions. It's clear that having certified EHRs with the ability to send and receive CDA templates and eventually reducing dependency on stand alone web portals will greatly simplify the workflow.
Tuesday, March 15, 2011
Performance Testing
One of my users recently asked me about the process of taking an application from build to live. The steps we take include:
*Functional Testing
*End-to-End Testing
*Performance Testing
*User Acceptance
*Cutover Planning including training, communications, and the technical details of transitioning one application to another.
I was asked to explain the difference between end-to-end testing and performance testing.
End-to-end testing is done to make sure the application code does what is expected in terms of function. For example, if you look up a patient result, is it presented accurately? End-to-end testing could theoretically done by one person entering one type transaction after another.
Performance testing places the application under "load" to see if there are bottlenecks with the server, database, storage and middleware. The purpose is to avoid slow performance after go live.
Many times you can do performance tests using simulated input. Two typical software tools for doing this are HP's Loadrunner and Microfocus' SilkPerformer.
Some vendors recommend using manual load testing i.e. put all the staff on the new system and do a day of work to see if infrastructure performance suffers.
Although manual testing is often the easiest thing to do, it may not find bottlenecks in transactional performance. Each transaction type and software module creates a different load on the infrastructure. Some transactions have minimal impact while others cause significant strain. Doing load testing right, requires a representative mixture of transactions including automated interfaces, data entry, reports and others.
Our approach is generally a combination of manual and automated performance testing. We pre-load the databases with years of data. We use automated load testing tools to simulate heavy web site use. We run scripts that emulate interface activity. In the context of this real world simulation, we then let the users exercise the software fully.
Of course, even such comprehensive testing can miss software flaws, such as queries against unindexed database tables or processes become a rate limiting step to application performance. Thus, it's also important to have tools that diagnose problems if slow downs should occur after go live (such as OpNet) and have a strong working relationship with your software vendors so they can rapidly correct any flaws that appear once an application is in full production.
*Functional Testing
*End-to-End Testing
*Performance Testing
*User Acceptance
*Cutover Planning including training, communications, and the technical details of transitioning one application to another.
I was asked to explain the difference between end-to-end testing and performance testing.
End-to-end testing is done to make sure the application code does what is expected in terms of function. For example, if you look up a patient result, is it presented accurately? End-to-end testing could theoretically done by one person entering one type transaction after another.
Performance testing places the application under "load" to see if there are bottlenecks with the server, database, storage and middleware. The purpose is to avoid slow performance after go live.
Many times you can do performance tests using simulated input. Two typical software tools for doing this are HP's Loadrunner and Microfocus' SilkPerformer.
Some vendors recommend using manual load testing i.e. put all the staff on the new system and do a day of work to see if infrastructure performance suffers.
Although manual testing is often the easiest thing to do, it may not find bottlenecks in transactional performance. Each transaction type and software module creates a different load on the infrastructure. Some transactions have minimal impact while others cause significant strain. Doing load testing right, requires a representative mixture of transactions including automated interfaces, data entry, reports and others.
Our approach is generally a combination of manual and automated performance testing. We pre-load the databases with years of data. We use automated load testing tools to simulate heavy web site use. We run scripts that emulate interface activity. In the context of this real world simulation, we then let the users exercise the software fully.
Of course, even such comprehensive testing can miss software flaws, such as queries against unindexed database tables or processes become a rate limiting step to application performance. Thus, it's also important to have tools that diagnose problems if slow downs should occur after go live (such as OpNet) and have a strong working relationship with your software vendors so they can rapidly correct any flaws that appear once an application is in full production.
Monday, March 14, 2011
Frameworks for IT Management
In the next few years, the transition from fee for service to accountable care organizations/global payments is going to require significant IT change at a time when budgets will become increasingly constrained. We'll have the combination of Meaningful Use Stages 1/2/3, ICD10/5010, and healthcare reform all occurring at the same time.
IT organizations will be required to demonstrate their value, benchmark themselves against best practices, and justify their actions.
There are numerous frameworks that can support a standardized approach to project scope definition, resource allocation, and service provision.
Although you may not use these techniques now, you should be familiar with them as the pressure increases to absorb increasing demand in the face of decreasing supply.
Here's a brief overview of 3 leading frameworks.
Information Technology Infrastructure Library (ITIL)
ITIL grew out of work done by UK Government's Central Computer and Telecommunications Agency in the 1980s to document best practices. Since then ITIL has had 3 major revisions and the current version consists of 26 processes and functions documented in 5 volumes.
1. ITIL Service Strategy
2. ITIL Service Design
3. ITIL Service Transition
4. ITIL Service Operation
5. ITIL Continual Service Improvement
The primary focus of ITIL is to provide best practice definitions and criteria for operations management. As with any framework there is significant debate about the pros and cons of ITIL. As long as you keep in mind that ITIL is a set of best practices, to be adopted and adapted as best fits your local needs, it can be useful. ITIL does not aim to be comprehensive and universal - use it where it helps maintain your ongoing services.
Control Objectives for Information and related Technology (COBIT)
COBIT was first released in 1996 by the Information Systems Audit and Control Association (ISACA) and IT Governance Institute (ITGI). COBIT has been used to evaluate security and controls during various audits of my IT organizations . The current version of COBIT has 34 high-level processes, covering 318 control objectives, categorized in four domains:
1. Planning and Organization
2. Acquisition and Implementation
3. Delivery and Support
4. Monitoring and Evaluation
COBIT focuses on the definition, implementation, auditing, measurement, and improvement of controls for specific processes that span the entire IT implementation life cycle.
Capability Maturity Model (CMM)
CMM was originally developed by Carnegie Mellon University researchers as a tool for objectively assessing the ability of government contractors to perform a contracted software project. Now it is applied more generally to any organization's software development processes. The predictability, effectiveness, and control of an organization's software development processes evolve over time through 5 stages:
1. Initial (chaotic, ad hoc, individual heroics) - the starting point for use of a new process.
2. Managed - the process is managed in accordance with agreed metrics.
3. Defined - the process is defined/confirmed as a standard business process
4. Quantitatively managed
5. Optimizing - process management includes deliberate process optimization/improvement.
CMM provides a framework for measuring and transforming software development.
What's the elevator speech about these various techniques? They are complementary frameworks. COBIT systematically chronicles a checklist of all the things that an IT organization ought to be doing to implement appropriate controls and security. ITIL explains how. CMM measures the sophistication of the processes used along the way.
I'm very interested in hearing from the community - do your IT organizations use any aspect of these frameworks? Have they been helpful to you to document the resource requirements of the IT organization and give users a transparent look into the work you perform?
IT organizations will be required to demonstrate their value, benchmark themselves against best practices, and justify their actions.
There are numerous frameworks that can support a standardized approach to project scope definition, resource allocation, and service provision.
Although you may not use these techniques now, you should be familiar with them as the pressure increases to absorb increasing demand in the face of decreasing supply.
Here's a brief overview of 3 leading frameworks.
Information Technology Infrastructure Library (ITIL)
ITIL grew out of work done by UK Government's Central Computer and Telecommunications Agency in the 1980s to document best practices. Since then ITIL has had 3 major revisions and the current version consists of 26 processes and functions documented in 5 volumes.
1. ITIL Service Strategy
2. ITIL Service Design
3. ITIL Service Transition
4. ITIL Service Operation
5. ITIL Continual Service Improvement
The primary focus of ITIL is to provide best practice definitions and criteria for operations management. As with any framework there is significant debate about the pros and cons of ITIL. As long as you keep in mind that ITIL is a set of best practices, to be adopted and adapted as best fits your local needs, it can be useful. ITIL does not aim to be comprehensive and universal - use it where it helps maintain your ongoing services.
Control Objectives for Information and related Technology (COBIT)
COBIT was first released in 1996 by the Information Systems Audit and Control Association (ISACA) and IT Governance Institute (ITGI). COBIT has been used to evaluate security and controls during various audits of my IT organizations . The current version of COBIT has 34 high-level processes, covering 318 control objectives, categorized in four domains:
1. Planning and Organization
2. Acquisition and Implementation
3. Delivery and Support
4. Monitoring and Evaluation
COBIT focuses on the definition, implementation, auditing, measurement, and improvement of controls for specific processes that span the entire IT implementation life cycle.
Capability Maturity Model (CMM)
CMM was originally developed by Carnegie Mellon University researchers as a tool for objectively assessing the ability of government contractors to perform a contracted software project. Now it is applied more generally to any organization's software development processes. The predictability, effectiveness, and control of an organization's software development processes evolve over time through 5 stages:
1. Initial (chaotic, ad hoc, individual heroics) - the starting point for use of a new process.
2. Managed - the process is managed in accordance with agreed metrics.
3. Defined - the process is defined/confirmed as a standard business process
4. Quantitatively managed
5. Optimizing - process management includes deliberate process optimization/improvement.
CMM provides a framework for measuring and transforming software development.
What's the elevator speech about these various techniques? They are complementary frameworks. COBIT systematically chronicles a checklist of all the things that an IT organization ought to be doing to implement appropriate controls and security. ITIL explains how. CMM measures the sophistication of the processes used along the way.
I'm very interested in hearing from the community - do your IT organizations use any aspect of these frameworks? Have they been helpful to you to document the resource requirements of the IT organization and give users a transparent look into the work you perform?
Sunday, March 13, 2011
The Japanese Earthquake
Having returned from Japan a few days ago, I am very close to many people there and am monitoring the tragedy closely. The Japan Society of Boston has provided the update below in an attempt to summarize the current situation and highlight available information resources:
Dear Japan Society members and friends,
As each day passes and the extent of the damage becomes more apparent, our hearts go out to all affected by this disaster. NHK is reporting that the death toll in Miyagi prefecture alone is expected to reach 10,000 people.
We are receiving many calls and emails from people trying to contact friends and family in Japan. Please take a moment to visit our website . We have many links to people finder services, as well as a forum for all to post thoughts, prayers, requests for information about people, and news updates. In addition there is information from the Japanese Government requesting that donations be made through the Red Cross.
One major concern for everyone is the damage to the nuclear power plants in Fukushima. Please note that many news outlets are describing this erroneously as a "nuclear blast." Though we share in the concern about a nuclear meltdown it is important that we not sensationalize this already unfathomable disaster, by calling it a "nuclear blast." The major concern at this time is that the cooling systems are malfunctioning, leading to a possible meltdown in the reactors. As of this writing it is being reported that both incidents are rated 4 out of 7 on the International Nuclear Event Scale.
The Japan Society of Boston will continue to follow the situation in Japan closely and will keep you updated as news comes in.
Dear Japan Society members and friends,
As each day passes and the extent of the damage becomes more apparent, our hearts go out to all affected by this disaster. NHK is reporting that the death toll in Miyagi prefecture alone is expected to reach 10,000 people.
We are receiving many calls and emails from people trying to contact friends and family in Japan. Please take a moment to visit our website . We have many links to people finder services, as well as a forum for all to post thoughts, prayers, requests for information about people, and news updates. In addition there is information from the Japanese Government requesting that donations be made through the Red Cross.
One major concern for everyone is the damage to the nuclear power plants in Fukushima. Please note that many news outlets are describing this erroneously as a "nuclear blast." Though we share in the concern about a nuclear meltdown it is important that we not sensationalize this already unfathomable disaster, by calling it a "nuclear blast." The major concern at this time is that the cooling systems are malfunctioning, leading to a possible meltdown in the reactors. As of this writing it is being reported that both incidents are rated 4 out of 7 on the International Nuclear Event Scale.
The Japan Society of Boston will continue to follow the situation in Japan closely and will keep you updated as news comes in.
Friday, March 11, 2011
Cool Technology of the Week
I've written previously about the SHARP grants and the Harvard project to create modular healthcare applications that run on a platform, analogous to the Apple AppStore.
The government has just announced the Smart Apps challenge.
You too could be the author of the next hot app for healthcare and be paid for your innovation.
Imagine all the energy we could harness if our most talented engineers wrote modular EHRs instead of "Angry Birds"
Here's the detail of the challenge.
May the coding begin!
The government has just announced the Smart Apps challenge.
You too could be the author of the next hot app for healthcare and be paid for your innovation.
Imagine all the energy we could harness if our most talented engineers wrote modular EHRs instead of "Angry Birds"
Here's the detail of the challenge.
May the coding begin!
Thursday, March 10, 2011
Working with the Media
Throughout my adult life, I've had many opportunities to speak with television, radio, and print journalists. I may not always receive good press, but I almost always receive balanced press.
I've learned several lessons along the way
-It's important not to endorse any product or service. I'm always careful to present my experience in the context of a case study or objective observation. I avoid conflicts of interest by never accepting gifts, travel, or meals from vendors.
-It's important to speak as an individual and not as an organizational representative. I remove my badges and eliminate any organizational logos from the visual field. I emphasize that my comments are personal opinions and do not necessarily reflect the views of any corporation I work for.
-It's important to speak clearly, succinctly, and deliver an unambiguous message. Everyone should be able to understand a 30 second elevator speech about the technology I'm discussing.
-It's important to use personal stories, analogies, and lay language. I often describe my own experience with healthcare and the ways technology would improve my wellness or my family's care coordination.
-It's important to give honest answers. Occasionally, reporters have an agenda and try to put words in your mouth. State your beliefs without being led to a conclusion by redirecting a bad question into an relevant, thoughtful answer.
In the recent iPad2 release (I'm at 11 minutes 44 seconds into the presentation), here are the points I conveyed:
I have watched clinicians (including myself) using tablet style devices. They enhance clinician productivity because of their portability (they fit in a white coat pocket), long battery life, and ease of disinfection (alcohol wipes on the screen do not damage the display). They untether clinicians from laptops and carts, are easy to read, and provide the high definition graphics needed for clinical imaging interpretation.
The nature of the form factor makes it easier than laptops for clinicians to share medical information with patients (such as explaining conditions by retrieving images), improving communications.
My quotes were
“Sometimes doctors are overwhelmed with data. What we’ve tried to do on the iPad is to give doctors at the point of care the tools they need at the exact moment the doctor can make the difference.”
“We’re finding with the iPad that doctors are spending more time with patients. In fact, doctors are engaging patients by showing them images, showing them data on the screen. So it’s empowered doctors to be more productive, and it’s also brought doctors and patients together.”
Some have called this an endorsement. I've followed my own guidelines and described my experience using the device and the behavior of other clinicians I've observed . I did not describe any plan to purchase the devices, nor have I ever received any free iPad products or services.
In a world of YouTube, social media, and blogging, everything I do and say should be considered public. Hopefully, by following my own moral compass and my guidelines for working with the media, I can share my experiences for the benefit of all without compromising my own objectivity.
I've learned several lessons along the way
-It's important not to endorse any product or service. I'm always careful to present my experience in the context of a case study or objective observation. I avoid conflicts of interest by never accepting gifts, travel, or meals from vendors.
-It's important to speak as an individual and not as an organizational representative. I remove my badges and eliminate any organizational logos from the visual field. I emphasize that my comments are personal opinions and do not necessarily reflect the views of any corporation I work for.
-It's important to speak clearly, succinctly, and deliver an unambiguous message. Everyone should be able to understand a 30 second elevator speech about the technology I'm discussing.
-It's important to use personal stories, analogies, and lay language. I often describe my own experience with healthcare and the ways technology would improve my wellness or my family's care coordination.
-It's important to give honest answers. Occasionally, reporters have an agenda and try to put words in your mouth. State your beliefs without being led to a conclusion by redirecting a bad question into an relevant, thoughtful answer.
In the recent iPad2 release (I'm at 11 minutes 44 seconds into the presentation), here are the points I conveyed:
I have watched clinicians (including myself) using tablet style devices. They enhance clinician productivity because of their portability (they fit in a white coat pocket), long battery life, and ease of disinfection (alcohol wipes on the screen do not damage the display). They untether clinicians from laptops and carts, are easy to read, and provide the high definition graphics needed for clinical imaging interpretation.
The nature of the form factor makes it easier than laptops for clinicians to share medical information with patients (such as explaining conditions by retrieving images), improving communications.
My quotes were
“Sometimes doctors are overwhelmed with data. What we’ve tried to do on the iPad is to give doctors at the point of care the tools they need at the exact moment the doctor can make the difference.”
“We’re finding with the iPad that doctors are spending more time with patients. In fact, doctors are engaging patients by showing them images, showing them data on the screen. So it’s empowered doctors to be more productive, and it’s also brought doctors and patients together.”
Some have called this an endorsement. I've followed my own guidelines and described my experience using the device and the behavior of other clinicians I've observed . I did not describe any plan to purchase the devices, nor have I ever received any free iPad products or services.
In a world of YouTube, social media, and blogging, everything I do and say should be considered public. Hopefully, by following my own moral compass and my guidelines for working with the media, I can share my experiences for the benefit of all without compromising my own objectivity.
Wednesday, March 9, 2011
Meaningful Use 2 and 3 Do It Yourself Presentation
Just as with Stage 1, it's likely that you'll be presenting the proposed Stage 2 and 3 Meaningful Use criteria to your stakeholders and boards.
Here's a comprehensive comparison of Stage 1, 2 and 3, but you'll likely want the Cliff's Notes version for your presentation.
Here's the presentation I'm using in my lectures which highlights the major changes.
Feel free to use it without attribution!
I hope it saves you time.
Here's a comprehensive comparison of Stage 1, 2 and 3, but you'll likely want the Cliff's Notes version for your presentation.
Here's the presentation I'm using in my lectures which highlights the major changes.
Feel free to use it without attribution!
I hope it saves you time.
Tuesday, March 8, 2011
The Open Science Grid
This week’s Harvard Medical School's Structural Biology Grid (SBGrid) group is hosting the Open Science Grid annual all hands meeting. Think of the Open Science Grid as a way to harness the unused computing cycles of high performance computing centers for the benefit of all - a kind of SETI at home for science. It's worth learning more about.
The Open Science Grid (OSG) is an open consortium of science and research communities, university and Department of Energy (DOE) laboratory IT facilities, and software developers. Together they have built and are now operating a broad distributed high throughput computing cyber infrastructure in the US. Staff are funded by the DOE SciDAC-2 program and NSF.
The OSG is designed to be effective for job runs of between one hour and a few of days, jobs that can be check-pointed, jobs that require management of large scale data movement and storage, and ensembles of jobs that can effectively run across a large number of resources.
On a typical day, the OSG supports 1.2 million CPU hours of computation, the launch of more than half a million jobs, and the transfer of more than a quarter of a petabyte of data, across more than sixty sites. Over the past year the Harvard Medical School SBGrid group has used more than 6 million CPU hours.
20% of the usage of OSG Is non-physics – across more than five different scientific disciplines. The Large Hadron Collider (LHC) uses 50% and the existing Tevatron experiments at Fermilab, Laser Interferometer Gravitational Wave Observatory (LIGO), STAR, and other physics experiments use the final 30%.
The Open Science Grid provides the US contribution to the World Wide LHC Computing Grid – most recently presented by Bob Jones.
OSG provides an engagement effort that helps new users, ranging from individuals and small groups to large communities. It engages teams to help them adapt their software (and their culture) to use a distributed set of computing and storage resources which they don’t manage directly themselves.
The OSG is built on a set of underlying principles of distributed high throughput computing. Professor Miron Livny, the lead of the Condor project at the University of Wisconsin-Madison, serves as Technical Director.
OSG partners with the NSF TeraGrid and new XD program. It has a long and productive history of collaboration with peer European projects, continuing with the European Grid Initiative (EGI-InSPIRE) and European Middleware Initiative (EMI). It works closely with ESNet and Internet2 in understanding the networking needs of high performance computing communities, testing and integration of advanced networks.
The OSG brings distributed high throughput computing services into and across campuses themselves by working with groups of faculty and researchers to leverage local and remote resources. OSG is currently working with communities at Clemson, Nebraska, Notre Dame, Purdue and Wisconsin-Madison on a prototyping effort that includes enabling the formation of local partnerships and dynamic access to shared resources using campus identities.
Here's an architecture diagram that shows the scope of OSG. The image above illustrates the usage over the past few years.
OSG is a very worthwhile application of technology - a grid computing initiative that captures millions of unused CPU hours for general use.
The Open Science Grid (OSG) is an open consortium of science and research communities, university and Department of Energy (DOE) laboratory IT facilities, and software developers. Together they have built and are now operating a broad distributed high throughput computing cyber infrastructure in the US. Staff are funded by the DOE SciDAC-2 program and NSF.
The OSG is designed to be effective for job runs of between one hour and a few of days, jobs that can be check-pointed, jobs that require management of large scale data movement and storage, and ensembles of jobs that can effectively run across a large number of resources.
On a typical day, the OSG supports 1.2 million CPU hours of computation, the launch of more than half a million jobs, and the transfer of more than a quarter of a petabyte of data, across more than sixty sites. Over the past year the Harvard Medical School SBGrid group has used more than 6 million CPU hours.
20% of the usage of OSG Is non-physics – across more than five different scientific disciplines. The Large Hadron Collider (LHC) uses 50% and the existing Tevatron experiments at Fermilab, Laser Interferometer Gravitational Wave Observatory (LIGO), STAR, and other physics experiments use the final 30%.
The Open Science Grid provides the US contribution to the World Wide LHC Computing Grid – most recently presented by Bob Jones.
OSG provides an engagement effort that helps new users, ranging from individuals and small groups to large communities. It engages teams to help them adapt their software (and their culture) to use a distributed set of computing and storage resources which they don’t manage directly themselves.
The OSG is built on a set of underlying principles of distributed high throughput computing. Professor Miron Livny, the lead of the Condor project at the University of Wisconsin-Madison, serves as Technical Director.
OSG partners with the NSF TeraGrid and new XD program. It has a long and productive history of collaboration with peer European projects, continuing with the European Grid Initiative (EGI-InSPIRE) and European Middleware Initiative (EMI). It works closely with ESNet and Internet2 in understanding the networking needs of high performance computing communities, testing and integration of advanced networks.
The OSG brings distributed high throughput computing services into and across campuses themselves by working with groups of faculty and researchers to leverage local and remote resources. OSG is currently working with communities at Clemson, Nebraska, Notre Dame, Purdue and Wisconsin-Madison on a prototyping effort that includes enabling the formation of local partnerships and dynamic access to shared resources using campus identities.
Here's an architecture diagram that shows the scope of OSG. The image above illustrates the usage over the past few years.
OSG is a very worthwhile application of technology - a grid computing initiative that captures millions of unused CPU hours for general use.
Monday, March 7, 2011
The Major Themes of HIMSS 2011
I was in Japan during HIMSS this year, but I asked Jeff Blair, an nationally known informatician from the Lovelace Clinic Foundation, to summarize the key themes from the events he attended. He declared HIMSS 2011 to be the year of the healthcare information exchange with the following points:
"Advances
*There was a better understanding of the capabilities and limitations of NwHIN Direct and NwHIN Exchange. In particular, it seemed as if most attendees now look at NwHIN Direct as a near-term solution and at NwHIN Exchange as having the capabilities to address Stage 2 and Stage 3 of meaningful use.
*A lot of progress has been made during 2010 by independent HIE networks and state HIE networks to develop their strategies, plans, and resources.
*The federal government, including ONC and CMS, has really ramped up to move forward with all of the HITECH initiatives and they are pushing forward on all of these initiatives at the same time.
*Many HIE networks are reporting that they have been able to expand connections to health care provider stakeholders in their communities, and that community support for HIE network services has been growing as HIEs can deliver more services.
*The vendors that provide solutions for HIE networks have matured and are able to more clearly articulate the capabilities of their components, as well as the synergies their components have with other HIE components.
*A lot of progress has been made to define the requirements for provider directories.
*The HIMSS exhibit floor was massive and impressive. There are many new information technologies that have become available and it will be interesting to see how quickly health care can adopt these new technologies (including cloud computing, mobile health, Twitter, etc.) to address health care challenges.
Constraints
*There is greater awareness that the CDA/CCD is not at the level of plug-and-play compatibility, and that work needs to be done to tighten the constraints to limit optionality to get closer to plug-and-play.
*The resources of HIE networks are now stretched thin trying to participate in all of the ONC initiatives, including conferences, committees, communities of practice, workgroups, etc.
*As ONC and CMS roll out more and more funding opportunities, HIE networks are finding that the resources of their local provider partners are also running thin to participate and/or support these funding opportunities.
*It is now clear that the development of provider directories in each state will be major projects, not just an additional component of HIE networks.
*More specifically, it is now clear that provider directories will involve entity-level provider directories (ELPD) and individual level provider directories (ILPD); that the ELPD and the ILPD will need to interact with each other; that there will be many more users of the provider directory than just HIE networks, which means that there will be several different use cases that will need to be created and addressed; and finally, that existing sources of listings of providers at the national, state, and professional association levels will all be needed to create a complete provider directory to support the meaningful use initiatives for Stage 2 and Stage 3.
*Many state HIE networks have lost their State HIE Coordinators, public health supporters, or Medicaid supporters due to the turnover caused by the state elections in November of 2010. It may take several more months to re-establish relationships with new state administrators."
Thanks to Jeff for this thoughtful summary! I'll see you all at HIMSS next year.
"Advances
*There was a better understanding of the capabilities and limitations of NwHIN Direct and NwHIN Exchange. In particular, it seemed as if most attendees now look at NwHIN Direct as a near-term solution and at NwHIN Exchange as having the capabilities to address Stage 2 and Stage 3 of meaningful use.
*A lot of progress has been made during 2010 by independent HIE networks and state HIE networks to develop their strategies, plans, and resources.
*The federal government, including ONC and CMS, has really ramped up to move forward with all of the HITECH initiatives and they are pushing forward on all of these initiatives at the same time.
*Many HIE networks are reporting that they have been able to expand connections to health care provider stakeholders in their communities, and that community support for HIE network services has been growing as HIEs can deliver more services.
*The vendors that provide solutions for HIE networks have matured and are able to more clearly articulate the capabilities of their components, as well as the synergies their components have with other HIE components.
*A lot of progress has been made to define the requirements for provider directories.
*The HIMSS exhibit floor was massive and impressive. There are many new information technologies that have become available and it will be interesting to see how quickly health care can adopt these new technologies (including cloud computing, mobile health, Twitter, etc.) to address health care challenges.
Constraints
*There is greater awareness that the CDA/CCD is not at the level of plug-and-play compatibility, and that work needs to be done to tighten the constraints to limit optionality to get closer to plug-and-play.
*The resources of HIE networks are now stretched thin trying to participate in all of the ONC initiatives, including conferences, committees, communities of practice, workgroups, etc.
*As ONC and CMS roll out more and more funding opportunities, HIE networks are finding that the resources of their local provider partners are also running thin to participate and/or support these funding opportunities.
*It is now clear that the development of provider directories in each state will be major projects, not just an additional component of HIE networks.
*More specifically, it is now clear that provider directories will involve entity-level provider directories (ELPD) and individual level provider directories (ILPD); that the ELPD and the ILPD will need to interact with each other; that there will be many more users of the provider directory than just HIE networks, which means that there will be several different use cases that will need to be created and addressed; and finally, that existing sources of listings of providers at the national, state, and professional association levels will all be needed to create a complete provider directory to support the meaningful use initiatives for Stage 2 and Stage 3.
*Many state HIE networks have lost their State HIE Coordinators, public health supporters, or Medicaid supporters due to the turnover caused by the state elections in November of 2010. It may take several more months to re-establish relationships with new state administrators."
Thanks to Jeff for this thoughtful summary! I'll see you all at HIMSS next year.
Friday, March 4, 2011
Cool Technology of the Week
In a world focused on green energy, conservation, and working more efficiently, why do we run escalators when no one is using them?
In Japan, they do not.
Above is a video from the Narita airport near Tokyo which shows the escalators to the gates. They run at a very slow speed, just enough to overcome the inertia of starting them up. When a passenger walks near the escalator platform, the rate increases and the escalator runs at full speed until 30 seconds after the passenger leaves the escalator. Here's a link to another example from the Seoul airport.
The end result is that energy is only expended when its needed.
It's this kind of creative thinking - using energy only when its needed - that defines our green power future.
On demand escalators - that's cool.
That's cool!
In Japan, they do not.
Above is a video from the Narita airport near Tokyo which shows the escalators to the gates. They run at a very slow speed, just enough to overcome the inertia of starting them up. When a passenger walks near the escalator platform, the rate increases and the escalator runs at full speed until 30 seconds after the passenger leaves the escalator. Here's a link to another example from the Seoul airport.
The end result is that energy is only expended when its needed.
It's this kind of creative thinking - using energy only when its needed - that defines our green power future.
On demand escalators - that's cool.
That's cool!
Thursday, March 3, 2011
A Perfect Weekend in Kyoto
After completing my meetings in Tokyo last week, I traveled to Kyoto with my family for the weekend. Kyoto is one of the world's great cities, with remarkable history, culture, and food (especially for vegans who eat the same foods as Zen monks). Here's where we walked on Sunday and Monday before returning to Boston on Tuesday.
On Sunday, we started the day with a great Japanese breakfast of Yudofu (boiled Tofu), pickles, rice, and tea at our ryokan in Central Kyoto, the Watazen. After breakfast, we walked south to the Buddhist temples of Higashi and Nishiki Honganji. We purchased coils of our favorite incense from Kungyoku-do, near Nishiki Honganji. From there, we walked east, crossing the Kamagawa river and passing Sanjusangen-do, the temple where samurai used to practice archery (you can still find the arrow holes in the beams). From there, we climbed Kiyomizu-yama on our way to the temple of Kiyomizu-dera, which is surrounded by pottery shops. We bought a pair of rustic tea cups, and a simple sake cup and bottle set made of fine clay at the Asahido pottery shop.
From there, we walked north through the park of Maruyama-koen and the shrine of Yasaka-jinja, then up Shijo-dori to buy incense at another great incense store adjacent to the Kyoto Craft Centre.
We walked back to the Kamagawa river and north to Sanjo-dori, passing our favorite rice cracker shop, Funahashi-Ya. We stopped for lunch at the conveyor belt sushi restaurant Musashi which has numerous vegetarian options.
Afterwards, my wife and daughter explored the shopping streets of Teramachi dori/Shinkyogoku, while I walked to north Kyoto and the temple of Ginkaku-ji. Just to the right is an unmarked path that leads up Daimonji, a 1500 foot peak overlooking Kyoto. At the top, I met a mountain biker named Yoshi, who gave me directions for a 5 mile ridge walk along the mountains between Kyoto and Lake Biwa, ending at the temple of Nanzenji, the starting point for the Philosopher's Walk. From there, I walked back to our ryokan and we went to dinner at our favorite tofu restaurant on Sanjo dori near the Kamagawa river. A great day with remarkable incense, pottery, and foods to take back to Boston.
On Monday after breakfast, we walked the Nishiki market street, with its wonderful pickles, tofu, yuba (tofu skin), and roasting chestnuts. We purchased dried yuba to prepare great stews back in Boston. From there, we returned to the Teramachi shopping street to visit our favorite tea shop, Horaido. The owner made us fine Gyokuro tea and provided precise instructions so we could make it at home. The key is to use 105F water for the first cup, creating a sweet. concentrated, viscous tea. We purchased Gyokuro and Sencha tea, as well as some cherry bark tea caddies to store our tea. If you visit me in my office at Harvard or BIDMC, I'll brew you a cup.
From there, we walked to the Imperial Palace and the textile areas of northwest Kyoto. It's always great to find clothing dyed with indigo at shops like Aizen Kobo.
From there we walked east and south, making our way back to the Gion, the Geisha quarter with its wooden shop fronts, antiques, and plum blossoms blooming over the canals lined with tea houses (pictured above) . We walked the antique shops of Shinmonzen, browsing in our favorite shop, Yagi, where I found several old Shakuhachi which the shop owner let me play. I purchased an old hammered incense scoop which I'll use for Koh-do, the incense ceremony.
At sunset, we wandered past the secretive lanes and alleys where Geisha and Maiko entertain their clients at chaya teahouses, with conversation and music. It was truly a magical moment.
We packed our treasures and readied for the commute home - 26 hours from point to point. Of all the cities in the world outside of the US, Kyoto is the one I am always reluctant to leave.
On Sunday, we started the day with a great Japanese breakfast of Yudofu (boiled Tofu), pickles, rice, and tea at our ryokan in Central Kyoto, the Watazen. After breakfast, we walked south to the Buddhist temples of Higashi and Nishiki Honganji. We purchased coils of our favorite incense from Kungyoku-do, near Nishiki Honganji. From there, we walked east, crossing the Kamagawa river and passing Sanjusangen-do, the temple where samurai used to practice archery (you can still find the arrow holes in the beams). From there, we climbed Kiyomizu-yama on our way to the temple of Kiyomizu-dera, which is surrounded by pottery shops. We bought a pair of rustic tea cups, and a simple sake cup and bottle set made of fine clay at the Asahido pottery shop.
From there, we walked north through the park of Maruyama-koen and the shrine of Yasaka-jinja, then up Shijo-dori to buy incense at another great incense store adjacent to the Kyoto Craft Centre.
We walked back to the Kamagawa river and north to Sanjo-dori, passing our favorite rice cracker shop, Funahashi-Ya. We stopped for lunch at the conveyor belt sushi restaurant Musashi which has numerous vegetarian options.
Afterwards, my wife and daughter explored the shopping streets of Teramachi dori/Shinkyogoku, while I walked to north Kyoto and the temple of Ginkaku-ji. Just to the right is an unmarked path that leads up Daimonji, a 1500 foot peak overlooking Kyoto. At the top, I met a mountain biker named Yoshi, who gave me directions for a 5 mile ridge walk along the mountains between Kyoto and Lake Biwa, ending at the temple of Nanzenji, the starting point for the Philosopher's Walk. From there, I walked back to our ryokan and we went to dinner at our favorite tofu restaurant on Sanjo dori near the Kamagawa river. A great day with remarkable incense, pottery, and foods to take back to Boston.
On Monday after breakfast, we walked the Nishiki market street, with its wonderful pickles, tofu, yuba (tofu skin), and roasting chestnuts. We purchased dried yuba to prepare great stews back in Boston. From there, we returned to the Teramachi shopping street to visit our favorite tea shop, Horaido. The owner made us fine Gyokuro tea and provided precise instructions so we could make it at home. The key is to use 105F water for the first cup, creating a sweet. concentrated, viscous tea. We purchased Gyokuro and Sencha tea, as well as some cherry bark tea caddies to store our tea. If you visit me in my office at Harvard or BIDMC, I'll brew you a cup.
From there, we walked to the Imperial Palace and the textile areas of northwest Kyoto. It's always great to find clothing dyed with indigo at shops like Aizen Kobo.
From there we walked east and south, making our way back to the Gion, the Geisha quarter with its wooden shop fronts, antiques, and plum blossoms blooming over the canals lined with tea houses (pictured above) . We walked the antique shops of Shinmonzen, browsing in our favorite shop, Yagi, where I found several old Shakuhachi which the shop owner let me play. I purchased an old hammered incense scoop which I'll use for Koh-do, the incense ceremony.
At sunset, we wandered past the secretive lanes and alleys where Geisha and Maiko entertain their clients at chaya teahouses, with conversation and music. It was truly a magical moment.
We packed our treasures and readied for the commute home - 26 hours from point to point. Of all the cities in the world outside of the US, Kyoto is the one I am always reluctant to leave.
Wednesday, March 2, 2011
Freeing the Data
I'm keynoting this year's Intersystems Global Conference on the topic of "Freeing the Data" from the transactional systems we use today such as Enterprise Resource Planning (ERP), Customer Relationship Management (CRM), Electronic Health Records (EHR), etc. As I've prepared my speech, I've given a lot of thought to the evolving data needs we have in our enterprises.
In healthcare and in many other industries, it's increasingly common for users to ask IT for tools and resources to look beyond the data we enter during the course of our daily work. For one patient, I know the diagnosis, but what treatments were given to the last 1000 similar patients. I know the sales today, but how do they vary over the week, the month, and the year? Can I predict future resource needs before they happen?
In the past, such analysis typically relied on structured data, exported from transactional systems into data marts using Extract/Transform/Load (ETL) utilities, followed by analysis with Online Analytical Processing (OLAP) or Business Intelligence (BI) tools.
In a world filled with highly scalable web search engines, increasingly capable natural language processing technologies, and practical examples of artificial intelligence/pattern recognition (think of IBM's Jeopardy-savvy Watson as a sophisticated data mining tool), there are novel approaches to freeing the data that go beyond a single database with pre-defined hypercube rollups. Here are my top 10 trends to watch as we increasingly free data from transactional systems.
1. Both structured and unstructured data will be important
In healthcare, the HITECH Act/Meaningful Use requires that clinicians document the smoking status of 50% of their patients. In the past, many EHRs did not have structured data elements to support this activity. Today's certified EHRs provided structured vocabularies and specific pulldowns/checkboxes for data entry, but what do we do about past data? Ideally, we'd use natural language processing, probability, and search to examine unstructured text in the patient record and figure out smoking status including the context of the word smoking such as "former", "active", "heavy", "never" etc.
Businesses will always have a combination of structured and unstructured data. Finding ways to leverage unstructured data will empower businesses to make the most of their information assets.
2. Inference is possible by parsing natural language
Watson on Jeopardy provided an important illustration of how natural language processing can really work. Watson does not understand the language and it is not conscious/sentient. Watson's programming enables it to assign probabilities to expressions. When asked "does he drink alcohol frequently?", finding the word "alcohol" associated with the word "excess" is more more likely to imply a drinking problem than finding "alcohol" associated with "to clean his skin before injecting his insulin". Next generation Natural Language Processing tools will provide the technology to assign probabilities and infer meaning from context.
3. Data mining needs to go beyond single databases owned by a single organization.
If I want to ask questions about patient treatment and outcomes, I may need to query data from hundreds of hospitals to achieve statistical significance. Each of those hospitals may have different IT systems with different data structures and vocabularies. How can a query a collection of heterogenous databases? Federation will possible by normalizing the queries through middleware. For example, data might be mapped to a common Resource Description Framework (RDF) exchange language using standardized SPARQL query tools. At Harvard, we've created a common web-based interface called SHRINE that queries all our hospital databases, providing aggregate de-identified answers to questions about diagnosis and treatment of millions of patients.
4. Non-obvious associations will be increasingly important
Sometimes, it is not enough to query multiple databases. Data needs to be linked external resources to produce novel information. For example, at Harvard, we've taken the address of each faculty member, examined every publication they have ever written, geo-encoded the location of every co-author, and created visualizations of productivity, impact, and influence based on the proximity of colleagues. We call this "social networking analysis"
5. The President's Council of Advisors on Science and Technology (PCAST) Healthcare IT report will offer several important directional themes to will accelerate "freeing the data".
The PCAST report suggests that we embrace the idea of universal exchange languages, metadata tagging with controlled vocabularies, privacy flagging, and search engine technology with probabilistic matching to transform transactional data sources into information, knowledge and wisdom. For example, imagine if all immunization data were normalized as it left transactional systems and pushed into state registries that were united by a federated search that included privacy protections. Suddenly every doctor could ensure that every person had up to date immunizations at every visit.
6. Ontologies and data models will be important to support analytics
Part of creating middleware solutions that enable federation of data sources requires that we generally know what data is important in healthcare and how data elements relate to each other. For example, it's important to know that an allergy has a substance, a severity, a reaction, an observer, and an onset data. Every EHR may implement allergies differently, but by using common detailed clinical model for data exchange and querying we can map heterogeneous data into comparable data.
7. Mapping free text to controlled vocabularies will be possible and should be done as close to the source of data as possible.
Every industry has its jargon. Most clinicians do not wake up every morning thinking about SNOMED-CT concepts of ICD-10 codes. One way to leverage unstructured data is to turn it into structured data as it is entered. If a clinician types "Allergy to Pencillin", it could become SNOMED-CT concept 294513009 for Pencillins. As more controlled vocabularies are introduced in medicine and other industries, transforming text into controlled concepts for later searching will be increasingly important. Ideally, this will be done as the data is entered, so it can be checked for accuracy. If not at entry, then transformations should be done as close to the source systems as possible to ensure data integrity. With every transformation and exchange of data from the original source, there is increasing risk of loss of meaning and context.
8. Linking identity among heterogenous databases will be required for healthcare reform and novel business applications.
If a patient is seen in multiple locations how can we combine their history together so they get the maximum benefit of alerts, reminders, and decision support? Among the hospitals I oversee, we have persistent linkage of all medical record numbers between hospitals - a master patient index. Surescripts/RxHub does a realtime probabilistic match on name/gender/date of birth for over 150 million people in real time. There are other interesting creative techniques such as those pioneered by Jeff Jonas for creating a unique hash of data for every person, then linking data based on that hash. For example John, Jon, Jonathan, and Johnny are reduced to one common root name John. "John" and the other demographic fields are then hashed using SHA-1. The hashes are compared between records to link similar hashes. In this way, records about a person can be aggregated without ever disclosing who the person really is - it's just hashes that are used to find common records.
9. New tools will empower end users
All users, not just power users, want web-based or simple to use client server tools that allow data queries and visualizations without requiring a lot of expertise. The next generation of SQL Server and PowerPivot offer this kind of query power from the desktop. At BIDMC, we've created web-based parameterized queries in our Meaningful Use tools, we're implementing PowerPivot, and we're creating a powerful hospital-based visual query tool using I2B2 technologies.
10. Novel sources of data will be important
Today, patients and consumers are generating data from apps on smart phones, from wearable devices, and social networking sites. Novel approaches to creating knowledge and wisdom will source data from consumers as well as traditional corporate transactional systems.
Thus, as we all move toward "freeing the data" it will no longer be ufficient to use just structured transaction data entered by experts in a single organization, then mined by professional report writers. The speed of business and the need for enhanced quality and efficiency is pushing us toward near real time business intelligence and visualizations for all users. In a sense this mirrors the development of the web itself, evolving from expert HTML coders, to tools for content management for non-technical designated editors, to social networking where everyone is an author, publisher, and consumer.
"Freeing the data" is going to require new thinking about the way we approach application design and requirements. Just as security needs to be foundational, analytics need to be built in from the beginning.
I look forward to my keynote in a few weeks. Once I've delivered it, I'll post the presentation on my blog.
In healthcare and in many other industries, it's increasingly common for users to ask IT for tools and resources to look beyond the data we enter during the course of our daily work. For one patient, I know the diagnosis, but what treatments were given to the last 1000 similar patients. I know the sales today, but how do they vary over the week, the month, and the year? Can I predict future resource needs before they happen?
In the past, such analysis typically relied on structured data, exported from transactional systems into data marts using Extract/Transform/Load (ETL) utilities, followed by analysis with Online Analytical Processing (OLAP) or Business Intelligence (BI) tools.
In a world filled with highly scalable web search engines, increasingly capable natural language processing technologies, and practical examples of artificial intelligence/pattern recognition (think of IBM's Jeopardy-savvy Watson as a sophisticated data mining tool), there are novel approaches to freeing the data that go beyond a single database with pre-defined hypercube rollups. Here are my top 10 trends to watch as we increasingly free data from transactional systems.
1. Both structured and unstructured data will be important
In healthcare, the HITECH Act/Meaningful Use requires that clinicians document the smoking status of 50% of their patients. In the past, many EHRs did not have structured data elements to support this activity. Today's certified EHRs provided structured vocabularies and specific pulldowns/checkboxes for data entry, but what do we do about past data? Ideally, we'd use natural language processing, probability, and search to examine unstructured text in the patient record and figure out smoking status including the context of the word smoking such as "former", "active", "heavy", "never" etc.
Businesses will always have a combination of structured and unstructured data. Finding ways to leverage unstructured data will empower businesses to make the most of their information assets.
2. Inference is possible by parsing natural language
Watson on Jeopardy provided an important illustration of how natural language processing can really work. Watson does not understand the language and it is not conscious/sentient. Watson's programming enables it to assign probabilities to expressions. When asked "does he drink alcohol frequently?", finding the word "alcohol" associated with the word "excess" is more more likely to imply a drinking problem than finding "alcohol" associated with "to clean his skin before injecting his insulin". Next generation Natural Language Processing tools will provide the technology to assign probabilities and infer meaning from context.
3. Data mining needs to go beyond single databases owned by a single organization.
If I want to ask questions about patient treatment and outcomes, I may need to query data from hundreds of hospitals to achieve statistical significance. Each of those hospitals may have different IT systems with different data structures and vocabularies. How can a query a collection of heterogenous databases? Federation will possible by normalizing the queries through middleware. For example, data might be mapped to a common Resource Description Framework (RDF) exchange language using standardized SPARQL query tools. At Harvard, we've created a common web-based interface called SHRINE that queries all our hospital databases, providing aggregate de-identified answers to questions about diagnosis and treatment of millions of patients.
4. Non-obvious associations will be increasingly important
Sometimes, it is not enough to query multiple databases. Data needs to be linked external resources to produce novel information. For example, at Harvard, we've taken the address of each faculty member, examined every publication they have ever written, geo-encoded the location of every co-author, and created visualizations of productivity, impact, and influence based on the proximity of colleagues. We call this "social networking analysis"
5. The President's Council of Advisors on Science and Technology (PCAST) Healthcare IT report will offer several important directional themes to will accelerate "freeing the data".
The PCAST report suggests that we embrace the idea of universal exchange languages, metadata tagging with controlled vocabularies, privacy flagging, and search engine technology with probabilistic matching to transform transactional data sources into information, knowledge and wisdom. For example, imagine if all immunization data were normalized as it left transactional systems and pushed into state registries that were united by a federated search that included privacy protections. Suddenly every doctor could ensure that every person had up to date immunizations at every visit.
6. Ontologies and data models will be important to support analytics
Part of creating middleware solutions that enable federation of data sources requires that we generally know what data is important in healthcare and how data elements relate to each other. For example, it's important to know that an allergy has a substance, a severity, a reaction, an observer, and an onset data. Every EHR may implement allergies differently, but by using common detailed clinical model for data exchange and querying we can map heterogeneous data into comparable data.
7. Mapping free text to controlled vocabularies will be possible and should be done as close to the source of data as possible.
Every industry has its jargon. Most clinicians do not wake up every morning thinking about SNOMED-CT concepts of ICD-10 codes. One way to leverage unstructured data is to turn it into structured data as it is entered. If a clinician types "Allergy to Pencillin", it could become SNOMED-CT concept 294513009 for Pencillins. As more controlled vocabularies are introduced in medicine and other industries, transforming text into controlled concepts for later searching will be increasingly important. Ideally, this will be done as the data is entered, so it can be checked for accuracy. If not at entry, then transformations should be done as close to the source systems as possible to ensure data integrity. With every transformation and exchange of data from the original source, there is increasing risk of loss of meaning and context.
8. Linking identity among heterogenous databases will be required for healthcare reform and novel business applications.
If a patient is seen in multiple locations how can we combine their history together so they get the maximum benefit of alerts, reminders, and decision support? Among the hospitals I oversee, we have persistent linkage of all medical record numbers between hospitals - a master patient index. Surescripts/RxHub does a realtime probabilistic match on name/gender/date of birth for over 150 million people in real time. There are other interesting creative techniques such as those pioneered by Jeff Jonas for creating a unique hash of data for every person, then linking data based on that hash. For example John, Jon, Jonathan, and Johnny are reduced to one common root name John. "John" and the other demographic fields are then hashed using SHA-1. The hashes are compared between records to link similar hashes. In this way, records about a person can be aggregated without ever disclosing who the person really is - it's just hashes that are used to find common records.
9. New tools will empower end users
All users, not just power users, want web-based or simple to use client server tools that allow data queries and visualizations without requiring a lot of expertise. The next generation of SQL Server and PowerPivot offer this kind of query power from the desktop. At BIDMC, we've created web-based parameterized queries in our Meaningful Use tools, we're implementing PowerPivot, and we're creating a powerful hospital-based visual query tool using I2B2 technologies.
10. Novel sources of data will be important
Today, patients and consumers are generating data from apps on smart phones, from wearable devices, and social networking sites. Novel approaches to creating knowledge and wisdom will source data from consumers as well as traditional corporate transactional systems.
Thus, as we all move toward "freeing the data" it will no longer be ufficient to use just structured transaction data entered by experts in a single organization, then mined by professional report writers. The speed of business and the need for enhanced quality and efficiency is pushing us toward near real time business intelligence and visualizations for all users. In a sense this mirrors the development of the web itself, evolving from expert HTML coders, to tools for content management for non-technical designated editors, to social networking where everyone is an author, publisher, and consumer.
"Freeing the data" is going to require new thinking about the way we approach application design and requirements. Just as security needs to be foundational, analytics need to be built in from the beginning.
I look forward to my keynote in a few weeks. Once I've delivered it, I'll post the presentation on my blog.