Showing posts with label healthcare IT risk. Show all posts
Showing posts with label healthcare IT risk. Show all posts

Tuesday, October 8, 2013

Quality and Safety Implications of Emergency Department Information Systems: ED EHR Systems Pose Serious Concerns, Report Says

A report "Quality and Safety Implications of Emergency Department Information Systems"
appeared in the Oct. 2013 issue of "Annals of Emergency Medicine."  It is available fulltext at http://www.annemergmed.com/article/S0196-0644%2813%2900506-4/fulltext, or in PDF via the tab, free as of this writing.

First, a preamble:  I once tried to alert a hospital where I'd trained decades before, Abington Memorial Hospital (http://www.amh.org/), of impediments to safe care I'd noted in their EHR's, predominantly their ED EHR.  They did not listen.  In fact, their response to my concerns was characterized by an apparent incompetence regarding conduct of safety investigations.  For instance, to my written concern in an April 2010 letter to the CEO and CMO about the ED EHR that:

... I've also had to stop administration [to my mother] of an antibiotic (Levaquin) in the recent past in the ED that she has had an adverse reaction to (torn rotator cuff), despite my having told ED intake she was allergic to it. She relates that administration of Levaquin was then almost repeated on the floor until she herself refused it during that past admission.

This was the sworn testimony in May 2013 about the "investigation" that resulted, from the hospital's VP of Risk Management, Regina Sturgis:

A:      Deborah [hospital General Counsel] asked me to investigate the Levaquin issue which I did.
Q:      Did you do that on your own or did you delegate some of the --
A:      No.  I did it on my own.
Q:      Do you know whether any of the IT folks were ever brought in to look at the -- the EMR issues referenced in this letter?
A:     No, I do not.  I know that I was asked to look at the Levaquin because of my clinical background.
Q:      Okay.  Did you come up with any conclusions?
A:      Yes.
Q:      What was your conclusion?
A:      That she had been ordered Levaquin in the ETC [Emergency Trauma Center a.k.a. ED], that it had been discontinued about a very short period of time later, under a half an hour, and that she never received it.

So, the investigation of a complaint that family and then the patient themselves had to stop the administration of a drug whose staff and EHR had been informed of an allergy consisted of confirming that the medication was never given.  No problem, the ED EHR is safe.

(I am not joking; that is the testimony given.  Imagine such an investigation and conclusion about, say, reported aircraft flaws, or, in the industry in which I was once a safety officer, public transit vehicle defects and dangers.)

However, when competent people investigate similar issues, the findings are concerning.  From Modern Healthcare (http://www.modernhealthcare.com/), a publication for healthcare executives, on the new Annals of Emergency Medicine article:

ED EHR systems pose serious concerns, report says

By Joseph Conn
Modern Healthcare

June 24, 2013
Electronic health-record systems used in emergency departments are beset with poor data displays, loaded with so many alerts warning of potential patient-safety issues that they can lead to user alert fatigue, and may be generating incorrect physician orders, according to a report by two emergency physicians' study groups.

Meanwhile, providers wanting to address these EHR issues are hampered by a lack of research and solid evidence of the extent of the problem with these systems, and by contract provisions with EHR vendors that stymie the free flow of information about system-linked safety concerns, the report authors say.

So, ED's across the country are rolling out technology, often taking advantage of ARRA's HITECH incentives ... but there is a lack of research and solid evidence into the risks.  Allow me to opine - that's simply crazy.

The groups found that “poor data display is a serious problem with many of today's EDISs,” while “the sheer volume” of alerts that range from the “completely irrelevant to life threatening” [or lack of appropriate alerts to relevant, simple issues such as data input errors - ed.] can “dull the senses, leading to a failure to react to a truly important warning.” They also found that “an alarming number of clinicians are anecdotally reporting a substantial increase in the incidence of wrong order/wrong patient errors while using the computerized physician order entry component of information systems.

The word "anecdote", as I have written, is being misused.  The reports are not "anecdotes."  They are risk management-relevant incident reports.  (See "From a Senior Clinician Down Under: Anecdotes and Medicine, We are Actually Talking About Two Different Things" at http://hcrenewal.blogspot.com/2011/08/from-senior-clinician-down-under.html.)

Two study groups from the American College of Emergency Physicians have recommended a program of systemic vigilance over electronic health-record systems used in emergency departments to improve patient safety and enhance quality of care.

ACEP workgroups on informatics and on quality improvement and patient safety published their findings in an article, “Quality and Safety Implications of Emergency Department Information Systems,” in the current issue of the Annals of Emergency Medicine.

Post marketing surveillance, a standard for decades in other healthcare sectors, has been absent from health IT due to a long-obsolete special regulatory accommodation afforded that industry.  This accommodation was initiated when systems were simple and merely advisory - not the comprehensive enterprise clinical resource and clinician command-and-control systems they are today.  Now, clinician investigators of the technology such as the authors of this study are realizing that continuing this accommodation is a mistake.

It follows in the wake of, and references, an Institute of Medicine report from 2011, “Health IT and Patient Safety: Building Safer Systems for Better Care.” That report concluded that “current market forces are not adequately addressing the potential risks associated with the use of health IT.” It also comes eight months after the New England Journal of Medicine published “Electronic Health Records and National Patient-Safety Goals,” which warned that recent evidence “has highlighted substantial and often unexpected risks resulting from the use of EHRs and other forms of health information technology.”

I note that if you frequent this blog, you likely read material similar to the bolded red statements above here first, as authored by me, dating to the founding of this blog in 2004.

... “The rush to capitalize on the huge federal investment of $30 billion for the adoption of electronic medical records led to some unfortunate and unintended consequences, particularly in the unique emergency department environment,” said Dr. Heather L. Farley, the lead author of the report, in a news release. “The irreversible drive toward EDIS implementation should be accompanied by a constant focus on improvement and hazard prevention." Farley is assistant chairwoman of the Department of Emergency Medicine at Christiana Care Health System in Newark, Del.

Ironically, I note in Dr. Farley's statement some of my own advice, given to ED staff when I was Chief Medical Informatics Officer at Christiana Care 1996-8.   I had in that time period advised Charles Reese IV, MD, Chair of Emergency Medicine, to not implement EHRs or, at best, implement document imaging systems (since ED charts are not that long or complex), not full field-based EHRs, due to the "unfortunate and unintended consequences" of bad health IT in such an environment I recognized even then.  It was only a few years ago that my advice was finally overturned.

The authors also report “(t)here are few consistent data on how commonly these errors occur, and few studies are actually focused on collecting evidence of these errors.” Meanwhile, “there is currently no mechanism in place to systematically allow, let alone encourage, users to provide feedback about ongoing safety issues or concerns” with EHRs in general, and EDISs specifically.

On its face, that is not a safety-conscious environment and the rollout and use of such systems seems a fundamental violation of patient's rights, made worse by the fact that there is no informed consent process whatsoever to ED EHR use.

The workgroups came up with seven recommendations: appointing an emergency department “clinician champion,” creating within healthcare delivery organizations an EDIS performance improvement group and an ongoing review process, paying timely attention to EDIS-related patient-safety issues raised by the review process, disseminating to the public lessons learned from performance improvement efforts, distributing vendors' product updates in a timely manner to all EDIS users and removing the “hold harmless” and “learned intermediary clauses” from vendor contracts.

Many of these issues have been discussed on this blog.

“The learned intermediary doctrine implies that the end users (clinicians) are the medical experts and should be able to detect and overcome any fallibility or contributing factor of the product,” the authors said.

I have also pointed out the absurdity of such a "doctors are clairvoyant" attitude, e.g., at my 2011 post on basic common sense on IT adverse consequences at http://hcrenewal.blogspot.com/2011/04/common-sense-on-side-effects-lacking-in.html.

They conclude that the “lack of accountability for vendors through hold harmless clauses and the shifting of liability to the clinicians through the learned intermediary doctrine are significant and additional impairments to safety improvement. Electronic health records and EDISs are sufficiently complex that the physician and other users cannot be expected to anticipate unpredictable errors.”

That aligns with the work of Dr. Jon Patrick in Sydney, whose treatise "A study of an Enterprise Health information System" on the Cerner FirstNet ED EHR is available here: http://sydney.edu.au/engineering/it/~hitru/index.php?option=com_content&task=view&id=91&Itemid=146

Earlier this month, the Electronic Health Record Association, an EHR developers trade group affiliated with the Chicago-based Healthcare Information and Management Systems Society, announced the launch of a voluntary “code of conduct in which adherents would agree to drop “gag clauses” in the contracts with their provider customers.

Great.  Per the wonderful 2007 article "The Denialists' Deck of Cards: An Illustrated Taxonomy of Rhetoric Used to Frustrate Consumer Protection Efforts" by Chris Jay Hoofnagle, available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=962462, as of this writing free:

... At this point [of losing the argument], the denialist must propose "self regulation" to deal with the problem that doesn't exist. The cool thing about self regulation is that it cannot be enforced, and once the non-existent problem blows over, the denialist can simply scrap it! [20]

[20] In the runup to passage of bank privacy legislation, data brokers created a group called the "Individual Reference Services Group" that promptly disappeared after the legislation passed.

("Denialism" is the use of rhetorical techniques and predictable tactics to erect barriers to debate and consideration of any type of reform, regardless of the facts.)

IMO 'self regulation' of healthcare is, on its face, a deception.  There are simply too many conflicts of interest.

On use of "integrated" big systems:

“These systems do have glitches [indeed - see http://hcrenewal.blogspot.com/search/label/glitch - ed], but it can be plain and simple bad design that can lead to clinical errors,” Cozzens said.  But ED physicians, he said, are “having the enterprise systems forced upon them. To think you can take one system and adapt it to those different environments is totally wrong. That's why you see low physician satisfaction and the productivity is going down, all for the sacrifice of having an integrated system.”

In fact, so-called "best of breed" systems can be bad health IT as well.  See the aforementioned evaluation by Dr. Patrick in Australia.

Bad Health IT ("BHIT") is defined as IT that is ill-suited to purpose, hard to use, hard to customize, unreliable, loses data or provides incorrect data, causes cognitive overload, slows rather than facilitates users, lacks appropriate alerts, creates the need for hypervigilance (i.e., towards avoiding IT-related mishaps) that increases stress, is lacking in security, compromises patient privacy or otherwise demonstrates suboptimal design and/or implementation. 


Through my own work, I've seen bad health IT result in patient harm and death.  It's just unfortunate that I got started in this line of work by being, in effect, shot out of a cannon.  That is, my own mother was a victim.

-- SS

Addendum 10/8/13:

From the article:

End-User Recommendation 4: EDIS-related patient safety concerns identified by the review process should be addressed in a timely manner by ED providers, the EDIS vendors, and hospital administration. Each of these processes should be performed in full transparency, specifically with openness, communication, and accountability. 

I'm not sure the aforementioned levaquin near-accident "investigation" meets these standards.

-- SS

Monday, September 16, 2013

An Open Letter to David Bates, MD, Chair, ONC FDASIA Health IT Policy Committee on Recommendations Against Premarket Testing and Validation of Health IT

From http://www.healthit.gov/policy-researchers-implementers/federal-advisory-committees-facas/fdasia:

The Food and Drug Administration Safety Innovation Act (FDASIA) Health IT Policy Committee Workgroup is charged with providing expert input on issues and concepts identified by the Food and Drug Administration (FDA), Office of the National Coordinator for Health IT (ONC), and the Federal Communications Commission (FCC) to inform the development of a report on an appropriate, risk-based regulatory framework pertaining to health information technology including mobile medical applications that promotes innovation, protects patient safety, and avoids regulatory duplication.

My Open Letter to the Committee's chair speaks for itself:

From: Scot Silverstein
Date: Mon, Sep 16, 2013 at 9:39 AM
Subject: ONC FDASIA Health IT Policy Committee's recommendations on Premarket Surveillance
To: David Bates

Sept. 16, 2013

David Bates, Chair, ONC FDASIA Health IT Policy Committee
via email
   
Dear David,

I am disappointed (and in fact appalled) at the ONC FDASIA Health IT Policy Committee's recommendations that health IT including typical commercial EHR/CPOE systems not be subjected to a premarket testing and validation process.  I believe this recommendation is, quite frankly, negligent. [1]

As you know, my own mother was injured and then died as a result of EHR deficiencies, and nearly injured or killed again in the recuperation period from her initial injuries by more health IT problems in a second EHR used in her care.  In my legal consulting and from my colleagues, as well as from the literature, I hear about other injuries/deaths and many "near misses" as well.  That your recommendations came in the face of the recent ECRI Deep Dive study is even more appalling, with the latter's finding of 171 health IT-related incidents in 9 weeks from 36 member PSO hospitals, resulting in 8 injuries and 3 possible deaths, all reported voluntarily. [2]

It is my expert opinion the issues that cause these outcomes would never have made it into production systems, had a reasonable, competent, unbiased premarket testing and validation process been in place.

Consequently, I have shared the FDASIA HIT Policy Committee's recommendations with the Plaintiff's Bar, and will use its recommendations in my presentations to various chapters of the American Association for Justice (the trial lawyer's association) - as well as to interested Defense attorneys so they may advise their clients accordingly.

I am also making recommendations that in any torts, individual or class, regarding EHR problems that would likely have been averted with competent premarket testing and validation, that the FDASIA HIT Policy Committee members who agreed with the recommendation be considered possible defendants.

I am sorry it has come to this.

Please note I am also posting this message for public viewing at the Healthcare Renewal weblog of the Foundation for Integrity and Responsibility in Medicine (FIRM).

Sincerely,

Scot Silverstein, MD
Consultant/Independent Expert Witness in Healthcare Informatics
Adjunct Faculty, Drexel University, College of Computing and Informatics

Notes:


[1] FDA Law Blog, Recommendations of FDASIA Health IT Workgroup Accepted, September 11, 2013, available at http://www.fdalawblog.net/fda_law_blog_hyman_phelps/2013/09/recommendations-of-fdasia-health-it-workgroup-accepted.html: "Of particular interest is the recommendation that health IT should generally not be subject to FDA premarket requirements, with a few exceptions:  medical device accessories, high-risk clinical decision support, and higher risk software use cases."

[2] "Peering Underneath the Iceberg's Water Level: AMNews on the New ECRI 'Deep Dive' Study of Health IT Events" , Feb. 28. 2013, available at http://hcrenewal.blogspot.com/2013/02/peering-underneath-icebergs-water-level.html.

-----------------------------------------------

Note: the following are listed on the linked site above as members of the committee:

Member List
  • David Bates, Chair, Brigham and Women’s Hospital
  • Patricia Brennan, University of Wisconsin-Madison
  • Geoff Clapp, Better
  • Todd Cooper, Breakthrough Solutions Foundry, Inc.
  • Meghan Dierks, Harvard Medical Faculty, Division of Clinical Informatics
  • Esther Dyson, EDventure Holdings
  • Richard Eaton, Medical Imaging & Technology Alliance
  • Anura Fernando, Underwriters Laboratories
  • Lauren Fifield, Practice Fusion, Inc.
  • Michael Flis, Roche Diagnostics
  • Elisabeth George, Philips Healthcare
  • Julian Goldman, Massachusetts General Hospital/ Partners Healthcare
  • T. Drew Hickerson, Happtique, Inc.
  • Jeffrey Jacques, Aetna
  • Robert Jarrin, Qualcomm Incorporated
  • Mo Kaushal, Aberdare Ventures/National Venture Capital Association
  • Keith Larsen, Intermountain Health
  • Mary Anne Leach, Children’s Hospital Colorado
  • Meg Marshall, Cerner Corporation
  • Mary Mastenbrook, Consumer
  • Jackie McCarthy, CTIA - The Wireless Association
  • Anna McCollister-Slipp, Galileo Analytics
  • Jonathan Potter, Application Developers Alliance
  • Jared Quoyeser, Intel Corporation
  • Martin Sepulveda, IBM
  • Joseph Smith, West Health
  • Paul Tang, Palo Alto Medical Foundation
  • Bradley Thompson, Epstein Becker Green, P.C
  • Michael Swiernik, MobileHealthRx, Inc.
Federal Ex Officios
  • Jodi Daniel, ONC
  • Bakul Patel, FDA
  • Matthew Quinn, FCC

 -- SS

Thursday, September 5, 2013

Study Explains Errors Caused by EHR Default Values - With Only Four Reports of "Temporary" (By the Grace of God) Patient Harm

From Health Data Management and the Pennsylvania Patient Safety Authority:

Study Explains Errors Caused by EHR Default Value
Sept. 5, 2013

A new study analyzes errors related to “default values” which are standardized medication order sets in electronic health records and computerized physician order entry systems.

The Pennsylvania Patient Safety Authority, an independent state agency, conducted the study. “Default values are often used to add standardization and efficiency to hospital information systems,” says Erin Sparnon, an analyst with the authority and study author. “For example, a healthy patient using a pain medication after surgery would receive a certain medication, dose and delivery of the medication already preset by the health care facility within the EHR system for that type of surgery.”

These presets are the default value, but safety issues can arise if the defaults are not appropriately used. Sparnon studied 324 verified safety reports, noting that 314, or 97 percent, resulted in no harm. Six others were reported as unsafe conditions that caused no harm and four reports caused temporary harm involving some level of intervention.

One might ask:  how many unreported or yet-to-be-reported EHR/CPOE cases involved, or will involve, permanent harm?

Sept. 9, 2012 Addendum:  We learn this from Healthcare IT News (http://www.healthcareitnews.com/news/ehr-adverse-events-data-cause-alarm):  "Sparnon said two of the reports involved temporary harm that required initial or prolonged hospitalization."  I note that hospitalization, especially prolonged hospitalization, exposes the patient to still more risk.

Regarding the "temporary harms", one which includes "default times" as opposed to doses:

The four cases requiring intervention involved accepting a default dose of a muscle relaxant that was higher than the intended dose, giving an extra dose of morphine [keep playing with 'extra doses' of drugs like morphine enough, and you're going to kill someone  - ed.] because of an accepted default administration time that was too soon after the last dose, having a patient’s temperature spike after a default stop time automatically cancelled an antibiotic [do this enough, and you're going to get sepsis and septic shock to deal with - ed.] and rising sodium levels in a patient because confused wording made nurses believe that respiratory therapy was administering an ordered antidiuretic. [Apparently the 'default'- ed.]

More on the errors:

The most common types of errors in the study were wrong time (200), wrong dose (71) and inappropriate use of an automated stopping function (28). 

Any of these, especially the latter two, could have caused harm depending on degree...and to those Risk Management majors out there, eventually will.

 “Many of these reports also showed a source of erroneous data and the three most commonly reported sources were failure to change a default value, user-entered values being overwritten by the system and failure to completely enter information which caused the system to insert information into blank parameters,” Sparnon says. 

Hence my claim that the term "EHR" is anachronistic, and that these systems now are really cybernetic command-and-control mediators and regulators of care (via cybernetic proxy).

“There were also nine reports that showed a default needed to be updated to match current clinical practice.”

The need for a constant, rigorous updating process (which will in the real world likely always be 'behind'), among many others, is a factor that makes the idealistic belief/promise that "health IT will save money and increase safety" (let alone "revolutionize" medicine) unpersuasive.

I note that the "default values" risk is only one of many, many "features" of EHRs and other clinical IT that cause risk and error.  This issue is but one layer of a very, very large and multi-layered onion (cf. AHRQ Health IT Hazards Manager, http://healthit.ahrq.gov/sites/default/files/docs/citation/HealthITHazardManagerFinalReport.pdf, for example).

And this, at the same time that the The HIT Policy Committee, a body of industry stakeholders who advise federal officials, on Sept. 4 adopted final recommendations on health IT risk consistent with an attitude of health IT exceptionalism that included:

"HIT should not be subject to FDA [or any - ed.] premarket requirements" and "Vendors should be required to list products which are considered to represent at least some risk if a non-burdensome approach can be identified."  

If not, no list? ... And what, exactly, is a "non-burdensome" approach, one might ask? 

(See www.healthdatamanagement.com/news/health-information-technology-regulation-fda-onc-fcc-46557-1.html)

 ------------------

Perhaps the penalty for health IT hyperenthusiasts**, short of the Biblical penalty of one of their loved ones suffering the fate of a guinea pig in a medical experiment, should be to be compelled to fly some third rate air carrier with a safety record of "we only had a few near-crashes last month."

-- SS

** [See definition of health IT hyperenthusiast at Doctors and EHRs: Reframing the 'Modernists v. Luddites' Canard to The Accurate 'Ardent Technophiles vs. Pragmatists' Reality' at http://hcrenewal.blogspot.com/2012/03/doctors-and-ehrs-reframing-modernists-v.html]
 

Wednesday, August 28, 2013

Calling Dr. Moe, Dr. Larry and Dr. Curly: Advocate Medical Breach of Four Million Patient Records, and No Encryption

At my Oct. 2011 post "Still More Electronic Medical Data Chaos, Pandemonium, Bedlam, Tumult and Maelstrom: But Don't Worry, Your Data is Secure" (http://hcrenewal.blogspot.com/2011/10/still-more-ehr-chaos-pandemonium-bedlam.html) I thought I'd seen the worst.

Yet another post to add to the category of medical record privacy/confidentiality/security (http://hcrenewal.blogspot.com/search/label/medical%20record%20privacy), however:

Advocate Medical Breach: No Encryption?
Computer Theft Raises Questions About Unencrypted Devices
By Marianne Kolbasuk McGee, August 27, 2013.

The recent theft of four unencrypted desktop computers from a Chicago area physician group practice may result in the second biggest healthcare breach ever reported to federal regulators. But the bigger issue is: Why do breaches involving unencrypted computer devices still occur?

According to the Department of Health and Human Services' "wall of shame" website listing 646 breaches impacting 500 or more individuals since September 2009, more than half of the incidents involved lost or stolen unencrypted devices. Incidents involving data secured by encryption do not have to be reported to HHS.

... The four unencrypted but password-protected computers [passwords on PC's are bypassable by smart teenagers - ed.] stolen during a burglary in July from an office of Advocate Medical Group in Illinois may have exposed information of about 4 million patients, according to an Advocate spokesman.

4 million is about 1.3 percent of the entire U.S. population (about 313.9 million in 2012) ... on just four desktop computers.

Try that with paper ...

As to the subtitle of the article, "Computer Theft Raises Questions About Unencrypted Devices", I've written on that issue before.  I'd noted questions like that are remarkable considering both MacOS and Windows have built-in, readily available encryption, the latter for a few extra $ for the "deluxe version" (see  http://en.wikipedia.org/wiki/FileVault and http://en.wikipedia.org/wiki/Bitlocker).  

Perhaps the best explanation in 2013 for unencrypted desktop PC's containing millions of confidential medical records is this picture, symbolic of the apparent attitudes of corporate and IT management on health IT security:


Encryption?  We don't need no encryption.  We got triple protection already!


-- SS

Friday, August 23, 2013

A Good Way to Cybernetically Harm or Kill Emergency Department Patients ... Via An ED EHR "Glitch" That Mangles Prescriptions

Yet another healthcare IT "glitch" - that banal little word used for potentially life-threatening software defects.  (See the query link http://hcrenewal.blogspot.com/search/label/glitch for more examples.)

An EHR/command and control system (including ordering, results reporting, etc.)  for hospital Emergency Departments, Picis Pulsecheck, was recalled by FDA.

Reason?  "Notes associated with prescriptions are not printed to the prescription or to the patient chart."  The data apparently is not being sent to the printer or being stored for future visits.  Instead, data input by clinical personnel, in one of the most risk-prone medical settings, the Emergency Department, is simply going away.

This is reminiscent of the truncation of prescription drug "long acting" suffixes, apparently by a Siemens system, that led to thousands of prescription errors (perhaps tens of thousands) over more than a year's time.  I wrote about that matter, as reported by the news media, at "Lifespan (Rhode Island): Yet another health IT "glitch" affecting thousands - that, of course, caused no patient harm that they know of - yet" at http://hcrenewal.blogspot.com/2011/11/lifespan-rhode-island-yet-another.html

Regarding the current Picis recall, notes connected with prescriptions can be crucial to the pharmacist or the patient.  Loss of those notes - apparently due to a computer glitch and most likely in this case without the prescribing clinician knowing about it - likely have been going on for some time now, since two software versions (5.2 and 5.3) are affected.

The solution for now?

"Consignees were provided with recommended actions until they receive the necessary update."

In other words, a workaround adding more work to clinicians who now not only have to take care of patients, but in the unregulated health IT market need to (as if they don't already have enough work to do in the ED where chaos often occurs) babysit computer glitches as well - and pray they catch potential computer errors 100% of the time.

Below is the FDA MAUDE recall notice at "Medical & Radiation Emitting Device Recalls", from http://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfRes/res.cfm?ID=119832.

At this additional link we find that this FDA recall was "Voluntary: Firm Initiated."  They apparently informed the FDA of the "glitch."

My question is - how did the company become aware of this "glitch"?  Also, were any patients put in harm's way, or injured, as a result of the prescription data loss?




FDA Device Recall Notice.  Click to enlarge; text below.



Class 2 Recall
ED PulseCheck

Date Posted July 29, 2013
Recall Number Z-1814-2013
Product Picis ED Pulsecheck - EMR Software Application - 2125, Software Versions: 5.2 and 5.3. The application stores patient information in a database, and it may analyze and/or display the data in different formats for evaluation by healthcare professionals for informational purposes.
Code Information Software Versions 5.2 and 5.3
Recalling Firm/
Manufacturer
Picis Inc.
100 Quannapowitt Parkway
Suite 405
Wakefield, Massachusetts 01880
For Additional Information Contact Support Representative
781-557-3000
Reason for
Recall
Notes associated with prescription are not printed to the prescription or to the patient chart.
Action Initial customer notifications were sent via email on June 21, 2013 informing consignees of the recall and providing further instruction regarding the software solution. Consignees were provided with recommended actions until they receive the necessary update.
Quantity in Commerce 35
Distribution Nationwide Distribution, including the states of: AK, AR, AZ, CA, CO, DC, DE, FL, GA, ID, IN, MA, MD, MO, NH, NJ, OH, OR, SC, TN, WA, and WV.
Finally, I ask - how did this "glitch" escape the notice of the company before the software was put into production not in just one, but through two sequential versions?

I propose that the lack of health IT regulatory controls due to special accommodation makes thorough software testing less "desirable" by a company (largely due to costs).

Compare that to, say, software regulation in the Federal Aviation Administration:


FAA Aircraft Software Approval Guidelines - available at http://www.faa.gov/documentLibrary/media/Order/8110.49%20Chg%201.pdf.  Click to access.

The FAA document begins:

"This order establishes procedures for evaluating and approving aircraft software and changes to appropriate approved aircraft software procedures."

Software regulation in other mission critical industries like aviation and pharma make the health IT industry and its lack of regulation look pathetic.


-- SS

Friday, August 9, 2013

A War on Patients: Panel Says EHRs Should Not Be Vetted Before Marketing and Deployment

"First, do harm - it's a learning experience, and injured or dead patients are just a bump in the road, anyway" - the apparent creed of the healthcare computing hyperenthusiasts

Joe Conn and Modern Healthcare published the following article:

Work group says OK to some HIT safety regs (link), Joe Conn, Modern Healthcare, Aug. 7, 2013

What is important is what safety regs the Workgroup said "no" to.  It comes as no surprise:
A federally chartered special work group with representatives from three federal agencies has submitted its draft recommendations on establishing a regulatory framework for health information technology. Chief among those recommendations is that health IT should not be subjected to pre-market federal regulation, but there were a few exceptions.

The exceptions are narrow, and are likely already covered as Class III medical devices by FDA (see http://www.fda.gov/MedicalDevices/DeviceRegulationandGuidance/Overview/ClassifyYourDevice/):

The exceptions under which there should be FDA regulation, according to the work group, include medical device accessories to be defined as such by the FDA; certain forms of “high risk” clinical decision support systems, such as “computer aided diagnostics,” also to be defined by the FDA; and some “higher risk software” use cases to be defined by the committee's own safety work group.

They did acknowledge the need for postmarket surveillance:
... The group also recommended: developing a federally supported, post-market surveillance system for health IT products “to ensure safety-related decision support is in place,” creating a process for gathering information on safety issues, aggregated at the federal level and establishing a public process for “customer rating of HIT to enhance transparency.”

Dr. David Bates [a professor at Harvard Medical School], chairman of the Food and Drug Administration Safety Innovation Act work group, presented the preliminary findings Wednesday at a meeting of HHS' Health Information Technology Policy Committee.

Let me translate this to plain English:  the health IT systems that go in (and their upgrades and patches) are recommended to be free from pre-marketing regulation and regulatory vetting.  Patients are to be the guinea pigs for testing of the software.  

If patients are harmed or killed, they get the honor of being named as "postmarket surveillance learning cases" who gave their all for the betterment of healthcare information technology.  

(Without their consent, but who needs consent to test experimental and unvetted devices on guinea pigs?)

Bates did express some liability concerns:

Asked during a question and answer period following his presentation whether the committee had considered the liability implications of its recommendations, Bates said, “It's not something we discussed at length, but it's something we can discuss over the next month.”

I, on the other hand, as a legal consultant on health IT-related medical errors and evidence tampering, am considering liability issues.

Unfortunately, patients would rather be whole than in lawsuits (or dead).  Also, sadly, it's physicians and nurses who will bear the brunt, if not all, of the liability for bad outcomes due to defective IT such as at these two recent posts, with vendor alerts regarding serious flaws of medication and other orders not being retained:

A clarification for all those proletarians who lack Harvard educations, and for the Workgroup members as well. Allow me to point out that the above manufacturer safety alerts of life-threatening fundamental flaws (involving entered text that "disappears", apparently found in live-patient scenarios, and the other "glitches" that did cause life-threatening errors sometimes en masse involving thousands of patients such as another apparent Siemens debacle at http://hcrenewal.blogspot.com/2011/11/lifespan-rhode-island-yet-another.html) would likely not have occurred if the systems had been vetted before being turned loose on patients.

Finally:  David and panel members, my mother and I thank you profusely. 

Oh wait...my mother can't thank you, she's dead from the toxic effects of un-premarket-vetted health IT on simple care processes at the very hospital where I performed my residency two decades ago.

She might have died a few times before she actually did thanks to other IT "glitches" that cropped up during her recovery from the first one, but I was able to (in one case, by sheer happenstance of showing up at  the right time) discover or provide staff with information to work around additional unvetted-health-IT flaws before those did her in.

It's taken more than a decade for critical-thinking, unconflicted writers and researchers ("iconoclasts") to force cybernetics-over-all hyperenthusasts (see here) like Bates and his panel members to own up the risks of health IT at all, e.g. via sites like this blog and this teaching site. These panel members IMO have their heads buried in sand.

Dr. Bates and his panel are, in my opinion, healthcare IT extremists, which is in part the apparent holding of the belief that computers have more rights than patients - and the other beliefs mentioned in this post:  "Another Health IT 'Glitch' - Can Digital Disappearing Ink Kill Patients?" at http://hcrenewal.blogspot.com/2013/08/another-health-it-glitch-can.html.

-- SS

Monday, August 5, 2013

Another Health IT "Glitch" - Can Digital Disappearing Ink Kill Patients?

Yes, it can.

There's been yet another "glitch" in the world of health IT (see http://hcrenewal.blogspot.com/search/label/glitch for more examples).

"Glitch" is a banal term used by health IT extremists (those who have abandoned a rigorous scientific approach to these medical devices as well as basic patient protections, in favor of unwarranted and inappropriate overconfidence and hyper-enthusiasm).  The term is used to represent potentially injurious and lethal problems with health IT, usually related to inadequate software vetting and perhaps even "sweatshop floor in foreign country directly to production for U.S. hospital floors" development processes (this industry is entirely unregulated).

This from Siemens Healthcare:



Click to enlarge.  Text below.

Text of this "Safety Advisory Notification":

August 1, 2013

Safety Advisory Notification
Soarian® Clinicals Medication Reconciliation EV06736602

Dear Customer:

This notification is to inform you that Soarian Clinicals Medication Reconciliation 3.3 may not be operating properly in some cases.

Although this may affect only some customers, we are taking a conservative approach and are alerting you to this potential problem. As such, please forward this notification to appropriate personnel as soon as possible.
This letter is being sent as a precautionary measure as there have been no adverse events reported from customers.

They mean "no adverse events reported - yet."  And if such events had been reported, Siemens would most certainly not make them public.  (Why should they, when there are no regulations?)

When does this issue occur and what are the potential risks?
This issue occurs when a user moves a free text in-house order from the current and home medications side (left side) to the discharge medication side (right side), and then modifies the continued free text in-house order in discharge reconciliation prior to saving the discharge reconciliation list. After the modification of the continued free text medication order, the changes to the free text medication order are not recorded in the saved discharge medication list.  [In other words, the changes to medication orders the user just typed disappear into thin air.  I note that medication reconciliation failures are among the most common causes of medical error - ed.]

The health IT extremists would invoke the "Leaned Intermediary" doctrine that lays all blame for errors on the user.  It seems the only way to avoid such liability would be after every "enter" or "save" action (or perhaps every keystroke?), users then verify what was saved or entered...

The "fix" to this "glitch" is not too far off from that:

Immediate steps you should take to avoid the potential risk of this issue:

To prevent this issue from occurring at your facility, instruct users not to modify continued free text in-house orders on the discharge medication list. Users may be instructed to enter free text in-house orders manually by selecting the add prescription action button and entering the order.

This is known as a "workaround."  Anyone who believes this edict can and will be 100% reliably followed in often chaotic medical environments, by users from medical students to nurses to physicians, is truly cavalier.

Steps that Siemens is taking to correct this complaint:

We are diligently working to develop a correction and will test and deliver it as soon as possible.

Perhaps FDA and Joint Commission need to inquire about exactly what testing and QC was done on the current code, testing that (if actually performed) did not detect this glaring and Siemens-admitted safety-risk "glitch."

-- SS

Tuesday, July 2, 2013

Is ONC's definition of "Significant EHR Risk" when body bags start to accumulate on the steps of the Capitol?

In a June 25, 2013 Bloomberg News article "Digital Health Records’ Risks Emerge as Deaths Blamed on Systems" by technology reporter Jordan Robertson (http://go.bloomberg.com/tech-blog/author/jrobertson40/), an EHR-harms case in which I am (unfortunately) intimately involved as substitute plaintiff is mentioned:

When Scot Silverstein’s 84-year-old mother, Betty, starting mixing up her words, he worried she was having a stroke. So he rushed her to Abington Memorial Hospital in Pennsylvania.

After she was admitted, Silverstein, who is a doctor, looked at his mother’s electronic health records, which are designed to make medical care safer by providing more information on patients than paper files do. He saw that Sotalol, which controls rapid heartbeats, was correctly listed as one of her medications.

Days later, when her heart condition flared up, he re-examined her records and was stunned to see that the drug was no longer listed, he said. His mom later suffered clotting, hemorrhaged and required emergency brain surgery. She died in 2011. Silverstein blames her death on problems with the hospital’s electronic medical records.

“I had the indignity of watching them put her in a body bag and put her in a hearse in my driveway,” said Silverstein, who has filed a wrongful-death lawsuit. “If paper records had been in place, unless someone had been using disappearing ink, this would not have happened.”

How can I say that?  Because I trained in this hospital and worked as resident Admitting Officer in that very ED pre-computer.  The many personnel in 2010 who were given the meds history by my mother and myself directed it not to paper for others to see, but to /dev/null.

Why can I say that?  Because the hospital's Motion for Prior Restraint (censorship) against me was denied outright by the presiding judge just days before the Bloomberg article was published (http://en.wikipedia.org/wiki/Prior_restraint):

Prior restraint (also referred to as prior censorship or pre-publication censorship) is censorship imposed, usually by a government, on expression before the expression actually takes place. An alternative to prior restraint is to allow the expression to take place and to take appropriate action afterward, if the expression is found to violate the law, regulations, or other rules.

Prior restraint prevents the censored material from being heard or distributed at all; other measures provide sanctions only after the offending material has been communicated, such as suits for slander or libel. In some countries (e.g., United States, Argentina) prior restraint by the government is forbidden, subject to certain exceptions, by a constitution.

Prior restraint is often considered a particularly oppressive form of censorship in Anglo-American jurisprudence because it prevents the restricted material from being heard or distributed at all. Other forms of restrictions on expression (such as actions for libel or criminal libel, slander, defamation, and contempt of court) implement criminal or civil sanctions only after the offending material has been published. While such sanctions might lead to a chilling effect, legal commentators argue that at least such actions do not directly impoverish the marketplace of ideas. Prior restraint, on the other hand, takes an idea or material completely out of the marketplace. Thus it is often considered to be the most extreme form of censorship.

The First Amendment lives.

(I wonder if it irks the hospital that they cannot perform sham peer review upon me now that the censorship motion is denied.  Sham peer review is a common reaction by hospital executives to "disruptive" physicians, but I have not worked there since 1987 and I no longer practice medicine.)

In the Bloomberg story Mr. Robertson wrote:

... “So far, the evidence we have doesn’t suggest that health information technology is a significant factor in safety events,” said Jodi Daniel (http://www.healthit.gov/newsroom/jodi-daniel-jd-mph), director of ONC’s office of policy and planning. “That said, we’re very interested in understanding where there may be a correlation and how to mitigate risks that do occur.”

In my opinion this statement represents gross negligence by a government official.  Ms. Daniel is unarguably working for a government agency pushing this technology.   She makes the claim that "so far the evidence we have doesn't suggest significant risk" while surely being aware (or having the fiduciary responsibility to be aware) of the impediments to having such evidence.

From my March 2012 post "Doctors and EHRs: Reframing the 'Modernists v. Luddites' Canard to The Accurate 'Ardent Technophiles vs. Pragmatists' Reality" at http://hcrenewal.blogspot.com/2012/03/doctors-and-ehrs-reframing-modernists-v.html  (yes, this was more than a year ago):

... The Institute of Medicine of the National Academies noted this in their late 2011 study on EHR safety:


... While some studies suggest improvements in patient safety can be made, others have found no effect. Instances of health IT–associated harm have been reported. However, little published evidence could be found quantifying the magnitude of the risk.

Several reasons health IT–related safety data are lacking include the absence of measures and a central repository (or linkages among decentralized repositories) to collect, analyze, and act on information related to safety of this technology. Another impediment to gathering safety data is contractual barriers (e.g., nondisclosure, confidentiality clauses) that can prevent users from sharing information about health IT–related adverse events. These barriers limit users’ abilities to share knowledge of risk-prone user interfaces, for instance through screenshots and descriptions of potentially unsafe processes. In addition, some vendors include language in their sales contracts and escape responsibility for errors or defects in their software (i.e., “hold harmless clauses”). The committee believes these types of contractual restrictions limit transparency, which significantly contributes to the gaps in knowledge of health IT–related patient safety risks. These barriers to generating evidence pose unacceptable risks to safety.[IOM (Institute of Medicine). 2012. Health IT and Patient Safety: Building Safer Systems for Better Care (PDF). Washington, DC: The National Academies Press, pg. S-2.]

Also in the IOM report:

… “For example, the number of patients who receive the correct medication in hospitals increases when these hospitals implement well-planned, robust computerized prescribing mechanisms and use barcoding systems. But even in these instances, the ability to generalize the results across the health care system may be limited. For other products— including electronic health records, which are being employed with more and more frequency— some studies find improvements in patient safety, while other studies find no effect.

More worrisome, some case reports suggest that poorly designed health IT can create new hazards in the already complex delivery of care. Although the magnitude of the risk associated with health IT is not known, some examples illustrate the concerns. Dosing errors, failure to detect life-threatening illnesses, and delaying treatment due to poor human–computer interactions or loss of data have led to serious injury and death.”


I also noted that the 'impediments to generating evidence' effectively rise to the level of legalized censorship, as observed by Koppel and Kreda regarding gag and hold-harmless clauses in their JAMA article "Health Care Information Technology Vendors' Hold Harmless Clause: Implications for Patients and Clinicians", JAMA 2009;301(12):1276-1278. doi: 10.1001/jama.2009.398.

FDA had similar findings about impediments to knowledge of health IT risks, see my Aug. 2010 post "Internal FDA memorandum of Feb. 23, 2010 to Jeffrey Shuren on HIT risks. Smoking gun?" at http://hcrenewal.blogspot.com/2010/08/smoking-gun-internal-fda-memorandum-of.html.

I also note this from amednews.com's coverage of the ECRI Deep Dive Study (http://hcrenewal.blogspot.com/2013/02/peering-underneath-icebergs-water-level.html):


... In spring 2012, a surgeon tried to electronically access a patient’s radiology study in the operating room but the computer would show only a blue screen. The patient’s time under anesthesia was extended while OR staff struggled to get the display to function properly. That is just one example of 171 health information technology-related problems reported [voluntarily] during a nine-week period [from 36 hospitals] to the ECRI Institute PSO, a patient safety organization in Plymouth Meeting, Pa., that works with health systems and hospital associations in Kentucky, Michigan, Ohio, Tennessee and elsewhere to analyze and prevent adverse events. Eight of the incidents reported involved patient harm, and three may have contributed to patient deaths, said the institute’s 48-page report, first made privately available to the PSO’s members and partners in December 2012. The report, shared with American Medical News in February, highlights how the health IT systems meant to make care safer and more efficient can sometimes expose patients to harm.


One wonders if Ms. Daniels' definition of "significant" is when body bags start to accumulate on the steps of the Capitol.

I also note she is not a clinician but a JD/MPH.

I am increasingly of the opinion that non-clinicians need to be removed from positions of health IT leadership at regional and national levels.

In large part many just don't seem to have the experience, insights and perhaps ethics necessary to understand the implications of their decisions.

At the very least, such people who never made it to medical school or nursing school need to be kept on a very short leash by those who did.

-- SS

Friday, April 19, 2013

SILVERSTEIN v. ABINGTON MEMORIAL HOSPITAL: MOTION TO PROHIBIT COMMENTARY ABOUT THIS LITIGATION TO ANY PUBLIC CONTEXT: Do computers have more rights than patients?

Herein is an issue of potential Internet censorship and/or attempted prior restraint of the rights of a citizen to express him/herself freely:

At my post "Hospital defense maliciousness, aided and abetted by attorneys who ignore the ABA and Pennsylvania's Ethical Rules of Conduct Regarding "Candor Towards the Tribunal" I wrote about how a defense attorney in a case I unfortunately am substitute plaintiff in, that involving EHRs and the injury and the death of my mother, violated the requirement under the Code of Conduct of lawyers to exhibit candor before the tribunal, and perhaps 18 Pa.C.S. §4904 relating to unsworn falsification to authorities as well.

As also mentioned, the lawfirm was displeased, but did not respond to my offer to consider amending any factually erroneous assertions at that post.

Now here is their response:

4/19/2013MotionBY ABINGTON MEMORIAL HOSPITAL MOTION TO PROHIBIT COMMENTARY ABOUT THIS LITIGATION TO ANY PUBLIC CONTEXT WITH MEMORANDUM OF LAW WITH SERVICE ON 04/19/2013

The Motion text is here in PDF (it is a public document available to anyone on the Montgomery County, PA Prothonotary website).

The court has yet to rule on this new motion and Substitute Plaintiff's (me) replies.

I will, of course, abide by the Court's decision.

First: I note that I have been writing about issues of court process, not the substance of the case's actual issues.   I think citizens have a right to know about process in their courtrooms.  I am also Joe Public, exercising my rights to freedom of expression; I am not an attorney breaking some rule of case publicity.  I am not even the suit's initiator.  What rule(s) am I breaking, exactly, I'd like to know from the Defense.

Further, in my opinion, considering that multiple authorities including the Institute of Medicine of the National Academies (link), FDA (link), Joint Commission (link), ECRI Institute (link), National Institute of Standards and Technology (link), AHRQ-Agency for Healthcare Research and Quality at HHS itself (link) and others have written about risks of health IT to patients and the need for far more study and data on the issue, in my view there is a compelling public interest in being informed about the progress of this lawsuit.

We have not reached the day yet, I hope, when computers have more rights than patients.

I note that if the involved lawfirm, Marshall Dennehey Warner Coleman Goggin, would stick to the Rules of Conduct for attorneys, it would seem they have nothing to fear.

I note the remarkable statement about "unjustified and malicious personal attacks against Moving Defendant and defense counsel", i.e., pointing out exactly what they did.  Namely, fail to provide the required Candor towards the Tribunal regarding the undisclosed 2008 Stroud v. AMH decision on COM's, known to them (same hospital, same counsel) at the time multiple, frivolous contrary claims about COM's to harm my mother's case were made to the courts from 2010 to just recently in 2013. 

In fact, that statement itself may be an example of legal misconduct - rendering false charges and certifying them in writing to a court as true.  I note that I didn't write the Rules of Professional Conduct for attorneys; attorneys did, including Rule 3.3: "Candor before the Tribunal" whose obvious violation and my pointing it out is certainly not an "unjustified and malicious personal attack."

As far as "fair trials" go - their stated concern in this latest filing - the defense should have thought about that before breaking the aforementioned Rule of Professional Conduct, causing numerous delays.  (I wonder how many med mal cases with proper paperwork are stalled more than 2 years before Discovery even begins - the case was filed 7/16/2010.)

It seems to me that my mother deserved to be alive at the time of her fair trial ("Justice delayed is justice denied.")  I would certainly like to know if the Defense thinks otherwise.


-- SS

Apr. 19, 2013 Addendum:

As noted at Wikipedia regarding what appears to be a Motion for Prior Restraint:

Prior restraint is often considered a particularly oppressive form of censorship in Anglo-American jurisprudence because it prevents the restricted material from being heard or distributed at all. Other forms of restrictions on expression (such as actions for libel or criminal libel, slander, defamation, and contempt of court) implement criminal or civil sanctions only after the offending material has been published. While such sanctions might lead to a chilling effect, legal commentators argue that at least such actions do not directly impoverish the marketplace of ideas. Prior restraint, on the other hand, takes an idea or material completely out of the marketplace. Thus it is often considered to be the most extreme form of censorship. The United States Supreme Court expressed this view in Nebraska Press Assn. v. Stuart by noting:

The thread running through all these cases is that prior restraints on speech and publication are the most serious and the least tolerable infringement on First Amendment rights. A criminal penalty or a judgment in a defamation case is subject to the whole panoply of protections afforded by deferring the impact of the judgment until all avenues of appellate review have been exhausted. Only after judgment has become final, correct or otherwise, does the law's sanction become fully operative.
A prior restraint, by contrast and by definition, has an immediate and irreversible sanction. If it can be said that a threat of criminal or civil sanctions after publication 'chills' speech, prior restraint 'freezes' it at least for the time.

Also, most of the early struggles for freedom of the press were against forms of prior restraint. Thus prior restraint came to be looked upon with a particular horror, and Anglo-American courts became particularly unwilling to approve it, when they might approve other forms of press restriction.

-- SS

Thursday, February 28, 2013

Peering Underneath the Iceberg's Water Level: AMNews on the New ECRI "Deep Dive" Study of Health IT "Events"

FDA's Center for Devices and Radiological Health director Jeffrey Shuren MD JD voiced the opinion a few years ago that what FDA knows about health IT risks is the "tip of the iceberg" due to systematic impediments to knowledge gathering and diffusion.   See links to source here and to the FDA Internal Memo on HIT risk - labeled "internal document not intended for public use" and unearthed by investigative reporter Fred Schulte several years ago - here (PDF).

At my Feb. 9, 2013 post "A New ECRI Institute Study On Health Information Technology-Related Events" I opined that a new ECRI study was beginning to peer beneath the waterline of Jeff Shuren's iceberg tip, at what may reside underneath that waterline.  Iceberg tips, needless to say, are usually tiny compared to the iceberg's overall size.

Reporter Kevin O'Reilly at AMNews (amednews.com) has now written about that ECRI report.

The results of the report are concerning:

 Ways EHRs can lead to unintended safety problems

Wrong records and failures in data transfer impede physicians and harm patients, according to an analysis of health technology incidents.

By Kevin B. O'Reilly, amednews staff,
posted Feb. 25, 2013.

In spring 2012, a surgeon tried to electronically access a patient’s radiology study in the operating room but the computer would show only a blue screen. The patient’s time under anesthesia was extended while OR staff struggled to get the display to function properly.

That is just one example of 171 health information technology-related problems reported during a nine-week period to the ECRI Institute PSO, a patient safety organization in Plymouth Meeting, Pa., that works with health systems and hospital associations in Kentucky, Michigan, Ohio, Tennessee and elsewhere to analyze and prevent adverse events.

Eight of the incidents reported involved patient harm, and three may have contributed to patient deaths, said the institute’s 48-page report, first made privately available to the PSO’s members and partners in December 2012. The report, shared with American Medical News in February, highlights how the health IT systems meant to make care safer and more efficient can sometimes expose patients to harm.

 Mar. 1, 2013 addendum.  From ECRI, the denominator is this:


Participating facilities submitted health IT related events during the nine-week period starting April 16, 2012, and ending June 19, 2012. ECRI Institute PSO pulled additional health IT events that were submitted by facilities during the same nine-week period as part of their routine process of submitting event reports to ECRI Institute PSO’s reporting program. The PSO Deep Dive analysis consisted of 171 health IT-related events submitted by 36 healthcare facilities, primarily hospitals.   [I note that's 36 of 5,724 hospital in the U.S. per data from the American Hospital Association (link), or appx. 0.6 %.  A very crude correction factor in extrapolation would be about x 159 on the hospital count issue alone, not including the effects of the voluntary nature of the study, of non-hospital EHR users, etc.  Extrapolating from 9 week to a year, the figure becomes about x 1000.  Accounting for the voluntary nature of the reporting (5% of cases per Koppel), the corrective figure approaches x20,000.  Extrapolation of course would be less crude if # total beds, degree of participant EHR implementation/use, and numerous other factors were known, but the present reported numbers are a cause for concern - ed.]

Sept. 2013 addendum: 

Health Leaders Media has more on the ECRI Deep Dive study at http://www.healthleadersmedia.com/print/TEC-290834/HIT-Errors-Tip-of-the-Iceberg-Says-ECRI:

HIT Errors 'Tip of the Iceberg,' Says ECRI
Cheryl Clark, for HealthLeaders Media , April 5, 2013

Healthcare systems' transitions from paper records to electronic ones are causing harm and in so many serious ways, providers are only now beginning to understand the scope.

Computer programs truncated dosage fields, leading to morphine-caused respiratory arrest; lab test and transplant surgery records didn't talk to each other, leading to organ rejection and patient death; and an electronic systems' misinterpretation of the time "midnight" meant an infant received antibiotics one dangerous day too late.

These are among the 171 health information technology malfunctions and disconnects that caused or could have caused patient harm in a report to the ECRI Institute's Patient Safety Organization.

... The 36 hospitals that participated in the ECRI IT project are among the hospitals around the country for which ECRI serves as a Patient Safety Organization, or PSO.

The 171 events documented, break down like this:
  • 53% involved a medication management system.
    • 25% involved a computerized order entry system
    • 15% involved an electronic medication administration record
    • 11% involved pharmacy systems
    • 2% involved automated dispensing systems
  • 17% were caused by clinical documentation systems
  • 13% were caused by Lab information systems
  • 9% were caused by computers not functioning
  • 8%. Were caused by radiology or diagnostic imaging systems, including PACS
  • 1% were caused by clinical decision support systems

Karen Zimmer, MD, medical director of the institute, says the reports of so many types of errors and harm got the staff's attention in part because the program captured so many serious errors within just a nine-week project last spring.  The volume of errors in the voluntary reports was she says, "an awareness raiser."

"If we're seeing this much under a voluntary reporting program, we know this is just the tip of the iceberg; we know these events are very much underreported."

As at the opening of this post, "tip of the iceberg" is a phrase also used by FDA CDRH director Jeffrey Shuren MD JD regarding safety issues with EHRs and other health IT.

Along those lines, at my April 2010 post "If The Benefits Of Healthcare IT Can Be Guesstimated, So Can And Should The Dangers" I proposed a "thought experiment" to theoretically extrapolate limited data on health IT risk to a national audience, taking into account factors that limited transparency and thus reduced known injury and fatality counts. The results were undesirable, to say the least - but it was a thought experiment only.

Using the current data, coming from a limited, voluntary set of information over 9 weeks, I opine that the results of an extrapolation to a national (or worldwide) level, in an environment of rapidly increasing adopters (many of whom are new to the technology), on an annual basis, not a mere 9 weeks - would not look pretty.

The institute’s report did not rate whether electronic systems were any less safe than the paper records they replaced. The report is intended to alert hospitals and health systems to the unintended consequences of electronic health records.

Ethically, this is really not relevant towards national rollout, especially with penalties beginning to accrue to non-adopters of HHS "Certified" technology in a few years.

As I've written on this blog, medical ethics generally do not condone experimentation without informed consent, especially when the experimental devices are of unknown risk. Not knowing the risks of IT, it really doesn't matter, ethically, what the safety of paper is.  "Hope" is not a valid reason for medical experimentation.  (See below for what a PubMed search reveals about risks of paper records.)

The unspoken truth prevalent in healthcare today seems to be this:  the sacrifice of individual patients to a technology of unknown risk is OK, as long as - we hope -  it advances the greater good.    Perhaps that should be explicitly admitted by the HIT industry's hyper-enthusiast proponents who ignore the downsides, so the spin can be dropped and there can be clarity?

The leading cause of problems was general malfunctions [also known by the benign-sounding euphemism "glitches" - ed.]  responsible for 29% of incidents. For example, following a consultation about a patient’s wounds, a nurse at one hospital tried to enter instructions in the electronic record, but the system would not allow the nurse to type more than five characters in the comment field. Other times, medication label scanning functions failed, or an error message was incorrectly displayed every time a particular drug was ordered. One system failed to issue an alert when a pregnancy test was ordered for a male patient. [These 'general malfunctions' are thus not just computer bugs undetected due to inadequate pre-rollout testing, but also examples of design flaws due to designer-programmer-seller-buyer-implementer lack of due diligence, i.e.,  negligence - ed.]

A quarter of incidents were related to data output problems, such as retrieving the wrong patient record because the system does not ask the user to validate the patient identity before proceeding. This kind of problem led to incorrect medication orders and in one case an unnecessary chest x-ray. Twenty-four percent of incidents were linked to data-input mistakes. For example, one nurse recorded blood glucose results for the wrong patient due to typing the incorrect patient identification number to access the record.  [Many of these are likely due to what NIST has termed "use error" - user interface designs that will engender users to make errors of commission or omission - as opposed to "user error" i.e., carelessness - ed.]

Most of remaining event reports were related to data-transfer failures, such as a case where a physician’s order to stop anticoagulant medication did not properly transfer to the pharmacy system. The patient received eight extra doses of the medication before it was stopped. [Due to outright software, hardware and/or network problems and defects - ed.]

I've been writing about such issues since 1998, not because I imagined them.  As a CMIO I saw them firsthand; as teacher and mentor I heard about them from colleagues; as a writer I heard about them via (usually unsolicited) emails from concerned clinicians; as an independent expert witness on health IT harms I've heard about them from Plaintiff's attorneys, but not from the Defense side of the Bar as yet.  Of course the reasons for that are understandable -  albeit disappointing.

In fact, robust studies of a serious issue - the actual risks of paper towards harm causation - and further, whether any of the issues are remediable without spending hundreds of billions of dollars on IT - seem scarce.  I've asked the PA Patient Safety Authority about the possibility of using data in the Pennsylvania Patient Safety Reporting System (PA-PSRS) database, just as they did for EHR-related medical events, to determine incidence of paper-related medical events.  They are pondering the issue.

As an aside, I note that it would be ironic if the relative risks of both IT and paper were not really robustly known.  (I note that in a PubMed search on "risks of paper medical records", not much jumps out.)  IT hyper-enthusiasts will not even debate the issue of studying whether a good paper system might be safer for patients in some clinical environments than bad health IT.

Considering the tremendous cost and unknown risk of today's health IT (and perhaps the unknown risk of paper, too), would it not make more sense, and be consistent with the medical Oath, to leave paper in place where it is currently used - and perhaps improve its performance - until we "get the IT right" in controlled, sequestered environments, prior to national rollout?

In other words, as I've asked before on these pages, should we not slow down the IT push and adhere to traditional (and hard-learned) cautions on medical research?

Even asking such questions brings forth logical fallacies such as straw arguments (e.g., UCSF's Bob Wachter in a recent discussion I initiated with several investigative reporters: "...where we part ways is your defense of paper and pencil. I understand advocacy, and you have every right to bang this particular drum"), ad hominem attacks, etc.

... It is not enough for physicians and other health care leaders to shop carefully for IT systems, the report said. Ensuring that systems such as computerized physician order entry and electronic health records work safely has to be a continuing concern, said Karen P. Zimmer, MD, MPH, medical director of the ECRI Institute PSO.

“Minimizing the unintended consequences of health IT systems and maximizing the poten­tial of health IT to improve patient safety should be an ongoing focus of every health care organization,” she said.

I recommended that clinicians take matters into their own hands if their leaders do not, as at the bottom of my post here.  This advice bears repeating:

... When a physician or other clinician observes health IT problems, defects, malfunctions, mission hostility (e.g., poor user interfaces), significant downtimes, lost data, erroneous data, misidentified data, and so forth ... and most certainly, patient 'close calls' or actual injuries ... they should (anonymously if necessary if in a hostile management setting):

(DISCLAIMER:  I am not responsible for any adverse outcomes if any organizational policies or existing laws are broken in doing any of the following.)


  • Inform their facility's senior management, if deemed safe and not likely to result in retaliation such as being slandered as a "disruptive physician" and/or or being subjected to sham peer review (link).
  • Inform their personal and organizational insurance carriers, in writing. Insurance carriers do not enjoy paying out for preventable IT-related medical mistakes. They have begun to become aware of HIT risks. See, for example, the essay on Norcal Mutual Insurance Company's newsletter on HIT risks at this link. (Note - many medical malpractice insurance policies can be interpreted as requiring this reporting, observed occasional guest blogger Dr. Scott Monteith in a comment to me about this post.)
  • Inform the State Medical Society and local Medical Society of your locale.
  • Inform the appropriate Board of Health for your locale.
  • If applicable (and it often is), inform the Medicare Quality Improvement Organization (QIO) of your state or region. Example: in Pennsylvania, the QIO is "Quality Insights of PA."
  • Inform a personal attorney.
  • Inform local, state and national representatives such as congressional representatives. Sen. Grassley of Iowa is aware of these issues, for example.
  • As clinicians are often forced to use health IT, at their own risk even when "certified" (link), if a healthcare organization or HIT seller is sluggish or resistant in taking corrective actions, consider taking another risk (perhaps this is for the very daring or those near the end of their clinical career). Present your organization's management with a statement for them to sign to the effect of:
"We, the undersigned, do hereby acknowledge the concerns of [Dr. Jones] about care quality issues at [Mount St. Elsewhere Hospital] regarding EHR difficulties that were reported, namely [event A, event B, event C ... etc.]

We hereby indemnify [Dr. Jones] for malpractice liability regarding patient care errors that occur due to EHR issues beyond his/her control, but within the control of hospital management, including but not limited to: [system downtimes, lost orders, missing or erroneous data, etc.] that are known to pose risk to patients. We assume responsibility for any such malpractice.

With regard to health IT and its potential negative effects on care, Dr. Jones has provided us with the Joint Commission Sentinel Events Alert on Health IT at http://www.jointcommission.org/assets/1/18/SEA_42.PDF, the IOM report on HIT safety at http://www.modernhealthcare.com/Assets/pdf/CH76254118.PDF, and the FDA Internal Memorandum on H-IT Safety Issues at http://www.scribd.com/huffpostfund/d/33754943-Internal-FDA-Report-on-Adverse-Events-Involving-Health-Information-Technology.

CMO __________ (date, time)
CIO ___________ (date, time)
CMIO _________ (date, time)
General Counsel ___________ (date, time)
etc."
  • If the hospital or organizational management refuses to sign such a waiver (and they likely will!), note the refusal, with date and time of refusal, and file away with your attorney. It could come in handy if EHR-related med mal does occur.
  • As EHRs remain experimental, I note that indemnifications such as the above probably belong in medical staff contracts and bylaws when EHR use is coerced.

These recommendations still stand, although after this recent story, my caution about retaliation should be re-emphasized:

The Advisory Board Company
Feb. 14, 2013
Hospital Framed Physician; Planted a Gun

-- SS