Build a New EDIS, Advertise it in Annals for Free

As everyone who has switched from paper to electronic charting and ordering has witnessed, despite some improvements, many processes became greatly more inefficient.  And – it doesn’t matter which Emergency Department information system you use.  Each vendor has its own special liabilities.  Standalone vendors have interoperability issues.  Integrated systems appear to have been designed as an afterthought to the inpatient system.  We have, begrudgingly, learned to tolerate our new electronic masters.

This study, in Annals of Emergency Medicine, describes the efforts of three authors to design an alternative to one of the vendor systems:  Cerner’s FirstNet product.  I have used this product.  I feel their pain.  And, I am in no way surprised these authors are able to design alternative, custom workflows that are faster (as measured in seconds) and more efficient (as measured in clicks) for their prototype system.  It is, essentially, a straw man comparator – as any thoughtful, user-centric, iterative design process could improve upon the current state of most EDIS.

With the outcome never in doubt, the results demonstrated are fundamentally unremarkable and of little scientific value.  And, it finally all makes sense as the recurrent same sad refrain rears its ugly head in the conflict-of-interest declaration:

Dr. Patrick and Mr. Besiso are employees of iCIMS, which is marketing the methodology described in this article.

Cheers to Annals for enabling these authors to use the pages of this journal as a vehicle to sell their consulting service.

“Efficiency Achievements From a User-Developed Real-Time Modifiable Clinical Information System”
http://www.ncbi.nlm.nih.gov/pubmed/24997563

6 thoughts on “Build a New EDIS, Advertise it in Annals for Free”

  1. It is sad that the poster who wrote the comment entitled " Build a New EDIS, Advertise it in Annals for Free" has missed the entire point of the article, that is, new technology to permit truly user-centric and user-led design, with rapid turnover. The approach of rapid prototyping–>deployment works well. Before technology such as this, it was a lot harder to provide; see my case example at http://cci.drexel.edu/faculty/ssilverstein/cases/?loc=cases&sloc=Cardiology%20story where the vendor provided an unusable system, the IT department lacked the competencies – and desire – to remediate the problems, and the solution came from a process that was fit for the needs (largely due to rapid changes implemented by dedicated programmers free of vendor and IT department attachments), but that could have benefited greatly from the information technology the paper describes.

  2. Ah, I’m afraid I disagree with your assessment of my critique.

    As you point out, this article describes a ”technology to permit truly user-centric and user-led design, with rapid turnover". But, what I feel we basically have here is a consulting case study. My issue is with publishing this article as “science”, particularly when the authors have potential financial gain from its dissemination and seeming validation. And, as I state above, FirstNet is an inefficient and unusable comparator, so their results (time saved, and clicks saved) are never in doubt.

    Don’t misunderstand me – this sort of responsive prototyping is fantastic, and their methodology is certainly of value. It just bothers me to see it in an academic journal.

  3. Ryan, you know that I usually agree with you 100%. And some of your concerns here are valid. But I think that you're over-reaching a little.

    First, in this specific paper, the 1st author does not have COI. So in theory, at least, that mitigates some of the potential bias. 2nd, the measures are very "quant" – and therefore less subjective. The analysis may certainly have been biased in ways that are not immediately obvious to the reader, but this is true of government-funded research as well.

    Second, it is of critical importance for us to do rigorous research on novel digital health technologies. And almost by definition, this will have to involve members of the software development team. I disagree that the outcomes of this study were "never in doubt." That's what many people would have said about EHRs in the beginning. And look where that landed us.

    Third, and most importantly, COI does not *always* = non-scientifically-valid work. There are ways to mitigate risk. And all researchers, ultimately, have some COI, whether it be the desire to advance our own career, the desire to keep funding, or a deeply held value set that supports one's research endeavors.

    By saying that NO studies that involve people with COI are EVER valid, you make these studies much less likely to happen. Which would be a big shame for the field.

  4. Thanks for the feedback, Megan.

    I still disagree regarding the inevitability of the results. This is analogous to any sort of process re-evaluation/redesign. If we took a critical eye to our own Emergency Department workflow, there is no doubt we’d be able to tweak a few processes to be more adaptable, more efficient, or less costly. And, we see these published all the time – demonstrating a specific new flow or process was superior than the prior. Then, we can look at our own processes, compare it to their base state, and see if implementing that same process in our setting has value.

    But, this article doesn’t describe a specific innovation. The “innovation” is just re-designing the EHR with a better understanding of their own specific workflow. I’ve spent enough time in Clinical Informatics, and in human factors before this, to know incremental improvement (especially over FirstNet) is almost assured. And, there’s a publication bias built into this type of study – what would you say the chance of authoring this publication would have been if they’d attempted to improve upon FirstNet, but failed? We would not be reading about iCIMS’ ineffective methodology if it hadn’t demonstrated the results they wanted. They’d have refined their methods and tried again until they had success – and then we’d see that publication, instead.

    Another question regarding the academic value of this paper is whether it can be independently reproduced. The only way to reproduce this work in another setting is to have iCIMS come to that facility and deploy their custom software tool, as it’s essentially impractical to recreate the tools necessary for their methodology on your own.

    I think their system of real-time clinical team-led design is _great_. But I don’t see the compelling academic value in this publication – and, adding in the two authors employed by iCIMS just leaves a bitter taste.

  5. That's fair. And I haven't used FirstNet so I can't comment there.

    It would be interesting to see a multi-center study of this type; so doing would require a potentially prohibitive amount of work with various hospitals' IT departments, though.

    And the publication bias issue is real! … but a topic for another day.

  6. Jon Patrick, the corresponding author, responded privately with these comments for public posting:
    “My previous work into clinical information systems (aka EMR), when I was an academic, lead me onto a new pathway which was to identify the nature of the software development methods that are needed to satisfy clinical applications. One of my students obtained his PhD on tackling the problem. In the end I arrived at an approach that I thought would be of great assistance to clinical teams – essentially a method by which users can control the design of a system ( our Clinical team Led Design method) and change it in near real-time at will (our emergent clinical Information Systems technology). So we created both methods and technology to address these problems.

    In 2012 I left the paid service of the Univeristy to start up the iCIMS company as I believed as a critic one should be able to show that purported solutions do work. It would be an inadequate description of our activities to portray them solely as commercial, we are proud of our academic heritage and do our best to be scientific in all our work both engineering and evaluation. One of my colleagues was generous enough to compliment me on being to able to serve firstly as critic and then a solutions provider.

    Crucially we have not built yet another EDIS as you proposed, but rather we have a built a tool which any clinical specialty can use to design their own custom solution, and we have designed well over 20 such systems, many of them in oncology.

    To me the most important statement of our work in the paper is the last 3 paragraphs – there we state the new way in which Clinical Information Systems need to be engineered and the consequences that has for all clinical specialties. We are happy that our first full blown assessment has been done in an ED but it could have been with one of many different specialties."

    Many thanks to Jon for taking the time to respond and highlight aspects of his work he feels are valuable to share through academic publication!

Comments are closed.