Duke Forge | February 12, 2019
By Eric Perakslis, PhD

In late January of this year, we welcomed health data expert Eric Perakslis, PhD, to Duke Forge as a Rubenstein Fellow. In Eric’s first blog post for Duke Forge, he shares some reflections prompted by an unexpected detour on his way to his first week at Duke.


Hi all!  I’m very excited to be starting my time at Duke and look forward to meeting my new colleagues. First, though, I’d like to share some recent experiences of mine that I hope will offer a window on my thoughts, experiences, and ideas about health data.

Last week, despite decades of experience snowboarding around the world, I took a nasty tumble on an icy slope of Whiteface Mountain, AKA “Ice-Face,” near Lake Placid, New York. I landed with a broken leg that required surgery, and during my inpatient stay that weekend I had plenty of time to talk myself into planning a snowboarding trip for next year. The care I received in the Adirondacks was remarkable. Having hosted two Olympics, Lake Placid is still a highly active Olympic training center and the local orthopedists are similarly world-class. After a few days of great care, I was off to home and recovery.

Several days later, however, I spiked a mild fever and had increased swelling, redness, and pain in my leg. My surgeon recommended that I head to the emergency room for a thorough checkup. Living just southeast of Boston in the little coastal town of Hingham, Massachusetts, I was lucky to have several choices. I opted for convenience and decided on a nearby suburban hospital. Five minutes after entering the facility, I was struck (again) by an observation that I’ve frequently made: Systems exist for institutions, but data should be for people.

(Before I dive in, I want to underscore that this is an observation and not a judgment. All institutions require systems to run their daily activities, ensure safety and compliance, pay their employees, and automate key functions. My purpose in this post is not to judge but to observe, understand, and optimize.)

On this Wednesday night at 19:00 hours, the systems of this emergency department were on clear display. I was checked in by a clerk who pulled my record and quickly asked why I was there. Twenty minutes later, I was called to have my vital measurements taken by a nice person who did not ask why I was there. Back I went to my wheelchair without leg braces, struggling to keep my leg – which was not yet in a cast – elevated, without risking having it kicked or stepped on as people moved about the crowded ER. As I waited, I tried to decipher the system of triage that was in use.

Across the United States and even in most low and middle-income countries where I have worked, I typically see health professionals directly involved in triage, but I saw no evidence of that here. The first step in their system comprised whatever note or code the clerk entered when I was wheeled in. The second step was more obvious: on the wall behind each vitals-collection station was a sheet with simple ranges for body temperature, blood pressure, heart rate, and respiration rate. There was also a pain chart, but I had not been asked about my level of pain. Presumably, if my vital measurements were outside of the ranges on that wall, a nurse would have been called?

As another hour passed, I pondered this system while my leg throbbed. During the Ebola outbreak of 2014, I wrote triage apps that would enable low-literacy workers to triage patients as easily as taking a fast food order at a McDonald’s. The apps used pictures for systems, were multiple choice, and were driven by case definitions for Ebola, the most frequently occurring illnesses that mimic Ebola, and many of the other common local illnesses. And it was much, much better than what these people were using!

The truth is that the system I saw in action wasn’t actually being used by the experienced clinicians for whom it was designed.

How do systems like this happen in healthcare? Was there a critical staff shortage? A technology outage? Some unprecedented incident that had flooded this ER at this particular time? I had plenty of opportunity to ask these questions later, as I ended up spending the night. I asked seven people from the ER and four supporting functions (vascular lab, phlebotomy, x-ray, and physical therapy) and received the same answer from all: this is just a normal night, and these are our systems. Striking.

No less striking was the amazing care that I received during this visit. As I sat in the waiting room, I watched clinicians walking here and there, reading charts, and looking around – most quite busily. A few, though, made eye contact with almost everyone as they hustled about. It was one of these folks, a nurse, who approached me and asked why I was there. She pulled my chart and quickly returned with a chair to take me “back.” At that point, I had what felt like a traditional triage. She was quickly joined by a physician assistant who smiled, introduced herself to me, my daughter Sam, and my wife Lisa, and then efficiently removed my bandages so she could see what was going on. She then started an IV, gave me something for pain, and arranged an ultrasound to check for a deep venous thrombosis. Both the nurse and the PA were competent, caring professionals, and I struggled to understand the systems in which they were expected to do their jobs.

The process that I tried to parse in the waiting room provides a clear example of systems that are built by institutions, for institutions. Things tend to be exactly as they look, and this process was likely all about resource management: in other words, covering the most common and least technical interactions as well as possible using the lowest-cost resource. Then, gradually increase the labor costs only as a patient progresses through process stage-gates designed to filter and parse patients based upon need.

I understand and am not going to judge this. What I will judge, though, is the lost opportunity to arm and enable dedicated workers with data that could not only help them do their individual jobs better, but make the system as a whole more efficient. The truth is that the system I saw in action wasn’t actually being used by the experienced clinicians for whom it was designed. These folks were still doing the real triage without useful benefits or reductions in burdens that the overall workflow process was presumably supposed to provide.

The reasons for this were clear. The initial “reason for visit” recorded by the receptionist was typically too high-level, unclear, or inaccurate. Vital signs were recorded only once, and could have been taken at any time from 5 to 90 minutes after a patient arrived. This reveals a second simple truth about systems in complex institutions: that the documented processes may or may not reflect how the work actually gets done.

So how can we improve upon this or other similar situations? We need to start by remembering that while systems may be for institutions, data, on the other hand, are for people. Yes, organizations and institutions use data, but seldom in the real-time and agile ways that people do. The process of “triage by walking around” that I witnessed relied upon real-time observation of the waiting rooms and ward – it was not informed by data that were being recorded and transcribed into the EHR. Creating a better system would mean getting clinicians and other hospital staff better data and getting it to them faster. For data to be useful, they must be accurate, timely, trustworthy, and readily available when needed – a set of qualities sometimes referred to as “data liquidity.” Regardless of nomenclature used to describe it, effective institutions manage data well, whether their customer is buying a book online or looking for help with a baby’s cough.

In the right hands, with the right approach, data provides the life blood of any institution in the internet age. The fact that in 2015, rural clinics in Sierra Leone had better automated triage, data collection, and less-skilled worker utilization following a devastating epidemic than the system I experienced in a rich suburban hospital outside Boston in 2019, demonstrates that technology and resources are not the limiting factors. Rather, what matters most often is the combination of process, priority, and creativity.

Why spend precious resources protecting data, unless those data are useful to somebody?

Of course, data alone do not provide a panacea. We must protect our institutions from poor data usage that leads to misinformed decisions, poor-quality research, and unintended outcomes. Privacy and security are also essential responsibilities, especially when personalized digital health is equivalent to personal surveillance. However, I remember that when I was working on achieving Federal Information Security Management Act (FISMA) status at FDA, there was an endlessly circular debate ongoing at the agency about data liquidity versus data security. When asked about the issue in an FDA “town hall” meeting, my answer was simply: “Why spend precious resources protecting data, unless those data are useful to somebody?” If the value of a particular set of data isn’t clear, those data can be archived. But expensive data – data that are actively spinning and flowing – must be useful (ideally for more than one thing), or resources are likely being wasted.

Six hours after I arrived at the ER, the attending said that things looked ok, but asked if I’d be willing to spend the night for observation and to see an orthopedist in the morning, to which I agreed. Just then, an announcement came over the intercom: “It is 1:00 AM and our weekly EPIC shutdown and upgrades will commence and be ongoing until 4:00 AM.”

Lying comfortably in a private corner of the ER with a real door, I asked my doc if I’d be spending the night in this room, or if I would be moved to a patient floor. He smiled and said, “I hope someone can get you that answer soon, but you may have to wait until after 4:00 AM to find out.”

True story.