Tracking the movements and minds of surgeons to improve performance
Stanford scientist Carla Pugh has spent years developing wearable technologies for surgeons. Her goal: Use data to improve surgical decision-making.
Deep in the halls of San Francisco’s Moscone Convention Center, surgeons gathered near the far corner of a conference’s exhibition space, attracted by a somewhat unusual scene: 10 identical tables, each topped by a tray of surgical tools, several bundles of sutures, a stopwatch and a section of pig intestine. An ice chest sat on the floor, its contents labeled in large lettering: OLD BOWEL.
Just outside the waist-high barriers of the display, Carla Pugh, MD, Ph.D., professor of surgery at the Stanford School of Medicine, mingled with curious passersby. “Would you like to know more about our study?” she asked. Pugh was at the world’s largest surgical conference, the American College of Surgeons Clinical Congress, which was held this year in late October, and she was taking advantage of the traffic.
“We’re recruiting surgeons for our research, which uses several types of sensors to measure a surgeon’s movement during an operation, their decision-making and their brain waves,” she told a group of interested participants.
The more adventurous of them stepped into the space and were presented with a scenario: A “patient”—a chunk of pig bowel—was in need of a small surgery. Unbeknownst to the volunteers, each piece of tissue had been lacerated in two places. It was their job to find the damage and repair it—all under the monitoring of several data-collecting sensors, while being timed.
The surgeons palpated and gently pulled at the edges of the pink, crinkled bowel, locating the tears. With their choice of suture and stitching technique, they mended the intestine to their satisfaction. At the end, they called for their time and stepped back to let the study’s assistants perform an evaluation of their work.
The goal of this surgical exercise was to collect metrics on a basic procedure—repairing a tear with sutures—and use the data to understand how specific motions, decisions and approaches correlated with the quality of the work. The event was one of the first showcases of a vision that Pugh has developed for decades: Applying data, for the first time, to understand what it is that surgeons do and how they do it.
“In the field of surgery, there are no metrics to back up what it is that we do, or the range of tactics we employ to get positive surgical outcomes,” said Pugh, who is the director of Stanford Medicine’s Technology Enabled Clinical Improvement Center. “But I’m hoping to change that.”
Gearing up for data collection
Pugh’s research, part of a new multi-institutional collaboration called the Surgical Metrics Project, harvests data from audio and video recordings of surgeons and wearable sensors that measure motion, brain waves and tactile pressure. She’s one of the first researchers to study surgical data analytics, a subspecialty that she guesses comprises only a few-dozen experts.
When Pugh began pursuing this line of research, surgical wearables did not exist; in fact, the rise of the smart watch was still 10 years away. Her interest in the subject stemmed from an insight that struck her as a young scientist. “In medical school, I saw that technology had huge potential to facilitate medical education and training,” Pugh said. “And so I took a bit of a detour through my own training and ended up getting a Ph.D. in education and technology.”
One of her graduate classes focused on human-computer interactions and sensor technologies. It was then that she began to see how technology could enhance clinical performance. She became convinced that data-collecting sensors were the key to teaching hands-on skills in a way that a textbook, video or lecture never could.
Fast forward to today: With a suite of operation-friendly wearables, Pugh’s ideas are beginning to gain attention in the surgical community. In the past few months alone, Pugh and her team have collected data from hundreds of surgeons—the bulk of which came from conference-goers eager to donate their time and test their skills.
When it comes to the technology itself, Pugh said, there are two key elements: one is sleek, user-friendly wearables; the other is integrated data streams. The trick is to collect as much information as possible without impeding the natural pattern of a surgeon’s workflow.
As part of the study, surgeons first undergo a baseline electroencephalogram, which measures brain waves through wires encased in a brown, translucent sensor strip that sticks to their foreheads. The strip measures brain activity while the participants perform a handful of mundane mental tasks: about 10 minutes of listening to music, meditating and recalling certain melodies.
Then, they suit up for data collection. Each surgeon dons a special lab coat that holds a variety of wired sensors. Three motion sensors—so fine they fit under surgical gloves—poke out of the sleeves and are secured to the thumbs, index fingers and wrists with a piece of tape. Finally, Pugh sets up audio and video recordings, which run as the surgeon operates. The integrated approach to data collection not only shows how the surgeons’ hands move, but also how they talk through tricky parts of a procedure and how their brain waves spike or dip.
https://youtube.com/watch?v=GUQzBW7WfTc%3Fcolor%3Dwhite
The proof is in the data pattern
So far, the idea of surgical wearables has been met with mixed reactions, Pugh said. Mostly, there’s a sense of excitement and an eagerness to participate, she said. But there’s concern too—namely, that they would be used to unfairly judge a surgeon’s skills during a difficult procedure. It’s true that the wearables could be used to one day test surgical skill, but to Pugh, it would be a mistake to limit the data to that purpose.
“To me, collecting surgical data is less about evaluating the skill of a surgeon and far more about quantifying what it took to take care of a specific patient,” Pugh said.
She gives an example: Patients in intensive care units often need a central line, a type of IV that can withdraw fluid or deliver medicine. But inserting a central like into the vein of a frail 90-year-old patient is extremely different than doing so in a morbidly obese patient, or a patient who already has had multiple lines placed during previous care. “We all know the difference as practiced physicians, but there’s no data to show it,” Pugh said. “We walk around with more detailed data about our bank accounts than how we perform clinical procedures, which are 10-times more complex.”
Pugh and her team are still just getting off the starting blocks, but the data they’ve collected—through early pilot studies and at a handful of medical and surgical conferences—have already started to yield intriguing insights through data patterns.
Instead of parsing every dataset of a surgery, Pugh and her team look for overarching trends. The motion-tracking sensors feed visual data back to a computer, allowing the researchers to see movement patterns of a surgeon’s hands, including where they pause and where they spend more time.
“People would ask me, ‘Why would you want to measure surgical technique? Everyone operates so differently.’ But our data essentially shows the opposite. Whether surgeons use different instruments or add their own finesses to a procedure doesn’t really matter,” Pugh said. The overall movement patterns that are generated at the end are very similar, so long as there aren’t complications—such as abnormal patient anatomy or the rare surgical error.
Such data patterns can show where surgeries hit a snag. Take, for instance, a successful surgery with a movement pattern that, at the end, looks roughly like the body and wings of a butterfly. Those who perform a surgery without complications will see a movement pattern in the rough shape of a butterfly. Those who don’t might have a pattern with lopsided wings, or one with two bodies. “The motion sensors that track that surgeon’s fingers and hands produce a very visual result,” Pugh said. “And what’s even more interesting to see is that there doesn’t seem to be a correlation with instrument choice or whether the surgeon switched step 5 for step 6—it’s the patient’s anatomy that most accurately correlates to the end pattern.”
Big (data) dreams
The intertwining data streams from various wearables on the surgeon’s body can reveal quite a bit about the procedure and the patient on the table, but more than that, Pugh and her colleagues see it as a data-first approach to teaching, learning and improvement.
“The innovative research led by Dr. Pugh’s team will provide incredible data-informed insights into surgeon efficiency of motion, tactile pressure and cognitive load while performing a variety of medical and surgical tasks,” said Mary Hawn, MD, professor and chair of surgery. “These types of data could be used to identify when a surgeon has mastered a procedure and when there may be a deficit.”
Some of the wearable applications are still a ways off, Pugh said, as the technology is now only used for procedures on mannequins and tissue bits. But there is one wearable Pugh has tested in the operating room: the EEG sensor.
During two surgeries, a gall bladder removal and an appendectomy, Pugh has volunteered to stick the brain-wave-reading sensor onto her forehead. “First we just need to verify that it works in the OR and that the data comes in successfully,” Pugh said. And, so far, it does. Through the EEG data, Pugh’s team could see that the peaks of Pugh’s brain waves while operating corresponded with the most trying moments of the surgery, while lower level activity synchronized with menial surgical tasks, like suturing.
After a successful surgery, Pugh closed the patient and left the OR, forgetting to remove the long strip on her forehead. “My colleagues who are aware of my research saw the EEG sensor and immediately knew what I had been doing,” she said. Now, Pugh’s getting peppered with the same question: When can others test out the technology?
Source: Read Full Article