Before Dr. Sonia Ramamoorthy, chief of colon and rectal surgery at UC San Diego Health, took a scalpel to Larry Smarr, director of Calit2, she first took a virtual tour of his large intestine. It encompassed an entire room.
Then Smarr, Harry E. Gruber Professor of Computer Science and Engineering at UC San Diego, shared a more modest, life-sized 3D-printed model of his suspect organ. With sometimes chagrined colleagues watching, surgeon and patient scrutinized its colonic curves and convolutions, revealing previously undetected complexities and, perhaps, the future of surgery.
Turning a fairly routine procedure into a bold experiment, Smarr recently underwent a segmental colon resection (in which a portion of the large intestine is removed) prefaced by the creation of dramatic, data-driven computer models of his abdomen and affected organs. This allowed Ramamoorthy to essentially conduct the operation in advance—in her head—before ever drawing a drop of blood.
“Once I realized that I was actually going to go through surgery,” Smarr said, “I thought to myself, ‘Well, as long as I'm going to have this done, I'm blessed to have all this talented staff, all kinds of computers and visualization that most people don't have. Why don't I use my surgery as a translational pilot project to see if using the 3D visualization actually does make a positive difference for the surgeon?’"
The resection, from which Smarr has quickly and successfully recovered, is a natural extension of his earlier project of “quantified self,” the idea that by knowing and understanding every aspect of your physiology, down to the diversity of microbes living in your gut, a person can better manage his or her health and well-being.
“We're living through a digital transformation in medicine, and that means we can make use of more data about our bodies than we ever could before, including 3D versions of imaging technologies,” Smarr said. “The question is, how can we use that data to minimize the risk of surgery, to have better outcomes? That was the experiment that we set out to try to work on here.”
Ramamoorthy was equally enthusiastic about Smarr’s notion of “quantified surgery,” which she characterizes as a surgical parallel to precision medicine, the latter largely focused upon an individual’s genetic makeup and how treatment can be uniquely tailored to each patient’s disease or condition.
“We seek to personalize the process of surgical intervention,” she said. “Surgery is a step-by-step process that culminates in an operative procedure. After confirming the appropriate patient disease and patient readiness for surgery, surgeons are tasked with planning an operative intervention. They rely on various sources of information to develop their ‘flight plan’ for each patient, such as imaging studies like MRIs or CT scans, patient history, physical exam findings, lab tests and a surgeon’s knowledge and experience with surgical diseases.
“As a surgeon, you bring a lot of experience and knowledge to the table. You walk into the OR and you say, ‘There's nothing that I can't handle.’ Which is true. But, wouldn't you rather be prepared for what you're going to encounter in the moment? I mean, who goes into a football game without having studied the opponent's typical patterns and made a strategy for what they're going to do? We do that now, but if you don't know what you will encounter or are going to encounter, how can you plan properly for how you are going to approach it?”
With Smarr’s novel computer models, developed by colleague Jurgen P. Schulze, associate research scientist at Calit2’s Qualcomm Institute and associate adjunct professor at Jacobs School of Engineering, and using Smarr’s earlier medical scans, Ramamoorthy said the actual procedure proved to be considerably simpler.
Surgeons oversee use of four-armed DaVinci robot during Smarr’s sigmoid colon resection procedure.
It certainly didn’t look that way. Smarr went under the knife early on a sunny morning in late November 2016. The gleaming, sprawling operating room at Jacobs Medical Center was packed with surgeons, nurses, technicians and high-tech equipment, including a four-armed DaVinci robot that Ramamoorthy would use to do the actual surgery and a wall-sized screen capable of displaying multiple, different images simultaneously. While Ramamoorthy used monitors at her robotic control station to guide the surgery, everyone else watched the wall, which provided Technicolor views of both Smarr’s virtual self and inside the real man, courtesy of laparoscopic cameras.
Ramamoorthy said the experience was gratifying, if a bit mind-blowing. She was able to switch back and forth between looking at Smarr’s now-familiar computer colon and the flesh-and-blood organ. “It was incredibly helpful,” she said.
The visualization technologies also allowed her to more effectively teach residents and nurses in the room, who could see everything she saw. “I could say, ‘You see this, guys? You understand what we are doing here? Do you see this plane? This is where you want to be. This is how we want to approach this.’”
During November 2016 procedure, the medical team discusses Smarr’s on-going operation in front of a primary viewing screen, which contains both live imagery and past visual medical data. A cameraman records the unprecedented operation.
Dr. William Sandborn, chief of the division of gastroenterology at UC San Diego Health and Smarr’s physician, came in to consult, as did Dr. Santiago Horgan, chief of minimally invasive surgery and director of the Center for the Future of Surgery at UC San Diego School of Medicine.
“They were in there sort of helping us with some of the integrated information,” said Ramamoorthy, “but in reality, we wouldn't necessarily need them physically in the operating room. There's a way for us in this day and age to bring them into the operating room remotely and virtually to get them to see what we are seeing, to get their input.”
Ramamoorthy said that while the experiment was an undeniable success, there is still much to learn and do. She and fellow surgeons will need time and training to adapt and exploit the new 3D tools.
“You're seeing three different views of the same thing: a 3D model, the actual patient in the operating room and an endolumenal picture, created by laparoscopic cameras inside the patient. Just processing all that in your head as a surgeon so you can decide your next move is a skill set that requires training. I think we have to get all of the technology into the OR so that we can start training our brains.”
Smarr’s goal is to get all of the technology into the operating room quickly, efficiently and cheaply. He acknowledges that his case—with his deep knowledge, connections and resources—made his particular procedure possible, but he believes the idea of quantified surgery can be readily translated to typical patients in typical hospitals.
Most patients generate vast amounts of health care data about themselves; most of which may be used once, sparingly or not at all, he said. It’s a deep resource waiting to be fully exploited.
“The great thing about our economy these days is that the consumer world is driving the computing world, which means we can now use these new technologies in specialized areas like surgery or radiology. If it were left to surgery or radiology alone, they would not have the money to drive this technological development. So that's one of the things that I specialize in, in understanding what changes in technology, driven by the mass market, are now going to be applicable to specialized areas.
“I see this as bringing more and more technology into the operating room to reduce risk, to shorten the operating time, and to have a better outcome for the patient. Now that we know this is valuable in surgery, what we're going to do is simplify the software, make a much better user interface, and then train up a number of our surgeons in the use of it, and get another 10 patients to refine the procedure, then 100 patients and so on ….”
By Scott LaFee
Visualizing the Future of Surgery was originally published on the University of California, San Diego website.