The term “robotic surgery” conjures images of machines performing heart transplants or surgeons in the Bahamas directing delicate procedures thousands of miles away via the Internet. These scenarios are mostly fantasy, according to Allison Okamura of the Johns Hopkins University Department of Mechanical Engineering. She says that the real potential of robotic surgery- or rather computer-enhanced surgery-is to make surgeons more present in the operating room, rather than less so.
Haptic systems are a particularly promising area of research in the field of robotics. Haptics involve making robotic surgical instruments more sensitive to human touch and, reciprocally, allowing robot tools to convey sensory tactile data to the doctors who wield them. Okamura and her team have developed a haptic system that helps doctors view how much pressure their robotic instruments are applying to a given area. This sort of research will enable surgeons to better perform minimally invasive surgeries.
“The advantages [of computerenhanced surgery] don’t really have much to do with artificial intelligence or autonomous robots that do surgery by themselves. It’s really about enhancing the capability of the surgeon in a way that maintains the minimally invasive approach,” she says.
Another potential use of this technology is helping surgeons to better visualize the anatomical landscape during a procedure. Doctors have long used scopes and lenses to examine surgical areas prior to performing operations, and, of course, X-rays have been in use for nearly a century. Within the next 10 years, 3-D scanning and high-resolution image projection will allow surgeons to “see” the area of operation in an entirely new way.
“For example,” says Okamura, “if someone is going in for a surgery for a liver tumor, that person has probably had some preoperative images, maybe a CT scan or an MRI, that have allowed the surgeon to acquire an image beforehand of what the tissue looks like. With the system that our center is developing, you can take those preoperative images and basically overlay them, visually, on top of the actual patient, so the surgeon would really know where to put his tools in order to reach the desired target. He wouldn’t be looking at the patient and looking up at a picture and looking back and forth, trying to do all of these complex transformations in his head. Instead, the surgeon can just look right at the target anatomy and have this advanced visualization. We call that augmented reality. So, instead of virtual reality, we’re combining virtual information, which is the preoperative imaging from the patient, and overlaying it onto the real world. So basically you’re giving the surgeon X-ray vision.”
Surgical robots can also photograph, survey, and collect data in ways that humans cannot and give surgeons a better sense of how the operation went, after the fact. “When you do robot-assisted surgery, you’re already tracking the tools that are inside the patient,” says Okamura. “You can have force-sensors and other ways of examining force, and then you’re acquiring data at the same time that you’re doing the procedure, so you can be getting even more information that can be used for diagnosis or in scheduling postop appointments. You can model tissue health based on the data you acquired during operation by the robot. The hope is that it will also improve our knowledge about how the patient is doing.”
But don’t expect to read about a long-distance kidney transplant anytime soon. According to Okamura, robotic surgery tools work best at close quarters. The Internet is not yet a stable enough platform to safely allow procedures outside of the operating room. Also, from a surgical perspective, increasing a doctor ‘s dexterity, accuracy, and visualization is more important than allowing a surgeon to operate across continents.
“While it’s a nice dream that you would have these really long-distance surgeries,” she says, “I think the practical advantages of just doing [traditional] tele-operated surgeries are much greater.”
Technology won’t replace human surgeons in the near future. Rather, surgeons will use robotic instruments and wireless search-engine technology optimized for surgery as readily as any other tool. In the operating room of the future, information from patient-record databases, sensors, consulting physicians, and so on is constantly traveling back and forth, keeping the surgeon connected to the outside world even as she’s performing delicate, life-or-death procedures.
Says Okamura, “The surgeon is going to be surrounded at a glance with any information that he or she desires in order to accomplish the surgery. It’s really all about integrating data to give the surgeon not only super-human manipulation capabilities, but also superhuman knowledge. Basically anything that might be stored in the computer, the surgeon should be able to access.”
Allison Okamura, associate professor of mechanical engineering at Johns Hopkins University, demonstrates her lab’s scissors-based surgical simulator. She’s working to endow robotic surgical tools with a sense of touch.
Originally posted in THE FUTURIST March-April 2007