OREANDA-NEWS. June 20, 2016. When you’re headed into surgery, you want a doc with unmatched skill, proven experience and steady hands. Or maybe just a really good robot.

Researchers at the Children’s National Medical Center, in Washington, D.C., are advancing this pursuit by having robot surgeons perform soft-tissue operations autonomously.

A Guiding “Hand”

Robot-assisted surgery can be traced back to the mid-1980s, when a team at the University of British Columbia developed a robot that assisted in an orthopedic procedure.

Three decades on, we’re still a long way from robots taking on more complex procedures without a surgeon being present. But the most recent advances, aided by GPUs, could make surgery safer, more accessible and less expensive.

“The goal was not to remove surgeons from the equation but to provide an intelligent option or solution to enhance their capacity and capability,” said Peter Kim, associate surgeon-in-chief at the CNMC and lead researcher on the project.

Even when robots are used today to perform relatively routine autonomous “soft surgery” procedures, a surgeon is typically directing every movement, meaning the robot has the same limits as the surgeon controlling it. Kim’s team has upped the ante by developing a robot that can autonomously perform more delicate procedures on soft tissues.

Robotic Surgery with Sub-Millimeter Accuracy

Kim and his team have put innovative technologies, including NVIDIA’s GeForce GTX TITAN GPU, to work in developing their Smart Tissue Autonomous Robot. STAR is an arm-like device that uses a 3D plenoptic camera and near-infrared vision and marking to pinpoint the location of tissue to be manipulated with sub-millimeter accuracy.

The researchers also incorporated laparoscopic suturing tools with added sensing capabilities so STAR can “feel” tension and pressure when in contact with tissue. Additionally, they programmed the robot with a consensus of how surgeons would best perform a complex surgical task.

The team’s work was published recently in the journal Science Translational Medicine.

To test STAR’s capability, Kim, who is also professor of surgery at the George Washington University School of Medicine, opted to have it perform anastomosis (the suturing together of two tubular structures) on bowel segments in a pig.

The team chose anastomosis because it’s a relatively common yet complex surgical task performed more than a million times a year in the U.S. — and one that has never been performed by a machine without direct human control. It’s similar conceptually to repairing a garden hose — the goal is to make the sutures tight and regularly spaced so that there’s no leakage.

A Breakout STAR

STAR didn’t just successfully complete the first completely autonomous robotic anastomosis. Its movements were so consistent that the surgical outcome was superior compared with the same task performed by experienced surgeons.

“The sobering part was that it did not need a lot of intelligence to do that,” said Kim.

GPUs play a critical role in STAR. They’re used to speed up the calculations of data coming from the plenoptic camera, which captures information about light emanating from the scene. This provides STAR with positional awareness and the ability to track target tissues in real time.

Kim believes STAR will eventually lead to a robotic surgeon with sufficient perceptual capabilities and intelligence to perform optimal surgery of all types anywhere and anytime. Especially as more sensing, vision and cognitive data are fed to STAR, and superior computing approaches such as parallel processing are applied.

“Very much like self-driving cars, we start with cruise control, self-parking, lane warning, autonomous stopping and, eventually, self-driving,” said Kim. “The goal is to save lives and prevent accidents. The motivation for intelligent surgery, including autonomous surgery, is no different.”