By 10 am, the robot has already carved out an entire human organ. The patient’s gall bladder, riddled with infection, is gone, and in a darkened operating room at Boston’s Beth Israel Deaconess Medical Centre, the machine is going back for seconds.
Its next target is a tumour, buried in the same woman’s kidney. So the bot remains perched, its arms fanned out over her like some great, hulking insect. Those arms twitch and wriggle, and inside the sleeping woman’s abdomen its tiny, dextrous manipulators slice and burn through fatty connective tissue, manoeuvring around veins, arteries and nerves, clearing a path to the cancer with a level of precision no human could ever hope to muster.
The robot, called the da Vinci Si Surgical System, isn’t performing a partial nephrectomy of its own volition. It is being controlled by two human surgeons seated at a pair of consoles in a corner of the OR. They take turns commanding the machine, using an array of foot switches and hand controls, to snip or sear through tissue. Their movements are replicated by the robot a few metres away and interpreted, too – the da Vinci smooths out the control signals, eliminating physiological tremors from the doctors. The robot enhances, even as it obeys.
This isn’t some experimental test or limited pilot programme – it’s simply how surgery is done today. The da Vinci system, which first reached hospitals 14 years ago, has become the most common surgical robot on the planet, with almost 2 500 units worldwide performing over 200 000 procedures a year.
Although the bot was initially used in urologic surgery, it’s since racked up so many procedures that the system’s manipulators now touch nearly every internal organ. Instead of encircling the patient in a tight, huddled pack, the nurses, technicians and surgeons in a da Vinci operating room each have a discrete station in a different part of the OR, and all the specialists are staring at screens showing the same intimate, zoomed-in contours of the patient’s insides. Over the past decade robots have reshuffled a work environment that had taken more than a millennium to perfect and have transformed an entire profession.
“The surgeon’s role is totally different now,” says Andrew Wagner, director of minimally invasive urologic surgery at Beth Israel and the attending surgeon for the procedure I watched. “A couple times a month, a patient will say, ‘Let me see your hands’. They want to see how steady they are. And I do it. I play along. What I don’t tell them is that it really doesn’t matter. Motion scaling removes that part of the job.”
According to Wagner and the growing number of surgeons and hospital administrators who’ve seen the da Vinci in action, the benefits of robot surgery come from the ability to synthesise human intelligence with machine-assisted precision. Like the majority of other surgical robots, the system specialises in minimally invasive procedures. Its slim, cable-driven manipulators fit into relatively small incisions, fully or partially removing organs more nimbly than the poles used in traditional laparoscopic surgery, and with less trauma than open surgery. There’s less violence done to the patient, which not only turns potentially massive scars into minor ones, but can also reduce the rate of complications and cut recovery times. Whereas open prostatectomies typically require a three-day hospital stay, a patient could go home within 24 hours of a robotic procedure, if not the same day. “We’ve moved from doing prostate and kidney surgery all open to 90 per cent robotic,” Wagner says.
Many of his patients are excited by the prospect of being operated on by a robot, but others are terrified. “The word robot is very misleading. They think I push a button and then walk out of the room,” Wagner says.
The robotic spectrum:
Robotic surgical systems such as the Renaissance now operate on almost every major internal organ. And they’re coming to an OR near you.
Mazor Robotics Renaissance
The Renaissance, a robotic guidance system for spinal procedures, positions the surgeon’s drill during each step, combining pre-op scans with real-time X-rays to provide pinpoint accuracy.
Currently used on:
It’s FDA-cleared for cranial surgery, though specific procedures haven’t been described in detail.
Mako Surgical RIO
The RIO is a single, super-accurate robotic arm on wheels. It can be fitted with various tools to resurface degraded or diseased joints or to position reconstructive implants.
Currently used on:
Future procedures: UB IQUI T Y
Despite steady sales of the RIO, Mako Surgical has struggled financially. Before the robot can move to other parts of the body, Mako will have to solidify its role in lower-body-joint surgery.
Intuitive Surgical Da Vinci
The world’s most common surgical bot, the da Vinci system allows remote control of a 3D camera and up to three instrumented arms that enter the body through tiny (1- to 2-cm) incisions.
Currently used on:
Adrenal glands, colon, heart, gall bladder, kidney, prostate, spleen, stomach, throat, female reproductive system
No new regions are planned, so any in-development techniques or procedures are essentially variations on current ones (or, at the very least, will involve the same organs).
Whereas traditional radiation therapy bathes entire regions of a patient’s body with high-energy rays, the CyberKnife’s particle accelerator fires precise, pulsed beams at specific organs
Currently used on:
Breast, female reproductive system, gastrointestinal tract, head and neck, intracranial, kidney, liver, lung, pancreas, prostate, spine
Rather than expanding use to new areas of the body, the CyberKnife team wants to develop more focused beams and the ability to treat patients who previously might have been ruled out
Eight years ago, the problem with the Renaissance surgical bot wasn’t underperformance – in fact, it was doing too much. Still in development at the time, the machine was designed to guide a drill to specific positions along a patient’s spine and, when given the go-ahead by the attending surgeon, bore away. But during preliminary testing, Israel-based Mazor Robotics found that orthopaedic surgeons wanted the robots to have less autonomy. They were fine with letting the robots aim, but they wanted to pull the trigger and retain that tactile feedback of the bit spinning through bone. So, Mazor CEO Ori Hadomi says, the company demoted its creation and removed the automatic drill.
The Renaissance bot is a relative newcomer to the OR, with just 18 systems in the US so far, but it’s not alone. The wide range of partially autonomous surgical robots currently deployed includes the Lasik laser eye surgery system, Mako Surgical’s joint resurfacing RIO system, and the CyberKnife system, which doesn’t physically cut into patients, but hits them with precisely targeted doses of radiation. These are true robots – machines that are given specific orders but whose programming determines how to execute those tasks.
The CyberKnife, for example, isn’t simply a medical-grade particle accelerator with good aim. Its algorithms allow it to hit a moving target, adjusting its beam to accommodate a patient’s breathing or other involuntary movements, so its beam can be narrower and its sessions fewer. Someone who might have needed a six-week course of radiation from a traditional linear accelerator can spend as little as a week with the CyberKnife. And since the machine is hitting bull’s-eyes every session with an emitter that conforms to the irregular shape of a given organ, there are fewer side effects and less crippling discomfort.
The CyberKnife might be saving patients who, with traditional treatments, would have been written off. Initial results from an ongoing study at the University of Pittsburgh indicate that CyberKnife treatments can turn pancreatic cancer patients, who often would have been considered inoperable, into viable surgical candidates. Dwight Heron, a professor at the university and an oncologist, says, “This is something we’ve never seen before.”
By shrinking the tumours with precisely targeted radiation before subjects go under the actual knife, surgeons can potentially remove one of the most lethal cancers. “We’re talking about patients where surgeons say, ‘We can’t take it out. Maybe you’ll live a year’. Now they’ll live potentially five years and may even be cured,” Heron says.
With demand for surgical systems on the rise, academic researchers are developing a second wave of medical robots, systems with even greater degrees of autonomy. In 2010, the Duke University Ultrasound Transducer Group demonstrated, using turkey breasts, a robot that could perform completely unassisted biopsies. Beyond testing its core capabilities on two women diagnosed with breast cancer, the machine hasn’t made it to broader human trials. But the group’s director, Stephen Smith, believes the technology could have a huge impact in the developing world.
“We envisage a mobile van with a mammography unit, a 3D scanner, a robot and a PC with AI software,” Smith says. “One technologist would do everything.”
At Carnegie Mellon University, robotics professor Howie Choset is still waiting for the FDA to clear the first commercial application of his snake-shaped surgical robot, a system called the Flex. The Massachusetts-based Medrobotics, which Choset co-founded in 2005, is presenting the flexible robot as a natural fit for ear, nose and throat procedures. Choset wants to do more, hoping to develop similar systems that not only compete with the da Vinci, but further redefine what surgery is and who can perform it. “This will disseminate medical care,” Choset says. “When we operated with surgical-snake robots on pigs, we had non-surgeons doing the jobs that surgeons used to do.” Users pilot the system with a simple joystick. And although the operator would need to know enough about human anatomy to avoid getting lost, the incisions would be considerably smaller and in areas that heal more readily – in orifices such as the mouth, for example.
Choset sees the first application for such a system on the battlefield, where a field medic and an intuitively controlled snake-bot could be someone’s sole hope for survival. “The number of hours it takes to master surgery with the da Vinci is about the same as it would take for an open procedure – 10 000 hours,” Choset says. Autonomy, along with the winding, flexible nature of a snake-bot, promises to exponentially reduce training time. “If you can play a video game, you can drive our robot,” he says.
It’s a hopeful take on the future of surgery, but back at Beth Israel, it’s coming apart at the seams.
For a relatively unhurried 90 minutes or so, Wagner and his partner have cleared away fatty tissue and located the urine and blood pathways to avoid, laying the groundwork for the tumour’s removal. Now the artery feeding into the kidney is clamped, and the clock has started. With no blood pumping in, the organ is effectively dying. Wagner has 20 minutes to not only get the cancer out, but also cauterise and suture the kidney’s worst wounds before the clamp comes off. Around 15 minutes in – or 5 minutes to go – blood arcs across the screen, just barely missing the camera.
“Whoa. Hello,” Wagner says. “We have a pumper. Watch out, Steve. Don’t let it hit me in the face.” While Wagner’s camera bobs and weaves, urology chief resident Steve Eyre uses an old-fashioned, nonrobotic pole to nudge this prodigious bleeder into a safer trajectory.
As the clamp comes off and the scrambling subsides, that’s when it sinks in. In this particular operating room and during this particular procedure, there’s no place for autonomy or lightly trained technicians. You need experts to look at that tumour.
You need Steve to reposition tissue and organs. You need multiple people making complex, sometimes urgent, decisions. And even if human judgment could somehow be distilled into code, our flesh is too unpredictable.
“Every patient is different,” says Catherine Mohr, director of medical research at Intuitive Surgical. In theory, a self-guided version of the da Vinci could be loaded with a general map of the region and the ability to find and remove the prostate. Half of the time the results might be suitable. But the other half, when the nerves don’t line up, you’d wind up with patients who were impotent, incontinent, or both.
It’s a problem of imaging, really. Hard tissue, such as bone, is easier to scan. But soft tissue is still a puzzle box, with vessels and nerves and plumbing that show up as ghosts or best guesses before surgery starts. So although systems such as the Renaissance and Mako’s RIO can follow a concrete game plan established ahead of time using clear X-rays, MRIs and ultrasounds – and reconfirmed during the procedure – the only guarantee with soft tissue is that it will be messy.
Without the ability to autonomously navigate soft tissue, self-guided bots can’t be trusted to go diving into organs. They still have roles to play, though, in more contained missions. When a robot like the RIO resurfaces a patient’s joint to better interface with a new implant, it’s functioning like a kind of surgical CNC machine, manufacturing a component within the body. The Renaissance system makes a compelling case for specific bursts of autonomy – in spinal procedures where a deviation of 2 millimetres can mean repeating the surgery or possibly even paralysis, a machine that’s accurate to within 1 millimetre is a clear benefit.
The future of surgery is not a linear path, with the da Vinci and its master–slave control signal laying the foundation for more autonomous systems, such as the CyberKnife and Renaissance. Automation and teleoperation are simply two halves of a collective, robotic solution. And barring some quantum leap in artificial intelligence, there’ll always be a time and place for human cognition.
What’s certain is that the era of human hands inside another body is drawing to a close. To Wagner, for procedures he tackles on a weekly basis, the human-only approach is already obsolete. “I’ve done something like 600 prostate surgeries. I’ve never done one open,” Wagner says. But he’s the one doing the surgery, not a machine alone. Still, he says, “In my mind there’s no need, really, to learn that operation the traditional way. We’ve proved that it’s just better for everyone with the robot.”
My doctor, the robot
America’s hard-to-please Food and Drugs Administration (FDA) has given the green light to an autonomous telemedicine robot called RP-VITA, created in a joint project by iRobot and InTouch Health. The robot is already making its rounds of hospital corridors in the US and elsewhere, allowing doctors and other medical staff to remotely interact with their patients.
According to its builders, the robot was designed to transform acute healthcare delivery by extending the ability of physicians, nurses and other medical staff to provide the best possible patient care at the lowest possible cost. It couples InTouch Health’s industryleading remote presence capabilities with state-of-the-art navigational technology to create what’s claimed to be the world’s most advanced and easy-to-use remote presence robot.
“The RP-VITA raises the bar for overseeing patient care remotely and allows me to proactively control a situation as if I were there,” says Dr Jason Knight, assistant clinical professor at the University of California, Irvine. “The robot is so easy to use that I can forget about the technology and just focus on the clinical needs at hand.”
The robot’s compelling features include an enhanced navigation capability that enables it to better manage driving and navigation elements so the health care professional can put more focus on patient care tasks; and state-of-the-art mapping and obstacle avoidance technologies that allow safe, fast and highly flexible navigation in a clinical environment.
An autonomous navigation capability allows a remote clinician or bedside nurse to send the robot to a target destination with a single click, enabling a number of breakthrough clinical applications. It will also have real time access to important clinical data, enabling a range of new workflow improvements for physicians, nurses and other medical staff. For example, the RP-VITA can be integrated with live patient data from the electronic medical record, and is equipped with the ability to connect with diagnostic devices such as otoscopes and ultrasound. It comes equipped with the latest electronic stethoscope.
An iPad user interface enables quick and easy navigation to anywhere the robot needs to go, as well as interaction with the patient, family and care team.