AI hero

AI Takes Robotic Cancer Surgery to the Next Level

City of Hope is exploring new technologies to more effectively root out tumors and complement surgical expertise

Robotic surgery has changed the game in treating numerous solid tumors over the last 20-plus years. 

With physicians at computer consoles guiding robotic instruments that work through tiny incisions, operations are more precise and gentler than traditional open procedures. This reduces side effects and healing time, and some patients return home on the same day.

Yuman Fong
Yuman Fong, M.D.

Surgeons at City of Hope® are not only cultivating expertise to offer patients such minimally invasive options but also pushing the envelope to improve robotic surgery. Physician-scientists are tapping advances in artificial intelligence (AI) to see cancer more clearly, minimize complications and make surgeons more effective.

“In our department, we’re doing a lot of work with different robotic companies,” said Yuman Fong, M.D., Sangiacomo Family Chair in Surgical Oncology and professor and chair of surgery at City of Hope. “Surgeons look for new ways of seeing tumors, nerves, blood vessels and organs. We’re even collaborating with basic scientists to automate the most straightforward portions of surgery so that doctors remain fresh and vigilant for important portions.”

AI is already in the operating room, providing virtual guardrails to avoid damaging healthy tissue. The next generation of algorithmic tools may just include innovations dreamed up at City of Hope.

Identifying Cancer Cells During Surgery

In robotic surgery, the device’s camera already offers a clearer field of vision than what’s accessible in open surgery. Mustafa Raoof, M.D., M.S., assistant professor of surgery and of cancer genetics and epigenetics, believes the view can get even better. 

He spearheads an ambitious collaboration with City of Hope data scientists and imaging experts from the University of California campuses in San Diego and Riverside. One issue on his mind is the wait for biopsy results after a tumor is removed.

Mustafa Raoof
Mustafa Raoof, M.D.

“The major question is, ‘Has the surgery been adequate?’” Dr. Raoof said. “We envision an operating room with real-time diagnosis of tissue during surgery.”

The project aims to visualize tumors in three ways: Machine learning for computer vision will find cancer cells lit up with fluorescent labels. Ultrasound through a balloon probe will better detect 3D shapes. And an imaging technique called Raman spectroscopy will open a window into the chemical composition of tissue by picking up light outside the visible spectrum. 

AI will tie all three together.

“One of the key areas where AI is particularly beneficial is interpreting multimodal data collected by the robot’s diagnostic modules,” said Nasim Eftekhari, M.S., executive director of Applied AI and Data Science at City of Hope. “For example, machine learning can analyze data from endoscopic cameras to detect and classify abnormalities based on subtle visual cues difficult for the human eye to discern. This not only increases the precision of surgery but also allows surgeons to focus on critical tasks while the AI handles complex data processing and pattern recognition.”

Dr. Raoof predicts the technology could help minimize damage to tissue not affected by cancer.

“Surgeries where we now remove an entire organ could become surgeries where we take only the tumor,” he said. “There are many situations where organ preservation leads to faster recovery and better quality of life without compromising cancer outcomes.”

Shining a Light on Cancer

Parallel investigations in molecular imaging advance the surgical field while also providing training data for algorithms.

By developing new technologies and leading clinical research, Thinzar Lwin, M.D., M.S., assistant clinical professor of surgery, is advancing the use of fluorescent and radiolabel markers to make tumor cells visible during surgery. Images of labeled cancer cells from her research also feed AI efforts.

Thinzar Lwin
Thinzar Lwin, M.D., M.S.

The origins of her mission reach back to her time in medical school, when she would ask oncologic surgeons how they could be sure the tissue they took out was cancerous.

“They’d answer, ‘Well, I’ve learned to recognize it because of all the training I’ve done,’” Dr. Lwin said. “When I asked what the specific characteristics were — texture or color or hardness — I never got a satisfactory answer. I’ve found that no one really had a good answer.”

A couple of approaches for lighting up cancer cells have already gained Food and Drug Administration approval. They have strong advantages — detecting lesions otherwise missed by surgeons in one-third to half of cases — as well as limitations.

“It’s like looking at text and seeing highlighted words jump out at you,” Dr. Lwin said. “But we don’t blindly trust that signal. We think, ‘Does this make sense with what we expect in the patient?’”

She wants to improve on today’s standard of care by finding targeting molecules that bind more specifically to cancer. Reciprocally, algorithms will aid in seeing the markers. 

“Our goal is to come up with the holy grail of molecular design for the least false positives, the least false negatives and the highest accuracy rate,” she said. “With AI, we could provide something more binary than just human judgment.”

Easing the Surgeon’s Burden

Dr. Fong wrote the book on robotic surgery. One day, a computer read that book.

Thus began a partnership with researchers at the University of Verona in Italy who are developing AI to guide some surgical tasks — by having the algorithms read instructions, a rare approach. The idea is not to replace surgeons, but rather to make their job easier. Oversight is a must.

“Tying what the computer does to reality is very important,” Dr. Fong said. “The human expertise has to be there to double-check.” 

Zahra “Nasim” Eftekhari
Nasim Eftekhari, M.S.

The closest analogy would be the autopilot function on a commercial airliner. Edging out the weariness that surgeons sometimes experience during complex procedures ought to lead to more effective robotic operations.

“Computers don’t get tired or have lapses in attention,” Dr. Fong said. “A computer just gets the job done, from beginning to end. The most essential portions will always be done by your skillful surgeon.”

One challenge is the difference between how computers and people process information. In his textbook, Dr. Fong describes the steps for a repeated task once, counting on readers to reiterate. The AI needs something more step-by-step.

“We’re trying to figure out whether our textbooks should look different,” he said. “There may be one version for trainees, another for experts and a third for AI. Now that computers are out there reading my work, maybe I should write for them."