Thursday, November 28, 2024
spot_img
HomeMedical TechnologyDigital Surgery With Touch Feedback Could Improve Medical Training

Digital Surgery With Touch Feedback Could Improve Medical Training

Combining the sense of touch with 3-D computer models of organs, researchers at Rensselaer Polytechnic Institute are developing a new approach to training surgeons, much as pilots learn to fly on flight simulators. With collaborators at Harvard Medical School, Albany Medical Center, and the Massachusetts Institute of Technology, the team is developing a virtual simulator that will allow surgeons to touch, feel, and manipulate computer-generated organs with actual tool handles used in minimally invasive surgery.

“The most important single factor that determines the success of a surgical procedure is the skill of the surgeon,” said Suvranu De, assistant professor of mechanical, aerospace, and nuclear engineering and director of the Advanced Computational Research Lab at Rensselaer. It is therefore not surprising, he notes, that more people die each year from medical errors in hospitals than from motor vehicle accidents, breast cancer, or AIDS, according to a 2000 report by the Institute of Medicine.

De and his colleagues at Rensselaer are seeking to improve surgical training by developing a new type of virtual simulator. Based on the science of haptics — the study of sensing through touch — the new simulator will provide an immersive environment for surgeons to touch, feel, and manipulate computer-generated 3-D tissues and organs with tool handles used in actual surgery. Such a simulator could standardize the assessment of surgical skills and avert the need for cadavers and animals currently used in training, according to De.

“The sense of touch plays a fundamental role in the performance of a surgeon,” De said. “This is not a video game. People’s lives are at stake, so when training surgeons, you better be doing it well.”

In a paper published in the June/July issue of the journal Presence, the researchers describe their new computational technique, and beginning in the summer of 2006 the work will be supported by a $1.4 million, four-year grant from the National Institutes of Health (NIH). This funding will extend the original three-year exploratory NIH grant De received in 2004 to support the initial phases of the research.

Surgical simulators — even more than flight simulators — are based on intense computation. To program the realism of touch feedback from a surgical probe navigating through soft tissue, the researchers must develop efficient computer models that perform 30 times faster than real-time graphics, solving complex sets of partial differential equations about a thousand times a second, De said.

The major challenge to current technologies is the simulation of soft biological tissues, according to De. Such tissues are heterogeneous and viscoelastic, meaning they exhibit characteristics of both solids and liquids — similar to chewing gum or silly putty. And surgical procedures such as cutting and cauterizing are almost impossible to simulate with traditional techniques.

To overcome these barriers, De’s group has developed a new computational tool called the Point-Associated Finite Field (PAFF) approach, which models human tissue as a collection of particles with distinct, overlapping zones of influence that produce coordinated, elastic movements. A single point in space models each spot, while its relationship to nearby points is determined by the equations of physics. The localized points migrate along with the tip of the virtual instrument, much like a roving swarm of bees.

This method enables the program to rapidly perform hundreds of thousands of calculations for real-time touch feedback, making it superior to other approaches, according to the researchers. “Our approach is physics-based,” De said. “The technologies that are currently available for surgical simulation are mostly graphical renderings of organs, and surgeons are not very happy with them.” And the same physics-based technology can be used to model blood flow and the generation of smoke during cauterization, which is often used to burn tissue and stop hemorrhaging.

The researchers are currently using video images of actual surgical procedures to enhance the visual realism of their computer-generated scenarios, and they are performing experiments on human cadavers to evaluate the mechanical properties of human organs. These experiments are taking place at Albany Medical Center in collaboration with Tejinder Paul Singh and Leon Martino, and also at Connecticut-based U.S. Surgical, a manufacturer of wound closure products and advanced surgical devices.

The team also plans to develop a prototype technology that will be tested by surgeons and surgical residents at the Carl J. Shapiro Simulation and Skills Center at Beth Israel Deaconess Medical Center, a teaching hospital of Harvard Medical School. Researchers at the Human Performance Institute at the University of Texas, Arlington, will assist the team in the validation process.

After developing a successful prototype, De hopes to apply the model to a much wider class of medical procedures. “The grand vision,” he said, “is to develop a palpable human — a giant database of human anatomy that provides real-time interactivity for a variety of uses, from teaching anatomy to evaluating injuries in a variety of scenarios. In the long run, a better simulator could even help in the design of new surgical tools and techniques.”

Daniel B. Jones, associate professor of surgery at Harvard Medical School, is a co-principal investigator for the new NIH grant. Other significant collaborators include Badri Roysam, professor of electrical, computer, and systems engineering at Rensselaer; George V. Kondraske, professor of electrical and biomedical engineering at the University of Texas, Arlington; and Jon J. Anton, chief technology officer of Medical Education Technologies, Inc. in Sarasota, Fla.

Yi-Je Lim, a former doctoral student at Rensselaer who is now at Energid Technologies in Cambridge, Mass., is corresponding author of the Presence paper. Two researchers affiliated with the Laboratory for Human and Machine Haptics at MIT also contributed to the paper: Mandayam A. Srinivasan, a senior research scientist in mechanical engineering at MIT; and Muniyandi Manivannan, assistant professor of biomedical engineering at the Indian Institute of Technology.

Read Full Story

RELATED ARTICLES

Most Popular