Laird Harrison

October 25, 2017

SAN DIEGO — Brain imaging can show which surgical trainees have learned the motor skills necessary for their profession, researchers say.

The technique, functional near-infrared spectroscopy (fNIRS), would be more accurate than current tests of technical skills used for certification exams, said Arun Nemani, MS, a researcher at Rensselaer Polytechnic Institute in Troy, New York.

"If we use metrics in the brain, we can classify trained and untrained subjects very accurately," he told Medscape Medical News. "We don't depend on performance metrics such as time."

Dr Nemani presented his findings here at the American College of Surgeons (ACS) Clinical Congress 2017.

Technical skills are essential to the work of surgeons, and students practice these skills on simulators. But most surgical specialty board exams only require that candidates demonstrate their knowledge through oral and written tests, said coauthor Suvranu De, ScD, a Rensselaer engineering professor.

In one exception, the American Board of Surgery requires that candidates pass the Fundamentals of Laparoscopic Skills test, and for those planning to practice endoscopy, the Fundamentals of Endoscopic Skills test. These tests rate surgeons based on their speed and errors while operating physical simulators.

However, it is hard to tell how accurate these tests are, Dr Nemani said. As there is no gold standard against which to compare them, they have been evaluated by comparing the scores of people who have been trained on simulators with those who have not been trained.

The fNIRS uses infrared probes on the outside of the subject's head to measure hemoglobin in regions of the brain involved in motor skills, including the prefrontal cortex, the primary motor cortex, and the supplementary motor area. Some areas show more activity in experts, whereas others show more activity among novices, said Dr Nemani.

The scan is noninvasive, penetrates about 1.5 cm, is specific to white and gray matter, and can be used while the surgeons are demonstrating their skills, he said.

To measure its accuracy, Dr Nemani and colleagues compared the brain activity of eight subjects trained on the Fundamentals of Laparoscopic Skills (which uses a physical simulator), six trained on the Virtual Basic Laparoscopic Skill Trainer (which uses a virtual reality simulator), and five untrained subjects. The subjects were assessed while they cut a circle in cadaveric tissue.

The medical students who practiced on the physical simulator completed the task in an average of 7.9 minutes with a deviation of 3.3 minutes. Those who had used the virtual stimulator performed the task in 13.05 minutes with a deviation of 2.6 minutes vs an average of 15.5 minutes with a deviation of 5.6 minutes for the group that had no practice.

Although these differences in mean times were statistically significant (P < .05), the distribution of the scores showed that many untrained individuals completed the tasks quickly and many trained individuals completed the tasks slowly, Dr Nemani said.

Specifically, the scores of the surgeons trained on the Fundamentals of Laparoscopic Skills system overlapped 14% to 20% with those who are untrained, and scores of those trained on the virtual stimulator overlapped 20% to 41% with those who were untrained. The overlap indicates the probability that trained surgeons have been misclassified as untrained or vice versa, Dr Nemani said.

One reason for the overlap is that speed is not an adequate marker for proficiency in surgery. "The individual test-taker must be either skilled or not," Dr De told Medscape Medical News. "The current test fails that. The reason is you're looking at one metric. We want to look at the brain and see who is proficient and who is not."

When the fNIRS measurement of brain activity was used to score the students, there was only a 2.2% to 2.7% overlap for the Fundamentals of Laparoscopic Skills and an 8.9% to 9.1% overlap for the virtual stimulator, indicating that fNIRS is more accurate in distinguishing the trained from the untrained, he said.

"The ideal would be to have zero overlap, so when a new user comes in we could say whether that person is trained or untrained," Dr Nemani said.

The fNIRS test might be used to assess practicing surgeons as well as medical students, he added. It "can potentially be used to assess surgical motor skill decay," he said in response to a question from the audience. "For example, when surgeons return from active military duty and haven't practiced in some time, fNIRS can potentially quantify these drops in motor skills level."

As well as motor skills, in the future, fNIRS might look at activity in other regions of the brain that could measure different key attributes of surgeons, such as judgment and the ability to perform under stress, Dr Nemani said.

Session moderator Sasha Adams, MD, an assistant professor of surgery at the University of Texas in Houston, called the new technology "fascinating." "We've done a lot with trying to assess different kinds of coaching," she told Medscape Medical News. "It's very difficult to measure."

She said she would like to see more data before passing judgment on whether the system could be used for board certifications.

The authors and Dr Adams have disclosed no relevant financial relationships.

American College of Surgeons (ACS) Clinical Congress 2017: Presented October 24, 2017.

For more news, join us on Facebook and Twitter


Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.
Post as: