Artificial Intelligence in Plastic Surgery

Current Applications, Future Directions, and Ethical Implications

Tyler Jarvis, BS; Danielle Thornburg, MD; Alanna M. Rebecca, MD, MBA; Chad M. Teven, MD

Disclosures

Plast Reconstr Surg Glob Open. 2020;8(10):e3200 

In This Article

Discussion

Big Data

Recent advances in technology combined with evidence-based medicine has brought ever-increasing amounts of data, which have become the focus of a great deal of AI research. Big data research is centered on large databases produced by investigators and clinicians as well as large aggregates of smaller data sets within electronic health records.[22] Such databases include the National Surgical Quality Improvement Project,[32] the Tracking Operations and Outcomes for Plastic Surgeons Program, and CosmetAssure, which collect outcomes data for adult surgical cases, plastic surgery procedures, and cosmetic procedures, respectively.

Big data analytics using AI models such as ANNs can assist investigators in efficiently analyzing the massive amount of information contained within these databases to answer clinically significant questions.[23] Esbroeck et al demonstrated the use of ML in determining relationships between current procedural terminology and postoperative complications to produce a procedural risk score.[33] The authors trained a support vector machine to determine procedural complexity and risk using data from the National Surgical Quality Improvement Project that were comparable to other known measures of procedural complexity.[33]

Machine Learning

Applications of ML in plastic surgery include predictive models that use the pattern-recognition abilities of ANNs to help surgeons make preoperative and postoperative decisions. In the early 2000s, Yeong et al developed a model that used data obtained from a portable reflective spectrophotometer to determine burn depth and healing time, with an average accuracy of 86%.[26] The ANN was able to differentiate between burns that would heal before or after 14 days, with an accuracy of 96% and 75%, respectively.[26] More recently, an application that monitored postoperative free flap viability based on skin color assessed via the photographs of a Samsung Galaxy S2.[27] Photographs of subjects' hands under different degrees of venous and arterial occlusion were used to train the application which was able to accurately assess the vascular status of new subjects with a sensitivity and specificity of 94% and 98%, respectively.[27]

Another predictive ML model was developed to predict the outcome of various nerve grafts with an accuracy exceeding 90%, demonstrating the potential utility of ML models in peripheral nerve surgery.[28] The researchers trained an ANN using over 30 variables identified from experimental records of nerve graft studies on rats.[28] Categories of variables included the biomaterials used for the wall and filling, the extracellular matrix proteins, growth factors, scaffold type, and surface.[28]

In the near future, ML systems are likely to facilitate an early diagnosis of a wide range of conditions by streamlining the extraction and analysis of clinical data. An ML algorithm was trained to diagnose distinct types of craniosynostosis based on CT images at a level comparable to a trained radiologist with a sensitivity of 92.7% and a specificity of 98.9%.[29] In recognition of the subjectivity of the mental visualizations typically relied on by surgeons to correct such malformations, the authors hope their model will assist surgical planning by characterizing dysmorphology in a more precise manner.[29]

Another supervised ML model was demonstrated to aid surgical planning via automated diagnosis and simulation.[8] Investigators trained a 3D morphable model, a statistical model of face shape and quality, using databases containing 10,000 three-dimensional face scans of both healthy volunteers and orthognathic surgical patients.[8] Three separate models were produced: a global model containing the faces of healthy volunteers and preoperative patients, a preoperative patient model, and a postoperative patient model.[8] The models were able to differentiate orthognathic patients from healthy volunteers, as well as predict patient-specific postoperative face shape using regression analysis.[8] The authors propose that their model can help surgeons in the objective assessment of preoperative and postoperative aesthetics, improving patient education and surgical planning.[8]

Deep Learning

Plastic surgery research is well suited for DL applications due to the abundance of unstructured, visual data collected via widely available technologies.[9] Aesthetic surgeons routinely collect before and after images for procedures, which are subsequently made publicly available, creating a large source of data. Using a publicly available repository containing over 18,000 before and after rhinoplasty photographs, a neural network could correctly classify rhinoplasty status in 85% of the tested images, performing at a level of sensitivity and specificity equivalent to ENT and plastic surgery residents and attendings.[9]

Another DL application developed by Phillips et al was used to identify melanoma in images of biopsied lesions taken on smartphones with dermoscopic lens attachments.[10] The Deep Ensemble for Recognition of Melanoma ANN is capable of learning features of malignant melanoma directly from the data given, rather than from features preset by the investigators.[11] When the application had been tested using published dermoscopic images, it assessed the likelihood of melanoma in the images at a level of accuracy similar to that of the clinicians.[10] The study highlights the potential for a DL application to help clinicians in secondary prevention of skin cancer and to serve as a diagnostic and decision-making tool.[10]

Natural Language Processing

Over the last 20 years, we have seen a boom in the regular use of NLP algorithms with applications such as spelling and grammar check in word processers, autocorrect, and predictive text messaging.[12] Just 10 years ago, Mayo Clinic researchers developed the Text Analysis and Knowledge Extraction system, an NLP framework that could root out data from text within electronic health records.[13] Recent applications of NLP in plastic surgery stem from a trending interest in the analysis of the public opinions regarding the field and an abundance of data within the Twitter microblogging system.[14]

Plastic surgeons in aesthetics have increasingly taken to using social media platforms to market their practices and provide patient education. To gauge the public perception of plastic surgery, Mustafa and colleagues used an NLP technique called hedonometrics to analyze tweets regarding plastic surgery occurring between 2012 and 2016.[15] Hedonometrics is a procedure that uses an algorithm to quantify happiness based on text.[16] The data set contains over 10,000 words pulled from Amazon Mechanical Turk.[15,16] They developed word-shift graphs to summarize the major shifts in overall word happiness and the words that caused them.[15] After analyzing 1,037,146 relevant tweets, they found that the term "plastic" was the most popular and had the lowest positivity score.[16] The terms "aesthetic," "cosmetic," and "reconstructive" were less popular, though more positively regarded.[16] A similar study by Cognovi Labs and Duke University used supervised ML technology to examine commonly hashtagged words associated with plastic surgery.[24] The authors suggest their results might inform decisions to use the title of aesthetic or cosmetic surgeon and emphasized the potential for such applications to influence marketing strategies and the public perception of plastic surgery.[16,24]

Recognizing the potential utility of a smartphone in addressing frequently asked questions in plastic surgery clinics, Boczar et al developed a smartphone application trained to provide answers to questions within 10 chosen topics of frequent concern to preoperative patients.[25] Participants asked the application questions from each of the chosen topics in their own words.[25] After a training period, the application provided an appropriate answer 92.3% of the time, with participants determining the answer to be accurate 83.3% of the time.[25] The authors anticipate the integration of the technology into clinical practice to improve patient support and free up time for surgeons.[25]

Facial Recognition Technology

The commercial use of facial recognition technology is becoming commonplace with an increasing number of people using smartphones employing the technology. Facial recognition operates using pattern recognition models combined with image analysis and deep neural networks to take unique biometric measurements that are used to interpret facial characteristics.[17] One model was able to classify facial beauty in patients relative to postoperative target features, which may be beneficial in estimating patient satisfaction and setting appropriate expectations before surgery.[19] Further applications demonstrate promise in the diagnosis of developmental disorders that express characteristic facies and assessing the success of facial feminization surgery.[17,18]

Acknowledging the desire for male-to-female transgender patients to be seen and treated as their identified gender, Chen et al trained several neural networks to identify gender based on facial features.[18] Images of patients who underwent male-to-female facial feminization surgery were taken along with cisgender male and cisgender female control images.[18] Four convolutional neural networks from Amazon, IBM, Microsoft, and Face++ were trained using these images to classify gender.[18] The control male images were correctly classified 100% of the time, while control females were correctly identified 98% of the time.[18] Preoperative images of the transgender patients were correctly classified as female 53% of the time, while postoperative images were correctly classified 98% of the time.[18] The authors point out that the software successfully provided an objective measure of the efficacy of the procedures, fulfilling a recognized gap in the outcome measures of gender confirmation surgery.[18]

Future Directions

In early 2019, the American Artificial Intelligence Initiative was enacted, facilitating a national strategy to make the United States a leader in AI. This led to a $142.2 billion investment in research and development for AI for the 2021 fiscal year.[21] Later that year, the FDA offered a regulatory framework for the use of AI-based software as a medical device, citing the potential for such technology to transform healthcare by taking advantage of the big data generated via the everyday delivery of modern healthcare.[34] With the increasing national awareness of the potential AI offers, it is becoming more important to see that its applications in healthcare are demonstrated.[23]

Plastic surgery is one of the many specialties that have the capacity to make use of AI in its full potential. The plastic surgeon's usual cognitive tasks of diagnosis, case planning, and perioperative assessment could be streamlined by thinking machines, allowing for increased productivity and patient satisfaction.[23,25] The deductive reasoning that drives surgical decision-making depends on the ability of the surgeon to produce an adequate list of differential diagnoses, to consider the best tests to establish the diagnosis, and to develop a plan to address the diagnosis using educated judgment and heuristic techniques.[35] AI-powered decision-making tools show tremendous potential in improving surgical outcomes by augmenting these processes via automated data acquisition, predictive analytics, and appropriate integration with human surgical intuition.[34]

Ethical Considerations

Kohli and Geis named 3 domains of ethical issues regarding the use of AI: algorithms, data, and practices (Table 3).[37] They raise the issue of informed consent and the need for data use agreements on behalf of data providers and third-party data aggregators. Quality assurance of data used in AI algorithms, particularly when the intent is to augment patient treatment decisions, is also of concern. The data sets used to train the ML systems must be representative of the population for which it is intended. Providers must make efforts to optimize the data and algorithms to best benefit their patients.[30]

The use of AI should not serve as a replacement for the shared decision-making process that is essential to ideal patient care. Providers taking advantage of this technology must be careful not to allow a biased view to disturb shared decision-making, considering the limitations imposed on such technology by its data sets.[31] An example of such bias may include the use of facial recognition systems in aesthetic practices. The use of data sets developed in Western or wealthy Eastern countries may lead to biased suggestions that can marginalize values and perceptions of beauty in other cultures.[31]

An intuitive concern when considering the implementation of AI is that the patient–physician relationship will become compromised. Some speculate that AI may completely replace certain specialties, though many physicians anticipate the opposite effect. In breast-imaging radiology practices, physicians are expected to accurately diagnose lesions, relate imaging features to prognostic data, and relay available treatments to patients all while considering their personal preferences.[38] These conversations between doctors and patients are both sensitive and complex, yet there is very little time for them.[38] With AI to automate repetitive tasks such as screenings, examination reporting, and breast density assessments, there is more time for rapport-building and the facilitation of these complex interactions.[38]

Krittanawong points out that AI is unable to engage in such high-level conversation with patients, nor is it able to build the trust and sense of empathy required to gain the therapeutic alliance that is integral to the patient–physician relationship and positive outcomes.[39] Among the primary goals of the implementation of AI in healthcare is the reduction of physician burnout via the automation of routine clinical tasks, therefore enabling physicians to spend more time on the more sophisticated and humane aspects of their practices. Many experts would agree that the purpose of AI in healthcare is to optimize a physician's practice, rather than to replace it.[40]

processing....