Abstract: Accurately determining pain levels in children is difficult, even for trained professionals and parents. Facial activity provides sensitive and specific information about pain, and computer vision algorithms have been developed to automatically detect Facial Action Units (AUs) defined by the Facial Action Coding System (FACS). Our prior work utilized information from computer vision, i.e. automatically detected facial AUs to develop classifiers to distinguish between pain and no-pain conditions. However, application of pain/no-pain classifiers based on automated AU codings across different environmental domains resulted indiminished performance. In contrast, classifiers based on manually coded AUs demonstrated reduced environmentally-based variability in performance. To improve classification performance in the current work, we applied transfer learning by training another machine learning model to map automated AU codings to a subspace of manual AU codings to enable more robust pain recognition performance when only automatically coded AUs are available for the test data. With this transfer learning method, we improved the Area under the ROC Curve (AUC) on independent data (new participants) from our target data domain from 0.69 to 0.72.
Automated Pain Detection in Facial Videos of Children using Human-Assisted Transfer Learning
- View Larger Image