Researchers at the Massachusetts Institute of Technology (MIT) have recently developed a machine learning algorithm that can predict how much pain a human being is in by looking at an image. The new system is called “DeepFaceLIFT.”
The new smart algorithm was trained on videos of people showing signs of pain and discomfort (don’t worry, it was not for sadistic purposes). From them, it was able to learn how human beings look like when in pain, their facial micro-expressions and other subtleties. It then took these signs and combined them with self-reported pain scores to estimate the level of pain.
DeepFaceLIFT can be adjusted and honed according to a person’s sex, age and skin complexion. And as it turns out, it’s much more accurate than any previous similar research project.
The new technology has numerous potential applications, including helping clinicians in their quest to make pain reporting more objective. This is necessary, as not all people are honest about their pain levels – many are faking it.
Although the algorithm is not finished, the hope is to develop it into a mobile app that can be easily used by physicians.
Source:
Digital Trends (https://www.digitaltrends.com/cool-tech/mit-pain-predicting-algorithm/)