Researchers in the UK have conducted trials using software capable of detecting intricate details of emotions that remain hidden to the human eye.
The software, which uses an ‘artificial net’ to map key features of the face, can evaluate the intensity of multiple different facial expressions simultaneously.
The teams at University of Bristol and Manchester Metropolitan University worked with Bristol’s Children of the 90s study participants to see how well computational methods could capture authentic human emotions amidst everyday family life.
This included the use of videos taken at home which were captured by headcams worn by babies during interactions with their parents.
The research, published in Frontiers, shows that scientists can use machine learning techniques to accurately predict human judgements of parent facial expressions based on the computers’ decisions.
Lead author Romana Burgess, a PhD student at the University of Bristol’s School of Electrical, Electronic and Mechanical Engineering at the University of Bristol, said:
“Humans experience complicated emotions – the algorithms tell us that someone can be 5 per cent sad or 10 per cent happy, for example.
“Using computational methods to detect facial expressions from video data can be very accurate, when the videos are of high quality and represent optimal conditions – for instance, when videos are recorded in rooms with good lighting, when participants are sat face-on with the camera, and when glasses or long hair are kept from blocking the face.
“We were intrigued by their performance in the chaotic, real-world settings of family homes.
“The software detected a face in around 25 per cent of the videos taken in real world conditions, reflecting the difficulty in evaluating faces in these kind of dynamic interactions.”
The researchers used data from the Children of the 90s health study – also known as Avon Longitudinal Study of Parents and Children (ALSPAC).
Parents were invited to attend a clinic at the University of Bristol when their babies were six months old.
At the clinic, as a part of the ERC MHINT Headcam Study, parents were given two wearable headcams to take home and use during interactions with their babies.
Both parents and infants wore the headcams during feeding and play interactions.
The team then used an ‘automated facial coding’ software to computationally analyse parents’ facial expressions in the videos and had human coders analyse the facial expressions in the same videos.
The researchers quantified how frequently the software was able to detect the face in the video, and evaluated how often the humans and the software agreed on facial expressions.
Finally, the scientists used machine learning to predict human judgements based on the computers decisions.
Burgess said: “Deploying automated facial analysis in the parents’ home environment could change how we detect early signs of mood or mental health disorders, such as postnatal depression.
“For instance, we might expect parents with depression to show more sad expressions and less happy facial expressions.
Professor Rebecca Pearson from Manchester Metropolitan University, co-author and PI of the ERC project explained:
“These conditions could be better understood through subtle nuances in parents’ facial expressions, providing early intervention opportunities that were once unimaginable.
“For example, most parents will try to ‘mask’ their own distress and appear ‘ok’ to those around them.
“More subtle combinations can be picked up by the software, including expressions that are a mix of sadness and joy or that change quickly.”
The team now plan to explore the use of automated facial coding in the home environment as a tool to understand mood and mental health disorders and interactions.
This could help to pioneer a new era of health monitoring, bringing innovative science directly into the home.
Burgess concluded: “Our research used wearable headcams to capture genuine, unscripted emotions in everyday parent-infant interactions.
“Together with the use of cutting-edge computational techniques, this means we can uncover hidden details that were previously unattainable by the human eye, changing how we understand parents’ real emotions during interactions with their babies.”
Headcam data is now being collected in teenagers, the plan being to use the same methods to understand complex teen emotions at home.
Professor Nic Timpson, Principal Investigator for Children of the 90s, said:
“Bristol’s families have been involved for decades in important health research and here they are pioneering new ways of studying mental health using this real-life headcam footage.”
Image: Romana Burgess
Healthcare innovators and leaders honoured at Imprivata HealthCon user group and awards ceremony
Wearable communication system may reduce digital health divide
Molecule trains the immune system to prevent cancer
Diabetes patients urged to use fitness games with caution
Urgent work needed to tackle ‘substantial’ digital health inequality
eSight: “The technology has the potential to change someone’s life”
Microsoft invests £2.5 billion in UK AI
TMS shows promise in tackling depression ‘epidemic’
AI depression app set for NHS clinical trial
UK Biobank releases world’s largest single set of sequencing data
- News3 weeks ago
Why a leading healthcare CEO sees recombinant DNA as a metaphor for developing breakthrough technologies
- AI6 days ago
AI model predicts breast cancer risk without racial bias
- Medtech4 weeks ago
Surtex Instruments to unveil game-changing Infinex microsurgery instruments at MEDICA
- AI2 weeks ago
Humans make better cancer treatment decisions than AI, study finds