Research  Crossing uncanny valley: new technology for lifelike facial expressions in androids

#1
C C Offline
https://resou.osaka-u.ac.jp/en

PRESS RELEASE: Even if an android's appearance is so realistic that it could be mistaken for a human in a photograph, watching it move in person can feel a bit unsettling. It can smile, frown, or display other various, familiar expressions, but finding a consistent emotional state behind those expressions can be difficult, leaving you unsure of what it is truly feeling and creating a sense of unease.

Until now, when allowing robots that can move many parts of their face, like androids, to display facial expressions for extended periods, a 'patchwork method' has been used. This method involves preparing multiple pre-arranged action scenarios to ensure that unnatural facial movements are excluded while switching between these scenarios as needed.

However, this poses practical challenges, such as preparing complex action scenarios beforehand, minimizing noticeable unnatural movements during transitions, and fine-tuning movements to subtly control the expressions conveyed.

In this study, lead author Hisashi Ishihara and his research group developed a dynamic facial expression synthesis technology using “waveform movements,” which represents various gestures that constitute facial movements, such as “breathing,” “blinking,” and “yawning,” as individual waves. These waves are propagated to the related facial areas and are overlaid to generate complex facial movements in real time. This method eliminates the need for the preparation of complex and diverse action data while also avoiding noticeable movement transitions.

Furthermore, by introducing “waveform modulation,” which adjusts the individual waveforms based on the robot's internal state, changes in internal conditions, such as mood, can be instantly reflected as variations in facial movements.

“Advancing this research in dynamic facial expression synthesis will enable robots capable of complex facial movements to exhibit more lively expressions and convey mood changes that respond to their surrounding circumstances, including interactions with humans,” says senior author Koichi Osuka. “This could greatly enrich emotional communication between humans and robots."

Ishihara adds, “Rather than creating superficial movements, further development of a system in which internal emotions are reflected in every detail of an android's actions could lead to the creation of androids perceived as having a heart."

By realizing the function to adaptively adjust and express emotions, this technology is expected to significantly enhance the value of communication robots, allowing them to exchange information with humans in a more natural, humanlike manner.

Click here for video: https://www.eurekalert.org/multimedia/1054266
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Article LaSota's AI cult left six dead – who is she? (the transgender Sith of Silicon Valley) C C 0 375 Jul 7, 2025 08:44 PM
Last Post: C C
  Research No “uncanny valley” effect in science-telling AI avatars C C 1 488 Apr 19, 2025 10:24 PM
Last Post: Magical Realist
  New technology to reduce potholes (machine learning technique) C C 0 406 Nov 5, 2022 04:40 AM
Last Post: C C
  What is web3? It’s Silicon Valley’s latest identity crisis C C 0 324 Jan 9, 2022 08:25 PM
Last Post: C C
  Experiments reveal why human-like robots elicit uncanny feelings C C 2 584 Sep 12, 2020 11:57 PM
Last Post: Syne
  Mini antenna enables complex robotic teaming + What does 6G technology hold in store? C C 0 333 Sep 5, 2020 02:58 AM
Last Post: C C
  Uncanny humanoid robot works at a Gov office in Russia + NLP is chasing wrong goal C C 1 401 Aug 4, 2020 07:42 AM
Last Post: Yazata
  Linking sense of touch to facial movement inches robots toward ‘feeling’ pain C C 1 476 Feb 19, 2020 11:46 PM
Last Post: Syne
  AI ethics: Using philosophy to tame Silicon Valley's dark side C C 2 608 Feb 21, 2018 11:42 PM
Last Post: Syne



Users browsing this thread: 1 Guest(s)