American scientists invented a robot capable of recreating human faces

by worldysnews
0 comment

Although robots have now made strides in verbal communication through advanced software such as ChatGPT, their ability to express emotional states on their faces is still quite slow. Columbia University researchers believe the Emo robot is a significant advance in nonverbal communication between humans and robots.

Columbia University researchers believe the Emo robot is a significant advance in nonverbal communication between humans and robots. Photo: popsci

The new Emo robot, published in the Journal of Science Robotics, is capable of predicting human facial expressions and mimicking them at the same time, and can even predict an upcoming smile about 0.84 seconds before it occurs. happen.

In addition, Emo can express 6 basic emotions such as: Anger, disgust, fear, joy, sadness and surprise, as well as a range of more nuanced reactions. That’s thanks to artificial muscles made from cables and motors. Emo expresses emotions by pulling artificial muscles at specific points on the face.

The study’s lead author said scientists faced challenges in developing mechanically expressive faces and determining when to produce natural, timely expressions.

“The main goal of the Emo project is to develop a robot face that can enhance human-robot interaction,” said Yuhang Hu, PhD student and lead author of the study. focuses on improving certain conversational skills – skills that are crucial to making those actions more natural and engaging. As robots become more advanced and complex, the demand increases Visual interaction is also increasing.”

The robot is shaped like a human head, uses 26 actuators to perform a variety of facial expressions, and is covered with silicone skin. In addition, the robot is also equipped with high-resolution cameras in its eyes for lifelike interaction and eye contact – very important for non-verbal communication.

The research team also used Artificial Intelligence (AI) software to predict human facial expressions and create corresponding robotic facial expressions: “Emo solves problems using 26 motors, soft skin soft and the eyes are equipped with cameras. So it can carry out non-verbal communications, such as eye contact and facial expressions. Emo is equipped with several AI models including detecting human faces, controlling facial actuators to mimic facial expressions, and even predicting human facial expressions. This allows Emo to interact in a way that feels timely and authentic.”

The robot is trained using a process called “self-modelling”, in which Emo makes random movements in front of the camera, learning the correlation between facial expressions and motor commands. After observing videos of human expressions, Emo can predict people’s facial expressions by noting small changes in their intended smile.

The research team says this research marks a change in human-robot interaction (HRI), allowing robots to take into account human expressions during interactions, improving interaction quality and motivating faith.

The team plans to integrate verbal communication into Emo, Yuhang Hu said, adding: “Our next step involves integrating verbal communication capabilities. This will allow Emo to engage in more complex and natural conversations.”

You may also like

Leave a Comment

Hosted by Byohosting – Most Recommended Web Hosting – for complains, abuse, advertising contact: o f f i c e @byohosting.com