menu-control
The Jerusalem Post

WATCH: Robot learns to smile after researchers successfully attach skin to his face - study

 
Engineered skin tissue attached to a solid surface, smiling. (Credit: Shoji Takeuchi, Haruka Oda, Minghao Nie, Michio Kawai, via Cell Press)

Researchers at the University of Tokyo developed a method to integrate engineered skin tissue with humanoid robots, enhancing mobility, self-healing, sensors, and realism.

A team of researchers led by Prof. Shoji Takeuchi at the University of Tokyo has developed a method to integrate engineered skin tissue with humanoid-looking robots, according to a new research published late June on Cell Press.

By mimicking human skin tissue, including special pore-like holes, the researchers successfully developed a method of attaching engineered skin matter to these anthropoid-resembling robots, offering potential advantages like improved mobility, self-healing, embedded sensors, and a more realistic appearance to robotic platforms.

While researching ways to engineer skin to heal itself, the team decided to attempt incorporating skin with robotics to improve its properties and capabilities. “During previous research on a finger-shaped robot covered in engineered skin tissue that we grew in our lab, I felt the need for better adhesion between the robotic features and the subcutaneous structure of the skin,” said Prof. Takeuchi.

Previous methods of connecting skin to solid surfaces used mini anchors or hooks, limiting where skin could be applied and risking damage during movement. In the newly discovered method, researchers used small holes to apply skin tissue to surfaces of almost any shape.

Advertisement

After making the holes, the researchers applied a special collagen gel and plasma treatment to fill them, securely bonding the skin to the surface.

 (A) Conceptual illustration of a perforation-type anchor inspired by skin ligaments to cover robots with skin equivalents. (B) Process of the tissue fixation using perforation-type anchors. (credit: Shoji Takeuchi, Haruka Oda, Minghao Nie, Michio Kawai, via Cell Press)
(A) Conceptual illustration of a perforation-type anchor inspired by skin ligaments to cover robots with skin equivalents. (B) Process of the tissue fixation using perforation-type anchors. (credit: Shoji Takeuchi, Haruka Oda, Minghao Nie, Michio Kawai, via Cell Press)

Further, they plan on enhancing robots’ awareness and interaction abilities by embedding sensors for better environmental awareness so there is more context to the robots’ interactions with their surroundings.

Human-like robots and advances in medical research

Moreover, Prof. Takeuchi emphasized that in addition to finding a solution, his team has “identified new challenges,” such as needing wrinkles and thicker skin for a more human-like appearance. Movement is also important and not just the materials used, Prof. Takeuchi noted.

He added that they had already begun work on adding advanced actuators or muscles to make realistic facial expressions, with the ultimate goal of creating robots that can heal themselves, sense their surroundings better, and perform tasks with human-like skill, which is a difficult but exciting goal.

Stay updated with the latest news!

Subscribe to The Jerusalem Post Newsletter


Beyond furthering the fields of robotics and biometrics, the research team aims to advance medical research by introducing a face-on-a-chip concept that is similar to organ-on-a-chip technology used in drug development to transform research in skin aging, cosmetics, surgery, and more.
They stressed that ”applying this knowledge to recreate expressions [such as a smile] on a chip could find applications in the cosmetics industry and the orthopedic surgery industry.”
Advertisement
Finally, the researchers explained that “examining the correlation between facial muscle contractions and resulting facial expression can offer insights into the physiological aspects of emotion, leading to new exploration in the treatment of diseases, such as facial paralysis surgery.”

×
Email:
×
Email: