The Android robot that is developed as a prototype, named “Replied R1”. To make the appearance resemble humans closely, it made a mold of a girl, and chose a kind of silicon carefully that would make the skin feel human like. The appearance is a five years old Japanese girl. The prototype has nine DOFs in the head, one for mouth, five for the eyes, and three for the neck, and many free joints to make a posture. The motors (actuators) are all embedded inside the body.
The android robot uses the touch sensor is a strain rate force sensor. The mechanism is similar to human as it detects touch strength while the skin is deforming. The android robot has four sensors under the skin of the left arm. Only four sensors can measure the touch strength all over the left arm surface. These sensors of tactile enable various touch communications.
In the future, it will implement an android with the same number of joints as humans, vision sensors, auditory sensors and tactile sensors covering the whole body. The researchers investigated eye motion of people during a conversation with the android for quantitative evaluation of interaction. A semi-unconscious behavior evaluation such as eye motion can reveal facts that do not appear in qualitative evaluations, such as a questionnaire test.
To test the prediction, three types of interlocutors were prepared; A1. Human girl, A2 the android with mouth, eye, and neck motions, A3 the android too. The conversation was held in the small room, the experimenter behind the curtain controlled reactions of the android. A speaker produced the pre recorded voice of the android. A2 moves its mouth while talking and sometimes move its neck and blinked, but A3 was stationary when it was talking without any blinked or move its neck.