- Research Article
- Open Access
Development of sense of self-location based on somatosensory feedback from finger tips for extra robotic thumb control
© The Author(s) 2019
- Received: 16 December 2018
- Accepted: 11 June 2019
- Published: 17 June 2019
Recently, wearable extra robotic limbs that aim to enhance the functionality and capability of human operators as extra arms or fingers have become an active research subject among robotics researchers. Improving the operability of the extra robotic limbs is required for the human operators, an approach for which is to induce robotic embodiment. In this paper, we focus on the update of sense of self-location which is the key aspect of embodiment and contributes to the body representation update, and we elucidate dominant factors which induce the embodiment of an extra robotic thumb (ERT). The experiments are conducted to compare the performance of the reaching task of the ERT under three separately given conditions, of which two are somatosensory feedbacks: (1) tactile and position feedback from the fingertips and (2) tactile and position feedback from the human face, and one is a non-somatosensory feedback: (3) auditory feedback. As a result, we confirmed that the somatosensory feedback from finger tips has a large contribution for the update of sense of self-location.
- Body representation
- Sense of self-location
- Somatosensory feedback
- Extra robotic thumb
In recent years, wearable extra robotic limbs that aim to enhance the functionality and capability of human operators as extra arms or fingers have became increasingly important research subjects among robotics researchers [1–4]. Extra thumbs are mounted on the left hand of a user and assist the user in holding or manipulating objects . Moreover, supernumerary robotic limbs (SRL) are attached to the shoulder of an operator for assisting the execution of tasks performed overhead . In addition, Abdi et al.  tried using supernumerary arms for surgical operation in the virtual reality environment.
The operability is one of the factors which determines the values of those extra robotic limbs, since it is hard to precisely perceive the location of the extra limbs without visual feedback that it makes the manipulation of the artefact a challenging task for the operators. Many efforts have been made to improve the operability of extra robotic limbs. Data-driven latent space impedance control and Bio-Artificial Synergies algorithms are developed for Supernumerary Robotic Fingers [8, 9]. In Wu et al. , the control method is able to reject the human induced disturbances by controlling the impedance of the extra fingers’ joints in the latent space, and this enabled single-handed object manipulation such as opening the cap of a bottle. On the other hand, in Bio-Artificial Synergies, the control law was extracted from the grasping experiment data by applying Partial Least Squares regression, and it enabled the robotic fingers to share the task load together with human fingers . In addition, the algorithm of object-based mapping is proposed to control the extra robotic finger by interpreting the entire hand motion of grasping . The control interface for extra robotic fingers using EMG signals was also proposed by Leigh et al. . In the case of supernumerary robotic limbs which supports the user as extra arms, the control method using torso generated muscle activation signals was proposed in Parietti et al. . Sasaki et al.  also proposed the MetaLimbs system manipulated by using two legs.
An approach for improving the operability of extra robotic limbs is to induce the robotic embodiment [4, 5]. In general, embodiment is the sense that emerges when artefact’s properties are processed as if they were the properties of one’s own biological body, and it is composed of three main aspects . Those aspects are sense of self-location, sense of ownership and sense of agency.
Since it is reported that the feed forward control is used in human motion planning as described in Fig. 1 [17–20], the update of body representation refers to the update of controllers and predictors in Fig. 1, by comparing and adjusting the predicted state and estimated actual state. Thus, once the embodiment of an extra robotic limb is induced and the body representation is updated, the kinematics and the dynamics of an artificial extra limb will be included in the feed forward control mechanism used by our brain and become available to the operator’s motion planning. Hence, the user will be able to operate the extra limbs as part of their own body.
The proper update of the body representation according to the current situation.
The methods to induce the embodiment of the extra robotic limbs have been studied in various ways. In the work , an experiment similar to rubber hand illusion is carried out by using virtual hands inside a virtual reality environment. In the experiment, tactile stimulation is applied to participant’s real hand with synchronously applying visual stimulation to the virtual hand. This resulted in a drift of ownership towards the virtual hand. Abdi et al.  proposed a foot controlled virtual hand with visual and force feedback. The research tried to improve the embodiment of the endoscope by preparing a set of trainings starting from a simple practice and became more complicated in the successive stages.
Our previous research challenged to embody the extra robotic thumb (ERT) by using electrical stimulation and by facilitating body representation shift. Firstly, by using electrical stimulation as tactile feedback, the better performance of bolt picking task is reported . This tactile feedback contributed to the acquisition of sense of self-location, sense of agency and sense of ownership which are the three main aspects of embodiment. Then, the research focused on evaluating the level of embodiment and elucidating the process of embodiment is carried out. The study confirmed that the updated body representation affects the motion planning of the subjects [5, 22].
However, the previous studies did not clarify which kind of feedback information is the dominant factor for those sensation update that contributes to the robotic embodiment. Therefore, the purpose of this study is to elucidate the efficient feedback information which facilitates the update of sense of self-location. In this paper, in order to investigate the dominant feedback information of robotic embodiment, the extra robotic thumb (ERT) which was developed in our previous study  is chosen as the embodiment target. Before carrying out the experiment, a hypothesis is made according to . This hypothesis is that the somatosensory feedback from fingers facilitates the sense of self-location update in the most efficient way and is a dominant factor which enforces embodiment of the ERT. The method for verifying this hypothesis is by carrying out the reaching task experiment and compare the task performance under different kinds of separately given somatosensory feedbacks. Those feedbacks are, (1) tactile and position feedback from a finger tip, (2) tactile and position feedback from face, and (3) auditory feedback. By analysing the experimental results, the highest task performance is observed when using somatosensory feedbacks from finger tips; and then, the hypothesis is verified.
This paper is organized as follows. “Extra robotic thumb” section introduces the design and mechanism of extra robotic thumb (ERT). “Sense of self-location update” section gives the meanings of sense of self-location update as well as providing evidences for its plasticity. “Experiments” section presents the method and experiments to investigate the dominant factor which contributes to the update of sense of self-location in detail, as well as showing analysis and discussions of the experimental results. In “Conclusion and future work” section, the summary and future vision is presented.
The robotic thumb has three joints and its size is the same with human finger. This robotic finger is produced by 3D printing using white ABS material. The joints are made with three servo motors (DS318), each joint has the range of motion from 0° to 110° which enables the movable range be enough for touching five fingers. The tip of robotic finger is mounted with a 3-axis force sensor (Optoforce, OMD-20-SE-40N) to measure contact force. Measured contact force is sent to electrical stimulation device in order to present tactile feedback for operator.
The control interface is mounted on the posterior side of the right hand and the right wrist. This interface captures the thumb motion by forward kinematics using angle information measured from rotary encoders installed within each joint. Totally, four rotary encoders (Murata, SV03) are installed which forms a four-link mechanism to follow the thumb motion without interference. Surface electrical stimulation has reported to be a reliable way for feeding back tactile information . The electrodes are attached to the tip of the right thumb. The frequency of stimulation is 50 Hz to effectively stimulate the tactile receptor under skin  with maximum current of 10 mA is applied.
This paper focuses on the dominant factors which contribute to the update of sense of self-location. The sense of self-location is one of the key embodiment aspects as described in the introduction. The sense of self location is the sense of coordinate system through which one will perceive the location of one’s own body parts and make motion control. The update of sense of self-location leads to the update of body representation. Human have various coordinate systems such as retinal coordinate, head-centered coordinate, body-centered coordinate, arm-centered coordinate and world-centered coordinate, for motion planning and trajectory planning. Those coordinate systems are switched unintentionally when human performs various tasks . For example, one can reach a cup without seeing the hand. Therefore, from the perspective of robotic operation, if the operator is able to recognize the coordinate of robotic hand then the operability will be improved.
Studies on the update of sense of self-location are widely carried out in the area of neuroscience [28, 29]. Those studies are based on the modified effect of rubber hand illusion which is a famous experiment about embodiment [30, 31]. In the experiment of rubber hand illusion, the subjects were positioned with their left hand hidden and a lifelike rubber left hand was placed in front of them. The experimenters stroked both the hidden left hand and the visible rubber hand with a paintbrush. The experiment showed that if experimenters stroked the two hands synchronously and in the same direction, the subjects began to recognise the rubber hand as their own hand. This illusion is the result of sensory fusion of visual input (rubber hand) and simultaneous tactile input from the real hand. Based on this illusion, Lenggenhager et al.  carried an experiment to make the participants mislocalize themselves in the virtual reality environment. Another experiment is about making the participants feel that they have a larger belly size. This is achieved by letting the participants prodding their real belly with a rod that had a virtual counterpart that they saw in the VR . Those researches are strong evidences that sense of self-location can be updated and modified.
Method to investigate the dominant factors
Tactile and position feedback from fingers.
Tactile and position feedback from the face.
Another tactile and position feedback is from operator’s face. This is a representative case of using feedback information from a body part with relatively high resolution of two-point discrimination which is the tactile sensation to discern that two nearby objects touching the skin are truly two distinct points . The last feedback information is given by auditory feedback. In this case, no body parts are touched to generate tactile sensing. In order to know how tactile feedback performs compared with non-tactile feedback, choosing auditory feedback which does not any tactile information is important. In this research, the experimental result under auditory feedback is considered to be a base line for experimental results under other 2 tactile feedbacks. This paper compares the performance of ERT on a reaching task when different kinds of sensory feedback listed above which are separately given.
The somatosensory feedback from fingers facilitates the sense of self-location update in the most efficient way and is a dominant factor which enforces embodiment of the ERT.
The position of the ERT can be perceived through tactile sensing from fingertips and face when touched by the ERT. Although, the auditory feedback device did not provide tactile feedback, the auditory feedback is given by pressing the buttons. Three buttons are installed on the plate of auditory feedback device, each button gives different pitch of sound to inform the subjects about the locations where their finger is touching. In this experiment, the subjects are not allowed to use their vision to acknowledge the location of the touching finger, and the eyes of the subjects are covered by eye mask. This setting is for eliminating other sensory feedback inputs other than somatosensory feedback.
The monitor presents the reaching target (A, B, or C).
The subject controls the ERT to touch the target displayed.
Repeat the step 1 and 2 for 30 s.
The monitor presents the reaching target.
The subject use left thumb to touch the presented target.
Repeat step 1 and 2 for 30 s.
Results and discussion
After the saturation of the control learning is confirmed, the saturated results are compared with the saturated results of the subjects’ own left thumbs. Here, the saturated result of each subject is defined as the mean value of reaching counts in the last three sets by that subject. Since the control of their left thumbs has already been learned by them in their lifetimes, we consider the reaching counts of the subjects’ own left thumbs to be the best performance by each subject. Thus, by comparing these results we can show how close the learned control of the ERT has become to the control of the subjects’ own left thumbs.
The purpose of this research is to elucidate the dominant factors and process which facilitate the embodiment of an extra robotic thumb (ERT). The experiment is carried out by comparing reaching task performances under three separately given conditions. Three given conditions are controlling ERT (1) to touch three fingertips, (2) to touch three positions on face, and (3) to push three buttons on a auditory feedback device. The performance of reaching task when controlling ERT with tactile feedback from fingertips is close to the performance of the reaching task performed by using subjects’ own thumbs. Moreover, the task performance with tactile feedback from fingertips is the highest among other cases, while auditory feedback gives the lowest performance. This result suggests that the tactile feedback from fingertips facilitated an update of sense of self-location in the most effective way among three kinds of sensory feedbacks (two somatosensory feedback and one non-somatosensory feedback). The result also indicates that somatosensory feedback from face also facilitates an update of sense of self-location. Since fingertips have the highest resolution of two-point discrimination and a face has relatively high resolution, the result also suggests that somatosensory feedback from body parts with relatively high resolution of two-point discrimination can facilitate the update of sense of self-location. This research confirmed that somatosensory feedback plays the important role in the update of sense of self-location, as the comparison result of auditory feedback shows inefficient control learning of ERT. In addition, this result is also a quantitative evaluation of the efficiency of three kinds of sensory feedbacks on facilitation of sense of self-location update in robotic embodiment.
Since the goal of robotic embodiment is to improve the operability of wearable extra robotic limbs, experiments which evaluate the improvement of operability is going to be carried out in future. The ERT used in this paper has the same size with human fingers, while not all extra robotic limbs are the same size with the human limbs. Thus, the research on elucidating dominant factors which facilitate the embodiment of extra robotic thumb whose size is larger than human thumb is also going to be carried out in future.
This work was partially supported by JSPS KAKENHI, Grant-in-Aid for Scientic Research on Innovative Areas Understanding brain plasticity on body representations to promote their adaptive functions (Grant Number 17H05906).
YH and TI initiated this research, designed and performed the experiments. YNZ performed the data analysis, interpretation of the experimental results and wrote the manuscript with the help and review from TA. All authors read and approved the final manuscript.
The authors declare that they have no competing interests.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
- Parietti F, Asada HH (2016) Supernumerary robotic limbs for human body support. IEEE Trans Robot 32(2):301–311View ArticleGoogle Scholar
- Prattichizzo D, Malvezzi M, Hussain I, Salvietti G (2014) The sixth-finger: a modular extra-finger to enhance human hand capabilities. In: Proceedings of the 23rd IEEE international symposium on robot and human interactive communication, Edinburgh, Scotland, UK, pp 993–998Google Scholar
- Hussain I, Salvietti G, Spagnoletti G, Prattichizzo D (2016) The soft-sixthfinger: a wearable emg controlled robotic extra-finger for grasp compensation in chronic stroke patients. IEEE Robot Autom Lett 1(2):1000–1006View ArticleGoogle Scholar
- Sobajima M, Sato Y, Xufeng W, Hasegawa Y (2015) Improvement of operability of extra robotic thumb using tactile feedback by electrical stimulation. In: Proceedings of IEEE international symposium on micro-nanomechatronics and human science (MHS), pp 1–3Google Scholar
- Shikida H, Hasegawa Y (2016) Hand space change after use of extra robotic thumb. In: Proceedings of IEEE international symposium on micro-nanomechatronics and human science (MHS), pp 1–4Google Scholar
- Parietti F, Chan KC, Hunter B, Asada HH (2015) Design and control of supernumerary robotic limbs for balance augmentation. In: Proceedings of IEEE international conference on robotics and automation (ICRA), pp 5010–5017Google Scholar
- Abdi E (2017) Supernumerary robotic arm for three-handed surgical application: behavioral study and design of human–machine interface. Ph.D. thesis, EPFL, LausanneGoogle Scholar
- Wu FY, Asada HH (2018) Decoupled motion control of wearable robot for rejecting human induced disturbances. In: Proceedings of IEEE international conference on robotics and automation (ICRA), pp 4103–4110Google Scholar
- Wu FY, Asada HH (2015) “Hold-and-manipulate” with a single hand being assisted by wearable extra fingers. In: Proceedings of IEEE international conference on robotics and automation (ICRA), pp 6205–6212Google Scholar
- Leigh S-W, Maes P (2016) Body integrated programmable joints interface. In: Proceedings of ACM the 2016 CHI conference on human factors in computing systems, pp 6053–6057Google Scholar
- Parietti F, Asada HH (2017) Independent, voluntary control of extra robotic limbs. In: Proceedings of IEEE international conference on robotics and automation (ICRA), pp 5954–5961Google Scholar
- Sasaki T, Saraiji M, Fernando CL, Minamizawa K, Inami M (2017) Metalimbs: multiple arms interaction metamorphism. In: Proceedings of ACM SIGGRAPH 2017 emerging technologies, p 16Google Scholar
- Kilteni K, Groten R, Slater M (2012) The sense of embodiment in virtual reality. Presence Teleoper Virtual Environ 21(4):373–387View ArticleGoogle Scholar
- Lenggenhager B, Mouthon M, Blanke O (2009) Spatial aspects of bodily self-consciousness. Conscious Cognit 18(1):110–117View ArticleGoogle Scholar
- Gallagher S (2000) Philosophical conceptions of the self: implications for cognitive science. Trends Cogn Sci 4(1):14–21View ArticleGoogle Scholar
- Blanke O, Metzinger T (2009) Full-body illusions and minimal phenomenal selfhood. Trends Cogn Sci 13(1):7–13View ArticleGoogle Scholar
- Bridgeman B, Hendry D, Stark L (1975) Failure to detect displacement of the visual world during saccadic eye movements. Vis Res 15(6):719–722View ArticleGoogle Scholar
- Kawato M (1990) Feedback-error-learning neural network for supervised motor learning. In: Eckmiller R (ed) Advanced neural computers. Elsevier, Amsterdam, pp 365–372Google Scholar
- Wolpert DM, Miall RC, Kawato M (1998) Internal models in the cerebellum. Trends Cogn Sci 2(9):338–347View ArticleGoogle Scholar
- Blakemore S-J, Wolpert DM, Frith CD (2002) Abnormalities in the awareness of action. Trends Cogn Sci 6(6):237–242View ArticleGoogle Scholar
- Slater M, Pérez Marcos D, Ehrsson H, Sanchez-Vives MV (2008) Towards a digital body: the virtual arm illusion. Front Hum Neurosci 2:6View ArticleGoogle Scholar
- Yaonan Z, Hiroshi S, Tadayoshi A, Yasuhisa H (2018) Evaluating shifted body representation and modified body schema using extra robotic thumb. In: Proceedings of IEEE international conference on cyborg and bionic systems (CBS), pp 282–285Google Scholar
- Segura Meraz N, Sobajima M, Aoyama T, Hasegawa Y (2018) Modification of body schema by use of extra robotic thumb. ROBOMECH J 5(1):3. https://doi.org/10.1186/s40648-018-0100-3 View ArticleGoogle Scholar
- Philippe R, Tricia S (2000) Perceived self in infancy. Infant Behav Dev 23(3):513–530Google Scholar
- Hasegawa Y, Ozawa K (2014) Pseudo-somatosensory feedback about joint’s angle using electrode array. In: Proceedings of IEEE/SICE international symposium on system integration (SII), pp 644–649Google Scholar
- Purves D, Augustine G, Fitzpatrick D (2001) Mechanoreceptors specialized to receive tactile information. In: Neuroscience, 2nd edn. Sinauer Associates, Sunderland. https://www.ncbi.nlm.nih.gov/books/NBK10895/
- ichard AA, Lawrence HS, Chiang-Shan L, Brigitte S (1993) Coordinate transformations in the representation of spatial information. Curr Opin Neurobiol 3(2):171–176View ArticleGoogle Scholar
- Lenggenhager B, Tadi T, Metzinger T, Blanke O (2007) Video ergo sum: manipulating bodily self-consciousness. Science 317(5841):1096–1099View ArticleGoogle Scholar
- Normand J-M, Giannopoulos E, Spanlang B, Slater M (2011) Multisensory stimulation can induce an illusion of larger belly size in immersive virtual reality. PLoS ONE 6(1):16128View ArticleGoogle Scholar
- Botvinick M, Cohen J (1998) Rubber hands ‘feel’ touch that eyes see. Nature 391(6669):756View ArticleGoogle Scholar
- Arata J, Hattori M, Ichikawa S, Sakaguchi M (2014) Robotically enhanced rubber hand illusion. IEEE Trans Haptics 7(4):526–532View ArticleGoogle Scholar
- Johnson KO, Phillips JR (1981) Tactile spatial resolution. I. Two-point discrimination, gap detection, grating resolution, and letter recognition. J Neurophysiol 46(6):1177–1192View ArticleGoogle Scholar