Skip to main content

Advertisement

  • Research Article
  • Open Access

A wearable soft robot for movement assistance on eyelid closure

ROBOMECH Journal20185:30

https://doi.org/10.1186/s40648-018-0126-6

  • Received: 10 April 2018
  • Accepted: 29 November 2018
  • Published:

Abstract

We present a facial wearable robot to support eyelid movement for patients with hemifacial paralysis. People with facial paralysis are not able to blink, which leads to dry eyes and could cause permanent damage to the cornea. To address this issue, we developed a robotic system to support eyelid movement on the paralyzed side based on voluntary eyelid movement on the healthy side. The robot has a novel mechanism for supporting eyelid movement and is made from soft material, which we call the eyelid gating mechanism (ELGM). We defined the requirements for eyelid manipulation. Then, we introduced design rules for the ELGM based on this definition, and conducted a deformation analysis. As a result, we could control deformation so that it is tailored to eyelid movement with simple rotational input at two hinged ends. This system provides appropriate support for eyelid movement in a non-invasive and gentle manner, based on voluntary eyelid movement on the healthy side. In this paper, we present a performance evaluation of the developed ELGM and the wearable facial robot. We confirm that ELGM could aid eyelid closure in 8 out of 10 patients. Further, we find that the time lag between eyelid closure detection and completion was \(236.9 \pm 19.4\) ms.

Keywords

  • Wearable robotics
  • Soft robotics
  • Facial paralysis

Introduction

Wearable robots for supporting human movements have been developed all over the world and have been recently reported in the literature [1]. As an example, these robotic technologies are used in medical welfare, military and so on. Medical welfare applications have gained considerable attention in recent years. In this field, the purpose of the robot is mainly to provide physical therapy, enable semi-automation of therapeutic tasks, and improve movement reproducibility. The robot suit HAL (CYBERDYNE Inc.) is one of the pioneering works, which can support human gait function based on the wearer’s intention using surface myoelectronic potentials [2]. In the current state, wearable robots mainly support articulation of body joints, but only a few technologies support other body movements such as facial movement [3]. It is important that the robots can support body movement other than joint articulation to further advance the field of wearable robotics.

The human face comprises mimetic muscles that control facial skin and tissue rather than joints. People with facial paralysis lose control of these facial muscles, primarily on one side of the face [4]. Facial paralysis rates are reported to be 4/10,000 per annum, overall. These patients suffer from severe distractions in many ways, including inability to close an eyelid, make facial expression, and purse the lips [5]. The inability to close an eyelid presents health problems. A perpetually open eye is at risk from exposure to debris and dryness, which can result in chronic irritation and pain. In addition, an inability to blink may lead to permanent corneal damage from ulceration or infection. Aside from the functional impairment in blinking and other facial movements, facial paralysis is also a major psychological barrier, and patients tend to avoid social involvement or try to conceal their faces while interacting with other people. As a result, facial paralysis can significantly affect the quality of life of patients.

There are some engineering approaches to support facial movements based on the healthy side. Senders et al. [6, 7] proposed an implantable device to reanimate an eye blink based on a blink on the healthy side. This device used an electroactive polymer artificial muscle that pulls a sling attached to the upper eyelid. Moreover, Xin et al. [8] proposed a functional-electrical stimulation method to restore an eye blink based on myoelectric signals of the orbicularis oculi on the healthy side. We also proposed a wearable robot to support movement of the mouth corner in order to reconstruct facial expressions [3]. Based on detection of myoelectric signals which mimic muscle on the healthy side, this robot provides non-invasive physical support by pulling the facial skin.

In this study, we have developed a facial wearable robot to support eyelid closure movement on the paralyzed side non-invasively by using eyelid closure on the healthy side, as shown in Fig. 1 [9]. This robot has a mechanism for supporting eyelid movement made from a soft material, which we call the eyelid gating mechanism (ELGM). The ELGM deforms with a simple rotational actuation input, and the deformation is customized to a particular eyelid movement. This robot is designed for daily life support. Design and development of the robot is introduced and implemented in this study. The performance of the proposed ELGM was tested on patients with facial paralysis. The ELGM could support eyelid closure on 8 out of 10 participants. The time lag between eye closure on the supported and non-supported sides was found to be \(236.9 \pm 19.4\) ms.
Fig. 1
Fig. 1

Facial wearable robot. 1. This robot detects eyelid movement on the healthy side. 2. Eyelid movement detection triggers the support. 3. Rotational actuation input from a motor is transmitted to the ELGM via a pulley-wire system to manipulate the eyelid on the paralyzed side. The size of the robot is \(155 \times 205 \times 25\) mm, and its weight is 90.1 g [9]

Requirements

Here, we describe the requirements for eyelid manipulation based on the physiological characteristics of eyelid movement. Humans blink continuously, and the eye blink movement is repetitive and quick. It is reported that the rate of blink is once every 4 s, with a duration of approximately 334 ms [10, 11]. Figure 2 [12] shows the structure of the human eyelid. Blinking mainly uses the orbicularis oculi and levator muscles of the upper eyelid. The orbicularis oculi is a sphincter muscle for eyelid closure, and the levator muscle of the upper eyelid is a striated muscle for eyelid opening. Facial paralysis is a facial nerve disease, and only the orbicularis oculi is innervated by facial nerve. The levator muscle of the upper eyelid is innervated by the oculomotor nerve. Therefore, patients with facial paralysis cannot control the orbicularis oculi, and they cannot close their eyelid.
Fig. 2
Fig. 2

Eyelid Structure. 1. Orbicularis oculi (sphincter muscle); closes eyelids. 2. Levator muscle of the upper eyelid (striate muscle); opens eyelids. 3. Tarsal plate (connective tissue); contributes to eyelid form and support. 4. Locus of tarsal plate; set as a requirement for eyelid manipulation

Eyelid motion comprises movement of the tarsal plate, which is manipulated by the orbicularis oculi and levator muscles in the upper eyelid. The tarsal plate is a dense connective tissue and supports the eyelid form as a frame. In other words, the tarsal plate must be manipulated in order to support eyelid movement. We focused on the locus of the tarsal plate, which runs vertically when viewed from the front side and forms a circular arc along the spherical surface of the eye ball when viewed from the right side, as shown in Fig. 2. Thus, the movement locus of the tarsal plate is three dimensional. We defined these motions as a requirement for eyelid manipulation. The following requirements should also be met:
  1. 1.

    Gentle and safe manipulation.

     
  2. 2.

    Light weight and ease of daily use.

     
  3. 3.

    3D movement along the locus of the tarsal plate.

     
  4. 4.

    Responsive and robust against eyelid movement, which is repetitive and quick.

     

Eyelid gating mechanism (ELGM)

Based on these requirements, we propose a mechanism to support eyelid closure, which we call the eyelid gating mechanism (ELGM). This mechanism transforms simple rotational actuation to three-dimensional and complex motion through deformation of soft material. This mechanism can use rigid actuators like a rotational motor as a power source, thus it can provide responsive and robust movement. Moreover, the end effector attached to the eyelid is made of soft material, thus the mechanism can provide good affinity and does not cause overload or strain on the eye.

Design rule of ELGM

The current ELGM design is shown in Fig. 3. This ELGM is composed of a support part, two handle parts, and a fixed base. The support part is made of soft material, and the nob part can hook to the eyelid. The handle parts are made of a rigid material and each of them has a hinged end. These hinged ends are fixed on an eyeglass-type frame crossing its rotational axes of hinged ends. These hinged ends must be located in the tail and inner corners of the eye, respectively. This mechanism was designed in SolidWorks and printed using VeloClear (rigid material) and Agilus 30 (soft material) on an Objet500 Connex2 3D printer (Stratasys Ltd.). In this case, the shore hardness of support part is A50. These support and handle parts were printed as a single piece.
Fig. 3
Fig. 3

Eyelid gating mechanism; the white part is the support part made of soft material, and the yellow parts are the handle parts made of rigid material. The red line shows the edge of the nob attached on eyelid. The dot is the center point of the edge

This ELGM was designed following the design rules shown in Fig. 4. Figure 4a shows the design parameters for the support and handle parts. L is the distance between both hinged ends. The curved line of the support part is based on a precise circle. The handle parts are placed on the extension of the centerline of the circle. Its inclination is \(\psi\), and its length is l. The red line shows the initial ELGM posture, which corresponds to the opening phase of the eyelid. The blue line shows the deformed ELGM posture corresponding to the eyelid closure phase. The parameter H corresponds to the width between the upper and lower eyelids, which is determined by L, l, and \(\psi\). Figure 4b shows the fixation parameters. \(\phi\) shows the angle between the rotation axes on the handle parts, \(L'\) shows the distance between the fixed positions of each hinged end, and \(\theta\) shows the input deformation value.
Fig. 4
Fig. 4

ELGM design rules. a ELGM design parameters. L is the length between two hinged ends, and H is the height between the red and blue lines. The red line shows the initial posture of the ELGM which corresponds to the opening phase of the eyelid, and the blue line shows the deformed posture of the ELGM, which corresponds to the eyelid closure phase of the ELGM. H refers to the width between the upper and lower eyelids, l is the length of the handle part of the ELGM, and \(\psi\) is the inclination of the handle. b ELGM fixed on an eyeglass-type frame. \(\phi\) is the angle between the rotation axes on the handle parts, \(L'\) is the width between the fixed ends, and \(\theta\) is the rotational input angle transmitted by the pulley mechanism to deform the ELGM

In this study, we set the value of each parameter in Fig. 4 based on an article which reported the anthropometry of Asian eyelids [13]. According to this article, the width between the upper and lower eyelids is 7.5 mm to 9.0 mm, and the width between the rail and inner corners of the eye is 20 mm to 30 mm. Based on these values, we set the value of each parameter to be 9 mm < H < 10 mm, L = 42 mm, and \(\psi = 40^{\circ }\).

Deformation analysis of the ELGM

We conducted deformation analysis using the parameters l, \(\phi\), and \(L'\) as variables. We used non-liner finite element analysis from ANSYS inc. We defined the ELGM material to be a hyperelastic material in the simulation, which is similar to the ELGM material properties. We calculated the deformation at each rotational input angle \(\theta\) under forced displacement. We defined the contact position between the handle and fixation parts as frictionless. All other contact positions were bond. The number of ELGM nodes was approximately 2000.

The calculation results are shown in Figs. 5 and 6. Figure 5 shows an overview of the deformation analysis results, where each colored line is related to the red line in Fig. 3, and the black line shows the locus of the black dot in Fig. 3. We extracted the deformation loci along the Y-Z direction, which are shown as black lines in Fig. 5 with respect to each parameter. The extracted results are indicated as a − 1, b − 1, and c − 1 in Fig. 6. We also calculated the curvature of these loci with a least-squares approach with respect to each parameter, which are indicated as a − 2, b − 2, and c − 2 in Fig. 6. Moreover, we measured the maximum deformation along the Y direction for each parameter, and the results are indicated as a − 3, b − 3, and c3 in Fig. 6.
Fig. 5
Fig. 5

ELGM deformation; each line shows the part in contact with the eyelid shown in Fig. 3. These are related to rotational displacements of 0, 10, 30 and 50°. The black line shows the locus of the black dot in Fig. 3

Fig. 6
Fig. 6

Results from the calculation of the ELGM deformation. a − 1, b − 1, and c − 1 show the deformation loci in Y–Z direction, which is indicated with a black line in Fig. 5. a − 2, b − 2, and c − 2 show how the deformation curvature shifts with respect to each parameter. Each curvature was calculated with a least-squares approach. a − 3, b − 3, and c − 3 show the maximum deformation in the Y direction with respect to each parameter

In terms of the handle length l, a longer handle length contributes to less deformation along the Y direction, and the curvature has a local maximum when l is approximately 5 mm. For the fixation angle \(\phi\), a larger fixation angle contributes to larger curvature during deformation and does not affect deformation in the Y direction. Shorter fixation distance \(L'\) between the handle parts transfers the circular arc deformation along the Y direction. From the nature of this mechanism, the fixation parts must be placed closer to the face than the nob attach to the eyelid, and it is more difficult to attach the nob to the eyelid when deformation in the Y direction is larger. The fixation parts and the ELGM deformation loci must be closer in order to maintain contact with the eyelid. In this case, from the results of this deformation analysis and exploratory prototyping, we prepare a prototype with l = 5 mm, \(\phi\) = 50°, and \(L'\) = 38 mm for use in the following experiments.

Facial wearable robot

We fabricated the facial wearable robot as shown in Fig. 1. This robot contains a lithium-ion battery, controller, sensor, actuator, and the ELGM. This robot uses an Arduino micro as a controller and a servomotor as an actuator. This robot uses an IR reflective optical sensor to detect eyelid closure. These parts were installed on an eyeglass-type frame. Eyelid movement detection method and eyeglass-type frame will be introduced in the following section.

Eyelid movement detection

This facial wearable robot supports eyelid movement on the paralyzed side based on movement on the healthy side. As blinking is typically symmetric, we consider that the healthy eyelid movement may serve as an appropriate trigger for eyelid movement support on the paralyzed side. Thus, we decided to detect eyelid movement on the healthy side as a trigger for this motion support on the paralyzed side. There are many ways to detect eyelid closure [1417]. Among them, we focused on eye wink detection with a reflective optical sensor [18]. An eye wink is a voluntary and strong eye closure motion. It generates a large skin displacement at the tail of the eye. A reflective optical sensor can measure this skin displacement. Compared to other detection methods, using a reflective optical sensor is contactless and easy to use. The wearable facial robot can also support eyelid movement based on the wearer’s intention because a wink motion detected by the optical sensor is a voluntary motion.

A reflective optical sensor (VASHAY Intertechnology Inc.) was implemented in the current prototype, and the sensor was installed toward the tail of the eye on the healthy side. The skin displacement signal was acquired every 10 ms. The signal was differentiated, and the obtained waveform is shown in Fig. 7. A positive value indicates eyelid closure, and the negative value indicates the eyelid is opening. In this work, we focus on both eyelid opening and closure movement, and this robot supports both movements. Voluntary eyelid motion detection is based on a variable threshold, and a simple calibration system was used. With this system, the wearer could set the threshold easily by closing and opening his/her eyelid voluntary following a signal, such as a blinking LED or an applied vibration. The robot obtains the maximum and minimum values and sets the values multiplied by 0.7 as the threshold for detection in this calibration method. These characteristics allow for user-friendly utilization, and a very short time is required to become familiar with the robot.
Fig. 7
Fig. 7

Differentiated signal from a reflective optical sensor due to voluntary eyelid closure, which is installed toward the tail of the eye. The blue line shows the differentiated signals due to voluntary eye closure (dark gray) and light eyelid closure (light gray)

Eyeglass-type frame

The hinged ends on the ELGM were located in the tail and inner corner of the eye. To meet this requirement, an eyeglass-type frame was designed to fit the facial surface. First, the facial surface was scanned with a 3D scanner (Artec 3D M), and the data were exported to SolidWorks. With the scanned facial surface, the tail and inner corner were related to the back of the nose where the nose pads of the frame were located, and the eyeglass-type frame was designed with SolidWorks. This frame was manufactured with a 3D printer using Velo Clear (rigid material) and Tango Black Plus (soft material), similar to that of the ELGM. At the hinges of the frame, Tango Black Plus (soft material) was used to adjust and fix the robot on the head of the user. Owing to this precise design with the scanned facial surface and flexibility of the ELGM, the mechanism could be fixed on the eyelid without any tape or glue, and remained fixed due to friction. As a result, a very short time is required to put on or remove the robot.

Evaluation

Evaluation of ELGM

Setting

In this experiment, the performance of the ELGM was evaluated in patients with facial paralysis. As mentioned above, patients with facial paralysis can open their eyelids with the levator muscle in the upper eyelid when innervated by oculomotor nerve, but they cannot close their eyelid. Therefore, we evaluated if the ELGM can support eyelid closure in patients with facial paralysis.

Ten Japanese subjects (average age: \(46.2 \pm 15.6\), 4 male and 6 female) with complete acquired facial paralysis participated in the experiment. Information on the participants is shown in Table 1. Informed written consent was obtained from all the participants. Since different participants have different facial topologies, we designed a chin stand and a manually operated ELGM for this experiment. In this setting, the position of the ELGM can be adjusted by the experimenter to fit the face of each participant. The participants were asked to place and fix their head on the chin stand on which the ELGM was implemented, as shown in Fig. 8. They were then asked to relax all facial muscles. First, the participants were asked to blink five times in a row. The participants subsequently put on the ELGM, and the experimenter actuated the mechanism manually five times in a row. As mentioned above, we did not use any tape or glue to manipulate the eyelid in this experiment. A potentiometer was installed in each of the ELGM hinged ends to track their rotation angles. The movements were tracked by using a camera with a frame rate of 60 fps placed in front of the participants. From the obtained videos, we extracted images of the closure phase when the potentiometer shows maximum values. The distance between the upper and lower eyelids was measured manually by three coders who are not related to this work, and we calculated the average values and standard errors. After the measurements were gathered, we asked the participants to rate the level of stress they experienced from the eyelid support provided by the ELGM on a 5-point scale (5: no stress, 1: significant stress).
Table 1

Properties of subjects

No.

1

2

3

4

5

6

7

8

9

10

Sex

M

M

F

F

F

M

M

F

F

F

Age

11

23

37

48

48

53

54

51

56

63

Side

R

L

L

L

R

R

R

R

L

L

Fig. 8
Fig. 8

Experimental set up. Subjects fixed their head on a chin stand on which the ELGM was implemented. The experimenter rotates the ELGM manually to manipulate the subject’s eyelid. We used a camera with a rate of 60 fps, and pictures were extracted from the obtained videos when the potentiometer shows a maximum value

To evaluate how wide the eye opens, we calculated the ratio of the distance between the upper and lower eyelids to this distance when the eyelid is open fully. We call this the palpebral aperture ratio (PAR). When the eyelid closes, the PAR is 0% and the PAR is 100% when the eyelid is open.

Results

The results of this experiment are shown in Fig. 9. This graph shows PAR changes for various participants. The blue bars show how the PAR value shifts without the robot, and the red bars show how the PAR value shifts with the robot. According to the blue bars, nearly all participants could not close their eyelid less than 40%. The red bars indicate improved eyelid closure. ELGM was able to help this group of patients close their eyelids, except for participants 2 and 7. We get a stress level of \(4.45 \pm 0.89\) by averaging the results obtained from the questionnaires.
Fig. 9
Fig. 9

ELGM evaluation results for patients with facial paralysis. These graphs show changes in PAR for various subjects. The blue bar shows how the PAR value shifts without the robot, and the red bar shows improved eyelid closure with the robot. When eyelid closes, the PAR is 0% and the PAR is 100% when it opens. Error bars in this graph indicate standard errors

Evaluation of the facial wearable robot

Setting

This experiment was conducted to evaluate the responsiveness of the robot. To ease the effect of blinking on eyelid movement supported by ELGM, two healthy subjects participated in the experiment. One subject was used for eyelid movement detection, and the other was used for eyelid movement support with ELGM.

Subject A wore the eyeglass-type frame with ELGM on the right side, and subject B wore an eyeglass-frame with a reflective optical sensor on the left side. Subject B calibrated the threshold for eye closure/opening detection. Following calibration, he was asked to close and open his eye on the left side, which forces the system to support eyelid motion on the right side of subject A. These movements were tracked with a camera with a frame rate of 60 fps. The same procedure was repeated for five sessions. We measured the distance between the upper and lower eyelids manually from the obtained images, and we calculated PAR from each frame.

Results

The results of this experiment are shown in Figs. 1011, and 12. Figure 10 shows a snapshot series of the functioning system. Picture (a) shows detection of eye closure, (b) shows the completion of eyelid closure support, (c) detection of eye opening, and (d) shows the completion of eyelid closure support; these pictures correspond to points indicated with a, b, c, and d in Fig. 11, respectively. As mentioned above, the procedure was repeated five times, and Fig. 11 shows the first result from five measurements. In this case, when the eyelid opens, PAR is 100%, and PAR is 0% when the eyelid closes. The red line shows changes in PAR with support, and the blue broken line shows the changes in PAR without support. The average values of “A” and “B” were calculated and are shown in Fig. 12. Value “A” shows the elapsed time between eyelid closure detection and completion of eyelid closure support, and its value is \(236.9 \pm 19.4\) ms. “B” shows the elapsed time between eyelid opening detection and completion of eyelid opening support, and its value is \(278.7 \pm 12.4\) ms.
Fig. 10
Fig. 10

Demonstration of the presented robot. a Eye closure detection, b completion of eyelid closure support, c detection of eye opening, and d completion of eyelid closure support. Each picture corresponds to the points in Fig. 11

Fig. 11
Fig. 11

Changes in the palpebral aperture ratio (PAR) over time. The red line shows the PAR change with support, and the blue broken line shows the PAR change without support. Dots correspond to the states in Fig. 10. Moreover, the average values of “A” and “B” was calculated, and these values are shown in Fig. 12

Fig. 12
Fig. 12

Obtained experimental values. “Eyelid closure procedure” is the time from eyelid closure detection to completion of eyelid closure, which is shown as the average value of “A” in Fig. 11. “Eyelid Opening Procedure” is the time from eye opening detection to completion of eyelid opening, which is shown as the average value of “B” in Fig. 11

Discussion

According to Fig. 9, the ELGM could support eyelid closure without any tape or glue, except in the case of two participants. Moreover, according to Figs. 11 and 12, this robot can support eyelid movement based on voluntary eyelid opening and closure on the healthy side of the face. From these results, one can conclude that this robot can provide repetitive and gentle eyelid closure support based on wearer’s intention.

For participants 2 and 9, ELGM could not completely support eyelid closure. Our observation is that the prepared ELGM could not match the shape of the eyelid for these subjects. We conducted deformation analysis of ELGM, and we obtained insights into the deformation trend. However, we are still not able to quantify the movement or shape of individual eyelids, thus it is difficult to make a suitable ELGM model. The best design of ELGM would be based on the eyelid shape of the wearer.

Regarding support latency, this robot can support eyelid opening and closure movement within approximately 300 and 250 ms, respectively. We believe this latency is fast enough to prevent the risk of exposure to debris and dryness. However, as mentioned earlier, facial paralysis is also a major psychological barrier because of facial asymmetry, and patients tend to avoid social involvement. To solve this psychological problem, latency should be faster than 33 ms to prevent the perception of asymmetry by an asocial partner [19]. The robot’s design should also be improved so that the motion it provides better resembles regular eyelid movement.

Conclusion

In this paper, we proposed a novel approach for supporting opening and closure motion of the human eye with a wearable soft robot. This robot has a novel mechanism that uses deformation of a soft material to manipulate the eyelid, which we call ELGM. We defined the requirements for eyelid manipulation, and we introduced ELGM design rules based on these requirements. We also conducted a deformation analysis. As a result, we could tailor deformation to eyelid movement with a simple rotational input at two hinged ends. This system provides appropriate support for eyelid movement in a non-invasive and gentle manner based on voluntary eyelid movement on the healthy side. We evaluated the ELGM and wearable facial robot, and we confirmed that this robot can support eyelid closure gently and repeatedly, based on the wearer’s intention.

We designed this robot with respect to safety and user-friendliness. We designed an eyeglass-type frame based on the wearer’s facial surface, which was scanned with a 3D scanner. Because the hinged ends of ELGM were fixed in appropriate positions on this frame, and it has the same shape as eyeglass frames, the device is easy to put on and remove. This design can ensure safety. We also implemented a calibration method that can set the threshold for eyelid movement detection with one wink motion. Users can easily use this robot easily thanks to this feature.

In the future, we plan to develop this wearable robot as a piece of rehabilitation equipment for patients with facial paralysis. The developed robot may support rehabilitation and help patients recover from paralysis faster, as well as prevent the development of after effects such as synkinesis.

Declarations

Authors’ contributions

YK designed the facial wearable robot, conducted the system integration and testing. NM recruited the participants and provided advice on the experimental protocol and future directions. KS conceived the idea of the robot, supervised the experiments and data analysis, and advised on future works and proof-reading of the paper. All authors read and approved the final manuscript.

Acknowledgements

The authors express their deep gratitude to all the participants for their generous contribution to this work.

Competing interests

The authors declare that they have no competing interests.

Availability of data and materials

There is no publically available data on this work.

Funding

This work was supported in part by JSPS KAKENHI Grant Number 17H06290.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors’ Affiliations

(1)
Ph.D Program in Empowerment Informatics, School of Integrative and Global Majors, University of Tsukuba, 1-1-1 Tenodai, Tsukuba, Japan
(2)
Department of Otolaryngology, Osaka Police Hospital, 10-31 Kitayama-cho, Tennoji-ku, Osaka, Japan
(3)
Faculty of Engineering, System and Information Engineering, University of Tsukuba, 1-1-1 Tenodai, Tsukuba, Japan

References

  1. Xie H, Li X, Li W, Li X (2014) The proceeding of the research on human exoskeleton. In: International conference on logistics engineering, management and computer science, pp 752–756Google Scholar
  2. Suzuki K, Mito G, Kawamoto H, Hasegawa Y, Sankai Y (2007) Intention-based walking support for paraplegia patients with robot suit hal. Adv Robot 21(12):1441–1469. https://doi.org/10.1163/156855307781746061 View ArticleGoogle Scholar
  3. Jayatilake D, Isezaki T, Teramoto Y, Eguchi K, Suzuki K (2014) Robot assisted physiotherapy to support rehabilitation of facial paralysis. IEEE Trans Neural Syst Rehabil Eng 22(3):644–653View ArticleGoogle Scholar
  4. Williamson IG, Whelan TR (1996) The clinical problem of bell’s palsy: is treatment with steroids effective? Br J Gen Pract 46(413):743–747Google Scholar
  5. Finsterer J (2008) Management of peripheral facial nerve palsy. Eur Arch Oto-Rhino-Laryngol 265(7):743–752View ArticleGoogle Scholar
  6. Tollefson TT, Senders CW (2007) Restoration of eyelid closure in facial paralysis using artificial muscle: preliminary cadaveric analysis. Laryngoscope 117(11):1907–1911View ArticleGoogle Scholar
  7. Senders CW, Tollefson TT (2010) Force requirements for artificial muscle to create an eyelid blink with eyelid sling. Arch Facial Plast Surg 12(1):30–36Google Scholar
  8. Xin Y, Jun J, Simin D, Steve GS, Qing X, Guoxing W (2013) A blink restoration system with contralateral artifact blanking. IEEE Trans Biomed Circ Syst 7:140–148View ArticleGoogle Scholar
  9. Kozaki Y, Suzuki K (2017) A facial wearable robot with eyelid gating mechanism for supporting eye blink. In: Annual international conference on intelligent robots and systems, pp 1812–1817Google Scholar
  10. VanderWerf F, Brassinga P, Reits D, Aramideh M, Ongerboer de Visser B (2003) Eyelid movements: behavioral studies of blinking in humans under different stimulus conditions. J Neurophysiol 89:2784–2796View ArticleGoogle Scholar
  11. Leigh RJ, David SZ (2015) The neurology of eye movements. Oxford University Press, OxfordView ArticleGoogle Scholar
  12. Drake R, Vogl AW, Mitchell AW (2009) Gray’s anatomy for students. Churchill Livingstone, LondonGoogle Scholar
  13. Park DH, Choi WS, Yoon SH, Song CH (2008) Anthropometry of asian eyelids by age. Plast Reconstruct Surg 121(4):1405–1413View ArticleGoogle Scholar
  14. Ishimaru S, Kunze K, Jens W, Andreas D, Paul L, Andreas B (2014) In the blink of an eye—combining head motion and eye blink frequency for activity recognition with google glass. In: Proceedings of the 5th augmented human international conference, pp 15–18Google Scholar
  15. Kanoh S, Ichi-Nohe S, Shioya S, Inoue K, Kawashima R (2015) Development of an eyewear to measure eye and body movements. In: Annual international conference of the IEEE engineering in medicine and biology society, pp 2267–2270Google Scholar
  16. Frigerio A, Hadlock TA, Murray EH, Heaton JT (2014) Infrared-based blink-detecting glasses for facial pacing. JAMA Facial Plast Surg 16(3):211–218View ArticleGoogle Scholar
  17. Ozawa M, Sampei K, Cortes C, Ogawa M, Oikawa A, Miki N (2014) Wearable line-of-sight detection system using micro-fabricated transparent optical sensors on eyeglasses. Sens Actuat A Phys 205:208–214View ArticleGoogle Scholar
  18. Robin S, Everett C, Anne L, Zofia L (1990) The eye wink control interface: using the computer to provide the severely disabled with increased flexibility and comfort. In: Proceedings of third annual IEEE symposium on computer-based medical systems. IEEE, New York, pp 105–116Google Scholar
  19. Kim SW, Heller ES, Hohman MH, Hadlock TA, Heaton JT (2013) Detection and perceptual impact of side-to-side facial movement asymmetry. JAMA Facial Plast Surg 15(6):411–416View ArticleGoogle Scholar

Copyright

© The Author(s) 2018

Advertisement