About the Authors:
Julián Espinosa
Contributed equally to this work with: Julián Espinosa, Begoña Domenech, Carmen Vázquez, Jorge Pérez, David Mas
Roles Conceptualization, Investigation, Methodology, Software, Writing – original draft
* E-mail: [email protected]
Affiliations Department of Optics, Pharmacology and Anatomy, University of Alicante, Alicante, Spain, University Institute of Physics Applied to Sciences and Technologies, University of Alicante, Alicante, Spain
ORCID http://orcid.org/0000-0001-6817-3117
Begoña Domenech
Contributed equally to this work with: Julián Espinosa, Begoña Domenech, Carmen Vázquez, Jorge Pérez, David Mas
Roles Conceptualization, Formal analysis, Investigation, Methodology, Visualization, Writing – review & editing
Affiliations Department of Optics, Pharmacology and Anatomy, University of Alicante, Alicante, Spain, University Institute of Physics Applied to Sciences and Technologies, University of Alicante, Alicante, Spain
Carmen Vázquez
Contributed equally to this work with: Julián Espinosa, Begoña Domenech, Carmen Vázquez, Jorge Pérez, David Mas
Roles Conceptualization, Methodology, Supervision, Visualization, Writing – review & editing
Affiliations Department of Optics, Pharmacology and Anatomy, University of Alicante, Alicante, Spain, University Institute of Physics Applied to Sciences and Technologies, University of Alicante, Alicante, Spain
Jorge Pérez
Contributed equally to this work with: Julián Espinosa, Begoña Domenech, Carmen Vázquez, Jorge Pérez, David Mas
Roles Conceptualization, Data curation, Formal analysis, Software, Writing – review & editing
Affiliations Department of Optics, Pharmacology and Anatomy, University of Alicante, Alicante, Spain, University Institute of Physics Applied to Sciences and Technologies, University of Alicante, Alicante, Spain
David Mas
Contributed equally to this work with: Julián Espinosa, Begoña Domenech, Carmen Vázquez, Jorge Pérez, David Mas
Roles Investigation, Methodology, Project administration, Software, Writing – original draft
Affiliations Department of Optics, Pharmacology and Anatomy, University of Alicante, Alicante, Spain, University Institute of Physics Applied to Sciences and Technologies, University of Alicante, Alicante, Spain
Introduction
Eye blinking is one of the fastest human reflexes [1]. A blink is a temporary closure of both eyes involving movements of upper and lower eyelids. Blinks’ role is fundamentally keeping the eye hydrated, allowing the tear film distribution over the ocular surface [2,3], and protecting against foreign objects [4,5]. It represents a normal, simply observable and easily accessible phenomenon that reflects central nervous activation processes without voluntary manipulation. Eyelids movements require simple neural commands and few active forces, so their analysis may reveal any abnormality and show if it is derived from a muscular or a neural disorder [4,6]. In all types of blinks, i.e. spontaneous, reflex and voluntary blinks, the movement of the upper eyelid is a result of three active forces (the orbicularis oculi -OO- muscle, the levator palpebrae -LP- muscle and Mueller’s muscle) and a passive force produced by the mechanical arrangement of the eyelid [4,7]. The tonic activity of the LP holds the upper eyelid against passive downward forces. The eyelid drops due to the inhibition of the LP and the activation of the OO muscles. Then, it opens again when the OO muscle activity has turned off and the LP has returned to its tonic activity.
In the past four decades, there have been a huge number of longitudinal studies involving the eye blink. Environmental conditions, age and gender variations in blink rate have been reported [8–11]. In healthy subjects, blinking frequency decrease when subjects are conducting tasks with high cognitive and visual demands [12,13]. Esteban [14] asserted the blink reflex evaluation as an essential tool for the diagnosis and pathophysiological insight into an important number of human neurological disorders. The spontaneous eye blink is also considered a suitable ocular indicator for fatigue diagnostics [15,16] and drowsiness measurement [17,18]. On the other hand, Shultz et al. [19] showed that measures of blink inhibition timing can serve as precise markers of perceived stimulus salience and they are useful quantifiers of atypical processing of social affective signals in toddlers. Another recent application is human biometric for authentication purposes. Abo-Zahhad et al. [20] achieved a high recognition rate (up to 97.3%) from blink waveforms of 25 subjects extracted from electro-oculograms (EOG) signals.
Traditionally, the eye blink was assessed by procedures requiring the application of electrodes to monitor the OO electromyographic activity and get the EOG signal [20–26], or the use of direct magnetic search coil technique [27–30]. Nevertheless, contact-free recording procedures such as photo or video, that permit a quantitative assessment for eye movement during blinking without interfering with the subject, have also been used [2,3,13,17,18,31–47].
The most evaluated blink properties are the rate and the duration because of their relationship with mental stages as fatigue, lapses of attention and stress. The start and the end of the blink are usually considered interdependent and they are determined through the definition of pre-calibrated threshold variables. Indeed, an objective method to determine the end of a blink has not been reported [30]. Other blink features like amplitudes and speeds are also assessed in literature but, up to our knowledge, a thorough report that gathers and analyzes all the physical magnitudes related to the kinematics and dynamics of the process has not been yet published.
An accurate evaluation of the blinking process through video recording needs of high-speed videos. In normal speed videos (60 fps), the difference in the position of the eyelid between two frames may be too large to track it precisely. Hence, Bernard et al. [31], with an eye tracking system, and Corthout et al. [32], with a high-speed Kodak camera, video monitored eye blinks with a temporal resolution of 2 ms.
Some years ago, some of the authors of this work presented a non-invasive technique aimed to high speed measure some of the blinking dynamic features, with 2 ms of temporal resolution too [33]. Lid displacement was monitored by studying the saturation of the frames in the sequence that allowed a quantitative analysis of eyelid location at any instant. In a posterior work [48], the authors proposed an analytical model of eye blinking including lid movement and ocular retraction.
In this paper, we have refined the technique and the blink action is thoroughly described from the analysis of different physical magnitudes directly related to the muscles action. In a recorded sequence of a subject blinking, the eyelid position is directly related to changes in the reflected light. From the variation of the position in time, its first and second derivatives, and their product, we have obtained a set of features describing this physiological phenomenon. As result, the obtained average values of some of the eye blink features agree with those reported in the literature [6,44,45]. Others related to muscular dynamics (power, work and impulse) are reported here for the first time.
Technological advances in last years have enabled the development of new biometric identification systems [20, 49, 50] based on human physical or physiological characteristics that can be studied using digital image processing. Therefore, among the wide field of applications where the analysis of the blink biomechanics could be of interest, we have evaluated the performance of the extracted blink features to accurately authenticate subjects through biometrics.
The paper is structured as follows. In the next section, we describe the subjects that participated in the experiment, the experimental setup and the method used to characterize the blinking. We define different physical features related to the blinking kinematics and dynamics. The third section deals with average results of the blinking of the subjects under study. There, we also introduce an application of the procedure to biometric authentication using different classification algorithms and sets of blinking data. The proposal is tested on a reduced number of subjects in order to check its viability. Finally, in the discussion and conclusion section, we discuss pros and contra of using the method to biometric authentication and present the main conclusions.
Subjects and methods
Our method was tested on 26 subjects (13 females and 13 males of ages ranging from 21 to 62 years, 38±14). Students and staff from the department were recruited as participants without compensation. No subject was discarded from the study. Video sequences were recorded using a commercial camera (GOPRO HERO 3+) working at 240 frames per second. We adhered to the tenets of the Declaration of Helsinki during this study. All participants were informed about the nature and purpose of the study and all of them provided written informed consent. Experiments were conducted in winter of 2016 with the approval of the "Comité de Ética de la Universidad de Alicante. Nº Expediente UA-2016-04-11". Subjects rested their head on a chinrest and the camera was placed in front of their faces at a distance of 30 cm. A halogen lamp was used to illuminate the scene. Subjects were asked to blink naturally during each sequence, which lasted 20 seconds. Seven sequences per subject were recorded.
An image processing algorithm based on the difference in light reflection between the eyelid and the open eye (the pupil, the iris and the sclera) [38] has been implemented. Visible light, as well as infrared radiation, is considerably more absorbed by the pupil and the iris than it is by the eyelid [43]. Some of the authors of this work showed that the variation in the intensity mean value of the blinking image provides direct information about the eyelid position [33]. Thus, by selecting the appropriate region of interest (ROI) around eye, one can observe that the energy dispersed is a direct function of the closure status of the eye. This way Lee et al. [39] obtained the cumulative difference of the number of black pixels of an eye region using an adaptive threshold in successive images in order to determine the state of the eye (open or closed).
In this work, first, a rectangular ROI around each eye was selected. This was done by hand in the first frame of each sequence in order to make the algorithm computationally lighter, while in the following frames, the selection is automatic. The energy contained in each region was computed in all frames of all the sequences. The amount of light intensity reflected by the eye is almost constant when the eyelid is open. When the eyelid closes, the reflected light changes and so does the intensity that was registered by the camera. Therefore, blinks appear as fast increases and decreases of light intensity recorded by the camera. This variation is directly related to the variation of the eyelid position. Fig 1 shows an example of the variation in time of the sum of the intensity of the pixels of one ROI in a registered sequence.
[Figure omitted. See PDF.]
Fig 1. Reflected intensity vs. time.
Variation in time of the sum of the intensity of the pixels computed for an example sequence. The intensity remains almost constant in time. Peaks correspond to the eyelid closed.
https://doi.org/10.1371/journal.pone.0196125.g001
In order to locate and isolate each blink from the sequence, we used a noise tolerant peak-finding algorithm. Peaks represent the instant when the eyelid is completely closed. Each peak is used as reference to extract blinks by cropping the sequence from 0.25 s (60 frames) before to 0.46 s (111 frames) after the peak maximum, which corresponds to the eyelid completely closed. The blinks with higher frequency that overlap in this interval were discarded. Previous works in literature measured a mean inter-blink interval of 5.97 s for normal versus 2.56 s for dry eye subjects [11] and defined the blinking as eyelid closures with a duration of 50 to 500 ms [18]. With the imposed limitations to the blink and inter-blink durations, we discarded incomplete and/or double blinks and consider all the range of closure duration for normal subjects.
Next, the curve of the intensity data (Ii) versus time (ti) of each isolated blink was fitted using a smoothing spline s. The smoothing spline is the solution to the minimization problem shown in (1):(1)where p is the smoothing parameter set to 0.99996. The smoothed data are directly related to the position of the lid. Therefore, the approximated derivatives of s(ti) computed following (2) and (4) can be identified with the velocity, v(ti), and the acceleration of the lid, a(ti).(2)(3)(4)(5) and respectively are the velocity and acceleration normalized to the maximum of their absolute value. T is the time interval between each one of the N samples of the data of intensity. In our case, T = 1/240 s, i.e. the inverse of the frame rate of the camera.
In Fig 2, we represent the data corresponding to the first blink of the sequence presented in Fig 1. The data of intensity (black crosses) are normalized to the unity. Black line represents the smoothing spline s(ti) computed for the intensity following (1). The normalized velocity and acceleration are represented by the green and red lines, respectively.
[Figure omitted. See PDF.]
Fig 2. Normalized data of intensity, smoothing spline, velocity and acceleration for a blink.
Data correspond to the first blink of the sequence of Fig 1. The data of intensity are the black crosses, the black line represents the smoothing spline, the data of velocity and acceleration are plotted in green and red lines, respectively.
https://doi.org/10.1371/journal.pone.0196125.g002
Up to this point, although we have defined a temporal window to isolate each eye blink from the raw signal, the blink duration has still not been defined. As we stated at the Introduction section, the definition of the start and end of the phenomenon is approached in different ways in literature, however no one leads to a categorical criterion [30]. The start of the eye blink can be defined as the time when the velocity of the eyelid is zero before starting the displacement. However, this moment is difficult to define from the above computed curves. In Fig 2, one can see that both velocity and acceleration vary around zero before increasing their value while the position of the eyelid seems not to change.
The product of the velocity and the acceleration results in the power per unit of mass (4a). The power is the rate of doing work. It provides information about the work developed by a force, the lid muscles force, per unit of time in the blinking process.
(6)(7)
Fig 3 shows the normalized power, , computed following (7) for the example blink. When the eye is open (eyelid retracted), the sum of the activity of the muscles that take part in the blinking process is null. Muscles do not develop power so the power curve before and after the blink is zero. Therefore, the start and end of the blink can be defined just by determining when the curve respectively is different to zero and turns back to be zero. The normalized power can also be used to characterize other blink dynamic and kinematic features. Following with Fig 3, once the blinking has started, i.e. after the first vertical black line, the curve intersects the zero line three times during the blink process duration (black dashed vertical lines). These moments are when the velocity or the acceleration of the eyelid are zero. Additionally, the power curve shows two local maxima (blue dash lines) and minima (red dashed lines). The timing of the physiological process is as follows: a few hundredths of a second after the blink has started, the total power developed by the muscles is maximum at the time t1P in the closure phase (1st blue-dashed line). Next, in t2P, the eyelid muscles stop working, the power turns zero and the eyelid gets a maximum velocity of closure (1st black-dashed line). After that, the eyelid starts braking and the power is developed with the opposite sign. There is a moment (t3P) when the curve reaches the minimum, which corresponds to the maximum power developed to brake the closure of the eyelid (1st red-dashed line). Then, the power decreases in absolute value until it returns to zero (2nd black-dashed line). This moment (t4P) corresponds to the eye closed, when the closure phase ends and the opening phase starts.
[Figure omitted. See PDF.]
Fig 3. Normalized power per unit of mass developed by the eyelid muscles for the example blink.
Black, blue and red dashed lines represent feature times obtained from zero-line intersections and local maxima and minima.
https://doi.org/10.1371/journal.pone.0196125.g003
The shape of the curve in the opening phase is similar to that in the closure one. The total power developed by the muscles reaches a local maximum at t5P (2nd blue-dashed line), which happens when the eye is in the upward phase. Then, the power diminishes until it is zero at t6P and the eyelid reaches a maximum velocity (3rd black-dashed line). After that, the sign of the power changes when the eyelid opening is braking and the curve reaches a local minimum (2nd red-dashed line) at t7P. At that moment, the eye still is not completely open. Finally, the power decreases in absolute value until zero (t8P), when the eyelid is again retracted, the muscles forces are compensated and the blink is finished (2nd black line).
The values of the normalized power at the above described instants can be of interest to characterize the blinking. Therefore, we obtained the absolute values of the local maxima and , and local minima, and . Moreover, zeros of the acceleration are local maxima and minima of the velocity. However, the local maxima and minima of the acceleration do not match to those from the power per unit of mass. For example, in Fig 4, we represent the normalized velocity and acceleration computed for the above case with the time features previously obtained. We have shifted the time scale so that the blinking starts at time equal to zero. One can see that the zeros and the local maxima and minima of the velocity (green line) have already been characterized whereas local maxima and minima of the acceleration provide new time features.
[Figure omitted. See PDF.]
Fig 4. Normalized velocity and acceleration computed for the example blink.
Velocity and acceleration are plotted in green and red lines, respectively. Local maxima and minima of the acceleration are used to define three new features (t1a, t2a and t3a).
https://doi.org/10.1371/journal.pone.0196125.g004
Chronologically, t1a is the time after beginning the blinking when the eyelid is in the closure phase and reaches a maximum in the acceleration. Next, after the maximum in the developed power and reaching a maximum in velocity, the total force brakes the eyelid (a change in the sign of the acceleration). This braking force reaches its maximum at t2a, before closing the eye. Then, in the opening phase, the dynamics is similar. The sum of forces accelerates the lid until a maximum at t3a. Later, that force diminishes and probably reaches a local minimum, which corresponds to the time when the eyelid’s braking acceleration in the upward phase is maximum, just before stopping the blinking. However, contrary to what happens in the power curve, this braking phase does not appear clear in the acceleration graphs, so, that local minimum of acceleration cannot be defined.
By proceeding with an analysis similar to that performed with the power, we obtained the absolute local peaks-values of normalized acceleration and velocity: , , , and .
Other magnitudes that we used to analyze the dynamics of the blinking were the work and the mechanical impulse developed by the eyelid muscles. The work done by those muscles is defined as the integral of the power developed by them (P(t) = d W(t)/dt)). Therefore, the area under the curve of the normalized power is related to the work developed by the muscles in a given period of time. Following the Eq (8), four new features are defined: is related to the work performed from 0 to t2P, , from t2P to t4P, , from t4P to t6P and , from t6P to t8P (see Fig 5A).
(8)
[Figure omitted. See PDF.]
Fig 5. Normalized power per unit of mass and normalized acceleration.
a) Normalized power per unit of mass. Gray areas represent the work developed by muscles between each zero-line intersection. b) Normalized acceleration. Gray areas represent the impulse of the eyelid at different stages.
https://doi.org/10.1371/journal.pone.0196125.g005
Similarly, the area under the curve of the acceleration in a time interval is represented by J, the mechanical impulse per unit of mass developed by the muscles in that period of time:(9)(10)
We have computed a magnitude proportional to the mechanical impulse developed by the eyelid muscles in different intervals following the Eq (10): , from the start to t2P, , from t2P to t6P, and , from t6P to the end (see Fig 5B).
Finally, from the analysis of the displacement curve s(ti), we have defined two more features that characterize the blinking: the full width at half maximum (fwhm) of the curve of the eyelid displacement in time, w, and the relation between the mean velocities in the closure and opening phases (S), given by Eq (11). In Fig 6, we represent both features for the blink that we are using as an example.
(11)
[Figure omitted. See PDF.]
Fig 6. Displacement of the eyelid.
θ1 and θ2 describe the mean velocities in the downward and upward phases, and w stands for the fwhm of the displacement.
https://doi.org/10.1371/journal.pone.0196125.g006
Results
We applied the just explained analysis to the recorded video sequences and obtained 3251 eye blinks, ranging from 74 to 191 blinks per subject. The difference in the number of blinks per subject was due to losses in the processing of the signals (overlapping or incomplete blinks) and different subjects’ blink rates. However, all subjects retained at least 74 trials, so 74 random trials were selected from each participant to keep the size of each participant’s data set uniform (http://hdl.handle.net/10045/74398). Thus, we reduced the number of blinks resulting in a set of 1924 blinks (74 blinks for each of the 26 subjects). The above defined blink features were computed and grouped for each blink in a vector , being j = 1,…,n subjects, k = 1,…,b trials and f = 1,…,F features, following the order stated in Table 1. There, we also present the resulting mean (12) and standard deviation (13) of all the features of the set of blinks under study.
(12)(13)
[Figure omitted. See PDF.]
Table 1. Features computed for the set of 1924 blinks.
https://doi.org/10.1371/journal.pone.0196125.t001
From Table 1, we can see that the closure is faster than the opening. The eyelid is closed at around 0.15 s and expends around 0.25 s in the opening phase. Thus, the closure is almost twice as fast as the opening phase. This agrees with the work of Schelini et al. [44]. To blink, the nervous system turns off the tonically active LP and the OO muscle rapidly lowers the upper eyelid. To raise the eyelid again, the OO activity ceases and the LP activity, which consists of raising and holding the eyelid up, resumes [3,7]. Regarding the absolute value of the developed powers, those of the closure phase are larger than those of the opening phase. The same happens with the absolute values of the acceleration. That reveals differences in the mechanics of both processes. Those differences can be thoroughly evaluated from the analysis of the work features. In the closure phase, the muscles develop more work than in the opening one.
The analysis of the maximum eyelid velocities both for closure and opening phase revealed that it was always faster in the closure than in the opening phase, in agreement with previous works [6,45]. Furthermore, Niida et al. [45] reported that, in the closure, the lid velocity shows two-phase distributions: an initial flat phase with small displacement and a second phase with a steep large displacement for the spontaneous blink. This fact is easily visible in the example in Fig 7. From the start to t1a (the moment of maximum acceleration), along 5 ms, the displacement is small. Then, during the next 3 ms approximately and until the maximum closing velocity is reached at t2P, there is a large displacement (around half of the total distance to cover by the eyelid).
[Figure omitted. See PDF.]
Fig 7. Eyelid displacement (black), normalized velocity (green) and acceleration (red) for the example blink.
https://doi.org/10.1371/journal.pone.0196125.g007
Eye blinking waveform can be used as biometric identifier, so features extracted from it does [20]. Thus, we propose using the obtained blinking features to biometric authentication. In order to test this possibility, we have first evaluated the discriminative ability of the extracted features. The coefficient of variation (CV), defined as standard deviation to the mean, permits assessing the inter and intra subject variability. A feature with a low intersubject CV means poor variability and therefore worthless discriminative skills. In Fig 8A), we have represented the CV of all the features computed for the 74 blinks of each subjects (blue to yellow bars) and for all the set of blinks (black bars). The graph manifests that the coefficient variation of the 10th, 21st and 26th features are always minimum (inter and intra subjects) and near zero. They respectively correspond to , and , i.e. the maximum absolute value of the power and acceleration braking the eyelid at the closure phase and the maximum velocity of the eyelid also at the closure phase. Their mean values in Table 1 are all close to 1, the maximum possible. They are maxima in around the 96% of all the measurements.
[Figure omitted. See PDF.]
Fig 8. Coefficient of variation and boxplots of all the features.
a) Coefficient of variation of all features for the blinks of each subject (color bars) and all the blinks (black bars). b) Boxplots of the features computed following Eq (9).
https://doi.org/10.1371/journal.pone.0196125.g008
Another way to visualize the discriminative skills of the features is through boxplots. In Fig 8B), we represented the boxplots computed for all features following the Eq (14):(14)
Thus, values correspond to the zero line. Boxes are plotted in blue, the bottom and top of the box are always the first and third quartiles, and the red band inside the box is the second quartile (the median). The spacing between the different parts of the boxes indicate the degree of dispersion (spread) and skewness in the data. The narrower is the boxplot, the less scattered are the data. One can see that, for the 10th, 21st and 26th features, the upper and lower quartiles coincide with the mode and the average and there are no whiskers. Therefore, we discarded those features and did not use then in biometric classification.
Different classifiers like Linear and Quadratic Discriminant Analysis (LDA and QDA) [51,52], K-Nearest Neighbor (KNN) [53], Classification Tree (CT) [54] have been tested for the proposed system. A classifier assigns a new observation to a class from the training set. In our case, given a vector of features of a blink, any of the above classifier assigns that vector to a subject from the 26 subjects under study. The assignation can be the correct or not, so the performance of each classifier was evaluated through the correct identification rate (CIR), that is the correctly classified samples divided by the classified samples. The validation was performed through non-exhaustive cross-validation, concretely 10-fold cross-validation. Each set of data was proportionally partitioned into 10 disjoint subsets or folds. 9 folds were used for training and the last fold was used for evaluation. The process was repeated 10 times, leaving one different fold for evaluation each time.
We evaluated through 10-fold cross-validation the performance of the proposed features and the above classifiers over five sets of data: the original set with 1924 blinks (74 blinks for each of the 26 subjects) and four additional sets that were synthetically generated by using a bootstrapping procedure in a similar way to Armstrong et al. [21]. It consisted in generating 100 blinks for each participant. Each blink was constructed with the arithmetic mean of β random blinks of that participant’s 74 trials selected each time, being β = 3, 5, 10 and 25 for each set (named β-mean). This way, we constructed 100 unique vectors of β elements that range from 1 to 74. Each vector determined the β blinks from a subject used to compute each average blink. After this kind of resampling, 100 different blink features vectors were available from each participant, forming a set with a total of 2600 blink vectors for each β -mean set.
The different classifiers and sets can be compared through the computed CIR in Table 2. One can see that LDA provides better results than any other classifier with any set of data. We have validated this classifier using leave-one-out cross-validation, thus using one observation as the validation set and the remaining observations as the training set testing on all possible ways. Leave-one-out results agree with those obtained with 10-fold cross-validation so this non-exhaustive cross-validation method provide accurate results.
[Figure omitted. See PDF.]
Table 2. Correct identification rateA.
https://doi.org/10.1371/journal.pone.0196125.t002
The proposed method provided similar or better results than previous works. Thus, in the work of Abo-Zahhad et al. [20], with a training set composed by 50 blinks per subject from 25 subjects and taking as test averages of 25 blinks, the authors achieved a correct identification rate of around 97% in the best of the cases. Here, if we take averages of 25 blinks to construct the training set and bootstrap to get 100 blinks per subject, we reach a CIR for the LDA of 99.7%. For the 10-mean set and LDA classifier, the resulting CIR is a bit lower (96.5%) but taking the averages of only 10 blinks.
In an attempt to determine the discriminative skills of all features when they are used to biometric authentication, we have recursively tested the LDA algorithm. Starting from an empty feature set, candidate feature subsets were created by adding in each of the features that had not been yet selected. For each candidate feature subset, 10-fold cross-validation was performed by repeatedly calling the LDA algorithm with different training and test subsets. Each time the LDA was called, the number of misclassified observations was computed, i.e. the loss. Then, the candidate feature subset that minimized the loss was chosen. The process continued until adding more features did not decrease the loss.
We recursively computed the features subsets selected for the trials subsets used above (original, 3-mean, 5-mean, 10-mean and 25-mean) 60 times. Thus, we got 300 vectors of selected features. In Fig 9, we represent the selection rate of the features. One can observe that almost all features are selected around half times at least, and only 3 of them (f = 5, 6 and 7) are selected around the 30% or less of times so they could be discarded as differentiating features. Contrary, some features are almost always selected. That mean that are good discriminant features. We could set a rate of selection threshold to define the best ones and use them in a new identification biometric system based on the combination of them with others from face recognition, fingerprint, iris, etc.
[Figure omitted. See PDF.]
Fig 9. Rate of selection obtained for each feature.
Percentage of selection of each blink feature obtained after recursively computing 60 times the feature subset that minimized the number of misclassified observations.
https://doi.org/10.1371/journal.pone.0196125.g009
Discussion and conclusions
We have applied different discriminant classifiers using different sets of vectors of blink features as an example of application of the presented method. Linear Discriminant Analysis provided the best correct identification rates for all the analyzed sets. We have constructed four different sets of blink vectors by computing new vectors from averages of vectors computed form real measurements. Resulting CIR improves with the number of samples used to compute the mean due to the fact that the intra-subject average vector eliminates noise but remains the discriminant ability of the features. Nevertheless, the 25-mean case may not be suitable for some practical application due to the fact that the identification process first would require obtaining at least 25 blinks from the subject. Taking into account the normal spontaneous interblink duration [55], that would imply recording blinking during more than 2 minutes. On the other hand, the system should be designed to authenticate using voluntary blinks. Then the recording time would drastically decrease. Anyway, the 10-mean case provided good results and the recording time is reduced to around 35 seconds.
Eye blinking is a physiological act related to intrinsic physical characteristics of the human body, so cannot be easily forged. This fact represents an advantage in its use to biometric authentication compared to others with lower security [20]. Moreover, our method is computationally less complex than authentication systems based on image processing of fingerprint, face or iris that deal with great number of data. Another advantage is that the authentication can be performed remotely and unconsciously for the subject. For instance, it can be used to identify or double-check a person in front of an ATM or a cell phone, or to prevent access of restricted contents in TV or computers. The use of blinking features to biometric authentication also may seem to present some drawbacks. On the one hand, it is well-known that fatigue is associated with increased blink frequency. Moreover, average individual rates of blinking increase with age [9,10] and those rates are correlated with dopamine levels in human and nonhuman primates [56]. Note that, in all cases, the blink feature that varies is the frequency or the blink duration in case of assessing fatigue. All the features presented in this work are obtained from one single blink, so the rate variation is not a problem. Of course, the variation of the blink latency will affect the performance of the biometric authentication and further analysis on this question should be done but it is out of our scope. On the other hand, the video sequences in this work are recorded in laboratory conditions, with approximately constant illumination and distance of capture. In real application, those conditions will probably vary and affect the performance of the method getting worse results. Fortunately, some solution could be applied to the registered signal in order to reduce the noise introduced by those changes. For example, in each frame, we could subtract the background light to the registered signal. It contains all variations due to changes in light or distance of capture. Background light could be estimated as the energy of a background region. Another possible solution would be employing Independent Component Analysis to the eye blink signal [57]. The validation of all these hypothesis, together with the evaluation of the authentication performance of the blink features combined with other biometric characteristics (finger-print, iris, face, …), remains as future work.
In this work, we have proposed obtaining distinctive subject features from a video sequence of the blinking taken with a widely available high-speed sports camera. We based our analysis on the fact that the change of the light reflected by the eyelid when it moves appears as changes in the registered intensity. The features extracted from the data describe the biomechanics of the blinking process and provide information about time, speed, acceleration, mechanical impulse, work and power developed by the muscles participant in the process. Up to our knowledge, kinematic parameters (position, speed, acceleration and the instants of time derived from them) are commonly used in the literature. However, the parameters related with the work, the impulse and the power developed by the muscles, and the times derived from them, are originally proposed in this work. Note that these include a categorical criterion to define the start and the end of the blink. Furthermore, results have shown that the power and the acceleration are maxima in absolute value when braking the eyelid at the closure phase and the maximum eyelid velocity is reached also at the closure phase. The method can be applied to deepen in the research of the blinking process and its relationship with fatigue, drowsiness, neurological diagnosis, etc. We have used the blinking features to biometric identification reaching a correct identification rate up to 99.7%.
Citation: Espinosa J, Domenech B, Vázquez C, Pérez J, Mas D (2018) Blinking characterization from high speed video records. Application to biometric authentication. PLoS ONE 13(5): e0196125. https://doi.org/10.1371/journal.pone.0196125
1. Blount WP. Studies of the Movements of the Eyelids of Animals: Blinking. Q J Exp Physiol. 1927;18: 111–125.
2. Doane MG. Interaction of Eyelids and Tears in Corneal Wetting and the Dynamics of the Normal Human Eyeblink. Am J Ophthalmol. 1980;89: 507–516. pmid:7369314
3. Tsubota K. Tear dynamics and dry eye. Prog Retin Eye Res. 1998;17: 565–596. pmid:9777650
4. Evinger C, Manning KA, Sibony PA. Eyelid movements. Mechanisms and normal data. Invest Ophthalmol Vis Sci. 1991;32: 387–400. pmid:1993591
5. Evinger C. A Brain Stem Reflex in the Blink of an Eye. Physiology. 1995;10: 147–153.
6. Sun WS, Baker RS, Chuke JC, Rouholiman BR, Hasan SA, Gaza W, et al. Age-related changes in human blinks. Passive and active changes in eyelid kinematics. Invest Ophthalmol Vis Sci. 1997;38: 92–99. pmid:9008634
7. Bour LJ, Aramideh M, Visser BWOD. Neurophysiological Aspects of Eye and Eyelid Movements During Blinking in Humans. J Neurophysiol. 2000;83: 166–176. pmid:10634863
8. Sforza C, Rango M, Galante D, Bresolin N, Ferrario VF. Spontaneous blinking in healthy persons: an optoelectronic study of eyelid motion. Ophthalmic Physiol Opt. 2008;28: 345–353. pmid:18565090
9. Bacher LF, Smotherman WP. Spontaneous eye blinking in human infants: A review. Dev Psychobiol. 2004;44: 95–102. pmid:14994260
10. Zametkin AJ, Stevens JR, Pittman R. Ontogeny of spontaneous blinking and of habituation of the blink reflex. Ann Neurol. 1979;5: 453–457. pmid:223495
11. Johnston PR, Rodriguez J, Lane KJ, Ousler G, Abelson MB. The interblink interval in normal and dry eye subjects. Clin Ophthalmol Auckl NZ. 2013;7: 253–259. pmid:23403736
12. Argilés M, Cardona G, Pérez-Cabré E, Rodríguez M. Blink Rate and Incomplete Blinks in Six Different Controlled Hard-Copy and Electronic Reading Conditions. Investig Opthalmology Vis Sci. 2015;56: 6679. pmid:26517404
13. Morcego B, Argilés M, Cabrerizo M, Cardona G, Pérez R, Pérez-Cabré E, et al. Blinking supervision in a working environment. J Biomed Opt. 2016;21: 025005–025005. pmid:26836209
14. Esteban A. A neurophysiological approach to brainstem reflexes. Blink reflex. Neurophysiol Clin Clin Neurophysiol. 1999;29: 7–38.
15. Stern JA, Boyer D, Schroeder D. Blink rate: a possible measure of fatigue. Hum Factors. 1994;36: 285–297. pmid:8070793
16. Schleicher R, Galley N, Briest S, Galley L. Blinks and saccades as indicators of fatigue in sleepiness warnings: looking tired? Ergonomics. 2008;51: 982–1010. pmid:18568959
17. Sugiyama K, Nakano T, Yamamoto S, Ishihara T, Fujii H, Akutsu E. Method of detecting drowsiness level by utilizing blinking duration. JSAE Rev. 1996;17: 159–163.
18. Caffier PP, Erdmann U, Ullsperger P. Experimental evaluation of eye-blink parameters as a drowsiness measure. Eur J Appl Physiol. 2003;89: 319–325. pmid:12736840
19. Shultz S, Klin A, Jones W. Inhibition of eye blinking reveals subjective perceptions of stimulus salience. Proc Natl Acad Sci. 2011;108: 21270–21275. pmid:22160686
20. Abo-Zahhad M, Ahmed SM, Abbas SN. A Novel Biometric Approach for Human Identification and Verification Using Eye Blinking Signal. IEEE Signal Process Lett. 2015;22: 876–880.
21. Armstrong BC, Ruiz-Blondet MV, Khalifian N, Kurtz KJ, Jin Z, Laszlo S. Brainprint: Assessing the uniqueness, collectability, and permanence of a novel method for ERP biometrics. Neurocomputing. 2015;166: 59–67.
22. Denney D, Denney C. The eye blink electro-oculogram. Br J Ophthalmol. 1984;68: 225–228. pmid:6704357
23. Visser BWOD, Goor C. Electromyographic and reflex study in idiopathic and symptomatic trigeminal neuralgias: latency of the jaw and blink reflexes. J Neurol Neurosurg Psychiatry. 1974;37: 1225–1230. pmid:4457616
24. Abo-Zahhad M, Ahmed SM, Abbas SN. A new multi-level approach to EEG based human authentication using eye blinking. Pattern Recognit Lett. 2016;82: 216–225.
25. Roy RN, Charbonnier S, Bonnet S. Eye blink characterization from frontal EEG electrodes using source separation and pattern recognition algorithms. Biomed Signal Process Control. 2014;14: 256–264.
26. Hsieh C-S, Tai C-C. An Improved and Portable Eye-Blink Duration Detection System to Warn of Driver Fatigue. Instrum Sci Technol. 2013;41: 429–444.
27. Schlag J, Merker B, Schlag-Rey M. Comparison of EOG and search coil techniques in long-term measurements of eye position in alert monkey and cat. Vision Res. 1983;23: 1025–1030. pmid:6649419
28. Guitton D, Simard R, Codère F. Upper eyelid movements measured with a search coil during blinks and vertical saccades. Invest Ophthalmol Vis Sci. 1991;32: 3298–3305. pmid:1748560
29. Stava MW, Huffman MD, Baker RS, Epstein AD, Porter JD. Conjugacy of spontaneous blinks in man: eyelid kinematics exhibit bilateral symmetry. Invest Ophthalmol Vis Sci. 1994;35: 3966–3971. pmid:7928197
30. VanderWerf F, Brassinga P, Reits D, Aramideh M, Visser BO de. Eyelid Movements: Behavioral Studies of Blinking in Humans Under Different Stimulus Conditions. J Neurophysiol. 2003;89: 2784–2796. pmid:12612018
31. Bernard F, Deuter CE, Gemmar P, Schachinger H. Eyelid contour detection and tracking for startle research related eye-blink measurements from high-speed video records. Comput Methods Programs Biomed. 2013;112: 22–37. pmid:23880079
32. Corthout E, Hallett M, Cowey A. TMS-induced blinking assessed with high-speed video: optical disruption of visual perception. Exp Brain Res. 2011;210: 243–250. pmid:21431430
33. Mas D, Domenech B, Espinosa J, Pérez J, Hernández C, Illueca C. Noninvasive measurement of eye retraction during blinking. Opt Lett. 2010;35: 1884–1886. pmid:20517450
34. Gittins J, Martin K, Sheldrick JH. A line imaging system for measurement of eyelid movements. Physiol Meas. 1995;16: 303. pmid:8599697
35. Somia NN, Rash GS, Epstein EE, Wachowiak M, Sundine MJ, Stremel RW, et al. A computer analysis of reflex eyelid motion in normal subjects and in facial neuropathy. Clin Biomech. 2000;15: 766–771.
36. Choi SH, Park KS, Sung MW, Kim KH. Dynamic and quantitative evaluation of eyelid motion using image analysis. Med Biol Eng Comput. 2003;41: 146–150. pmid:12691434
37. Casse G, Sauvage J-P, Adenis J-P, Robert P-Y. Videonystagmography to assess blinking. Graefes Arch Clin Exp Ophthalmol. 2007;245: 1789–1796. pmid:17598124
38. Mitelman R, Joshua M, Adler A, Bergman H. A noninvasive, fast and inexpensive tool for the detection of eye open/closed state in primates. J Neurosci Methods. 2009;178: 350–356. pmid:19126413
39. Lee WO, Lee EC, Park KR. Blink detection robust to various facial poses. J Neurosci Methods. 2010;193: 356–372. pmid:20826183
40. Jiang X, Tien G, Huang D, Zheng B, Atkins MS. Capturing and evaluating blinks from video-based eyetrackers. Behav Res Methods. 2012;45: 656–663. pmid:23271154
41. Juhola M, Zhang Y, Rasku J. Biometric verification of a subject through eye movements. Comput Biol Med. 2013;43: 42–50. pmid:23177205
42. Kwon K-A, Shipley RJ, Edirisinghe M, Ezra DG, Rose G, Best SM, et al. High-speed camera characterization of voluntary eye blinking kinematics. J R Soc Interface. 2013;10: 20130227. pmid:23760297
43. Durkin M, Prescott L, Jonet CJ, Frank E, Niggel M, Powell DA. Photoresistive Measurement of the Pavlovian Conditioned Eyelid Response in Human Subjects. Psychophysiology. 1990;27: 599–603. pmid:2274623
44. Schellini SA, , Hoyama E, Cruz AAV, Padovani CR. Spontaneous Eye Blink Analysis in the Normal Individual. Orbit. 2005;24: 239–242. pmid:16354632
45. Niida T, Mukuno K, Ishikawa S. Quantitative measurement of upper eyelid movements. Jpn J Ophthalmol. 1987;31: 255–264. pmid:3669426
46. Portello JK, Rosenfield M, Chu CA. Blink rate, incomplete blinks and computer vision syndrome. Optom Vis Sci Off Publ Am Acad Optom. 2013;90: 482–487. pmid:23538437
47. Mak FHW, Harker A, Kwon K-A, Edirisinghe M, Rose GE, Murta F, et al. Analysis of blink dynamics in patients with blepharoptosis. J R Soc Interface. 2016;13: 20150932. pmid:26962027
48. Perez J, Espinosa J, Domenech B, Mas D, Illueca C. Blinking kinematics description through non-invasive measurement. J Mod Opt. 2011;58: 1857–1863.
49. Fang J-S, Hao Q, Brady DJ, Guenther BD, Hsu KY. A pyroelectric infrared biometric system for real-time walker recognition by use of a maximum likelihood principal components estimation (MLPCE) method. Opt Express. 2007;15: 3271–3284. pmid:19532568
50. Uzair M, Mahmood A, Shafait F, Nansen C, Mian A. Is spectral reflectance of the face a reliable biometric? Opt Express. 2015;23: 15160–15173. pmid:26193499
51. Fisher RA. The Use of Multiple Measurements in Taxonomic Problems. 1936; Available: https://digital.library.adelaide.edu.au/dspace/handle/2440/15227
52. Narsky I, Porter FC. Linear and Quadratic Discriminant Analysis, Logistic Regression, and Partial Least Squares Regression. Statistical Analysis Techniques in Particle Physics. Wiley-VCH Verlag GmbH & Co. KGaA; 2013. pp. 221–249. https://doi.org/10.1002/9783527677320.ch11
53. Larose DT. k-Nearest Neighbor Algorithm. Discovering Knowledge in Data. John Wiley & Sons, Inc.; 2004. pp. 90–106. https://doi.org/10.1002/0471687545.ch5
54. Breiman L, Friedman J, Stone CJ, Olshen RA. Classification and Regression Trees. Edición: 1. Boca Raton: Chapman and Hall/CRC; 1984.
55. Bentivoglio AR, Bressman SB, Cassetta E, Carretta D, Tonali P, Albanese A. Analysis of blink rate patterns in normal subjects. Mov Disord. 1997;12: 1028–1034. pmid:9399231
56. Karson CN. Spontaneous Eye-Blink Rates and Dopaminergic Systems. Brain. 1983;106: 643–653. pmid:6640274
57. Zhang C, Wu X, Zhang L, He X, Lv Z. Simultaneous detection of blink and heart rate using multi-channel ICA from smart phone videos. Biomed Signal Process Control. 2017;33: 189–200.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2018 Espinosa et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
The evaluation of eye blinking has been used for the diagnosis of neurological disorders and fatigue. Despite the extensive literature, no objective method has been found to analyze its kinematic and dynamic behavior. A non-contact technique based on the high-speed recording of the light reflected by the eyelid in the blinking process and the off-line processing of the sequence is presented. It allows for objectively determining the start and end of a blink, besides obtaining different physical magnitudes: position, speed, eyelid acceleration as well as the power, work and mechanical impulse developed by the muscles involved in the physiological process. The parameters derived from these magnitudes provide a unique set of features that can be used to biometric authentication. This possibility has been tested with a limited number of subjects with a correct identification rate of up to 99.7%, thus showing the potential application of the method.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer