Eye-Tracker: ML Approach for SNA Activation Monitoring

This work is inserted in the scope of the technology for ocular tracking. We performed the comparative analysis of the signal recorded by two different devices, the fixed and the mobile one, and the correlation between the autonomic nervous system (SNA) activity and the pupil dilatation signal due to the emotional stimulus induced by IAPS images.

This work is inserted in the scope of the technology for ocular tracking, a tech that is used to record the dimension and the movement of the pupils. The principal aims are basically two: the comparative analysis of the signal recorded by two different devices, the fixed and the mobile one, and the correlation between the autonomic nervous system (SNA) activity and the pupil dilatation signal due to the stimulus induced by the IAPS images. Pupil dilatation is known to be directly correlated to the level of light exposition of the eye and to the activity of the SNA, particularly it is controlled by the action of parasympathetic and sympathetic systems Thanks to the precision of the newest devices for oculometry, and with the use of proper software for data manipulation, it is very interesting to find out if the pupil signal could be interpreted as a driver for the SNA. Inspecting the correlation between pupil dilatation and SNA activity the aim was to corroborate the hypothesis shared in literature. The final goal for this approach would be a ML algorithm able to precept in real time the SNA behaviour in the detected signal of the subject, providing a metrics for the Arousal explicated by the subject itself according to different stimuli. We divided the dataset into all the possible combination of labels (low vs medium, medium vs high, low vs high Arousal) to let a more intuitive interpretability of the decision boundaries during visualization. Here the accuracy metrics for the model are presented:

We can conclude stating that our results open the way to innovative approach in Brain Computer Interface Design; the approach provided results in a quite good accuracy in discriminating the emotional state of the subject based only on its pupillometric data, which can bee thought as one of the less invasive way to extract information about the Autonomic Nervous System, even from remote. Advancements in computer vision will surely provide even more promising results in BCI systems based on the analysis of the eye of a subject. This type of approaches can be used in several application, from rehabilitation to the marketing, passing through innovative way to control our electronical devices. For future developments much still need to be experimented, first of all the experimental set-up should be designed in a way that is able to arouse an emotional response in a more immersive way than just presenting some passive images (video, sounds..). The main limit of this approach is the attempt of validating an inter-subject model, usually BCI calibration and model parameter tuning is performed on the same subject during different session due to the high variability of neural pattern, nevertheless the performance of this model apparently does not exclude different alternative.