Posts Tagged Feature extraction

[Abstract + References] Self-paced movement intention recognition from EEG signals during upper limb robot-assisted rehabilitation

Abstract

Currently, one of the challenges in EEG-based brain-computer interfaces (BCI) for neurorehabilitation is the recognition of the intention to perform different movements from same limb. This would allow finer control of neurorehabilitation and motor recovery devices by end-users [1]. To address this issue, we assess the feasibility of recognizing two self-paced movement intentions of the right upper limb plus a rest state from EEG signals recorded during robot-assisted rehabilitation therapy. In addition, the work proposes the use of Multi-CSP features and deep learning classifiers to recognize movement intentions of the same limb. The results showed performance peaked greater at (80%) using a novel classification models implemented in a multiclass classification scenario. On the basis of these results, the decoding of the movement intention could potentially be used to develop more natural and intuitive robot assisted neurorehabilitation therapies
1. S. R. Soekadar , N. Birbaumer , M. W. Slutzky , and L. G. Cohen , “Brain machine interfaces in neurorehabilitation of stroke,” Neurobiology of Disease, vol. 83, pp. 172-179, 2015.

2. P. Ofner , A. Schwarz , J. Pereira , and G. R. Müller-Putz , “Upper limb movements can be decoded from the time-domain of low-frequency EEG,” PLoS One, vol. 12, no. 8, p. e0182578, Aug 2017, poNE-D- 17-04785[PII].

3. F. Shiman , E. Lopez-Larraz , A. Sarasola-Sanz , N. Irastorza-Landa , M. Spler , N. Birbaumer , and A. Ramos-Murguialday , “Classification of different reaching movements from the same limb using EEG,” Journal of Neural Engineering, vol. 14, no. 4, p. 046018, 2017.

4. J. Pereira , A. I. Sburlea , and G. R. Müller-Putz , “EEG patterns of self- paced movement imaginations towards externally-cued and internally- selected targets,” Scientific Reports, vol. 8, no. 1, p. 13394, 2018.

5. R. Vega , T. Sajed , K. W. Mathewson , K. Khare , P. M. Pilarski , R. Greiner , G. Sanchez-Ante , and J. M. Antelis , “Assessment of feature selection and classification methods for recognizing motor imagery tasks from electroencephalographic signals,” Artif. Intell. Research, vol. 6, no. 1, p. 37, 2017.

6. I. Figueroa-Garcia et al , “Platform for the study of virtual task- oriented motion and its evaluation by EEG and EMG biopotentials,” in 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Aug 2014, pp. 1174–1177.

7. B. Graimann and G. Pfurtscheller , “Quantification and visualization of event-related changes in oscillatory brain activity in the timefrequency domain,” in Event-Related Dynamics of Brain Oscillations, ser. Progress in Brain Research, C. Neuper and W. Klimesch , Eds. Elsevier, 2006, vol. 159, pp. 79 – 97.

8. G. Pfurtscheller and F. L. da Silva , “Event-related EEG/MEG synchronization and desynchronization: basic principles,” Clinical Neurophysiology, vol. 110, no. 11, pp. 1842 – 1857, 1999.

9. G. Dornhege , B. Blankertz , G. Curio , and K. Muller , “Boosting bit rates in noninvasive EEG single-trial classifications by feature combination and multiclass paradigms,” IEEE Transactions on Biomedical Engineering, vol. 51, no. 6, pp. 993–1002, 2004.

10. X. Yong and C. Menon , “EEG classification of different imaginary movements within the same limb,” PLOS ONE, vol. 10, no. 4, pp. 1–24, 04 2015.

11. L. G. Hernandez , O. M. Mozos , J. M. Ferrandez , and J. M. Antelis , “EEG-based detection of braking intention under different car driving conditions,” Frontiers in Neuroinformatics, vol. 12, p. 29, 2018. [Online]. Available: https://www.frontiersin.org/article/10.3389/fninf.2018.00029

12. L. G. Hernandez and J. M. Antelis , “A comparison of deep neural network algorithms for recognition of EEG motor imagery signals,” in Pattern Recognition, 2018, pp. 126–134.

13. M. Abadi et al , “TensorFlow: Large-scale machine learning on heterogeneous systems,” 2015, software available from tensorflow.org. [Online]. Available: https://www.tensorflow.org/

via Self-paced movement intention recognition from EEG signals during upper limb robot-assisted rehabilitation – IEEE Conference Publication

, , , , , , , , , , , ,

Leave a comment

[Abstract] EMG Feature Extractions for Upper-Limb Functional Movement During Rehabilitation

Abstract

Rehabilitation is important treatment for post stroke patient to regain their muscle strength and motor coordination as well as to retrain their nervous system. Electromyography (EMG) has been used by researcher to enhance conventional rehabilitation method as a tool to monitor muscle electrical activity however EMG signal is very stochastic in nature and contains some noise. Special technique is yet to be researched in processing EMG signal to make it useful and effective both to researcher and to patient in general. Feature extraction is among the signal processing technique involved and the best method for specific EMG study needs to be applied. In this works, nine feature extractions techniques are applied to EMG signals recorder from subjects performing upper limb rehabilitation activity based on suggested movement sequence pattern. Three healthy subjects perform the experiment with three trials each and EMG data were recorded from their bicep and deltoid muscle. The applied features for every trials of each subject were analyzed statistically using student T-Test their significant of p-value. The results were then totaled up and compared between the nine features applied and Auto Regressive coefficient (AR) present the best result and consistent with each subjects’ data. This feature will be used later in our future research work of Upper-limb Virtual Reality Rehabilitation.

via EMG Feature Extractions for Upper-Limb Functional Movement During Rehabilitation – IEEE Conference Publication

, , , , , , , , , , , ,

Leave a comment

[Abstract + References] Design of Isometric and Isotonic Soft Hand for Rehabilitation Combining with Noninvasive Brain Machine Interface

Abstract

Comparing with the traditional way for hand rehabilitation, such as simple trainers and artificial rigid auxiliary, this paper presents an isometric and isotonic soft hand for rehabilitation supported by the soft robots theory which aims to satisfy the more comprehensive rehabilitation requirements. Salient features of the device are the ability to achieve higher and controllable stiffness for both isometric and isotonic contraction. Then we analyze the active control for isometric and isotonic movement through electroencephalograph (EEG) signal. This paper focuses on three issues. The first is using silicon rubber to build a soft finger which can continuously stretch and bend to fit the basic action of the fingers. The second is changing stiffness of the finger through the coordination between variable stiffness cavity and actuating cavity. The last is to classify different EEG states based on isometric and isotonic contraction using common spatial pattern feature extraction (CSP) methods and support vector machine classification methods (SVM). On this basis, an EEG-based manipulator control system was set up.

 

I. Introduction

In recent years, stroke has became one of the major health problems which significantly affect the daily life of the elderly, and hand rehabilitation is introduced as an auxiliary treatment. Though various kinds of mechanical devices for hand rehabilitation have been developed, some deficiencies still exist in the current rigid rehabilitation hand, such as the degrees of freedom is not enough, complexity, unsafe status, overweight, being uncomfortable, unfitness and so on. Therefore, with the growth of aging population, it is highly needed to develop some new devices to satisfy the comprehensive rehabilitation requirements. Meanwhile, inspired by the mollusks in nature, soft robot is made of soft materials that can withstand large strains. It is a new type of continuum robot with high flexibility and environmental adaptability. The soft robot has a broad application prospects in military detection techniques, such as instance search, rescue, medical application and other fields.

References

1. J Zhang, H Wang, J Tang et al., “Modeling and design of a soft pneumatic finger for hand rehabilitation [C]”, IEEE International Conference on Information and Automation, pp. 2460-2465, 2015.

2. H Godaba, J Li, Y Wang et al., “A Soft Jellyfish Robot Driven by a Dielectric Elastomer Actuator [J]”, IEEE Robotics & Automation Letters, vol. 1, no. 2, pp. 624-631, 2016.

3. Y Yang, Y. Chen, “Novel design and 3D printing of variable stiffness robotic fingers based on shape memory polymer [C]”, IEEE International Conference on Biomedical Robotics and Biomechatronics, pp. 195-200, 2016.

4. M Wehner, R L Truby, D J Fitzgerald et al., “An integrated design and fabrication strategy for entirely soft autonomous robots [J]”, Nature, vol. 536, no. 7617, pp. 451, 2016.

5. P Polygerinos, Z Wang, K C Galloway et al., “Soft robotic glove for combined assistance and at-home rehabilitation [J]”, Robotics & Autonomous Systems, vol. 73, no. C, pp. 135-143, 2014.

6. M Tian, Y Xiao, X Wang et al., “Design and Experimental Research of Pneumatic Soft Humanoid Robot Hand [M]/ /” in Robot Intelligence Technology and Applications 4. Springer International Publishing, 2017.

7. K Y Hong, J H Lim, F Nasrallah et al., “A soft exoskeleton for hand assistive and rehabilitation application using pneumatic actuators with variable stiffness [C]”, IEEE International Conference on Robotics and Automation, pp. 4967-4972, 2015.

8. J.R Wolpaw, N Birbaumer, WJ Heetderks, DJ Mcfarland, PH Peckham, G Schalk et al., “Brain-computer interface technology: a review of thefirst international meeting”, IEEE Transactions on Rehabilitation Engineering A Publication of the IEEE Engineering in Medicine & Biology Society, vol. 8, no. 2, pp. 164, 2000.

9. C Ethier, ER Oby, MJ Bauman, LE. Miller, “Restoration of grasp following paralysis through brain-controlled stimulation of muscles”, Nature, vol. 485, no. 7398, pp. 368, 2012.

10. C JL, B W, D JE, W W, T EC, W DJ et al., “High-performance neuroprosthetic control by an individual with tetraplegia”, Lancet, vol. 381, no. 9866, pp. 557-564, 2013.

11. UA Qidwai, M. Shakir, Fuzzy Classification-Based Control of Wheelchair Using EEG Data to Assist People with Disabilities, vol. 7666, pp. 458-467, 2012.

12. UA Qidwai, M. Shakir, Fuzzy Classification-Based Control of Wheelchair Using EEG Data to Assist People with Disabilities, vol. 7666, pp. 458-467, 2012.

13. D Broetz, C Braun, C Weber, S.R Soekadar, A Caria, N. Birbaumer, “Combination of brain-computer interface training and goal-directed physical therapy in chronic stroke: a case report”, Neurorehabilitation & Neural Repair, vol. 24, no. 7, pp. 674, 2010.

14. BH. Dobkin, “Brain-computer interface technology as a tool to augment plasticity and outcomes for neurological rehabilitation”, Journal of Physiology, vol. 579, no. Pt 3, pp. 637, 2007.

15. S.R Soekadar, N Birbaumer, LG. Cohen, Brain-Computer Interfaces in the Rehabilitation of Stroke and Neurotrauma, Japan:Springer, 2011.

16. LR Hochberg, B Daniel, J Beata, NY Masse, JD Simeral, V Joern et al., “Reach and grasp by people with tetraplegia using a neurally controlled robotic arm”, Nature, vol. 485, no. 7398, pp. 372-375, 2013.

17. S R Soekadar, M Witkowski, C Gómez et al., Hybrid EEG/EOG-based brain/neural hand exoskeleton restores fully independent daily living activities after quadriplegia [J], vol. 1, no. 1, pp. eaag3296, 2016.

18. L B. Rosenberg, Force feedback interface having isotonic and isometric functionality: CA US 5825308 A [P], 1998.

19. L B. Rosenberg, Isotonic-isometric haptic feedback interface: US US71 02541 [P], 2006.

20. J T Gwin, D P. Ferris, “An EEG-based study of discrete isometric and isotonic human lower limb muscle contractions [J]”, Journal of NeuroEngineering and Rehabilitation, vol. 9, no. 1, pp. 35, 9 2012-06-09.

21. S Bouisset, F Goubel, B. Maton, “[Isometric isotonic contraction and anisotonic isometric contraction: an electromyographic comparison] [J]”, Electromyography & Clinical Neurophysiology, vol. 13, no. 5, pp. 525, 1973.

 

via Design of Isometric and Isotonic Soft Hand for Rehabilitation Combining with Noninvasive Brain Machine Interface – IEEE Conference Publication

, , , , , , , , ,

Leave a comment

[Abstract] Focal onset seizure prediction using convolutional networks

Abstract:

Objective: This work investigates the hypothesis that focal seizures can be predicted using scalp electroencephalogram (EEG) data. Our first aim is to learn features that distinguish between the interictal and preictal regions. The second aim is to define a prediction horizon in which the prediction is as accurate and as early as possible, clearly two competing objectives.
Methods: Convolutional filters on the wavelet transformation of the EEG signal are used to define and learn quantitative signatures for each period: interictal, preictal, and ictal. The optimal seizure prediction horizon is also learned from the data as opposed to making an a priori assumption.
Results: Computational solutions to the optimization problem indicate a ten-minute seizure prediction horizon. This result is verified by measuring Kullback-Leibler divergence on the distributions of the automatically extracted features.
Conclusions: The results on the EEG database of 204 recordings demonstrate that
  1. the preictal phase transition occurs approximately ten minutes before seizure onset, and
  2. (the prediction results on the test set are promising, with a sensitivity of 87.8% and a low false prediction rate of 0.142 FP/h.
Our results significantly outperform a random predictor and other seizure prediction algorithms.
Significance: We demonstrate that a robust set of features can be learned from scalp EEG that characterize the preictal state of focal seizures.

via Focal onset seizure prediction using convolutional networks – IEEE Journals & Magazine

, , , , , , , , , , , , ,

Leave a comment

[ARTICLE] Brain Computer Interface issues on hand movement – Full Text

January 2018

Abstract

This paper focuses on the Brain Computer Interface (BCI) application and its issues. Further the attempt was made to implement left and right hand movement classification after removal of the artifacts in the acquired signals of the various hand movements.

 

1. Introduction

The Brain Computer Interface (BCI) involves a combination of the brain and device both sharing an interface to enable communication channel between the brain and an object that have to be controlled externally. The human brain has innumerable neurons which are connected to each other for transmission of impulses. As an electrode chip is implemented into the brain via surgical methodology the electrical signals produced by the neurons are transmitted to the computer which then translates the signals into data. These data are interpreted to control a computer device. In 2013, Lebedev successfully coupled the brains of two rats making use of an interface to enable direct sharing of information (Pais-Vieira et al., 2013). Minute fluctuations in voltages between neurons are measured and signals are amplified to produce graphs. While the Invasive BCIs focus on direct implementation into the grey matter of the brain to produce the highest quality of signals by neurosurgery, Non Invasive BCIs make use of techniques like Electroencephalography (EEG), Magneto Encephalography (MEG) and function Magnetic Resonance Imaging (fMRI). EEG techniques experience placing of electrodes on the scalp accompanied by a conductive gel or paste. Many systems are known to use electrodes which are attached to separate wires. Over the years, BCI has been instrumental in developing intelligent relaxation devices, providing enhanced control of devices like wheelchairs and vehicles, controlling robots and computer cursors and providing an additional channel of control in computer games. Bionic eyes have been known to restore sight for people having vision loss (Krishnaveni et al., 2012).

Considering the case of a motor imagery which refers to a mental process wherein an individual replicates an action. Thus, a mental representation of movement prevails without an actual body movement. Imagination efficiency is hard to control. Hence controlling EEG enables an individual to communicate despite the inability to control voluntary muscles. Interface substitute for nerves and muscles and the signals are incorporated into the hardware and software to be translated into physical actions. EEG based BCIs can record and classify EEG changes through different types of motor imagery like imagination of right and left hand and activity, consequently motor imagery as means to enhance motor function and motor learning. It has made a significant contribution in the field of neurological rehabilitation, cognitive neuroscience and cognitive psychology. Clinical applications have procured a great deal of aid from motor imagery ranging from enhancing mobility and locomotion to reduce neuropathic pain (Malouin and Richards, 2013). Analysis and interception of data are challenging as EEG signals are vulnerable to varying fluctuations often termed as noise. Various strategies have been devised for prevention and removal of noise. In this paper, we apply Butterworth filter mechanism to eliminate noise from the signals to enhance the data quality. Besides we concentrate on feature extraction to transform raw signals into informative signals. We make use of Support Vector Machine for the same. Feature extraction contributes significantly in image processing.

A step by step process involved in Brain Computer Interface system is shown in the Fig. 1. Signal is acquired through various means such as invasive (ECog, Neurosurgery) and Non-invasive (EEG, fMRI, MEG) techniques. The channel selection is one of the important considerations since most of the EEG channel represent redundant information (Sleight et al., 2009).
Process involved in brain computing interface system

Figure 1. Process involved in brain computing interface system.

Fig. 2 shows the EEG channel placement on the human scalp. Each scalp electrode is located at the brain centres. In 2001 Pfurtscheller (Wolpaw, 2002) identified that many of the neural activity related to fist movements are found in channels C3, C4 and Cz as shown in Fig. 2 B. F7 is for rational activities, Fz is for intentional and motivational data, P3, P4 and Pz contain perception and differentiation, T3, T4 is for emotional processes, T5, T6 has memory functions and O1 and O2 contain visualization data.

EEG channel placements on the human scalp (http://static

In order to remove the noise from the obtained signal, any of the suitable filtering techniques may be adopted. Further the extracted data may move for classification phase. […]

Continue —-> Brain Computer Interface issues on hand movement – ScienceDirect

, , , , , , , ,

Leave a comment

[Abstract] A survey on sEMG control strategies of wearable hand exoskeleton for rehabilitation

Abstract:

Surface electromyographic (sEMG) signals is one most commonly used control source of exoskeleton for hand rehabilitation. Due to the characteristics of non-invasive, convenient collection and safety, sEMG can conform to the particularity of hemiplegic patients’ physiological state and directly reflect human’s neuromuscular activity. By way of collecting, analyzing and processing, sEMG signals corresponding to identify the target movement model would be translated into robot movement control instructions and input into hand rehabilitation exoskeleton controller. Then patients’ hand can be directed to achieve the realization of the similar action finally. In this paper, the recent key technologies of sEMG-based control for hand rehabilitation robots are reviewed. Then a summarization of controlling technology principle and methods of sEMG signal processing employed by the hand rehabilitation exoskeletons is presented. Finally suitable processing methods of multi-channel sEMG signals for the controlling of hand rehabilitation exoskeleton are put forward tentatively and the practical application in hand exoskeleton control is commented also.

Source: A survey on sEMG control strategies of wearable hand exoskeleton for rehabilitation – IEEE Xplore Document

, , , , , , , , , , , , , , , ,

Leave a comment

[Abstract] On the use of wearable sensors to enhance motion intention detection for a contralaterally controlled FES system.

During the last years, there has been a relevant progress in motor learning and functional recovery after the occurrence of a brain lesion. Rehabilitation of motor function has been associated to motor learning that occurs during repetitive, frequent and intensive training.

Contralaterally controlled functional electrical stimulation (CCFES) is a new therapy designed to improve the recovery of paretic limbs after stroke, that could provide repetitive training-based therapies and has been developed to control the upper and lower limbs movements in response to user’s intentionality.

Electromyography (EMG) signals reflect directly the human motion intention, so it can be used as input information to control a CCFES system. Implementation of the EMG-based pattern recognition is not easy to be accomplished due to some difficulties, among them that the activity level of each muscle for a certain motion is different between each person. Inertial Measurement Units (IMU) is a kind of wearable sensors that are used to gather movement data. IMUs could provide valuable kinematic information in an EMG-based pattern recognition process to improve classification.

This work describes the use of IMUS to improve detecting motion intention from EMG data. Results shows that myoelectric algorithm using information from IMUs was better in classification of seven movements at the upper-limb level that algorithm using only EMG data.

Source: IEEE Xplore Abstract (Abstract) – On the use of wearable sensors to enhance motion intention detection for a contralaterally controlle…

, , , , , , ,

Leave a comment

%d bloggers like this: