Posts Tagged Feature extraction
[Abstract] A Method for Self-Service Rehabilitation Training of Human Lower Limbs – IEEE Conference Publication
[Abstract + References] Self-paced movement intention recognition from EEG signals during upper limb robot-assisted rehabilitation
[Abstract + References] Design of Isometric and Isotonic Soft Hand for Rehabilitation Combining with Noninvasive Brain Machine Interface
In recent years, stroke has became one of the major health problems which significantly affect the daily life of the elderly, and hand rehabilitation is introduced as an auxiliary treatment. Though various kinds of mechanical devices for hand rehabilitation have been developed, some deficiencies still exist in the current rigid rehabilitation hand, such as the degrees of freedom is not enough, complexity, unsafe status, overweight, being uncomfortable, unfitness and so on. Therefore, with the growth of aging population, it is highly needed to develop some new devices to satisfy the comprehensive rehabilitation requirements. Meanwhile, inspired by the mollusks in nature, soft robot is made of soft materials that can withstand large strains. It is a new type of continuum robot with high flexibility and environmental adaptability. The soft robot has a broad application prospects in military detection techniques, such as instance search, rescue, medical application and other fields.
- the preictal phase transition occurs approximately ten minutes before seizure onset, and
- (the prediction results on the test set are promising, with a sensitivity of 87.8% and a low false prediction rate of 0.142 FP/h.
This paper focuses on the Brain Computer Interface (BCI) application and its issues. Further the attempt was made to implement left and right hand movement classification after removal of the artifacts in the acquired signals of the various hand movements.
The Brain Computer Interface (BCI) involves a combination of the brain and device both sharing an interface to enable communication channel between the brain and an object that have to be controlled externally. The human brain has innumerable neurons which are connected to each other for transmission of impulses. As an electrode chip is implemented into the brain via surgical methodology the electrical signals produced by the neurons are transmitted to the computer which then translates the signals into data. These data are interpreted to control a computer device. In 2013, Lebedev successfully coupled the brains of two rats making use of an interface to enable direct sharing of information (Pais-Vieira et al., 2013). Minute fluctuations in voltages between neurons are measured and signals are amplified to produce graphs. While the Invasive BCIs focus on direct implementation into the grey matter of the brain to produce the highest quality of signals by neurosurgery, Non Invasive BCIs make use of techniques like Electroencephalography (EEG), Magneto Encephalography (MEG) and function Magnetic Resonance Imaging (fMRI). EEG techniques experience placing of electrodes on the scalp accompanied by a conductive gel or paste. Many systems are known to use electrodes which are attached to separate wires. Over the years, BCI has been instrumental in developing intelligent relaxation devices, providing enhanced control of devices like wheelchairs and vehicles, controlling robots and computer cursors and providing an additional channel of control in computer games. Bionic eyes have been known to restore sight for people having vision loss (Krishnaveni et al., 2012).
Considering the case of a motor imagery which refers to a mental process wherein an individual replicates an action. Thus, a mental representation of movement prevails without an actual body movement. Imagination efficiency is hard to control. Hence controlling EEG enables an individual to communicate despite the inability to control voluntary muscles. Interface substitute for nerves and muscles and the signals are incorporated into the hardware and software to be translated into physical actions. EEG based BCIs can record and classify EEG changes through different types of motor imagery like imagination of right and left hand and activity, consequently motor imagery as means to enhance motor function and motor learning. It has made a significant contribution in the field of neurological rehabilitation, cognitive neuroscience and cognitive psychology. Clinical applications have procured a great deal of aid from motor imagery ranging from enhancing mobility and locomotion to reduce neuropathic pain (Malouin and Richards, 2013). Analysis and interception of data are challenging as EEG signals are vulnerable to varying fluctuations often termed as noise. Various strategies have been devised for prevention and removal of noise. In this paper, we apply Butterworth filter mechanism to eliminate noise from the signals to enhance the data quality. Besides we concentrate on feature extraction to transform raw signals into informative signals. We make use of Support Vector Machine for the same. Feature extraction contributes significantly in image processing.
Fig. 2 shows the EEG channel placement on the human scalp. Each scalp electrode is located at the brain centres. In 2001 Pfurtscheller (Wolpaw, 2002) identified that many of the neural activity related to fist movements are found in channels C3, C4 and Cz as shown in Fig. 2 B. F7 is for rational activities, Fz is for intentional and motivational data, P3, P4 and Pz contain perception and differentiation, T3, T4 is for emotional processes, T5, T6 has memory functions and O1 and O2 contain visualization data.
[Abstract] On the use of wearable sensors to enhance motion intention detection for a contralaterally controlled FES system.
During the last years, there has been a relevant progress in motor learning and functional recovery after the occurrence of a brain lesion. Rehabilitation of motor function has been associated to motor learning that occurs during repetitive, frequent and intensive training.
Contralaterally controlled functional electrical stimulation (CCFES) is a new therapy designed to improve the recovery of paretic limbs after stroke, that could provide repetitive training-based therapies and has been developed to control the upper and lower limbs movements in response to user’s intentionality.
Electromyography (EMG) signals reflect directly the human motion intention, so it can be used as input information to control a CCFES system. Implementation of the EMG-based pattern recognition is not easy to be accomplished due to some difficulties, among them that the activity level of each muscle for a certain motion is different between each person. Inertial Measurement Units (IMU) is a kind of wearable sensors that are used to gather movement data. IMUs could provide valuable kinematic information in an EMG-based pattern recognition process to improve classification.
This work describes the use of IMUS to improve detecting motion intention from EMG data. Results shows that myoelectric algorithm using information from IMUs was better in classification of seven movements at the upper-limb level that algorithm using only EMG data.