Brain Control Motor
The brain is "hardwired" with
connections, which are made by billions of neurons that make electricity
whenever they are stimulated. The electrical patterns are called brain waves.
Neurons act like the wires and gates in a computer, gathering and transmitting
electrochemical signals over distances as far as several feet. The brain encodes
information not by relying on single neurons, but by spreading it across large
populations of neurons, and by rapidly adapting to new circumstances.
Motor
neurons carry signals from the central nervous system to the
muscles, skin and glands of the body, while sensory neurons carry signals from
those outer parts of the body to the central nervous system. Receptors sense things like chemicals,
light, and sound and encode this information into electrochemical signals
transmitted by the sensory neurons. And interneurons
tie everything together by connecting the various neurons within the brain and spinal
cord. The part of the brain that controls motor skills is located at the ear of the
frontal lobe.
How does this communication happen? Muscles
in the body's limbs contain embedded sensors called muscle spindles that
measure the length and speed of the muscles as they stretch and contract as you
move. Other sensors in the skin respond to stretching and pressure. Even if
paralysis or disease damages the part of the brain that processes movement, the brain still makes
neural signals. They're just not being sent to the arms, hands and legs.
à
A technique called neurofeedback uses connecting sensors on
the scalp to translate brain waves into information a person can learn from. The sensors
register different frequencies of the signals produced in the brain. These
changes in brain
wave patterns indicate whether someone is concentrating or suppressing his
impulses, or whether he is relaxed or tense.
NEUROPROSTHETIC DEVICE:
A neuroprosthetic
device known as Braingate converts brain activity into computer commands. A sensor is implanted on
the brain,
and electrodes are hooked up to wires that travel to a pedestal on the scalp.
From there, a fiber optic cable carries the brain activity data to a nearby computer.
PRINCIPLE:
"The principle of operation of the BrainGate Neural
Interface System is that with intact brain function, neural signals are
generated even though they are not sent to the arms, hands and legs. These
signals are interpreted by the System and a cursor is shown to the user on a
computer screen that provides an alternate "BrainGate pathway". The
user can use that cursor to control the computer, just as a mouse is
used."
BrainGate is a brain implant system developed by the
bio-tech company Cyberkinetics in 2003 in
conjunction with the Department of Neuroscience at Brown University. The device was designed to help
those who have lost control of their limbs, or other bodily functions, such as
patients with amyotrophic lateral
sclerosis (ALS) or spinal cord injury. The computer chip, which is implanted into the brain,
monitors brain activity in the patient and converts the intention of the user
into computer commands
.
NUERO
CHIP:
Currently the chip uses 100 hair-thin electrodes that 'hear' neurons
firing in specific areas of the brain, for example, the area that controls arm
movement. The activity is translated into electrically charged signals and are
then sent and decoded using a program, which can move either a robotic arm or a
computer cursor. According to the Cyberkinetics' website, three patients have
been implanted with the BrainGate system. The company has confirmed that one patient (Matt Nagle) has a spinal cord injury, whilst another has
advanced ALS.
In addition to real-time analysis of neuron
patterns to relay movement, the Braingate array is also capable of recording electrical data for
later analysis. A potential use of this feature would be for a neurologist to study seizure patterns in a patient with epilepsy.
Braingate is currently recruiting patients with a range of
neuromuscular and neurodegenerative conditions for pilot clinical trials in the United States .
WORKING:
Operation of the BCI system is not
simply listening the EEG of user in a way that let’s tap this EEG in and listen
what happens. The user usually generates some sort of mental activity pattern
that is later detected and classified.
PREPROCESSING:
The raw EEG signal requires some
preprocessing before the feature extraction. This preprocessing includes
removing unnecessary frequency bands, averaging the current brain activity
level, transforming the measured scalp potentials to cortex potentials and
denoising
DETECTION:
The detection
of the input from the user and them translating it into an action could be
considered as key part of any BCI system. This detection means to try to find
out these mental tasks from the EEG signal. It can be done in time-domain, e.g.
by. comparing
amplitudes of the EEG and in frequency-domain. This involves usually digital
signal processing for sampling and band pass filtering the signal, then
calculating these time -or frequency domain features and then classifying them.
These classification algorithms include simple comparison of amplitudes linear
and non-linear equations and artificial neural networks. By constant feedback
from user to the system and vice versa, both partners gradually learn more from
each other and improve the overall performance.
CONTROL:
The final part
consists of applying the will of the user to the used application. The user
chooses an action by controlling his brain activity, which is then detected and
classified to corresponding action. Feedback is provided to user by
audio-visual means e.g. when typing with virtual keyboard, letter appears to
the message box etc.
TRAINING:
The training is the part where the user adapts
to the BCI system. This training begins with very simple exercises where
the user is familiarized with mental activity which is used to relay the
information to the computer. Motivation, frustration, fatigue, etc. apply also
here and their effect should be taken into consideration when planning the
training procedures
BIO FEEDBACK: The
definition of the biofeedback is biological
information which is returned to the source that created it, so that source can
understand it and have control over it. This biofeedback in BCI systems
is usually provided by visually, e.g. the user sees cursor moving up or down or
letter being selected from the alphabet.
boon to the paralyzed -Brain Gate Neural Interface System
The first patient, Matthew Nagle, a 25-year-old Massachusetts man with a
severe spinal cord injury, has been paralyzed from the neck down since 2001.
Nagle is unable to move his arms and legs after he was stabbed in the neck.
During 57 sessions, at New England Sinai Hospital
and Rehabilitation
Center , Nagle learned to
open simulated e-mail, draw circular shapes using a paint program on the
computer and play a simple videogame, "neural Pong," using only his
thoughts. He could change the channel and adjust the volume on a television,
even while conversing. He was ultimately able to open and close the fingers of
a prosthetic hand and use a robotic limb to grasp and move objects. Despite a
decline in neural signals after few months, Nagle remained an active
participant in the trial and continued to aid the clinical team in producing
valuable feedback concerning the BrainGate technology.
NAGLE’S
STATEMENT:
“I can't put it
into words. It's just—I use my brain. I just thought it. I said, "Cursor
go up to the top right." And it did, and now I can control it all over the
screen. It will give me a sense of independence.”
OTHER APPLICATIONS:
Rats implanted with BCIs in Theodore Berger's
experiments.Several laboratories have managed to record signals from monkey and
rat cerebral cortexes in order to operate BCIs to carry
out movement. Monkeys have navigated computer cursors on screen and commanded
robotic arms to perform simple tasks simply by thinking about the task and
without any motor output. Other research on cats has decoded visual signals.
Garrett Stanley's recordings of cat vision
using a BCI implanted in the lateral geniculate nucleus (top row: original
image; bottom row: recording)
In 1999, researchers led by Garrett Stanley
at Harvard University decoded neuronal firings to
reproduce images seen by cats. The team used an array of electrodes embedded in
the thalamus (which integrates all of the brain’s sensory input)
of sharp-eyed cats. Researchers targeted 177 brain cells in the thalamus lateral geniculate nucleus area, which
decodes signals from the retina. The cats were shown eight short movies, and
their neuron firings were recorded. Using mathematical filters, the researchers
decoded the signals to generate movies of what the cats saw and were able to
reconstruct recognisable scenes and moving objects.
In the 1980s, Apostolos Georgopoulos at Johns
Hopkins University found a mathematical relationship between the electrical
responses of single motor-cortex neurons in rhesus macaque monkeys and the direction that
monkeys moved their arms (based on a cosine
function). He also found that dispersed groups of neurons in different areas of
the brain collectively controlled motor commands but was only able to record
the firings of neurons in one area at a time because of technical limitations
imposed by his equipment.[4]
There has been rapid development in BCIs
since the mid-1990s.[5] Several groups have been able to capture
complex brain motor centre signals using recordings from neural ensembles (groups of neurons) and use these
to control external devices, including research groups led by Richard Andersen,
John Donoghue, Phillip Kennedy, Miguel Nicolelis, and Andrew Schwartz.
Diagram of the BCI developed by Miguel
Nicolelis and collegues for use on Rhesus onkeys
Later experiments by Nicolelis using rhesus
monkeys, succeeded in closing the feedback loop and
reproduced monkey reaching and grasping movements in a robot arm. With their
deeply cleft and furrowed brains, rhesus monkeys are considered to be better
models for human neurophysiology than owl monkeys. The
monkeys were trained to reach and grasp objects on a computer screen by
manipulating a joystick while corresponding movements by a robot arm were
hidden.[8][9] The monkeys were later shown the robot
directly and learned to control it by viewing its movements. The BCI used
velocity predictions to control reaching movements and simultaneously predicted
hand gripping force.
Other labs that develop BCIs and algorithms
that decode neuron signals include John Donoghue from Brown University, Andrew Schwartz
from the University of Pittsburgh and Richard Andersen from Caltech. These
researchers were able to produce working BCIs even though they recorded signals
from far fewer neurons than Nicolelis (15–30 neurons versus 50–200 neurons).
Donoghue's group reported training rhesus
monkeys to use a BCI to track visual targets on a computer screen with or
without assistance of a joystick (closed-loop BCI).[10] Schwartz's group created a BCI for
three-dimensional tracking in virtual reality and also reproduced BCI control
in a robotic arm.
CONCLUSION:
The idea of moving robots or prosthetic
devices not by manual control, but by mere “thinking” (i.e., the brain activity
of human subjects) has been a fascinated approach. Medical cures are
unavailable for many forms of neural and muscular paralysis. The enormity of
the deficits caused by paralysis is a strong motivation to pursue BMI solutions.
So this idea helps many patients to control the prosthetic devices of their own
by simply thinking about the task.
This technology is well
supported by the latest fields of Biomedical Instrumentation, Microelectronics,
signal processing, Artificial Neural Networks and Robotics which has
overwhelming developments. Hope these systems will be effectively implemented
for many Biomedical applications.
The
No comments:
Post a Comment