NEURAL MECHANISMS FOR VISUAL-MOTOR INTEGRATION, SPATIAL PERCEPTION AND MOTION PERCEPTION

While the concept of artificial intelligence has received a great deal of attention in the popular press, the actual determination of the neural basis of intelligence and behavior has proven to be a very difficult problem for neuroscientists.  Our behaviors are dictated by our intentions, but we have only recently begun to understand how the brain forms plans to act.  The posterior parietal cortex is situated between the sensory and the movement regions of the cerebral cortex and serves as a bridge from sensation to action.  We have found that an anatomical map of plans  exists within this area, with one part devoted to planning eye movements and another part to planning arm movements.  The action plans in the arm movement area exist in a cognitive form, specifying the goal of the intended movement rather than particular signals to various muscle groups. 

Neuroprosthetics.  One project in the lab is to develop a cognitive-based neural prosthesis for paralyzed patients.  This prosthetic system is designed to record the electrical activity of nerve cells in the posterior parietal cortex of paralyzed patients, interpret the patients’ intentions from these neural signals using computer algorithms, and convert the “decoded” plans into electrical control signals to operate external devices such as a robot arm, autonomous vehicle or a computer. 

Recent attempts to develop neural prosthetics by other labs have focused on decoding intended hand trajectories from motor cortical neurons.  We have concentrated on higher-level signals related to the goals of movements.  Using healthy monkeys with implanted arrays of electrodes we recorded neural activity related to the intended goals of the animals and used this signal to position cursors on a computer screen without the animals emitting any behaviors.  Their performance in this task improved over a period of weeks.  Expected value signals related to fluid preference, or the expected magnitude or probability of reward were also decoded simultaneously with the intended goal.  For neural prosthetic applications, the goal signals can be used to operate computers, robots and vehicles, while the expected value signals can be used to continuously monitor a paralyzed patient’s preferences and motivation. 

Movable probes.  In collaboration with Joel Burdick’s laboratory at Caltech, we have developed a system that can autonomously position recording electrodes to isolate and maintain optimal quality extracellular recordings.  It consists of a novel motorized microdrive and control algorithm.  The microdrive uses very small piezoelectric actuators that can position electrodes with micron accuracy over a 5 mm operating range.  The autonomous positioning algorithm is designed to detect, align, and cluster action potentials, and then command the microdrive to move the electrodes to optimize the recorded signals.  This system has been shown to autonomously isolate single unit activity in monkey cortex.  In collaboration with Yu-Chong Tai’s lab and the Burdick lab we are now developing an even more compact system which uses electrolysis-based actuators designed to independently move a large number of electrodes in a chronically implanted array of electrodes.

Coordinate frames.  Our laboratory also examines the coordinate frames of spatial maps in cortical areas of the parietal cortex coding movement intentions.  Recently, we have discovered that plans to reach are initially coded in the coordinates of the eye.  This is particularly interesting finding because it means the reach plan at this stage is still rather primitive, coding the plan in a visual coordinate frame rather than the fine details of torques and forces for making the movement.  We have also discovered that when the animal plans a limb movement to a sound, this movement is still coded in the coordinates of the eye.  This finding indicates that vision predominates in terms of spatial programming of movements in primates.

We have also been examining the coordinate frame for coordinated movements of the hand and eyes.  In the dorsal premotor cortex we find a novel, “relative” coordinate frame is used for hand-eye coordination.  Neurons in this cortical area encode the position of the eye to the target, the position of the hand to the target, and the relative position of the hand to the eye.  A similar  relative coding may be used for other tasks which involve the movements of multiple body parts such as bimanual movements. 

Motion perception.  Another major effort of our lab is to examine the neural basis of motion perception.  One series of experiments is determining how optic flow signals and efference copy signals regarding eye movements are combined in order to perceive the direction of heading during self-motion.  These experiments are helping us understand how we navigate as we move through the world.  A second line of investigation asks how motion information is used to construct the three dimensional shape of objects.  We asked monkeys to tell us which way they perceived an ambiguous object rotating.  We found an area of the brain where the neural activity changed according to what the monkey perceived, even though he was always seeing the same stimulus.  In other experiments we have been examining how we rotate mental images of objects in our minds, so-called mental rotation.  In the posterior parietal cortex we find that these rotations are made in a retinal coordinate frame, and not an object based coordinate frame, and the mental image of the object rotates through this retinotopic map. 

Local field potentials.  The cortical local field potential (LFP) is a summation signal of excitatory and inhibitory dendritic potentials that has recently become of increasing interest.  We found that LFP signals in the saccade and reach regions provide information about the direction of planned movements as well as the state of the animal; e.g. baseline, planning a saccade, planning a reach, executing a saccade, or executing a reach.  This new evidence provides further support for a role of the parietal cortex in movement planning.  It also shows that LFPs can be used for neural prosthetics applications.  Since LFP recordings from implanted arrays of electrodes are more robust and do not degrade as much with time compared to single cell recordings, this application is of enormous practical importance. 

fMRI in monkeys.  We have successfully performed functional magnetic resonance imaging (fMRI) experiments in awake, behaving monkeys.  This development is important since this type of experiment is done routinely in humans and monitors the changes in blood flow during different cognitive and motor tasks.  However, a direct correlation of brain activity with blood flow cannot be achieved in humans, but can in monkeys.  Thus, the correlation of cellular recording and functional MRI activation in monkeys will provide us with a better understanding of the many experiments currently being performed in humans.  A 4.7 Tesla vertical magnet for monkey imaging has recently been installed in the new imaging center at Caltech.  We are using this magnet, combined with neural recordings, to examine the correlation between neural activity and fMRI signals.


[Home] [Research] [People] [Papers] [Projects] [Classes] [News] [Misc Links]