Tricking Touch with Plaids

catholic_school_uniformsImagine yourself at a street corner, watching cars go by and waiting for your turn to cross. When the eye tracks a moving object like a car, it inspires fireworks of activity in the visual systems of the brain. Initially, the information is pixelated into independent scraps, as primary visual neurons respond to their preferred shapes (edges, corners, and points on the car) and directions. But in a subconscious flash, the brain performs the complex calculations needed to transform those individual elements into a complete picture that accurately predicts the true movement of the car: to the left, to the right, or straight AT ME!

The laboratory of Sliman Bensmaïa, assistant professor of organismal biology and anatomy at the University of Chicago, has demonstrated that the brain processes the basic elements of touch very much like it handles visual information. Neurons in the somatosensory (the fancy neurobiology name for touch) system show directional selectivity just like primary visual neurons. Drag a pen cap along a finger, and one neuron responds to rightward movement, while another responds to the cap moving left.

Collecting information about the outside world in the same way presumably allows for easier communication between the visual and somatosensory systems in the brain, which would be important for tasks that require both seeing and touching. But do the similarities extend beyond the initial gathering of sensory information, including how that piecemeal data is assembled by the brain into a complete picture?

In a 2008 paper in the Proceedings of the National Academy of Sciences, Bensmaïa and collaborators Yu-Cheng Pei and Steven Hsaio determined the answer was yes. Complex stimuli such as the “barber pole illusion” or moving plaids are often used by scientists to test the limits of the visual system and determine which computational strategy the brain uses to predict overall direction of motion from a bunch of sometimes contradictory stimuli. When the same complex patterns were recreated as a touch stimulus (a process that requires a multi-million-dollar one-of-a-kind machine that controls 400 tiny pins with 400 separate motors), the experiments suggested that the somatosensory system uses the same computational strategy as vision.

But upon further reflection, Bensmaïa and his team wondered if those tests were challenging enough to truly reveal the brain’s computational abilities. The plaids used in the PNAS paper, called Type I plaids, consisted of two gratings moving in different directions at the same speed – if one grating is going northeast and the other is going southeast, the eye and the finger both perceive the direction in between the two directions, as east.

Yet that test failed to distinguish between the two most popular models for motion processing. In the more simple model, the brain would merely average out all of the individual directions perceived and take the result as the “true direction.” In the more complex model, the brain would use an algorithm to compute the direction based on the motion of individual edges. The motion of one edge limits the interpretation of what the direction the whole object is moving – if you’re watching a car moving to the right, none of its edges will be moving left. In fact, only two edges are needed to calculate the true direction of motion for an object, but this process requires more complicated computation than mere averaging. Yet, the simple model can also give the right answer for a complex stimuli if the brain focuses more heavily on particular elements, called end-stop terminators, that give more accurate information about the overall direction; in the case of the plaids, the corners where the two gratings intersect.

plaid_135_160To separate between these two models, more complex plaids were needed. In a paper published last week in Neuron, Bensmaïa’s team used Type II plaids (example at left), where the two gratings moved at different speeds, to try and confuse the sensory systems even further. This time, vision and touch were not found to be created equal.

“When we presented this plaid we fully expected the same thing would happen in vision and touch, and the truth of the matter is: it didn’t,” Bensmaïa said. “We found a tactile illusion.”

Even with the more complex plaids, the visual system could not be fooled on the true direction, indicating that the brain either made calculations based on the more complex model or by a simple model that prioritized the terminators. However, the somatosensory system was tricked and perceived a different direction than the visual system – an illusion which can be modeled if the brain weighed edges and corners in the plaid equally and took the average of both types to predict motion, rather than concentrating on the more reliable terminators.

Nevertheless, one interpretation of the result is that vision and touch are largely computed by the brain in largely the same fashion – both using the simple model, but differing only in how heavily the informative terminators are weighed. That difference only matters for certain types of complex stimuli, like the plaids, and may reflect the more nuanced tasks that the visual system must perform, such as depth perception and transparency. For the somatosensory system – where the depth of an object is not hard to calculate, because you are touching it – such advanced computations are irrelevant.

“It’s important to know something is moving relative to your hand for instance when it slips, but it doesn’t matter what direction its moving when it slips, because you’re going to do the same thing regardless – you’re going to tighten your grip on it,” Bensmaïa said.

In evolutionary terms, this set-up allowed the brain to duplicate its work and use similar computations for two different systems. In the case of vision, the computations are tricked out with some extra features to handle more complex tasks. But the two sensory systems prove themselves more similar than different, provided no sneaky plaids come along.

=====

Pei YC, Hsiao SS, Craig JC, & Bensmaia SJ (2011). Neural mechanisms of tactile motion integration in somatosensory cortex. Neuron, 69 (3), 536-47 PMID: 21315263

About Rob Mitchum (518 Articles)
Rob Mitchum is communications manager at the Computation Institute, a joint initiative between The University of Chicago and Argonne National Laboratory.
Follow

Get every new post delivered to your Inbox.

Join 2,905 other followers

%d bloggers like this: