Luke Skywalker’s Hand and How Touch is Like Vision

lukehandPerhaps the most famous neuroprosthetic device in movie history shows up at the end of The Empire Strikes Back. In the final scene, Luke Skywalker  is fitted for a new, robotic hand to replace the one so cruelly lopped off by (spoiler alert!) his father’s lightsaber. To test out the new hand, Luke first flexes it a couple of times, then allows a droid to poke it in several places with a thin needle. That latter part is actually an important test, verifying the sensory ability of the prosthetic to mimic a real hand’s response to pain or pressure.

Last week, we wrote about BrainGate, a neuroprosthetic that allowed some quadriplegic subjects to control a robotic arm with their brain activity. But a successful prosthetic limb, whether for a quadriplegic or an amputee, would need to have not just motor control, but also some semblance of sensory feedback replicating the ability to touch and sense the limb’s position in space. Imagine if you could move your hand but couldn’t feel it – tasks such as picking up a coffee cup without looking, catching a ball tossed your way, or removing your hand from a hot stove would become considerably more difficult. So while some scientist focus on the commands that travel from brain to hand and direct movement, others study the messages from hand to brain that convey touch information.

That’s the research goal of Sliman Bensmaia, assistant professor of organismal biology and anatomy, who provided the excellent Star Wars example. Bensmaia, who joined the University of Chicago faculty last summer, studies the neural basis of perception, how the sensation of touch is represented in the brain. Some of his most interesting studies have found ways in which the somatosensory system (the neurobiological term for touch) resembles other sensory systems such as hearing and vision.

In a paper published last month in PLoS Biology, Bensmaia and colleagues identified in the somatosensory system a mechanism already famous in the visual system. Visual direction selectivity is a classic topic of neuroscience classes based upon experiments performed by David Hubel and Torsten Wiesel, part of work on the visual system that eventually won them the Nobel Prize for Physiology or Medicine in 1981. Recording from the visual cortex of cats as they were shown different stimuli, Hubel and Wiesel found certain neurons were only activated when the stimulus moved in a particular direction. So when a dot or a bar was moved from left to right, a particular neuron would rapidly fire. But move the same dot or bar right to left, and that same neuron would be silent.

journalpbio1000305g0012A degree of direction selectivity has been recognized in the somatosensory system since the 1970’s, but Bensmaia’s experiments characterized just how selective these neurons truly are. Using an array of 400 pins (like a miniature version of those pin art toys everyone had as a kid, except with each pin under independent motor control), researchers stimulated the fingertips of Rhesus monkeys with a variety of shapes scanned in all directions and recorded from somatosensory cortex. The experiments revealed neurons that were highly selective for direction in three different areas of somatosensory cortex. In one area (dubbed Area 1), the directional tuning was so strong in some neurons as to outweigh all other features of the stimulus – as long as the object was moving in the preferred direction, the neuron didn’t care if the object was a bar or a dot or a series of dots.

Such directional selectivity would be important for a person to adjust when the object they are touching is moving, and would allow the brain to focus on that one particular property if necessary, Bensmaia said.

“These neurons really produce the same response no matter what the shape of the stimulus is. They’ve sort of abstracted all information on shape and focused entirely on motion,” Bensmaia said. “If a neuron responds the same no matter what the shape and its response only depends on motion, the response of that neuron becomes completely unambiguous, and that’s useful. When the higher levels of the brain are interested in decoding just the motion of an object, they can pay attention to only these neurons and ignore all the others.”

The result also shows that touch is processed in the brain very much like vision, an overlap that probably isn’t a coincidence. Aside from the evolutionary philosophy of using the same system for multiple purposes, touch and vision also share the common problem of inferring shape and motion from a two dimensional receptor sheet – the retina and the skin. It’s important to combine information about how that apple on your desk both looks and feels before you bite into it, so why not process the two senses using the same language?

“At some point, visual representations have to communicate with tactile representations,” Bensmaia said. “When I see this object and I want to grab it, I have this visual representation of its shape and when I touch it I have a tactile representation of its shape and somehow these two representations have to communicate.”

Understanding the common language of these sensory systems and decoding the neural code of touch will help scientists to bring Luke Skywalker’s hand closer to reality, Bensmaia said. In theory, motion detectors could be placed on the neuroprosthetic that, when stimulated, would send the proper signal to the brain to give the person a sensation of having felt a moving object. Experiments to test a primitive version of that workaround will take place soon in Bensmaia’s laboratory, as they attempt to “trick” monkeys into perceiving motion by directly activating motion-sensitive neurons in somatosensory cortex.

“It’s the final piece of the puzzle to demonstrate that these neurons are critical in the perception of tactile motion,” Bensmaia said. “If we stimulate these neurons and evoke the perception of motion, then you can attach motion processors on a prosthesis and electrically stimulate these neurons every time there’s motion. That would hopefully give an amputee a conscious experience of motion in a specific direction.”

And, presumably, restore their lightsaber skills as well.

About Rob Mitchum (518 Articles)
Rob Mitchum is communications manager at the Computation Institute, a joint initiative between The University of Chicago and Argonne National Laboratory.

2 Trackbacks / Pingbacks

  1. uberVU - social comments
  2. Wearing a Robot to Fight Paralysis « Science Life Blog « University of Chicago Medical Center

Comments are closed.

%d bloggers like this: