More than moving: Somatosensation in brain-controlled prosthetics

An invention that for decades has been used in Science-Fiction movies[i]  to portray the technological advance of futuristic civilizations seems to be ever more close: The replacement of a lost limb (precisely, the arm) with a bionic one, that restores great parts of functionality of a biological human limb[ii]. For example, first experiments with a prosthetic device receiving control signals directly from the nerves that innervate the remaining parts of the limb have shown a remarkable regain in dexterity[iii].

There is, however, another group of patients where these approaches will not be feasible: Tetraplegia [iv] is a severe condition, in which patients due to damage to the spinal cord or the brain lose control of and the senses in their limbs. Generalizing to some extent, this means that the signals the brain sends downstream to move the limbs will not reach their target. Likewise, the “feedback” that it receives, (i.e. sensory information), will get lost on the way. Hence, any prosthetic device designed for tetraplegic patients has to bridge this nervous gap.

New technological advances are trying to give back the possibility of independent interaction with the world to such patients: By creating robotic arms, that are controlled by inferring motor signals directly from the brain of the user, researchers are hoping that tetraplegic patients can at one points use these arms to interact as skillful with objects as with a human hand.

Frequent visitors of this site will have read a recent post on this blog, where a great introduction on such Brain-Machine-Interface[v] (BMI) prosthetic devices was given (If you haven’t read it, check it out now). To put it short, a Brain-Machine-Interface is just any device that derives some signal from brain activity and transfers this signal to a machine. In the case discussed here, the signal of interest are the inferred movement intentions a patient has for their prosthesis.

Here, I will give you a follow-up on the topic, discussing some recent developments that have been made in BMI-prosthetic hands – to be precise: How could they include a sense of touch? And how could that increase the agility of the arm?

The importance of touch for skillful movements

One feature that many modern BMI-applications have not yet employed is somatosensory[vi]  feedback from the prosthetic hands.  Most of the devices[vii] just work in a one-way fashion, replacing efferent[viii] signals from the cortex downstream to the muscles by the “decoded[ix]” signals to the prosthetic hand. That means, they look for activity related to the output the brain would normally send to the limbs and use this brain activity to infer what movements the prosthetic device should make. This top-down aspect, however, is only one part of movement control in subjects without injuries or diseases. Another important part of control is feedback.

Indeed, there is no doubt on the importance of the feedback tactile afferents[x] e.g. for the manipulation[xi] of objects. Imagine for example a relatively simple task, where you have to pick up an item, let’s say a coin, and then turn it around between your fingers and put it inside a box. This sounds like an easy task, but there is a lot that could go wrong. You might miss the coin, reaching somewhere away from it. You might apply too little grip, so that you when you turn it around, it slips away. Or you apply an unnecessary amount of grip. The list could be prolonged, but the important point is this: If we just had “one-way” control over our hand without the feedback, we could not correct such mistakes in the process. Of course, there is also other information e.g. from vision, but this is not enough: For example, one does not visually perceive on a fine scale how much force their own fingertips are applying to an object. So clearly there seems to be a crucial role for somatosensation in this case.

Johansson & Flanagan (2009)[xii] give a simple conceptualization of how sensory information is involved in a task like this: The whole process of the voluntary manipulation can usually be discretized in different subgoals. In our example above, reaching for the coin, picking it up etc. For each of this subgoals, there is a corresponding sensation – e.g. you have reached the coin when you feel your fingertips touching the coin, you have established a stable grip when it does not slip while moving. The idea is that the brain already predicts this sensations signaling the achievement of a subgoal when the motor command is issued. This then allows for fast correction of the movement, when a deviation of the predicted sensation is experienced.  Although there are[xiii] other[xiv] theories[xv] on how exactly somatosensory and proprioceptive[xvi] feedback are integrated into the control of movements, the consensus is that without the sensory feedback, erroneous movements cannot be as efficiently corrected.

The essentiality of the somatosenses becomes drastically clear by clinical cases: There are reports on people losing[xvii] their full somatosenses from the neck below. Although still being able to control their muscles and perform movements, they initially fail at everyday tasks like buttoning a shirt or drinking a cup of coffee, hence they have to learn compensating with their other senses and their memory. Daily routines that they once did without effort now need constant monitoring by e.g. the visual sense.  Often the movements do not result as smooth as with an intact sense of touch – as noted before, the somatosenses convey rich information, some of which is hardly available through seeing. This leads to the question: how is the variety of information actually achieved?  

When we manipulate an object with our hand, a rich number of tactile features of the object is available to our somatosensory cortices, e.g. information about spatial features of the object, its dynamics, the texture of its surface etc. To capture this different aspects of somatosensation, there are actually different types of receptors[xviii] and afferent[xix] fibers each with their own spatial and temporal sensitivity. The following table gives you a short overview on these different types, what stimulus qualities they are sensitive to, and how they are distributed in the hand – but to really discuss these would take a whole blog post.

Types of cutaneous mechanoreceptors in the hand and their response properties
The glabrous skin contains mainly 4 different types of fibers/nerve endings, that each have different spatial and temporal sensitivities. That is, they either respond to high or low frequencies of (mechanical) stimulation (top vs. bottom) and have either fine or larger receptive field properties (left vs. right). By integrating information from the different receptor types, a variety of information about the properties of an object is acquired, e.g. texture(grip) or motion direction. Adapted from Johannson & Flanagan (2009)

Eliciting percepts via cortical microstimulation

We now have an idea why the somatosenses are important to restore motor function. Before we get back to the receptors in the hand, we will come back to the principal question of this post:

How can you elicit feelings of touch in people where the connection between the peripheral receptors and the brain is lost, e.g. in tetraplegic patients? Some studies are following an ambitious approach: By implanting electrodes[xx] in the cortex, they try to elicit “natural” haptic sensations directly in the brain. The basic ideas motivating this technique of intracortical microstimulation (ICMS) are of course not new: More than 70 years ago, during neurosurgeries[xxi], it was already shown that by stimulating certain surface areas of the primary somatosensory Cortex (S1), patients would feel e.g. tingling sensations in their bodies. Importantly, these sensations are spatially organized in a well-ordered manner – the image of the cortical homunculus[xxii])  has become popular knowledge. This somatotopy of the cortex makes researchers hope to use the spatial mapping, so that they can specifically target[xxiii] the area corresponding to a stimulation of the prosthetic device.

Box 1: Application of Intracortical Microstimulation

That means, a prosthetic hand with feedback implementation would contain sensors at various positions in the fingers, the palm, the forearm etc. When the device has contact with an object, the corresponding sensor activity should then be encoded in a signal that mimics the signal that biological receptors in the hand (as shortly introduced above) would produce when interacting with the object in the same way.  This signal would then trigger ICMS – which is achieved by small electrode arrays that are directly implanted into the Cortex (see Box 1, Box 2).  This finally elicits a sensation that is correctly localized to the area of contact and ideally represents qualitative aspects of the stimulus.

Box 2: Schematic Illustration of a BMI employing ICMS:
Intended movements for object manipulation are decoded from cortical signals. Interaction with the object generates a signal in tactile sensors in the prosthetic device. From this signal, the corresponding stimulation routine for somatosensory areas is decoded. This finally leads to somatosensation and its intracortical propagation via natural pathways, i.e. feedback of the movement.

A number of animals of studies provides promising results on the feasibility of this approach: By implanting small microelectrode arrays in Area 3b of S1, Romo et al. (1998)[xxiv] could specifically target the fingertips of monkeys. The monkeys had been trained before to discriminate two “naturally” (i.e. by mechanical force on the skin) evoked stimuli at the same location by their frequencies. When replacing one of the two stimuli with the according intracortical simulation, the monkeys could still perform the task reliably.

With a similar ICMS-approach, O’Doherty et al. (2011)[xxv] also provided some first evidence that this could actually add functionality to a BMI. They taught monkeys to perform a task where they had to control a hand in virtual reality via moving a joystick and then explore a set of items to find the one yielding a reward. The important point is, that the information needed to perform the task was in the “tactile” properties of the virtual objects – these tactile properties where conveyed via ICMS electrodes in S1. Later on, the joystick was also disconnected from the virtual hand and control was executed directly by decoding motor cortex output (while the monkey was moving the joystick). Hence, the loop was closed and what would be called a “Brain-Machine-Brain-Interface” was achieved.

In a recent pioneering study[xxvi], this was also achieved in a tetraplegic patient, who had acquired this syndrome due to spinal cord injury. Microelectrode arrays were implanted in S1, close to areas identified prior to implementation to correspond to the patient’s right hands hand. After training, activation of some of the electrodes would then be consistently felt by the patient at the base of his fingers D2 to D5 (i.e. from the index to the pinky). The intensity of the sensation could be manipulated by adjusting the amplitude of the stimulation. Moreover, these stimulations were rated as somewhat close to a natural sensation of touch by the patient, as far as he could judge. This is an important feature of the stimulation, as in previous studies with humans, stimulating the cortical surface, stimulations where often rated as tingling or diffuse[xxvii].

Then, the prosthetic hand the patient was already using was equipped with a “sense of touch”: By converting the torque exerted on a finger of the prosthetic hand into a signal to the intracortical electrodes, the researchers could then elicit sensations just by touching the prosthetic hand. How did this work out? See for yourself (from minute 3:10)

https://www.youtube.com/watch?v=L1bO-29FhMU

As you can see, the combination of BMI with ICMS already has proven worth to be investigated. So what’s next?  As mentioned before, there are different types of receptors in the hand that are sensitive to different qualities of touch. Likewise, the receptive fields in the somatosensory Cortex seem to contain neurons that are sensitive to different aspects of stimulation. Some of these more directly correspond to the afferent properties[xxviii]  (e.g. slow adapting vs. quick-adapting neurons), while others seem to be sensitive to information that is just available at a higher level – like stimulus orientation[xxix] or movement direction[xxx]. As these neurons seem to be spatially grouped (i.e. neurons close to each other will respond to similar stimulus qualities), it would in principle be possible[xxxi] to independently stimulate neurons sensitive for different qualities of the object. This way, it could be achieved that patients can feel more aspects of the world they are interacting with, than before.  Another ongoing field of research regards the question of how to encode the touch at the level of the sensors in the hand. To make the signals from those sensors mimic the signals from receptors in a natural hand as closely as possible, researchers need accurate models[xxxii] on how the latter are formed in the hand. These models try to capture the complex biophysical properties of the human hand and could then be applied to the robotic hand, to create similar signals.

To conclude, there is still a lot of work to until tetraplegic patients can regain a sense of touch via a BMI. But there is a lot of exciting scientific progress made these days, and maybe the day when touch is restored is closer than we think.


[i] https://www.youtube.com/watch?v=cik8cl_n9AE

[ii] Schultz, A. E., & Kuiken, T. A. (2011). Neural interfaces for control of upper limb prostheses: the state of the art and future possibilities. PM&R, 3(1), 55-67.

[iii] https://www.youtube.com/watch?v=xKUn0-Bhb7U

[iv] https://en.wikipedia.org/wiki/Tetraplegia

[v] https://en.wikipedia.org/wiki/Brain%E2%80%93computer_interface

[vi] https://en.oxforddictionaries.com/definition/somatosensory

[vii] Lebedev, M. A., & Nicolelis, M. A. (2006). Brain–machine interfaces: past, present and future. TRENDS in Neurosciences, 29(9), 536-546.

[viii] https://en.wikipedia.org/wiki/Efferent_nerve_fiber

[ix] Kim, S. P., Simeral, J. D., Hochberg, L. R., Donoghue, J. P., & Black, M. J. (2008). Neural control of computer cursor velocity by decoding motor cortical spiking activity in humans with tetraplegia. Journal of neural engineering, 5(4), 455.

[x] https://en.wikipedia.org/wiki/Afferent_nerve_fiber

[xi] Johansson, R. S., & Flanagan, J. R. (2009). Coding and use of tactile signals from the fingertips in object manipulation tasks. Nature Reviews Neuroscience, 10(5), 345.

[xii] Ibd.

[xiii] Schwartz, A. B. (2016). Movement: how the brain communicates with the world. Cell, 164(6), 1122-1135.

[xiv] Scott, S. H. (2012). The computational and neural basis of voluntary motor control and planning. Trends in cognitive sciences, 16(11), 541-549.

[xv] Friston, K. (2011). What is optimal about motor control?. Neuron, 72(3), 488-498.

[xvi] https://www.dictionary.com/browse/proprioception

[xvii] Rothwell, J. C., Traub, M. M., Day, B. L., Obeso, J. A., Thomas, P. K., & Marsden, C. D. (1982). Manual motor performance in a deafferented man. Brain, 105(3), 515-542.

[xviii] Vallbo, A. B., & Johansson, R. S. (1984). Properties of cutaneous mechanoreceptors in the human hand related to touch sensation. Hum Neurobiol, 3(1), 3-14.

[xix] Macefield, G., Gandevia, S. C., & Burke, D. (1990). Perceptual responses to microstimulation of single afferents innervating joints, muscles and skin of the human hand. The Journal of Physiology, 429(1), 113-129.

[xx] Tehovnik, E. J., Tolias, A. S., Sultan, F., Slocum, W. M., & Logothetis, N. K. (2006). Direct and indirect activation of cortical neurons by electrical microstimulation. Journal of neurophysiology, 96(2), 512-521.

[xxi] Penfield, W., & Boldrey, E. (1937). Somatic motor and sensory representation in the cerebral cortex of man as studied by electrical stimulation. Brain, 60(4), 389-443.e

[xxii] https://en.wikipedia.org/wiki/Cortical_homunculus

[xxiii] Bensmaia, S. J., & Miller, L. E. (2014). Restoring sensorimotor function through intracortical interfaces: progress and looming challenges. Nature Reviews Neuroscience, 15(5), 313.

[xxiv] Romo, R., Hernández, A., Zainos, A., & Salinas, E. (1998). Somatosensory discrimination based on cortical microstimulation. Nature, 392(6674), 387.

[xxv] O’Doherty, J. E., Lebedev, M. A., Ifft, P. J., Zhuang, K. Z., Shokur, S., Bleuler, H., & Nicolelis, M. A. (2011). Active tactile exploration using a brain–machine–brain interface. Nature, 479(7372), 228.

[xxvi] Flesher, S. N., Collinger, J. L., Foldes, S. T., Weiss, J. M., Downey, J. E., Tyler-Kabara, E. C., … & Gaunt, R. A. (2016). Intracortical microstimulation of human somatosensory cortex. Science translational medicine, 8(361), 361ra141-361ra141.

[xxvii] Johnson, L. A., Wander, J. D., Sarma, D., Su, D. K., Fetz, E. E., & Ojemann, J. G. (2013). Direct electrical stimulation of the somatosensory cortex in humans using electrocorticography electrodes: a qualitative and quantitative report. Journal of neural engineering, 10(3), 036021.

[xxviii] Mountcastle, V. B., Steinmetz, M. A., & Romo, R. (1990). Frequency discrimination in the sense of flutter: psychophysical measurements correlated with postcentral events in behaving monkeys. Journal of Neuroscience, 10(9), 3032-3044.

[xxix] Bensmaia, S. J., Denchev, P. V., Dammann, J. F., Craig, J. C., & Hsiao, S. S. (2008). The representation of stimulus orientation in the early stages of somatosensory processing. Journal of Neuroscience, 28(3), 776-786.

[xxx] Pei, Y. C., Hsiao, S. S., Craig, J. C., & Bensmaia, S. J. (2010). Shape invariant coding of motion direction in somatosensory cortex. PLoS biology, 8(2), e1000305.

[xxxi] Bensmaia, S. J., & Miller, L. E. (2014). Restoring sensorimotor function through intracortical interfaces: progress and looming challenges. Nature Reviews Neuroscience, 15(5), 313.

[xxxii] Saal, H. P., Delhaye, B. P., Rayhaun, B. C., & Bensmaia, S. J. (2017). Simulating tactile signals from the whole hand with millisecond precision. Proceedings of the National Academy of Sciences, 114(28), E5693-E5702.

For images:

https://obamawhitehouse.archives.gov/photos-and-video/photo/2016/10/president-obama-fist-bumps-nathan-copeland (retrieved on 08.06.19)

Flesher, S. N., Collinger, J. L., Foldes, S. T., Weiss, J. M., Downey, J. E., Tyler-Kabara, E. C., … & Gaunt, R. A. (2016). Intracortical microstimulation of human somatosensory cortex. Science translational medicine, 8(361), 361ra141-361ra141.

Johansson, R. S., & Flanagan, J. R. (2009). Coding and use of tactile signals from the fingertips in object manipulation tasks. Nature Reviews Neuroscience, 10(5), 345.

Salas, M. A., Bashford, L., Kellis, S., Jafari, M., Jo, H., Kramer, D., … & Andersen, R. A. (2018). Proprioceptive and cutaneous sensations in humans elicited by intracortical microstimulation. Elife, 7, e32904.

Seymour, J. P., Wu, F., Wise, K. D., & Yoon, E. (2017). State-of-the-art MEMS and microsystem tools for brain research. Microsystems & Nanoengineering, 3, 16066.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s