Cell computing
Study Finds Cell-Wide Web of Tubules, Suggests Cells Work Like Computers
Edge-orientation processing in first-order tactile neurons
Tactile neurons innervating end organs in the human fingertips have receptive fields with many highly sensitive zones and are critical for conveying detailed spatial information about touched objects. Skin mechanics along with nonlinear processing in the terminal arborization enable first-order tactile neurons to perform a host of complex computations. A fundamental feature of first-order neurons in the tactile system is that their distal axon branches in the skin and forms many transduction sites, yielding complex receptive fields with many highly sensitive zones. We found that this arrangement constitutes a peripheral neural mechanism that allows individual neurons to signal geometric features of touched objects. Specifically, we observed that two types of first-order tactile neurons that densely innervate the glabrous skin of the human fingertips signaled edge orientation via both the intensity and the temporal structure of their responses. Moreover, we found that the spatial layout of a neuron’s highly sensitive zones predicted its sensitivity to particular edge orientations. We submit that peripheral neurons in the touch-processing pathway, as with peripheral neurons in the visual-processing pathway, perform feature extraction computations that are typically attributed to neurons in the cerebral cortex. 1
Visual and tactile sensory processing both involve neural mechanisms that extract high-level geometric features of a stimulus, such as the orientation of an edge, by integrating information from many low-level inputs. Although geometric feature extraction is generally attributed to neural processing in the cerebral cortex, there is growing evidence in the visual system that feature extraction begins very early in the processing pathway, even at the level of first-order (that is, bipolar) neurons in the retina. We found that feature extraction also begins very early in the tactile processing pathway, at the distal arborization of first-order tactile neurons.
The temporal structure of a neuron’s response provides substantial information about edge orientation. Edge orientation discrimination arises because different edges cause different patterns of spatial and temporal coincidence between a neuron’s many transduction sites and the moving stimulus. This peripheral mechanism may also permit first-order human tactile neurons to signal information about higher order aspects of a touched object, such as the curvature of an edge and its motion direction.
A neuron with multiple highly sensitive zones can still only signal that the stimulus is located somewhere in its receptive field, a population of such neurons with overlapping receptive fields provides higher resolution.
1. https://sci-hub.tw/https://www.nature.com/articles/nn.3804
Study Finds Cell-Wide Web of Tubules, Suggests Cells Work Like Computers
Edge-orientation processing in first-order tactile neurons
Tactile neurons innervating end organs in the human fingertips have receptive fields with many highly sensitive zones and are critical for conveying detailed spatial information about touched objects. Skin mechanics along with nonlinear processing in the terminal arborization enable first-order tactile neurons to perform a host of complex computations. A fundamental feature of first-order neurons in the tactile system is that their distal axon branches in the skin and forms many transduction sites, yielding complex receptive fields with many highly sensitive zones. We found that this arrangement constitutes a peripheral neural mechanism that allows individual neurons to signal geometric features of touched objects. Specifically, we observed that two types of first-order tactile neurons that densely innervate the glabrous skin of the human fingertips signaled edge orientation via both the intensity and the temporal structure of their responses. Moreover, we found that the spatial layout of a neuron’s highly sensitive zones predicted its sensitivity to particular edge orientations. We submit that peripheral neurons in the touch-processing pathway, as with peripheral neurons in the visual-processing pathway, perform feature extraction computations that are typically attributed to neurons in the cerebral cortex. 1
Visual and tactile sensory processing both involve neural mechanisms that extract high-level geometric features of a stimulus, such as the orientation of an edge, by integrating information from many low-level inputs. Although geometric feature extraction is generally attributed to neural processing in the cerebral cortex, there is growing evidence in the visual system that feature extraction begins very early in the processing pathway, even at the level of first-order (that is, bipolar) neurons in the retina. We found that feature extraction also begins very early in the tactile processing pathway, at the distal arborization of first-order tactile neurons.
The temporal structure of a neuron’s response provides substantial information about edge orientation. Edge orientation discrimination arises because different edges cause different patterns of spatial and temporal coincidence between a neuron’s many transduction sites and the moving stimulus. This peripheral mechanism may also permit first-order human tactile neurons to signal information about higher order aspects of a touched object, such as the curvature of an edge and its motion direction.
A neuron with multiple highly sensitive zones can still only signal that the stimulus is located somewhere in its receptive field, a population of such neurons with overlapping receptive fields provides higher resolution.
1. https://sci-hub.tw/https://www.nature.com/articles/nn.3804