Tactile discrimination is about our somatosensory cortex receiving and processing detailed tactile information, together with proprioceptive information, and integrating it with other detailed sensory information from vision, auditory etc to build neuronal connections and create libraries of malleable memory maps of the world around us. These memory maps are ‘experience-dependent’ we have to explore the world to build the library.
We gain our discriminative tactile information largely through the receptors in our glabrous skin, our lips, tongue, mouth, palms and soles. We gain rich tactile information by moving objects across the glabrous skin areas. Mouthing of objects, or moving the tongue across a surface, or in-hand manipulation enables our somatosensory system to “see” objects. It’s akin to reading Braille.
For humans our two main discriminative tactile sites of hands and mouth, are functionally linked before birth. Hand-to-mouth coordination is supported in utero as early as 12 weeks GA. Before birth hand and mouth work together to support feeding and regulation behaviours at birth. Within weeks our hand and mouth work together for multimodal exploration and manipulation of objects, the means by which we will gather rich sensory information.

When we put objects in our mouth, we gain tactile, taste, smell and proprioceptive information. Our mouth also creates a boundaried space that surrounds the object and enables us to gain a tactile 3-D image of the object.
Mouthing of objects, hand-mouth coordination, is a typical part of our development and appears to underpin the next stage of development, eye-hand-mouth coordination.
Initially when an object is placed in the hand and grasped it is brought straight to the mouth for exploration. At around 4-5 months the grasped object is first brought into the visual field for inspection before going to the mouth for more tactile and proprioception exploration.
Our hands also create a boundaried 3-D space for objects. As we manipulate objects in our hands the movement fires the tactile receptors giving information of size, shape, texture.
When we begin to look at objects, we reach for them to explore through our tactile system and link this information with vision. This way the distal information, vision, becomes proximal information and is grounded within us, made real to us. Even as adults if we see something new, we want to touch it. Confirm what it feels like, what it is. It is this way that the tactile system also underpins our visual perceptual skill development.
For babies the whole world is new, they are driven to touch everything they see, to want to go out and explore the world bringing multimodal discriminative sensory information together. The world is full of textures and objects to be explored. We will only move to explore and build libraries of information if we feel safe and are regulated and have the postural skills to do so.
Supporting sense of safety and regulation is key to our discriminative development.