The sense of touch plays an important role to build a perceptual representation of motion and space. This aspect deserves specific attention since it can guide the technological development of devices that can encode relevant information related to touch-related spatial information processing, as it is the case of Braille displays, and shed light on the neuroscientific mechanisms underpinning motion control and execution. In this paper, we report on the design an validation of an electromagnetic singlecell refreshable Braille display, named Readable. We also analyse and model the role of touch in guiding hand movement. To this aim, we performed experiments where we asked blindfolded participants to slide the fingertip on a ridged plate. We found that the orientation of the ridges produced a bias in hand motion direction. This is due to the fact that the tactile cues induced the illusion of bending towards a perpendicular direction with respect to the ridge orientation and this triggered a correction of the movement in the opposite direction. We modelled the integration of touch with classical muscular-skeletal proprioception using a Kalman filter model