In this study we aimed to explore the predictive link between visual stimuli moving towards the body and the tactile consequences that follow. More specifically, we tested if information derived from an approaching visual stimulus in the region directly surrounding the body (the peripersonal space) could be used to make judgments about the location and time of impending tactile contact. We used moving arm stimuli, displayed on a computer screen, which appeared to travel either towards the face (middle of the left/right cheek) or slightly away from the subject's face. This stimulus was followed by tactile stimulation of the left/right cheek. The time lag between the visual stimulus and tactile stimulation was also manipulated to simulate tactile contact at a time that was either consistent or inconsistent with the speed of the approaching hand. Reaction time information indicated that faster responses were produced when the arm moved towards the hemispace in which the tactile stimulation was delivered and was insensitive to whether the arm was moving towards the cheek or slightly away from the cheek. Furthermore, response times were fastest when the tactile stimulation arrived at the moment that was consistent with the speed of the moving arm. The effects disappeared when the arm appeared to be retracting from the subject's face. These results suggest the existence of a predictive mechanism that exploits the visual information derived from objects moving towards the body for making judgments about the time and location of impending tactile contact.