It has been argued that representations of peripersonal space based on haptic input are systematically distorted by egocentric reference frames. Interestingly, a recent study has shown that noninformative vision (i.e., freely viewing the region above the haptic workspace) improves performance on the so-called haptic parallel-setting task, in which participants are instructed to rotate a test bar until it is parallel to a reference bar. In the present study, we made a start at identifying the different sensory integration mechanisms involved in haptic space perception by distinguishing the possible effects of orienting mechanisms from those of noninformative vision. We found that both the orienting direction of head and eyes and the availability of noninformative vision affect parallel-setting performance and that they do so independently: orienting towards a reference bar facilitated the parallel-setting of a test bar in both no-vision and noninformative vision conditions, and noninformative vision improved performance irrespective of orienting direction. These results suggest the effects of orienting and noninformative vision on haptic space perception to depend on distinct neurocognitive mechanisms, likely to be expressed in different modulations of neural activation in the multimodal parietofrontal network, thought to be concerned with multimodal representations of peripersonal space.