We studied the low-level interactions between motion coherence detection and binocular correlation detection. It is well-established that e.g. depth information from motion parallax and from binocular disparities is effectively integrated. The question we aimed to answer is whether such interactions also exist at the very first correlation level that both mechanisms might have in common. First we quantitatively compared motion coherence detection and binocular correlation detection using similar stimuli (random pixels arrays, RPAs) and the same noise masking paradigm (luminance signal to noise ratio, LSNR). This showed that human observers are much more sensitive to motion than to binocular correlation. Adding noise therefore has a much stronger effect on binocular correlation than on motion detection. Next we manipulated the shape of the stimulus aperture to equalize LSNR thresholds for motion and binocular correlation. Motion sensitivity could be progressively reduced by shortening the length of the motion path, while keeping the aperture area constant. Changing the shape of the aperture did not affect binocular correlation sensitivity. A 'balanced' stimulus, one with equal strengths of motion and binocular correlation signals was then used to study the mutual interactions. In accordance with previous results, motion was found to greatly facilitate binocular correlation. Binocular correlation, however did not facilitate motion detection. We conclude that interactions are asymmetrical; fronto-parallel motion is primarily detected monocularly and this information can then be used to facilitate binocular correlation, but binocular correlation cannot improve motion sensitivity.