The access to many sources of satellite information is nowadays a reality. However, few methods allow to consider simultaneously data coming from different sensors, due to the differences in numbers of bands, spatial resolution and changes in the acquisition conditions. In this paper, we propose a methodology to align the data structures (also called manifolds) of two (or more) images and to exploit them simultaneously in a joint latent space. The method being invertible, it also have the interesting property to allow to project the image pixels from one sensor to another, thus allowing to synthesize the bands of one sensor using the pixels of the other through the projection learned. Experiments using QuickBird and World-View II images show the properties of the method and open new opportunities for multisensor remote-sensing.