multimodal data fusion of remotely sensed and in-situ acquired sensor data
While Remote Sensing (non optical or optical) provides highly resolved information on a global scale, but being restricted to those effects that are remotely visible, 3D-In-Situ-Exploration opens up (with strong applications in mobile cooperative robotics, robotical rescue, catastrophe and disaster recovery) complementary possibilities for information mining. During the processing of such multimodally acquired information, the different imaging geometries, noise features, visibility characteristics and also different scales and resolutions of the different primary sensors must be taken into account. They must be modelled and incorporated during the image or information fusion. Special points of interest concentrate on new developments in information theory relevant to inference, as well as general methods for image and geoinformation processing and understanding.
|Image fusion of bistatic SAR image (raw data: © FHR/FGAN Processing © ZESS) and optical ortho photo (© Landesvermessungsamt Bayern) light grey|