Nowadays we observe and model the Earth with a wealth of observations, from a plethora of different sensors, measuring states, fluxes, processes, and variables, at unprecedented spatial and temporal resolutions. Earth observation is well equipped with diverse remote sensing systems, mounted on satellites, airborne, and UAV platforms. Multisensor data fusion, as a vibrant field of research in the remote sensing community, evolved through decades of research to combine heterogeneous data types for scene representation and classification. The advances in multisensor data fusion have been inspired by two fields of research, including the popularization of image and signal processing as well as machine (deep) learning, leading to two types of approaches named shallow and deep techniques. In this seminar, groups of students will each present an approach in lecture and elaboration by emphasizing the use of shallow and deep models for multisensor data fusion. Students also apply a number of shallow and deep fusion models on real satellite multisensor images. For this purpose, a good amount of codes will be distributed among the students for benchmarking and evaluation.
- Kursverantwortliche/r: Pedram Ghamisi