Robust Multi-Modal Sensor Fusion: An Adversarial Approach

Virtual: https://events.vtools.ieee.org/m/284533

Abstract. Increasingly important Multi-Domain Operations entail multi-modal sensing which in turn, may require much effort to harness the information for exploitation. As a result, an upsurge in research interest in multi-modal fusion has emerged across industry, and in academia and government. Combining this multi-sensor information has been of particular interest in inference and target detection to enhance performance in challenging and adversarial environments. While fusion is strictly not new, a more principled approach has only recently been emerging on account of its ubiquitous need. The viability of these fusion approaches strongly hinges on the simultaneous functionality of all the sensors, limiting their efficacy in a real environment. The severity of this limitation is even more pronounced in unconstrained surveillance settings where the environmental conditions have a direct impact on the sensors, and close manual monitoring is difficult or even impractical. Partial sensor failure can hence cause a major drop in performance in a fusion system in the absence of a timely failure detection. We will describe in this talk a data driven approach to multimodal fusion, where optimal features for each sensor are selected from a hidden latent space among different modalities. This hidden space is learned via a generative network conditioned on individual sensor modalities. The hidden space, as an intrinsic structure, is then exploited as a palliative proxy for not only detecting damaged sensors, but for subsequently safeguarding the performance of the fused sensor system. Experimental results show that such an approach can make an inference system robust against noisy/damaged sensors, without requiring human intervention to inform the system about the damage. Co-sponsored by: IEEE SP Atlanta Chapter & IEEE AESS/GRSS Atlanta chapter Speaker(s): Dr. Hamid Krim, Virtual: https://events.vtools.ieee.org/m/284533

Robust Multi-Modal Sensor Fusion: An Adversarial Approach

Virtual: https://events.vtools.ieee.org/m/284533

Abstract. Increasingly important Multi-Domain Operations entail multi-modal sensing which in turn, may require much effort to harness the information for exploitation. As a result, an upsurge in research interest in multi-modal fusion has emerged across industry, and in academia and government. Combining this multi-sensor information has been of particular interest in inference and target detection to enhance performance in challenging and adversarial environments. While fusion is strictly not new, a more principled approach has only recently been emerging on account of its ubiquitous need. The viability of these fusion approaches strongly hinges on the simultaneous functionality of all the sensors, limiting their efficacy in a real environment. The severity of this limitation is even more pronounced in unconstrained surveillance settings where the environmental conditions have a direct impact on the sensors, and close manual monitoring is difficult or even impractical. Partial sensor failure can hence cause a major drop in performance in a fusion system in the absence of a timely failure detection. We will describe in this talk a data driven approach to multimodal fusion, where optimal features for each sensor are selected from a hidden latent space among different modalities. This hidden space is learned via a generative network conditioned on individual sensor modalities. The hidden space, as an intrinsic structure, is then exploited as a palliative proxy for not only detecting damaged sensors, but for subsequently safeguarding the performance of the fused sensor system. Experimental results show that such an approach can make an inference system robust against noisy/damaged sensors, without requiring human intervention to inform the system about the damage. Co-sponsored by: IEEE SP Atlanta Chapter & IEEE AESS/GRSS Atlanta chapter Speaker(s): Dr. Hamid Krim, Virtual: https://events.vtools.ieee.org/m/284533