Description of the image

Our Method Compared to the baseline SOTA methods: (a) Panchromatic (Pan) input. (b) CycleGAN output. (c) CUT output. (d) PETIT output. (e) Real unpaired monochromatic image (Mono) for reference.

Abstract

Thermal multispectral imagery is imperative for a plethora of environmental applications. Unfortunately, there are no publicly-available datasets of thermal multispectral images with a high spatial resolution that would enable the development of algorithms and systems in this field. However, image-to-image (I2I) translation could be used to artificially synthesize such data by transforming largely-available datasets of other visual modalities. In most cases, pairs of content-wise-aligned input-target images are not available, making it harder to train and converge to a satisfying solution. Nevertheless, some data domains, and particularly the thermal domain, have unique properties that tie the input to the output that could help mitigate those weaknesses. We propose PETIT-GAN, a physically enhanced thermal image-translating generative adversarial network to transform between different thermal modalities - a step toward synthesizing a complete thermal multispectral dataset. Our novel approach embeds physically modeled prior information in an UI2I translation to produce outputs with greater fidelity to the target modality. We further show that our solution outperforms the current state-of-the-art architectures at thermal UI2I translation by approximately 50% with respect to the standard perceptual metrics, and enjoys a more robust training procedure.

Methods

Dataset

BibTeX

@InProceedings{Berman_2024_WACV,
  author    = {Berman, Omri and Oz, Navot and Mendlovic, David and Sochen, Nir and Cohen, Yafit and Klapp, Iftach},
  title     = {PETIT-GAN: Physically Enhanced Thermal Image-Translating Generative Adversarial Network},
  booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)},
  month     = {January},
  year      = {2024},
  pages     = {1618-1627}
}