Trainee Project
Title:
Tensor learning for color and image processing
Dates:
2023/03/01 - 2023/08/31
Description:
Context
Many imaging applications rely on the acquisition, processing and analysis of 3D or 4D vectorial data pixels: this includes notably color imaging (red, blue and green channels) or polarimetric imaging (4D Stokes parameters at each pixel). Such multichannel data is often represented using quaternions - a generalization of complex numbers in four dimensions - in order to simplify expressions and leverage unique geometric and physical insights offered by this algebraic representation. Therefore, datasets of color or polarimetric images can be viewed as a collection of quaternion-valued matrices, which form multidimensional quaternion arrays - also called quaternion tensors.

Summary
The aim of this internship is to demonstrate the potential of quaternion tensor decom- positions for learning features from databases of color and polarimetric images. Quaternion tensor decompositions have only been introduced recently [1]. They generalize usual tensor decompositions [2] to the quaternion field. The candidate will take advantage of the algorithms proposed in [1]. He / she will focus on two main cases of uses of quaternion tensor decompositions (Canonical Polyadic and Tucker) to :
1. learn features from a standard color image database (such as ImageNET)
2. perform source separation on polarimetric hyperspectral data
One key complementary objective will be to benchmark performances of quaternion tensor decompo- sitions against standard real-domain tensor decompositions.

[1] J. Flamant, X. Luciani, Y. Zniyed, and S. Miron, "Tenseurs `a valeurs quaternioniques : un objet math ́ematique `a identifier," in GRETSI 2022 - XXVIIIeme Colloque Francophone de Traitement du Signal et des Images, Nancy, France, Sep. 2022.
[2] T. G. Kolda and B. W. Bader, "Tensor decompositions and applications," SIAM review, vol. 51, no. 3, pp. 455-500, 2009.
Department(s): 
Biology, Signals and Systems in Cancer and Neuroscience
Funds:
CNRS