PhD Position F/M [campagne doc mi-nf-lys-2024] Low-rank Tensor Representations of Convolutional Neural Networks

Contract type : Fixed-term contract

Level of qualifications required : Graduate degree or equivalent

Other valued qualifications : Master degree

Fonction : PhD Position

About the research centre or Inria department

The Inria research centre in Lyon is the 9th Inria research centre, formally created in January 2022.  It brings together approximately 300 people in 16 research teams and research support services.

Its staff are distributed at this stage on 2 campuses: in Villeurbanne La Doua (Centre / INSA Lyon / UCBL) on the one hand, and Lyon Gerland (ENS de Lyon) on the other.

The Lyon centre is active in the fields of software, distributed and high-performance computing, embedded systems, quantum computing and privacy in the digital world, but also in digital health and computational biology.


Convolutional neural networks (CNNs) are currently the state-of-the-art models to classify objects in several domains, such as computer vision, speech recognition, text processing etc. Thanks to improved computational capability, we witness several popular complex and deeper CNNs. For example, AlexNet is 8 layers deep, while ResNet employs short connections and is represented with 152 layers. Both have about 60M parameters. CNNs have intensive computational requirements due to their huge complexity and large number of parameters.


Tensors are a natural way to represent high dimensional data for numerous applications in computational science and data science [1]. CP, Tucker and Tensor Train are the widely used tensor decomposition methods in the literature. These decompositions represent a high dimensional object with a small set of low dimensional objects.


Representing a high dimensional tensor with a set of smaller dimensional objects drastically reduces the overall number of parameters. This led to the use of low-rank tensor representations at different layers of CNNs. For example, it has been shown that replacing convolution kernels of ResNet with their low-rank approximations in Tucker tensor representations significantly reduces the number of parameters and improves the overall performance [2]. In a separate work, contributions have been made to replace dense weight matrices of the fully connected layers of AlexNet by their approximations in Tensor-train format [3]. This approach also significantly reduces the number of parameters while achieving the similar accuracy. The above contributions strongly advocate to employ the low-rank tensor representations in CNNs. We view the full CNN as a large tensor and aim to replace it with a set of smaller tensors.


[1]  T. G. Kolda and B. W. Bader, “Tensor decompositions and applications,” SIAM Review, vol. 51, no. 3, pp. 455–500, 2009. [Online]. Available:

[2] A.-H. Phan, K. Sobolev, K. Sozykin, D. Ermilov, J. Gusak, P. Tichavsky ́, V. Glukhov, I. Oseledets, and A. Cichocki, “Stable low-rank tensor decomposition for compression of convolutional neural network,” in Computer Vision – ECCV 2020, pp. 522–539. [Online]. Available:

[3] A. Novikov, D. Podoprikhin, A. Osokin, and D. P. Vetrov, “Tensorizing neural networks,” in Advances in Neural Information Processing Systems, vol. 28, 2015. [Online]. Available:


We view CNN models as large tensors and plan to represent them with their low-rank tensor representations. The main goal of this PhD thesis is to take advantage of parallel work on tensor computations and various methods to iteratively train tensor based frameworks for the efficient training and prediction with popular CNN models.

This PhD thesis will be held in the ROMA Inria team at LIP, ENS Lyon under the supervision of Suraj Kumar and Loris Marchal.

Main activities

The candidate is expected to perform the following activities:

  • Analyze existing training methods for CNNs and adapt them for tensor based models
  • Represent popular CNN models with low-rank tensor representations
  • Evaluate proposed models for MNSIT, CIFAR and ImageNet datasets
  • Design parallel algorithms for the proposed models


The candidate must have a Master’s degree in Computer Science, Computational Sciences, Applied Mathematics, or a related technical field.

Familiarity with Linear Algebra computations and Neural Networks will be much appreciated.

Benefits package

  • Subsidized meals
  • Partial reimbursement of public transport costs
  • Leave: 7 weeks of annual leave + 10 extra days off due to RTT (statutory reduction in working hours) + possibility of exceptional leave (sick children, moving home, etc.)
  • Possibility of teleworking (after 6 months of employment) and flexible organization of working hours
  • Professional equipment available (videoconferencing, loan of computer equipment, etc.)
  • Social, cultural and sports events and activities
  • Access to vocational training
  • Social security coverage


1st and 2nd year: 2100 euros gross salary /month
 3rd year: 2190 euros gross salary / month