Learning Computationally Efficient Metrics for Large Scale Person Identification

Nearest neighbor (NN) classifiers rely on a distance metric either a priori fixed or previously estimated through metric learning. When such metric learning occurs, a natural objective is to minimize the classification errors of the NN classifier. This learning procedure is however commonly performed without any regard to the computational efficiency of the NN classifier at test time. In this work, we propose to formulate the metric learning problem as a multi-objective trade-off between classification performance and computational efficiency at test time. This is illustrated here in the context of large scale person identification. Our method shows a significant improvement of the search efficiency of a NN classifier, compared to standard soft-margin maximization metrics. In presence of hard time and space constraints, it leads to a drastic enhancement of the identification performance.

 

Authors :

Victor Hamer is a research assistant at the INGI department at the UCLouvain, Belgium. He completed a Master in Computer Science and Engineering at the UCLouvain in 2018. The article originates from his Master's thesis.

 

 

     http://www.ncbme.ugent.be/wp-content/uploads/2018/06/pdupont.jpg

 

       Pierre Dupont is a Professor at the Louvain School of Engineering at the UCLouvain, Belgium and the co-founder of the UCL Machine Learning Group.

 

 

 

This poster will be presented at  Benelearn 2018, Jheronimus Academy Of Data Science on Nov. 8-9, 2018

 

Published on November 06, 2018