TY - GEN
T1 - Learning a low-rank shared dictionary for object classification
AU - Vu, Tiep H.
AU - Monga, Vishal
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2016/8/3
Y1 - 2016/8/3
N2 - Despite the fact that different objects possess distinct class-specific features, they also usually share common patterns. Inspired by this observation, we propose a novel method to explicitly and simultaneously learn a set of common patterns as well as class-specific features for classification. Our dictionary learning framework is hence characterized by both a shared dictionary and particular (class-specific) dictionaries. For the shared dictionary, we enforce a low-rank constraint, i.e. claim that its spanning subspace should have low dimension and the coefficients corresponding to this dictionary should be similar. For the particular dictionaries, we impose on them the well-known constraints stated in the Fisher discrimination dictionary learning (FDDL). Further, we propose a new fast and accurate algorithm to solve the sparse coding problems in the learning step, accelerating its convergence. The said algorithm could also be applied to FDDL and its extensions. Experimental results on widely used image databases establish the advantages of our method over state-of-the-art dictionary learning methods.
AB - Despite the fact that different objects possess distinct class-specific features, they also usually share common patterns. Inspired by this observation, we propose a novel method to explicitly and simultaneously learn a set of common patterns as well as class-specific features for classification. Our dictionary learning framework is hence characterized by both a shared dictionary and particular (class-specific) dictionaries. For the shared dictionary, we enforce a low-rank constraint, i.e. claim that its spanning subspace should have low dimension and the coefficients corresponding to this dictionary should be similar. For the particular dictionaries, we impose on them the well-known constraints stated in the Fisher discrimination dictionary learning (FDDL). Further, we propose a new fast and accurate algorithm to solve the sparse coding problems in the learning step, accelerating its convergence. The said algorithm could also be applied to FDDL and its extensions. Experimental results on widely used image databases establish the advantages of our method over state-of-the-art dictionary learning methods.
UR - http://www.scopus.com/inward/record.url?scp=85006805250&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85006805250&partnerID=8YFLogxK
U2 - 10.1109/ICIP.2016.7533197
DO - 10.1109/ICIP.2016.7533197
M3 - Conference contribution
AN - SCOPUS:85006805250
T3 - Proceedings - International Conference on Image Processing, ICIP
SP - 4428
EP - 4432
BT - 2016 IEEE International Conference on Image Processing, ICIP 2016 - Proceedings
PB - IEEE Computer Society
T2 - 23rd IEEE International Conference on Image Processing, ICIP 2016
Y2 - 25 September 2016 through 28 September 2016
ER -