LNCS Homepage
ContentsAuthor IndexSearch

Efficient Sparsity Estimation via Marginal-Lasso Coding

Tzu-Yi Hung1, Jiwen Lu2, Yap-Peng Tan1, and Shenghua Gao3

1School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore

2Advanced Digital Sciences Center, Singapore

3ShanghaiTech University, Shanghai, China

Abstract. This paper presents a generic optimization framework for efficient feature quantization using sparse coding which can be applied to many computer vision tasks. While there are many works working on sparse coding and dictionary learning, none of them has exploited the advantages of the marginal regression and the lasso simultaneously to provide more efficient and effective solutions. In our work, we provide such an approach with a theoretical support. Therefore, the computational complexity of the proposed method can be two orders faster than that of the lasso with sacrificing the inevitable quantization error. On the other hand, the proposed method is more robust than the conventional marginal regression based methods. We also provide an adaptive regularization parameter selection scheme and a dictionary learning method incorporated with the proposed sparsity estimation algorithm. Experimental results and detailed model analysis are presented to demonstrate the efficacy of our proposed methods.

Keywords: Sparsity estimation, marginal regression, sparse coding, lasso, dictionary learning, adaptive regularization parameter

LNCS 8692, p. 578 ff.

Full article in PDF | BibTeX


lncs@springer.com
© Springer International Publishing Switzerland 2014