LNCS Homepage
ContentsAuthor IndexSearch

Transfer Learning Based Visual Tracking with Gaussian Processes Regression

Jin Gao1, 2, Haibin Ling2, Weiming Hu1, and Junliang Xing1

1National Laboratory of Pattern Recognition, Institute of Automation, CAS, Beijing, China
jin.gao@nlpr.ia.ac.cn
wmhu@nlpr.ia.ac.cn
jlxing@nlpr.ia.ac.cn

2Department of Computer and Information Sciences, Temple University, Philadelphia, USA
hbling@temple.edu

Abstract. Modeling the target appearance is critical in many modern visual tracking algorithms. Many tracking-by-detection algorithms formulate the probability of target appearance as exponentially related to the confidence of a classifier output. By contrast, in this paper we directly analyze this probability using Gaussian Processes Regression (GPR), and introduce a latent variable to assist the tracking decision. Our observation model for regression is learnt in a semi-supervised fashion by using both labeled samples from previous frames and the unlabeled samples that are tracking candidates extracted from the current frame. We further divide the labeled samples into two categories: auxiliary samples collected from the very early frames and target samples from most recent frames. The auxiliary samples are dynamically re-weighted by the regression, and the final tracking result is determined by fusing decisions from two individual trackers, one derived from the auxiliary samples and the other from the target samples. All these ingredients together enable our tracker, denoted as TGPR, to alleviate the drifting issue from various aspects. The effectiveness of TGPR is clearly demonstrated by its excellent performances on three recently proposed public benchmarks, involving 161 sequences in total, in comparison with state-of-the-arts.

LNCS 8691, p. 188 ff.

Full article in PDF | BibTeX


lncs@springer.com
© Springer International Publishing Switzerland 2014