ITEM METADATA RECORD
Title: Hough Forests for Object Detection, Tracking, and Action Recognition
Authors: Gall, Juergen ×
Yao, Angela
Razavi, Nima
Van Gool, Luc
Lempitsky, Victor #
Issue Date: Nov-2011
Publisher: IEEE Computer Society
Series Title: IEEE Transactions on Pattern Analysis and Machine Intelligence vol:33 issue:11 pages:2188-2202
Abstract: The paper introduces Hough forests, which are random forests adapted to perform a generalized Hough transform in an efficient way. Compared to previous Hough-based systems such as implicit shape models, Hough forests improve the performance of the generalized Hough transform for object detection on a categorical level. At the same time, their flexibility permits extensions of the Hough transform to new domains such as object tracking and action recognition. Hough forests can be regarded as task-adapted codebooks of local appearance that allow fast supervised training and fast matching at test time. They achieve high detection accuracy since the entries of such codebooks are optimized to cast Hough votes with small variance and since their efficiency permits dense sampling of local image patches or video cuboids during detection. The efficacy of Hough forests for a set of computer vision tasks is validated through experiments on a large set of publicly available benchmark data sets and comparisons with the state-of-the-art.
Description: Gall J., Yao A., Razavi N., Van Gool L., Lempitsky V., ''Hough forests for object detection, tracking, and action recognition'', IEEE transactions on pattern analysis and machine intelligence, vol. 33, no. 11, pp. 2188-2202, November 2011.
ISSN: 0162-8828
Publication status: published
KU Leuven publication type: IT
Appears in Collections:ESAT - PSI, Processing Speech and Images
× corresponding author
# (joint) last author

Files in This Item:
File Description Status SizeFormat
3393.pdf Published 1998KbAdobe PDFView/Open Request a copy

These files are only available to some KU Leuven Association staff members

 




All items in Lirias are protected by copyright, with all rights reserved.

© Web of science