Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/105212
Citations
Scopus Web of Science® Altmetric
?
?
Type: Journal article
Title: Efficient globally optimal consensus maximisation with tree search
Author: Chin, T.
Purkait, P.
Eriksson, A.
Suter, D.
Citation: IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017; 39(4):758-772
Publisher: IEEE Computer Society
Issue Date: 2017
ISSN: 0162-8828
1939-3539
Statement of
Responsibility: 
Tat-Jun Chin, Pulak Purkait, Anders Eriksson and David Suter
Abstract: Maximum consensus is one of the most popular criteria for robust estimation in computer vision. Despite its widespread use, optimising the criterion is still customarily done by randomised sample-and-test techniques, which do not guarantee optimality of the result. Several globally optimal algorithms exist, but they are too slow to challenge the dominance of randomised methods. Our work aims to change this state of affairs by proposing an efficient algorithm for global maximisation of consensus. Under the framework of LP-type methods, we show how consensus maximisation for a wide variety of vision tasks can be posed as a tree search problem. This insight leads to a novel algorithm based on A* search. We propose efficient heuristic and support set updating routines that enable A* search to efficiently find globally optimal results. On common estimation problems, our algorithm is much faster than previous exact methods. Our work identifies a promising direction for globally optimal consensus maximisation.
Rights: © 2016 IEEE
DOI: 10.1109/TPAMI.2016.2631531
Grant ID: http://purl.org/au-research/grants/arc/DP130102524
http://purl.org/au-research/grants/arc/DE130101775
http://purl.org/au-research/grants/arc/DP160103490
Published version: http://dx.doi.org/10.1109/tpami.2016.2631531
Appears in Collections:Aurora harvest 8
Electrical and Electronic Engineering publications

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.