Knowledge Vault 5 /4 - CVPR 2015
Efficient Globally Optimal Consensus Maximisation with Tree Search
Tat-Jun Chin, Pulak Purkait, Anders Eriksson, David Suter
< Resume Image >

Concept Graph & Resume using Claude 3 Opus | Chat GPT4o | Llama 3:

graph LR classDef consensus fill:#f9d4d4, font-weight:bold, font-size:14px classDef residuals fill:#d4f9d4, font-weight:bold, font-size:14px classDef algorithm fill:#d4d4f9, font-weight:bold, font-size:14px classDef search fill:#f9f9d4, font-weight:bold, font-size:14px classDef generalization fill:#f9d4f9, font-weight:bold, font-size:14px A[Efficient Globally Optimal
Consensus Maximisation with
Tree Search] --> B[Maximum consensus: find agreeing
model parameters 1] B --> C[Line fitting maximum
consensus 2] B --> D[Computer vision maximum
consensus 3] B --> E[1D linear regression
example 4] A --> F[Residual: point-line vertical
distance 5] F --> G[RANSAC: subsets, no
global guarantee 6] F --> H[Min-max: minimize largest
residual 7] H --> I[L-infinity: two maximum
residuals 8] F --> J[Parameter space: V-shaped
residuals 9] J --> K[V-curves maximum: convex,
linear programming 10] K --> L[Simplex: vertex-to-vertex
convex curve 11] H --> M[Support set: min-max
solution points 12] M --> N[Combinatorial dimension: support
set size 13] B --> O[Maximum consensus: residual,
size p+1 14] A --> P[Algorithm: p+1 subsets, min-max,
, coverage 15] P --> Q[Subsets: polynomial in
p+1 16] B --> R[Linear regression maximum consensus:
not NP-hard 17] F --> S[Residual functions: tree
of bases 18] S --> T[Root: min-max. Children:
point removal 19] S --> U[Level: points outside coverage.
Feasible:20] P --> V[Goal: lowest feasible
basis level 21] A --> W[Breadth-first search: lowest
feasible basis 22] A --> X[A level + feasibility
heuristic 23] X --> Y[Heuristic: removed bases.
Admissible: optimality 24] P --> Z[Outlier ratio limit:
1/p+1 25] A --> AA[Generalizes to pseudo-convex
residuals 26] AA --> AB[Pseudo-convex -level sets:
convex 27] AA --> AC[Pseudo-convex functions: quick
iterative optimization 28] AA --> AD[Pseudo-convex: tree structure,
search methods 29] B --> AE[Multiple hypotheses: increased runtime,
exact solution 30] class A,B,C,D,E,O,R consensus class F,G,H,I,J,S,T,U residuals class P,Q,V,Z algorithm class W,X,Y search class AA,AB,AC,AD,AE generalization

Resume:

1.- Maximum consensus problem: find model parameters that agree with the maximum number of data points.

2.- Line fitting example of maximum consensus.

3.- Triangulation and homography fitting are practical computer vision examples of maximum consensus.

4.- 1D linear regression is the running example used to illustrate ideas.

5.- Residual is the vertical distance of a point to the estimated line.

6.- RANSAC samples minimal subsets to fit models but doesn't guarantee finding the global maximum consensus.

7.- Min-max problem minimizes the largest residual (L-infinity minimization or Chebyshev regression).

8.- L-infinity solution has two points with the same maximum residual being minimized.

9.- Plotting residuals in parameter space yields V-shaped curves for each data point.

10.- Maximum of V-curves is convex and min-max problem can be solved via linear programming.

11.- Simplex algorithm is equivalent to dropping down the convex curve vertex to vertex.

12.- Two points forming the min-max solution are the active set, basis or support set.

13.- Combinatorial dimension is the size of the support set (p+1).

14.- Given the maximum consensus set, min-max solution on it has residual ≤ ε and a size p+1 support set.

15.- Algorithm: enumerate p+1 subsets, solve min-max, keep bases with residual ≤ ε, report basis with highest coverage.

16.- Number of subsets is polynomial in p+1 (n choose p+1 or O(n^(p+1))).

17.- Maximum consensus for linear regression is not NP-hard, contrary to some claims.

18.- Intersections of residual functions form a tree structure of bases.

19.- Root basis is min-max solution on full dataset. Child bases formed by removing points.

20.- Level in basis tree indicates number of points outside coverage. Feasible bases have residual ≤ ε.

21.- Goal: find lowest level feasible basis, guaranteeing maximum consensus solution.

22.- Breadth-first search of basis tree level-by-level finds lowest feasible basis.

23.- A search prioritizes search by level plus heuristic estimating distance to feasibility.

24.- Heuristic: number of bases removed to reach feasibility estimates outliers. Admissible heuristics guarantee optimality.

25.- Method performs well but heuristic limits performance if outlier ratio > 1/(p+1).

26.- Approach generalizes beyond linear regression to pseudo-convex residuals like transfer error and reprojection error.

27.- Alpha sub-level sets of pseudo-convex functions are convex.

28.- Iterative optimization quickly finds optimum of pseudo-convex functions.

29.- Tree structure and search methods still apply for pseudo-convex residuals.

30.- Exact algorithm finds maximum consensus even with multiple true hypotheses, but runtime increases due to more outliers.

Knowledge Vault built byDavid Vivancos 2024