Concept Graph & Resume using Claude 3 Opus | Chat GPT4o | Llama 3:
Resume:
1.- Maximum consensus problem: find model parameters that agree with the maximum number of data points.
2.- Line fitting example of maximum consensus.
3.- Triangulation and homography fitting are practical computer vision examples of maximum consensus.
4.- 1D linear regression is the running example used to illustrate ideas.
5.- Residual is the vertical distance of a point to the estimated line.
6.- RANSAC samples minimal subsets to fit models but doesn't guarantee finding the global maximum consensus.
7.- Min-max problem minimizes the largest residual (L-infinity minimization or Chebyshev regression).
8.- L-infinity solution has two points with the same maximum residual being minimized.
9.- Plotting residuals in parameter space yields V-shaped curves for each data point.
10.- Maximum of V-curves is convex and min-max problem can be solved via linear programming.
11.- Simplex algorithm is equivalent to dropping down the convex curve vertex to vertex.
12.- Two points forming the min-max solution are the active set, basis or support set.
13.- Combinatorial dimension is the size of the support set (p+1).
14.- Given the maximum consensus set, min-max solution on it has residual ≤ ε and a size p+1 support set.
15.- Algorithm: enumerate p+1 subsets, solve min-max, keep bases with residual ≤ ε, report basis with highest coverage.
16.- Number of subsets is polynomial in p+1 (n choose p+1 or O(n^(p+1))).
17.- Maximum consensus for linear regression is not NP-hard, contrary to some claims.
18.- Intersections of residual functions form a tree structure of bases.
19.- Root basis is min-max solution on full dataset. Child bases formed by removing points.
20.- Level in basis tree indicates number of points outside coverage. Feasible bases have residual ≤ ε.
21.- Goal: find lowest level feasible basis, guaranteeing maximum consensus solution.
22.- Breadth-first search of basis tree level-by-level finds lowest feasible basis.
23.- A search prioritizes search by level plus heuristic estimating distance to feasibility.
24.- Heuristic: number of bases removed to reach feasibility estimates outliers. Admissible heuristics guarantee optimality.
25.- Method performs well but heuristic limits performance if outlier ratio > 1/(p+1).
26.- Approach generalizes beyond linear regression to pseudo-convex residuals like transfer error and reprojection error.
27.- Alpha sub-level sets of pseudo-convex functions are convex.
28.- Iterative optimization quickly finds optimum of pseudo-convex functions.
29.- Tree structure and search methods still apply for pseudo-convex residuals.
30.- Exact algorithm finds maximum consensus even with multiple true hypotheses, but runtime increases due to more outliers.
Knowledge Vault built byDavid Vivancos 2024