Knowledge Vault 5 /75 - CVPR 2022
Learning to Solve Hard Minimal Problems
Petr Hruby, Timothy Duff, Anton Leykin, and Tomas Pajdla
< Resume Image >

Concept Graph & Resume using Claude 3 Opus | Chat GPT4o | Llama 3:

graph LR classDef main fill:#f9f9f9, stroke:#333, stroke-width:1px, font-weight:bold, font-size:14px classDef opt fill:#d4f9d4, stroke:#333, stroke-width:1px, font-weight:bold, font-size:14px classDef pose fill:#d4d4f9, stroke:#333, stroke-width:1px, font-weight:bold, font-size:14px classDef solve fill:#f9d4d4, stroke:#333, stroke-width:1px, font-weight:bold, font-size:14px classDef learn fill:#f9f9d4, stroke:#333, stroke-width:1px, font-weight:bold, font-size:14px classDef manifold fill:#f9d4f9, stroke:#333, stroke-width:1px, font-weight:bold, font-size:14px classDef homotopy fill:#d4f9f9, stroke:#333, stroke-width:1px, font-weight:bold, font-size:14px A[Learning to Solve
Hard Minimal Problems] --> B[Solve geometric optimization
avoid spurious solutions 1] B --> C[Compute camera pose
relaxed 4 point problem 2] C --> D[10x speedup vs
careful approximation 3] B --> E[Applicable to hard
minimal problems 4] B --> F[Couple solver with
real data distribution 5] A --> G[Problem-solution manifold
projected to problem space 6] G --> H[Probability density on
problem-solution manifold 7] H --> I[Assumption: dominant meaningful
solution in real data 8] A --> J[Pick promising start
continue to solution 9] A --> K[Offline: sample, cover
anchors, learn selection 10] A --> L[Online: preprocess, select
start, construct, compute 11] K --> M[Anchor selection: graph
dominating set 12] K --> N[Learn starting pair
selection as classification 13] L --> O[Homotopy continuation
track solution path 14] O --> P[Efficient homotopy
optimized predictor-corrector 15] A --> Q[Minimal problem formulations
depths, constraints, relaxations 16] A --> R[Normalize/preprocess input
for learning, tracking 17] A --> S[RANSAC experiments
evaluate generalization, robustness 18] S --> T[Lower success rate
needs more RANSAC 19] A --> U[Code and data
publicly available 20] class A main class B,E,F opt class C,D,Q,R pose class G,H,I manifold class J,S solve class K,M,N learn class L,O,P,T homotopy


1.- Solving hard geometric optimization problems in RANSAC by avoiding spurious solutions using learning and homotopy continuation.

2.- Demonstrating the approach on computing relative pose of three calibrated cameras via relaxed minimal problem with four points per view.

3.- Single problem solved in under 70 μs on average, over 10x speedup compared to previous careful approximation.

4.- General approach applicable to other hard minimal problems, even some with infinite spurious solution families.

5.- Coupling the solver with real data distribution of the specific computer vision problem.

6.- Problem-solution manifold contains problem-solution pairs, projected to problem space.

7.- Introducing probability density on the problem-solution manifold representing real-world problem-solution pair distribution.

8.- Assumption: an input problem likely has one dominant meaningful solution occurring frequently in real data.

9.- Pick & solve approach: pick promising starting point, then continue it to a meaningful solution.

10.- Offline stage: sampling data representing manifold and distribution, covering with anchors, learning anchor selection model.

11.- Online stage: preprocessing input, selecting starting pair, constructing equations, computing solution by homotopy continuation from starting pair.

12.- Anchor selection by building graph of problem-solution pairs and finding dominating set.

13.- Learning starting pair selection as classification task using multi-layer perceptron.

14.- Homotopy continuation to track one real solution path from start to target problem.

15.- Efficient homotopy continuation implementation with optimized predictor-corrector steps using sparsity.

16.- Minimal problem formulations using unknown depths, with basic multiview constraints and problem-specific relaxations.

17.- Normalizing/preprocessing input image correspondences for simplified learning and tracking.

18.- Experiments with RANSAC on real datasets to evaluate generalization, noise/mismatch robustness.

19.- Results show lower success rate can be compensated by more RANSAC iterations.

20.- Code and data made publicly available.

Knowledge Vault built byDavid Vivancos 2024