Knowledge Vault 5 /15 - CVPR 2016
Sublabel-Accurate Relaxation of Nonconvex Energies
Thomas Möllenhoff, Emanuel Laude, Michael Moeller, Jan Lellmann, Daniel Cremers
< Resume Image >

Concept Graph & Resume using Claude 3 Opus | Chat GPT4o | Llama 3:

graph LR classDef optimization fill:#f9d4d4, font-weight:bold, font-size:14px classDef lifting fill:#d4f9d4, font-weight:bold, font-size:14px classDef approximation fill:#d4d4f9, font-weight:bold, font-size:14px classDef results fill:#f9f9d4, font-weight:bold, font-size:14px A[Sublabel-Accurate Relaxation of
Nonconvex Energies] --> B[Variational computer vision minimizes
data and regularizer. 1] A --> C[Challenging non-convex optimization problems. 2] C --> D[Gradient descent finds local optima,
unintuitive parameters. 3] C --> E[Ishikawa 2003: globally optimizing
graph cut, grid/label bias. 4] C --> F[Pock et al. 2008: continuous setting,
label bias remains. 5] A --> G[Stereo matching: more labels needed,
becomes intractable. 6] A --> H[Proposes piecewise convex approximation,
fewer labels. 7] A --> I[Related work: continuous MRFs,
specific regularizers. 8] A --> J[Contributions: continuous sub-label accurate,
tightest convex relaxation. 9] A --> K[Convexification and functional lifting explained. 10] K --> L[Traditional lifting samples labels,
takes convex envelope. 11] K --> M[Proposed: costs between labels,
closer approximation, convex. 12] K --> N[Non-convex energy folded
in higher dimensions. 13] K --> O[Reformulated energy defined
along convex combinations. 14] K --> P[Tightest convex envelope
found analytically. 15] K --> Q[C characterized by
convex conjugates epigraphs. 16] A --> R[Implementation: projecting onto epigraphs,
piecewise linear/quadratic. 17] A --> S[Total variation regularizer
lifted to higher dimensions. 18] A --> T[Convex-concave saddle point problem,
primal-dual GPU algorithm. 19] A --> U[Solution obtained by back-projecting. 20] A --> V[Convex model: sub-label accurate
matches direct optimization. 21] V --> W[Transitions between direct
optimization and lifting. 22] A --> X[Exact solution with 10 labels,
traditional needs more. 23] A --> Y[Stereo: traditional has label bias,
many labels. 24] A --> Z[Proposed: smooth results, few labels,
less runtime/memory. 25] Z --> AA[Improvement with equal labels,
reasonable with 2. 26] A --> AB[First continuous sub-label accurate
relaxation for non-convex energies. 27] A --> AC[Fewer labels than traditional,
improves runtime/memory. 28] A --> AD[Generalizes to piecewise
convex approximations. 29] A --> AE[Code available online. 30] class B,C,D,E,F optimization class G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U lifting class V,W,X,Y,Z,AA,AB,AC,AD approximation class AE results


1.- Variational approaches in computer vision minimize energies with a data term and total variation regularizer.

2.- These are challenging non-convex optimization problems.

3.- Classical methods like gradient descent only find local optima and have unintuitive parameters.

4.- Ishikawa (2003) proposed globally optimizing by considering the product space and finding a graph cut, but this has grid/label bias.

5.- Pock et al. (2008) generalized this to a continuous setting, eliminating grid bias but still having label bias due to range discretization.

6.- Label discretization issues illustrated for stereo matching - more labels needed for smooth results but becomes intractable.

7.- This work proposes solving with fewer labels by using a piecewise convex approximation instead of piecewise linear.

8.- Some related work on MRFs with continuous state spaces, but focused on discrete setting or specific regularizers.

9.- Key contributions - first spatially continuous fully sub-label accurate formulation, proof of tightest convex relaxation, unifies lifting and direct optimization.

10.- Idea of convexification and functional lifting explained - transform energy to higher dimensional space.

11.- Traditional multi-labeling samples lifted energy at labels and takes convex envelope - linear relaxation.

12.- Proposed approach assigns costs between labels too before convex envelope - closer approximation, still convex.

13.- Non-convex energy folded along basis vectors in higher dimensional space.

14.- Reformulated energy defined as original along convex combinations of basis vectors, infinity elsewhere.

15.- Tightest convex envelope found analytically, is support function of convex set C.

16.- C characterized by epigraphs of convex conjugates of non-convex energy pieces.

17.- Implementation involves projecting onto epigraphs, done for piecewise linear and quadratic so far.

18.- Total variation regularizer also lifted to higher dimensional space, same as Pock et al. 2008.

19.- Results in a convex-concave saddle point problem, solved by primal-dual algorithm on GPU.

20.- Solution obtained by back-projecting from higher to lower dimensional space.

21.- Evaluated on convex model, sub-label accurate finds same solution as direct optimization with little overhead.

22.- Provides transition between direct optimization and functional lifting.

23.- Finds exact solution with 10 labels while traditional needs many more labels and memory.

24.- For stereo, traditional has visible label bias even with many labels.

25.- Proposed method gives smooth results with few labels and less runtime/memory.

26.- Clear improvement with equal label counts, reasonable results with just 2 labels.

27.- Concludes it's first spatially continuous sub-label accurate relaxation for certain non-convex energies.

28.- Uses far fewer labels than traditional lifting while improving runtime and memory.

29.- Generalizes from piecewise linear to piecewise convex approximations.

30.- Code is available online.

Knowledge Vault built byDavid Vivancos 2024