Concept Graph & Resume using Claude 3 Opus | Chat GPT4o | Llama 3:
Resume:
1.- Variational approaches in computer vision minimize energies with a data term and total variation regularizer.
2.- These are challenging non-convex optimization problems.
3.- Classical methods like gradient descent only find local optima and have unintuitive parameters.
4.- Ishikawa (2003) proposed globally optimizing by considering the product space and finding a graph cut, but this has grid/label bias.
5.- Pock et al. (2008) generalized this to a continuous setting, eliminating grid bias but still having label bias due to range discretization.
6.- Label discretization issues illustrated for stereo matching - more labels needed for smooth results but becomes intractable.
7.- This work proposes solving with fewer labels by using a piecewise convex approximation instead of piecewise linear.
8.- Some related work on MRFs with continuous state spaces, but focused on discrete setting or specific regularizers.
9.- Key contributions - first spatially continuous fully sub-label accurate formulation, proof of tightest convex relaxation, unifies lifting and direct optimization.
10.- Idea of convexification and functional lifting explained - transform energy to higher dimensional space.
11.- Traditional multi-labeling samples lifted energy at labels and takes convex envelope - linear relaxation.
12.- Proposed approach assigns costs between labels too before convex envelope - closer approximation, still convex.
13.- Non-convex energy folded along basis vectors in higher dimensional space.
14.- Reformulated energy defined as original along convex combinations of basis vectors, infinity elsewhere.
15.- Tightest convex envelope found analytically, is support function of convex set C.
16.- C characterized by epigraphs of convex conjugates of non-convex energy pieces.
17.- Implementation involves projecting onto epigraphs, done for piecewise linear and quadratic so far.
18.- Total variation regularizer also lifted to higher dimensional space, same as Pock et al. 2008.
19.- Results in a convex-concave saddle point problem, solved by primal-dual algorithm on GPU.
20.- Solution obtained by back-projecting from higher to lower dimensional space.
21.- Evaluated on convex model, sub-label accurate finds same solution as direct optimization with little overhead.
22.- Provides transition between direct optimization and functional lifting.
23.- Finds exact solution with 10 labels while traditional needs many more labels and memory.
24.- For stereo, traditional has visible label bias even with many labels.
25.- Proposed method gives smooth results with few labels and less runtime/memory.
26.- Clear improvement with equal label counts, reasonable results with just 2 labels.
27.- Concludes it's first spatially continuous sub-label accurate relaxation for certain non-convex energies.
28.- Uses far fewer labels than traditional lifting while improving runtime and memory.
29.- Generalizes from piecewise linear to piecewise convex approximations.
30.- Code is available online.
Knowledge Vault built byDavid Vivancos 2024