Knowledge Vault 5 /61 - CVPR 2021
Unsupervised Learning Of Equivariant Space-Time Capsules
Max Welling
< Resume Image >

Concept Graph & Resume using Claude 3 Opus | Chat GPT4o | Llama 3:

graph LR classDef equivariance fill:#f9d4d4, font-weight:bold, font-size:14px classDef symmetry fill:#d4f9d4, font-weight:bold, font-size:14px classDef advantages fill:#d4d4f9, font-weight:bold, font-size:14px classDef practical fill:#f9f9d4, font-weight:bold, font-size:14px classDef applications fill:#f9d4f9, font-weight:bold, font-size:14px classDef brain fill:#d4f9f9, font-weight:bold, font-size:14px classDef quantum fill:#f9d4d4, font-weight:bold, font-size:14px A[Unsupervised Learning Of
Equivariant Space-Time Capsules] --> B[Embed symmetries,
transform predictions 1] A --> C[Physics symmetries crucial 2] A --> D[Efficiency, sharing,
disentangling, depth 3] A --> E[Group representations,
generators, basis 4] A --> F[Convolutions on
spheres, graphs 5] F --> G[Handle neighborhoods,
rotation invariant 6] A --> H[Assemble molecules,
map numbers 7] A --> I[Disentangle pose
in capsules 8] A --> J[Remove dependencies,
topographic maps 9] J --> K[Brain maps
body, vision smoothly 10] J --> L[Wavelets decorrelate,
leave structure 11] A --> M[Infer latents,
amortized posteriors 12] M --> N[Decoder physics,
causal models 13] M --> O[Sparse variables,
correlated capsules 14] A --> P[Slow change,
sensory noise 15] P --> Q[Roll activations
in capsules 16] A --> R[Redundancy, disentangling,
topography, equivariance 17] R --> S[Presence norms,
pose patterns 18] A --> T[Continuous signals,
irregular sampling 19] T --> U[Gaussian processes,
uncertainty 20] T --> V[PDEs, sampling
grid invariance 21] A --> W[Oscillators,
measure states 22] W --> X[Quantum neural
networks 23] W --> Y[Particle excitations
in networks 24] W --> Z[Unsupervised capsules
resemble oscillators 25] A --> AA[Leverage dynamics,
temporal smoothness 26] A --> AB[Equivariance structures
networks 27] A --> AC[Topographic maps
inspire architectures 28] A --> AD[Quantum impact
via fields 29] A --> AE[Learn quantum
information theory 30] class A,B,R,S,AB equivariance class C symmetry class D,AA advantages class E,F,G,H,T,U,V practical class I,J,K,L,M,N,O,P,Q applications class AC,Z brain class W,X,Y,AD,AE quantum

Resume:

1.- Equivariance: Neural networks can embed symmetries, allowing predictions to transform similarly to input transformations, inspired by physics.

2.- Symmetry in physics: Symmetries play a major role in physics theories like electromagnetism, general relativity, and elementary particles.

3.- Equivariance advantages: Equivariance enables data efficiency, parameter sharing, disentangling latent representations, and building deep networks.

4.- Practical equivariant basis: Specifying group representations and generators allows computing an equivariant basis, simplifying implementation.

5.- Equivariance on manifolds: Equivariance helps define convolutions on general manifolds like spheres and graphs.

6.- Graph neural networks: Equivariant graph neural networks handle arbitrary neighborhood sizes and are rotationally equivariant.

7.- Equivariant flows: Equivariant flows assemble molecules by mapping random numbers to atom types, interactions, and positions.

8.- Disentangling: Equivariance relates to disentangling, structuring latent spaces into relatively independent capsules encoding pose.

9.- Topographic ICA: Topographic independent component analysis removes higher-order dependencies by organizing filters in topographic maps.

10.- Brain topographic maps: The brain maps the body and visual orientations smoothly onto cortical areas.

11.- Higher-order dependencies: Wavelets decorrelate but leave structured dependencies in activation energy/volatility.

12.- Variational autoencoders: VAEs learn generative models by inferring latent variables using amortized approximate posteriors.

13.- Decoder physics: The VAE decoder can incorporate physical/causal world models, even simulators.

14.- Topographic VAE: Generates sparse latent variables from Gaussian activations and energies, correlated within capsules.

15.- Temporal coherence: Higher-order world concepts change slowly over time despite sensory noise.

16.- Sequence model: Energies are correlated across time by "rolling" activations forward in capsules.

17.- Statistical and equivariant concepts: Redundancy reduction, disentangling, topography, and equivariance are related ways to structure representations.

18.- Presence and pose: Object presence is encoded in capsule norms, pose in activation patterns.

19.- Continuous image signals: Images can be viewed as continuous signals irregularly sampled by pixels or superpixels.

20.- Gaussian process images: Gaussian processes turn discrete samples into continuous signals with uncertainty.

21.- PDEs and convolutions: Partial differential equations implement continuum limit convolutions invariant to sampling grids.

22.- Quantum field theory: Quantum fields are like oscillators at each point, measuring to sample entire states.

23.- Quantum neural networks: Neural networks can be formulated as quantum field theories implementable on quantum computers.

24.- Hinton particles: Quantum field formulation reveals "particle" excitations in neural networks.

25.- Equivariant capsules as oscillators: Learned unsupervised equivariant capsules resemble coupled oscillators.

26.- Dynamics and time: Neural networks should leverage dynamics and temporal smoothness priors reflecting the world.

27.- Mathematical structures: Deep mathematical theories like equivariance help structure neural networks.

28.- Brain inspiration: Neuroscience ideas like topographic maps inspire neural network architectures.

29.- Quantum computing potential: Quantum computers may significantly impact computer vision via quantum field formulations.

30.- Quantum information theory: Computer vision researchers can benefit from learning quantum information theory.

Knowledge Vault built byDavid Vivancos 2024