Concept Graph & Resume using Claude 3 Opus | Chat GPT4o | Llama 3:
Resume:
1.- Equivariance: Neural networks can embed symmetries, allowing predictions to transform similarly to input transformations, inspired by physics.
2.- Symmetry in physics: Symmetries play a major role in physics theories like electromagnetism, general relativity, and elementary particles.
3.- Equivariance advantages: Equivariance enables data efficiency, parameter sharing, disentangling latent representations, and building deep networks.
4.- Practical equivariant basis: Specifying group representations and generators allows computing an equivariant basis, simplifying implementation.
5.- Equivariance on manifolds: Equivariance helps define convolutions on general manifolds like spheres and graphs.
6.- Graph neural networks: Equivariant graph neural networks handle arbitrary neighborhood sizes and are rotationally equivariant.
7.- Equivariant flows: Equivariant flows assemble molecules by mapping random numbers to atom types, interactions, and positions.
8.- Disentangling: Equivariance relates to disentangling, structuring latent spaces into relatively independent capsules encoding pose.
9.- Topographic ICA: Topographic independent component analysis removes higher-order dependencies by organizing filters in topographic maps.
10.- Brain topographic maps: The brain maps the body and visual orientations smoothly onto cortical areas.
11.- Higher-order dependencies: Wavelets decorrelate but leave structured dependencies in activation energy/volatility.
12.- Variational autoencoders: VAEs learn generative models by inferring latent variables using amortized approximate posteriors.
13.- Decoder physics: The VAE decoder can incorporate physical/causal world models, even simulators.
14.- Topographic VAE: Generates sparse latent variables from Gaussian activations and energies, correlated within capsules.
15.- Temporal coherence: Higher-order world concepts change slowly over time despite sensory noise.
16.- Sequence model: Energies are correlated across time by "rolling" activations forward in capsules.
17.- Statistical and equivariant concepts: Redundancy reduction, disentangling, topography, and equivariance are related ways to structure representations.
18.- Presence and pose: Object presence is encoded in capsule norms, pose in activation patterns.
19.- Continuous image signals: Images can be viewed as continuous signals irregularly sampled by pixels or superpixels.
20.- Gaussian process images: Gaussian processes turn discrete samples into continuous signals with uncertainty.
21.- PDEs and convolutions: Partial differential equations implement continuum limit convolutions invariant to sampling grids.
22.- Quantum field theory: Quantum fields are like oscillators at each point, measuring to sample entire states.
23.- Quantum neural networks: Neural networks can be formulated as quantum field theories implementable on quantum computers.
24.- Hinton particles: Quantum field formulation reveals "particle" excitations in neural networks.
25.- Equivariant capsules as oscillators: Learned unsupervised equivariant capsules resemble coupled oscillators.
26.- Dynamics and time: Neural networks should leverage dynamics and temporal smoothness priors reflecting the world.
27.- Mathematical structures: Deep mathematical theories like equivariance help structure neural networks.
28.- Brain inspiration: Neuroscience ideas like topographic maps inspire neural network architectures.
29.- Quantum computing potential: Quantum computers may significantly impact computer vision via quantum field formulations.
30.- Quantum information theory: Computer vision researchers can benefit from learning quantum information theory.
Knowledge Vault built byDavid Vivancos 2024