Concept Graph & Resume using Claude 3 Opus | Chat GPT4 | Gemini Adv | Llama 3:
Resume:
1.-Convolutional networks are successful for images and sounds due to grid structure, local statistics, and parameter efficiency.
2.-Limitations exist for non-grid data like 3D meshes, spectrograms, social networks, and across channels in standard architectures.
3.-Goal is to learn layers where parameter count is independent of input size by treating signals as functions on graphs.
4.-Similarity between features can come from sensing process (e.g. 3D mesh distances) or be estimated from data statistics.
5.-Locally connected networks learn neighborhoods to capture local correlation, then reduce graph resolution and repeat, but still scale with size.
6.-Convolution on graphs can be defined through Laplacian eigenvectors that generalize the Fourier basis.
7.-Convolution is defined as any linear operator that commutes with the Laplacian, i.e. diagonal in the Laplacian eigenbasis.
8.-Learning filter coefficients directly in this spectral domain still requires parameter count proportional to input size.
9.-Analogy between spatial localization of signals and smoothness of their Fourier transforms suggests learning smooth spectral filters.
10.-Smooth spectral filters with finite spatial support require parameters proportional only to filter size, achieving constant parameter count.
11.-Challenge is relating frequencies to define spectral smoothness; even 1D frequency ordering works but optimal similarity is an open problem.
12.-Preliminary results on subsampled MNIST and MNIST projected on 3D sphere validate the approach.
13.-Spectral CNN reduces parameters by 1-2 orders of magnitude vs fully connected nets without sacrificing performance.
14.-Learned spectral feature maps are complementary and concentrate energy in different input regions.
15.-First step in exploiting input geometry to learn with parameter count independent of input size.
16.-Computing Fourier transform is still expensive compared to other methods like MFT.
17.-Need to handle highly irregular graphs beyond simple examples shown.
18.-Open question on how to optimally arrange frequencies in dual domain for effective spatial localization.
19.-Symmetries and structure in original data should inform parameter allocation in neural networks.
20.- Much more work needed on optimally allocating network parameters by exploiting input structure and symmetries.
Knowledge Vault built byDavid Vivancos 2024