Knowledge Vault 6 /34 - ICML 2018
Building Machines that Learn and Think Like People
Josh Tenenbaum
< Resume Image >

Concept Graph & Resume using Claude 3.5 Sonnet | Chat GPT4o | Llama 3:

graph LR classDef context fill:#f9d4d4, font-weight:bold, font-size:14px classDef human_cognition fill:#d4f9d4, font-weight:bold, font-size:14px classDef ai_approaches fill:#d4d4f9, font-weight:bold, font-size:14px classDef challenges fill:#f9f9d4, font-weight:bold, font-size:14px classDef future fill:#f9d4f9, font-weight:bold, font-size:14px Main[Building Machines that
Learn and Think
Like People] Main --> A[Building human-like AI:
key research challenge 1] A --> B[Current AI excels
at pattern recognition 2] A --> C[Intelligence involves modeling
the world 3] A --> D[Center aims to
reverse engineer intelligence 4] Main --> E[Human cognition insights] E --> F[Babies born with
intuitive capabilities 5] E --> G[Intuitive physics predicts
object behavior 7] E --> H[Intuitive psychology understands
others actions 8] E --> I[One-shot learning from
few examples 12] E --> J[Infant cognitive development
proceeds in stages 20] J --> K[Language acquisition enables
advanced reasoning 25] Main --> L[AI approaches] L --> M[Probabilistic programs model
intuitive capabilities 6] L --> N[Neural networks useful
for pattern recognition 9] L --> O[Amortized inference speeds
up probabilistic models 10] L --> P[Inverse graphics reconstructs
3D from 2D 11] L --> Q[Program induction captures
abstract knowledge 13] L --> R[Differentiable programming enables
gradient-based optimization 14] L --> S[Neural physics engines
combine symbolic and learned 15] L --> T[Program synthesis generates
task-solving code 16] T --> U[DreamCoder: wake-sleep algorithm
for code learning 17] L --> V[Neuro-symbolic models integrate
networks and symbols 23] Main --> W[Challenges and considerations] W --> X[Inductive bias shapes
learning and generalization 18] W --> Y[Evolution vs gradient
descent in learning 19] W --> Z[Game engines model
physics and behaviors 21] W --> AA[Probabilistic programming languages
combine multiple approaches 22] W --> AB[Causal reasoning key
for human intelligence 26] W --> AC[Abstract reasoning and
compositionality important capabilities 27] W --> AD[Cognitive resource limitations
explain developmental patterns 28] W --> AE[Balancing innate and
learned capabilities 30] Main --> AF[Future directions] AF --> AG[Evolutionary algorithms may
capture human learning 29] class A,B,C,D context class E,F,G,H,I,J,K human_cognition class L,M,N,O,P,Q,R,S,T,U,V ai_approaches class W,X,Y,Z,AA,AB,AC,AD,AE challenges class AF,AG future

Resume:

1.- Building machines that learn and think like people is a key challenge in AI research.

2.- Current AI technologies excel at pattern recognition but lack general-purpose common sense intelligence.

3.- Intelligence involves modeling the world, not just finding patterns in data.

4.- The Center for Brains, Minds and Machines aims to reverse engineer human intelligence for AI advances.

5.- Babies are born with intuitive physics and psychology capabilities that develop over time.

6.- Probabilistic programs and game engines can model intuitive physics and psychology in AI systems.

7.- Intuitive physics allows humans to predict object behavior and interactions in the physical world.

8.- Intuitive psychology enables understanding others' actions, beliefs, and goals.

9.- Neural networks and deep learning are useful for pattern recognition within probabilistic programming frameworks.

10.- Amortized inference uses neural networks to speed up probabilistic inference in generative models.

11.- Inverse graphics uses neural networks to reconstruct 3D object models from 2D images.

12.- One-shot learning allows humans to learn new concepts from just one or few examples.

13.- Program induction aims to learn programs that capture abstract knowledge, like physics engines.

14.- Differentiable programming attempts to make program learning more amenable to gradient-based optimization.

15.- Neural physics engines combine symbolic knowledge of objects with learned physical dynamics.

16.- Program synthesis explores how to automatically generate programs to solve tasks or model knowledge.

17.- DreamCoder is a wake-sleep algorithm for learning to write code and build domain-specific libraries.

18.- Inductive bias shapes how humans and machines learn and generalize from limited data.

19.- Evolution may have shaped human learning mechanisms differently than gradient descent shapes neural networks.

20.- Infant cognitive development proceeds in stages, with some capabilities present from birth.

21.- Game engines provide useful abstractions for modeling intuitive physics and basic agent behaviors.

22.- Probabilistic programming languages combine probabilistic inference, neural networks, and symbolic reasoning.

23.- Neuro-symbolic hybrid models integrate neural networks with symbolic knowledge representations.

24.- Intuitive psychology develops from simple goal-directed reasoning to full theory of mind.

25.- Language acquisition around 18 months to 3 years old enables more advanced learning and reasoning.

26.- Causal reasoning is a key aspect of human intelligence not fully captured by current AI.

27.- Abstract reasoning and compositionality are important human capabilities to model in AI.

28.- Cognitive resource limitations in infants may explain some developmental patterns.

29.- Evolutionary algorithms may be necessary to capture some aspects of human learning.

30.- Balancing innate knowledge and learned capabilities is a key challenge in cognitive modeling and AI.

Knowledge Vault built byDavid Vivancos 2024