Knowledge Vault 2/57 - ICLR 2014-2023
Noah Goodman ICLR 2019 - Invited Talk - Learning (from) language in context
<Resume Image >

Concept Graph & Resume using Claude 3 Opus | Chat GPT4 | Gemini Adv | Llama 3:

graph LR classDef goodman fill:#f9d4d4, font-weight:bold, font-size:14px; classDef learning fill:#d4f9d4, font-weight:bold, font-size:14px; classDef language fill:#d4d4f9, font-weight:bold, font-size:14px; classDef generics fill:#f9f9d4, font-weight:bold, font-size:14px; classDef cultural_ratchet fill:#f9d4f9, font-weight:bold, font-size:14px; A[Noah Goodman
ICLR 2019 ] --> B[Humans learn from limited info 1] A --> C[Concept learning: graded
judgments, typicality 2] A --> D[Bayesian model explains
concept learning 3] A --> E[Cultural ratchet: knowledge
across generations 4] E --> F[Language conveys generalizations 5] F --> G[Language efficient for
transmitting concepts 6] G --> H[Generics used for conveying concepts 7] H --> I[Language models perform
like humans 8] H --> J[Generics common in
conveying concepts 9] A --> K[Reference games enable
transfer learning 10] E --> L[Cultural ratchet from
language generalizations 11] L --> M[Generics: easy to understand,
hard to define 12] M --> N[Generics as minimal
or intended examples 13] M --> O[Prior expectations influence
generic interpretation 14] M --> P[Bayesian model explains
generic interpretation 15] A --> Q[Language regularizes and
prepares learning 16] Q --> R[Language decoding improves m
eta-learning 17] Q --> S[Language helps reject
irrelevant features 18] E --> T[Cultural ratchet
accumulates knowledge 19] T --> U[Generics early in child speech 20] A --> V[Agnostic about logical
representations 21] V --> W[Learning from language
relevant for all formats 22] A --> X[Language importance task-dependent 23] X --> Y[Language powerful for
abstract, causal knowledge 24] A --> Z[Division of perception and
language in children 25] Z --> AA[Grounded language precedes
abstractions 26] M --> AB[Generics apply to causal,
habitual statements 27] AB --> AC[Procedural knowledge
may use similar structures 28] A --> AD[Research aims to
understand human learning 29] AD --> AE[Cultural ratchet key
to human abilities 30] class A,B,C,D,Q,V,W,X,Y,Z,AA,AD goodman; class E,F,G,H,I,J,K,L,R,S,T learning; class M,N,O,P,U,AB,AC generics; class AE cultural_ratchet;

Resume:

1.-Noah Goodman discusses how humans learn vast amounts of knowledge from limited information, unlike other primates.

2.-Early psychological studies showed concept learning involves graded judgments and typicality effects, not just logical processes.

3.-A Bayesian learning model with a language of thought representation can explain many human concept learning experiments well.

4.-The cultural ratchet theory proposes knowledge is transmitted across generations, allowing humanity to accumulate knowledge over time.

5.-Goodman hypothesizes the cultural ratchet relies on language constructs for conveying generalizations, like quantifiers and generics.

6.-In experiments, language was sufficient and efficient for transmitting concepts between people compared to learning from examples.

7.-When conveying concepts, people relied heavily on generic language constructs as opposed to concrete referential language.

8.-Modern language models performed similarly to humans in learning concepts from the language used in the experiments.

9.-Generic language was much more common when people conveyed concepts compared to concrete reference games.

10.-Reference games provide a rich source of grounded, compositional language data that enables strong transfer learning in models.

11.-Goodman hypothesizes the cultural ratchet arises specifically from language constructs that directly convey generalizations.

12.-Generics are easy to understand but difficult to formally define, as their interpretations vary based on the property.

13.-Two potential interpretations of generics are as a minimal example or a socially intended minimal example chosen by a teacher.

14.-Prior expectations about feature prevalence differ across properties, influencing the interpretation of corresponding generic statements.

15.-A Bayesian model incorporating socially intended examples and prior knowledge explains human interpretations and endorsements of generic statements well.

16.-Learning from language regularizes and prepares learners for future learning from both language and examples.

17.-Incorporating an auxiliary task of decoding language descriptions improves the performance and data efficiency of meta-learning models.

18.-Language helps reject irrelevant features and regularizes learning, especially in the "sweet spot" with enough but not abundant data.

19.-The cultural ratchet allows humanity to accumulate knowledge over generations, with language facilitating transmission and future learning.

20.-Generics appear very early in child-directed speech and children's own productions, before other forms of generalization like quantifiers.

21.-Goodman is agnostic about the necessity of logical representations, acknowledging the potential of vector-based representations with sufficient power.

22.-Learning from language is relevant regardless of the specific representational format, as demonstrated by the success of neural network models.

23.-The importance of learning from language likely depends on the task, with some requiring extensive embodied experience before language becomes meaningful.

24.-Language is particularly powerful for transmitting abstract, causal, and high-level knowledge that is difficult to discover through direct experience alone.

25.-There is an open question regarding the division of labor between perception and language in children's acquisition of high-level concepts.

26.-Grounded, concrete language understanding is a prerequisite for learning abstractions from generics and other linguistic constructs.

27.-The semantic framework used for category generics also applies well to causal and habitual generics, with some syntactic variations.

28.-Procedural knowledge that generalizes across event scenarios may involve similar linguistic structures to those used for conveying generic information.

29.-Goodman's research aims to understand the gap between human and animal learning that enables humans to build complex technological societies.

30.-The cultural ratchet, driven by language's ability to convey generalizations, is proposed as a key factor in humans' remarkable learning abilities.

Knowledge Vault built byDavid Vivancos 2024