đź”´APOCALIPSIS CIVILIZATORIO ÂżPodrĂamos accidentalmente destruir la civilizaciĂłn otra vez?
graph LR
classDef creativity fill:#ffd4a3, font-weight:bold, font-size:14px;
classDef quantum fill:#a3d4ff, font-weight:bold, font-size:14px;
classDef agi fill:#a3ffa3, font-weight:bold, font-size:14px;
classDef society fill:#ffa3ff, font-weight:bold, font-size:14px;
classDef knowledge fill:#ffffa3, font-weight:bold, font-size:14px;
classDef future fill:#a3ffff, font-weight:bold, font-size:14px;
Main[Knowledge & Intelligence Web]
Main --> C1[Creativity for culture
not problems 1]
C1 -.-> creativity
Main --> C2[Early imitation of
complex acts 2]
C2 -.-> creativity
Main --> C3[Knowledge fragile
can reverse 3]
C3 -.-> knowledge
Main --> C4[Creative societies self-destruct 4]
C4 -.-> society
Main --> C5[AI efficient
lacks real creativity 5]
C5 -.-> agi
Main --> C6[AGI is person
not copyable 6]
C6 -.-> agi
Main --> C7[Clone AGI
value drops 7]
C7 -.-> agi
Main --> C8[Hardware speed
not universal 8]
C8 -.-> quantum
Main --> C9[Quantum enables
non-classic algorithms 9]
C9 -.-> quantum
Main --> C10[Quantum crypto
single qubit works 10]
C10 -.-> quantum
Main --> C11[Govt hoards encrypted
data for later 11]
C11 -.-> quantum
Main --> C12[Universe allows
self-computation 12]
C12 -.-> knowledge
Main --> C13[Simulation leads
to infinite regress 13]
C13 -.-> knowledge
Main --> C14[Knowledge beats
physical might 14]
C14 -.-> knowledge
Main --> C15[Tools override
dominance 15]
C15 -.-> knowledge
Main --> C16[Tech needed
for hostile worlds 16]
C16 -.-> future
Main --> C17[Consciousness creativity
co-evolved 17]
C17 -.-> creativity
Main --> C18[AGI needs qualia
free will 18]
C18 -.-> agi
Main --> C19[No theory
explains AGI yet 19]
C19 -.-> agi
Main --> C20[LLMs simulate
not replicate thought 20]
C20 -.-> agi
Main --> C21[Brain Turing
unknown software 21]
C21 -.-> agi
Main --> C22[Grover not
fully understood 22]
C22 -.-> quantum
Main --> C23[Quantum may solve
minimize simulate 23]
C23 -.-> quantum
Main --> C24[Crypto must
resist quantum 24]
C24 -.-> quantum
Main --> C25[Quantum wont
replace AI optimize 25]
C25 -.-> quantum
Main --> C26[Scientific revolution
rare leap tools 26]
C26 -.-> knowledge
Main --> C27[Writing arithmetic
boost efficiency 27]
C27 -.-> knowledge
Main --> C28[LLMs speed
writing no meaning 28]
C28 -.-> agi
Main --> C29[AGI must pay
for compute 29]
C29 -.-> agi
Main --> C30[AGI rights
include hardware 30]
C30 -.-> agi
Main --> C31[Diversity raises
collective value 31]
C31 -.-> society
Main --> C32[Parallel copies
duplicate not multiply 32]
C32 -.-> agi
Main --> C33[Series immortality
beats clones 33]
C33 -.-> agi
Main --> C34[Human creativity
underused today 34]
C34 -.-> creativity
Main --> C35[Stable open
society needed 35]
C35 -.-> society
Main --> C36[Slowing progress
worse than collapse 36]
C36 -.-> society
Main --> C37[Fun guides
worth research 37]
C37 -.-> creativity
Main --> C38[Chance not
merit finds truth 38]
C38 -.-> creativity
Main --> C39[Physics allows
self-simulation 39]
C39 -.-> knowledge
Main --> C40[Cellular automata
must be quantum 40]
C40 -.-> quantum
Main --> C41[Only simple
systems reducible 41]
C41 -.-> knowledge
Main --> C42[Tools extend
mind beyond biology 42]
C42 -.-> future
Main --> C43[Sun predictable
lacks knowledge 43]
C43 -.-> knowledge
Main --> C44[Future galactic
or extinct 44]
C44 -.-> future
Main --> C45[Transhumanism merges
bio AI 45]
C45 -.-> future
Main --> C46[Human X
evolution beyond limits 46]
C46 -.-> future
Main --> C47[War division
threaten survival 47]
C47 -.-> society
Main --> C48[Global mind
must replace nations 48]
C48 -.-> society
Main --> C49[Spiritual growth
individual duty 49]
C49 -.-> society
Main --> C50[Hybrid risk
unequal species 50]
C50 -.-> future
Main --> C51[AGI must
not be slave 51]
C51 -.-> agi
Main --> C52[Cross-species
knowledge possible 52]
C52 -.-> future
Main --> C53[Now choices
shape future 53]
C53 -.-> future
creativity --> C1
creativity --> C2
creativity --> C17
creativity --> C34
creativity --> C37
creativity --> C38
quantum --> C8
quantum --> C9
quantum --> C10
quantum --> C11
quantum --> C22
quantum --> C23
quantum --> C24
quantum --> C25
quantum --> C40
agi --> C5
agi --> C6
agi --> C7
agi --> C18
agi --> C19
agi --> C20
agi --> C21
agi --> C28
agi --> C29
agi --> C30
agi --> C32
agi --> C33
agi --> C51
society --> C4
society --> C31
society --> C35
society --> C36
society --> C47
society --> C48
society --> C49
knowledge --> C3
knowledge --> C12
knowledge --> C13
knowledge --> C14
knowledge --> C15
knowledge --> C26
knowledge --> C27
knowledge --> C39
knowledge --> C41
knowledge --> C43
future --> C16
future --> C42
future --> C44
future --> C45
future --> C46
future --> C50
future --> C52
future --> C53
class C1,C2,C17,C34,C37,C38 creativity
class C8,C9,C10,C11,C22,C23,C24,C25,C40 quantum
class C5,C6,C7,C18,C19,C20,C21,C28,C29,C30,C32,C33,C51 agi
class C4,C31,C35,C36,C47,C48,C49 society
class C3,C12,C13,C14,C15,C26,C27,C39,C41,C43 knowledge
class C16,C42,C44,C45,C46,C50,C52,C53 future
Resume:
The episode begins with Placio Domenech introducing David Deutsch as a guest to discuss the possibility of accidental civilizational collapse, a theme Deutsch explores under the title *Apocalypse Civilization*. Deutsch argues that humanity’s unique ability to create and transmit knowledge is both its greatest strength and its greatest vulnerability. He explains that creativity did not evolve for problem-solving but for cultural transmission, and only later was repurposed for innovation. This shift allowed humans to become universal explainers, but it also made civilizations prone to self-destruction when they fail to maintain the conditions for knowledge growth. Deutsch warns that all previous creative societies collapsed not due to external threats but because they undermined their own epistemological frameworks.
The conversation then shifts to the role of technology in accelerating knowledge. Deutsch distinguishes between AI, which he sees as a powerful tool for enhancing human efficiency, and AGI, which he argues is fundamentally misunderstood. He emphasizes that AGI, if ever achieved, would not be a mere extension of current AI systems but a new kind of person. Unlike software that can be copied infinitely, AGI would have individuality, rights, and agency. He cautions against the idea of mass-producing AGI clones, arguing that diversity, not duplication, drives progress. Deutsch also critiques the notion that faster hardware alone will lead to exponential growth in knowledge, noting that many real-world constraints—such as physical experiments—cannot be sped up by computation alone.
The final segment of the episode includes reflections on quantum computing, the nature of consciousness, and the future of humanity. Deutsch expresses excitement about quantum computation as a fundamentally new mode of processing information, though he remains cautious about its practical timeline. He also discusses the philosophical implications of AGI and consciousness, suggesting that a true theory of AGI would need to explain not just intelligence but also qualia and free will. The episode ends with a clip from Jiddu Krishnamurti, who warns that humanity’s future is bleak if it continues on its current path of war, division, and spiritual emptiness. The host echoes this sentiment, calling for a new era of human evolution—Human X—where technology and consciousness converge to avoid extinction and embrace transcendence.
Key Ideas:
1.- Creativity evolved for cultural transmission, not problem-solving.
2.- Early humans used creativity to imitate complex behaviors.
3.- Knowledge growth is fragile and can be reversed.
4.- All past creative societies destroyed themselves internally.
5.- AI boosts efficiency but lacks genuine creativity.
6.- AGI would be a person, not a copyable program.
7.- Cloning AGI would reduce economic value due to sameness.
8.- Faster hardware doesn’t speed up all knowledge discovery.
9.- Quantum computers enable new, non-classical algorithms.
10.- Quantum cryptography is already usable with single qubits.
11.- Governments may already store encrypted data for future decryption.
12.- The universe allows universal computation within itself.
13.- The simulation hypothesis leads to infinite regress.
14.- Knowledge violates the natural hierarchy of physical might.
15.- Tools let humans override physical dominance.
16.- Humans survive only through technology in hostile environments.
17.- Consciousness and creativity likely evolved together.
18.- AGI theory must explain qualia and free will.
19.- No current theory explains what makes AGI possible.
20.- LLMs simulate but do not replicate human thought.
21.- The brain is a Turing machine running unknown software.
22.- Quantum algorithms like Grover’s are not yet fully understood.
23.- Quantum computing may solve minimization and simulation problems.
24.- Classical cryptography must become quantum-resistant.
25.- Quantum computers won’t replace AI for most optimization tasks.
26.- The scientific revolution was a rare leap in knowledge tools.
27.- Writing and arithmetic vastly improved knowledge efficiency.
28.- LLMs double writing speed but don’t understand meaning.
29.- AGI would need to pay for its own computational resources.
30.- AGI rights would include ownership of their hardware.
31.- Diversity among intelligences increases collective value.
32.- Parallel AGI copies would duplicate effort, not multiply insight.
33.- Series-based AGI immortality would be more useful than clones.
34.- Human creativity is underutilized in modern societies.
35.- Knowledge growth requires stable, open societies.
36.- Slowing progress is worse than civilizational collapse.
37.- Fun is a reliable guide to worthwhile research.
38.- Discoveries often come by chance, not merit.
39.- Physics allows simulation of any system within itself.
40.- Cellular automata must be quantum to model reality.
41.- Computational reducibility exists only in simple systems.
42.- Tools extend human minds beyond biological limits.
43.- The sun’s behavior is predictable because it lacks knowledge.
44.- Humanity’s future is either galactic or extinct.
45.- Transhumanism will merge biology and artificial intelligence.
46.- Human X represents evolution beyond current limitations.
47.- Krishnamurti warns war and division threaten survival.
48.- Global consciousness must replace nationalism.
49.- Spiritual evolution is an individual responsibility.
50.- Hybridization risks creating unequal human species.
51.- AGI must not be enslaved or treated as property.
52.- Knowledge transfer across species may become possible.
53.- The future depends on conscious choices made now.
Interviews by Plácido Doménech Espà & Guests - Knowledge Vault built byDavid Vivancos 2025