👩🏻‍🔬LA FÍSICA DE LA AI SINTIENTE | Análisis entrevista Karl Friston
graph LR
classDef blanket fill:#d4f9f9, font-weight:bold, font-size:12px
classDef fep fill:#f9d4d4, font-weight:bold, font-size:12px
classDef attractor fill:#d4d4f9, font-weight:bold, font-size:12px
classDef mechanics fill:#f9f9d4, font-weight:bold, font-size:12px
classDef coding fill:#f9d4f9, font-weight:bold, font-size:12px
classDef selection fill:#d4f9d4, font-weight:bold, font-size:12px
classDef inference fill:#f9d4d4, font-weight:bold, font-size:12px
classDef entropy fill:#d4f9f9, font-weight:bold, font-size:12px
classDef scale fill:#f9f9d4, font-weight:bold, font-size:12px
classDef observer fill:#d4d4f9, font-weight:bold, font-size:12px
Main[Vault7-291]
Main --> B1[Statistical
boundary 1]
B1 --> B2[Separates
states 1]
Main --> F1[Minimise
surprise 2]
F1 --> F2[Model
error 2]
Main --> A1[Non-identical
revisits 3]
A1 --> A2[Probabilistic
neighbourhoods 3]
Main --> M1[Bayesian
mechanics 4]
M1 --> M2[Gradient
log-probability 4]
Main --> E1[Notational
variants 5]
E1 --> E2[Schrödinger
Fokker–Planck 5]
Main --> C1[Neural
variational 6]
C1 --> C2[Predictive
coding 6]
Main --> S1[Density
dynamics 7]
S1 --> S2[Maximise
fitness 7]
Main --> I1[Perception
action 8]
I1 --> I2[Blanket
exchange 8]
Main --> R1[Oscillatory
return 9]
R1 --> R2[Non-equilibrium
steady 9]
Main --> SO1[Resist
entropy 10]
SO1 --> SO2[Gradient
descent 10]
Main --> SC1[Same
formalism 11]
SC1 --> SC2[Cells
brains 11]
Main --> CI1[Individuation
without 12]
CI1 --> CI2[Causal
isolation 12]
Main --> SM1[Aligns
homeostasis 13]
SM1 --> SM2[Reinforcement
utility 13]
Main --> EP1[Entropy
production 14]
EP1 --> EP2[Solenoidal
circulation 14]
Main --> OD1[Observer
observee 15]
OD1 --> OD2[Markov
partition 15]
Main --> ME1[Model
evidence 16]
ME1 --> ME2[Fit
predictions 16]
Main --> RG1[Microscopic
noise 17]
RG1 --> RG2[Macroscopic
order 17]
Main --> AU1[Self-creating
blanket 18]
AU1 --> AU2[Active
states 18]
Main --> NN1[Helmholtz
decomposition 19]
NN1 --> NN2[Predictive
processing 19]
Main --> GF1[Dissipative
conservative 20]
GF1 --> GF2[Self-information 20]
Main --> QM1[Complex
amplitudes 21]
QM1 --> QM2[Probability
factorisation 21]
Main --> SDK1[Sentient
agents 22]
SDK1 --> SDK2[Active-inference
SDKs 22]
Main --> MA1[Replicator
dynamics 23]
MA1 --> MA2[Shared
blanket 23]
Main --> SMB1[Information
thermodynamics 24]
SMB1 --> SMB2[Statistical
bridge 24]
Main --> NE1[Energy-matter
exchange 25]
NE1 --> NE2[Non-equilibrium
persist 25]
Main --> AD1[Slow
context 26]
AD1 --> AD2[Fast
subsystem 26]
Main --> PG1[Blanket-mirror
independence 27]
PG1 --> PG2[Phenotype-genotype 27]
Main --> PE1[Maximise
evidence 28]
PE1 --> PE2[Generative
models 28]
Main --> AC1[Minimise
future 29]
AC1 --> AC2[Alter
hidden 29]
Main --> TL1[Engineer
artificial 30]
TL1 --> TL2[Tautological
richness 30]
class B1,B2 blanket
class F1,F2 fep
class A1,A2 attractor
class M1,M2,E1,E2 mechanics
class C1,C2 coding
class S1,S2 selection
class I1,I2 inference
class SO1,SO2,EP1,EP2 entropy
class SC1,SC2 scale
class OD1,OD2 observer
Resume:
Plácido DomĂnguez welcomes viewers to the 3 p.m. slot of XHUBAI’s fifth season, devoted to exploring the ideas of Karl Friston, a neuroscientist whose free-energy principle is reshaping artificial intelligence. After discovering Friston through a Wired interview, Benets guides the Spanish-speaking community through dense terrain: Markov blankets, Bayesian mechanics, and the physics of sentience. The live stream is simulcast on X, YouTube, LinkedIn, Twitch, Facebook, Rumble, Instagram and Kik; comments overflow with curiosity, confusion, and gratitude.
The host warns that the upcoming lecture is “infumable,” yet essential. He contrasts the mathematically heavy material with an evening programme featuring SofĂa Vidal on synthetic consciousness, promising a gentler on-ramp. Benets invites Discord membership, PayPal or BuyMeACoffee support, and reminds viewers that these sessions are neither monetised nor dumbed-down; they are collective deep dives into theory that many will nap through, but none will forget.
Carl Friston then takes the floor, beginning with the claim that biology, psychology and physics share one foundation: the statistical persistence of “things.” Using Markov blankets—statistical boundaries that separate internal, sensory and active states—he shows how any self-organising system, from a cell to a society, can be described as minimising surprise (free energy) while maximising model evidence. The blanket is both shield and interface: it renders internal dynamics conditionally independent of the external world, yet permits controlled exchange through action and perception.
From this scaffold emerges a universal Bayesian mechanics. Friston recasts Schrödinger, Fokker–Planck and Langevin equations as different notations for the same gradient flows on log-probability landscapes; the brain’s predictive coding, reinforcement learning and Darwinian selection become special cases of the same inference process. Repetition—heartbeat, replication, coffee—appears as itinerant return to neighbourhoods of a pullback attractor, never the same state twice, always within probabilistic bounds. Viewers grapple with the tautology: systems that exist are those that act as if minimising surprise, so the free-energy principle is both analytic and generative for engineering agents.
The session ends with a promise of a sequel devoted to neural implementations and active-inference SDKs. Benets returns, exhilarated, acknowledging that the talk “explodes heads” yet unites physics, mathematics and AI into one tangible reality. He invites the community to reconvene later for a more conversational exploration of synthetic consciousness, carrying forward the conceptual fire sparked by Friston’s formidable synthesis.
30 Key Ideas:
1.- Markov blanket defines statistical boundary separating internal, sensory, active states.
2.- Free-energy principle frames existence as minimising surprise or model prediction error.
3.- Pullback attractor describes non-identical revisits to probabilistic state neighbourhoods.
4.- Bayesian mechanics unifies physics equations under gradient flows on log-probability.
5.- Schrödinger, Fokker–Planck, Langevin equations viewed as notational variants.
6.- Predictive coding emerges as neural implementation of variational inference.
7.- Darwinian selection recast as density dynamics maximising adaptive fitness likelihood.
8.- Active inference couples perception and action through blanket-mediated exchange.
9.- Replication equals oscillatory return within open non-equilibrium steady-state systems.
10.- Self-organisation resists entropy via gradient descent on potential landscapes.
11.- Scale invariance permits same formalism across cells, brains, societies.
12.- Conditional independence enables individuation without causal isolation.
13.- Surprise minimisation aligns with homeostasis, reinforcement learning, utility theory.
14.- Entropy production balanced by solenoidal circulation sustaining structure.
15.- Observer–observee distinction requires explicit Markov blanket partition.
16.- Model evidence quantifies fit between internal predictions and external sensory data.
17.- Renormalisation group links microscopic noise to macroscopic order.
18.- Autopoiesis interpreted as self-creating blanket maintenance via active states.
19.- Neural networks implement Helmholtz decomposition for predictive processing.
20.- Gradient flows on self-information generate both dissipative and conservative dynamics.
21.- Quantum mechanics recovered by factorising probability densities into complex amplitudes.
22.- Cognitive architectures can leverage active-inference SDKs for sentient agents.
23.- Multiagent reinforcement learning parallels replicator dynamics under shared blanket.
24.- Statistical mechanics bridges thermodynamics and information-theoretic descriptions.
25.- Non-equilibrium systems persist through constant energy-matter exchange.
26.- Adiabatic approximation allows slow context drift for fast subsystem solutions.
27.- Phenotype–genotype separation mirrors blanket–environment statistical independence.
28.- Perception maximises evidence for internal generative models of external causes.
29.- Action minimises future surprise by altering external hidden states.
30.- Tautological richness of free-energy principle enables engineering artificial life.
Interviews by Plácido Doménech Espà & Guests - Knowledge Vault built byDavid Vivancos 2025