← Back to Catalog

Free Energy Principle

Passive Matter
Second Law — entropy maximization
🐟 Life emerges from the ink
Environment
Temperature
298K
Thermal Energy
4.1×10⁻²¹ J
Water Molecules
~10²¹
Fluctuation Rate
1.0σ
System
Passive Matter
Thermodynamic Entropy S0.00
Spatial Spread 0 px
Particles 0
Equilibrating with environment...
Click anywhere to add particles
🧠
Prof. Karl Friston
Think of a drop of ink in a glass of water...

Nonequilibrium Steady State vs Dissipation

Figure 3.3 — Chapter 3: The High Road to Active Inference
From the Textbook
"Path taken by a 2-dimensional random dynamical system with a nonequilibrium steady state. This can be interpreted as minimizing its surprise."
"The center is the least surprising region; the circles moving away from the center represent progressively more surprising regions."
"In contrast, [the dissipating system's] dynamics bear no relation to surprise. Not only does it enter more surprising regions of space; it also fails to achieve any sort of steady state, dissipating in an unconstrained fashion over time."
"The scope of Active Inference is restricted to systems like [the steady state]—which counter random fluctuations with their average flow and thereby retain their form over time."
— Parr, Pezzulo & Friston (2022), p.48-49

Steady State System

Surprise: 0.00 | Position: (0, 0)

Dissipating System

Surprise: 0.00 | Position: (0, 0)

Surprise Landscape ℑ(y)

Center = Low Surprise (High Probability)
Steady state: counters fluctuations, retains form
Dissipating: no attractor, disperses over time
Dynamics: dx = f(x)dt + σdW Steady: f(x) = −(Γ−Q)∇ℑ Dissipating: f(x) = 0

Glossary of Terms

Key concepts explained in friendly language
Surprise (ℑ)
How unexpected something is. Mathematically, it's the negative log probability: ℑ(y) = −ln P(y). If something is very likely (high probability), it has low surprise. If something is unlikely, it has high surprise. Think of it like: seeing a fish in the ocean = low surprise. Seeing a fish in your living room = high surprise!
Free Energy (F)
An upper bound on surprise that a system can actually compute. Living systems minimize free energy because they can't directly measure surprise. It combines two things: how wrong your predictions are (prediction error) and how complex your beliefs are (complexity). It's like a proxy for "how badly am I doing at predicting the world right now?"
Steady State
A condition where a system maintains its overall pattern over time, even while constantly fluctuating. The system doesn't freeze—it keeps moving—but it stays within a recognizable region of possibilities. Like a fish swimming in place in a current: always moving, but staying roughly where it is.
Nonequilibrium
A system that maintains order by constantly exchanging energy/matter with its environment. Unlike a rock (equilibrium), living things are always processing, metabolizing, and adapting. A candle flame is nonequilibrium—it maintains its shape only by continuously burning fuel.
Markov Blanket
A statistical boundary that separates "inside" from "outside." It's the set of variables that mediate all interactions between a system and its environment. Everything inside the blanket is conditionally independent of everything outside, given the blanket itself. Like your skin: the outside world can only affect your internal organs through your skin (and other sensory surfaces).
Active Inference
The idea that living systems act to confirm their predictions about the world. Instead of just passively updating beliefs based on sensory input, organisms actively change the world (or sample it differently) to reduce prediction errors. If you predict it's warm outside and feel cold, you can either update your belief OR put on a coat (change the world to match your prediction).
Generative Model
An internal model of how sensory data is generated by hidden causes in the world. It captures your brain's "theory" about what's out there and how it produces the signals you receive. Your brain's implicit guess about "what kind of world would produce these sights, sounds, and feelings?"
Prediction Error (ε)
The difference between what you predicted you would sense and what you actually sensed. These errors drive learning and belief updating—and action, if you're doing active inference. Expected a sweet taste, got sour? That mismatch is prediction error.
Precision (π)
How much confidence or weight you give to certain predictions or prediction errors. High precision means "pay attention to this signal, it's reliable." Low precision means "this is noisy, don't trust it too much." Like turning up the volume on important information and muting the background noise.
Entropy
A measure of disorder or uncertainty. In thermodynamics, it tends to increase (Second Law). In information theory, it measures how spread out a probability distribution is. Living systems are special because they resist entropy increase locally. A dropped egg spreads out (high entropy). Living systems are like eggs that somehow un-scramble themselves.
Solenoidal Flow
Rotational movement that doesn't increase or decrease surprise—it just moves along contours of equal probability. The "Q" term in the dynamics equation. This is what makes the steady-state system spiral counterclockwise. Like a marble rolling around the inside of a bowl without climbing up or falling down—just orbiting.
Gradient Descent
Moving in the direction that most rapidly decreases some quantity (like surprise or free energy). The "−Γ∇ℑ" part of the steady-state dynamics—the system flows "downhill" toward less surprising states. Like a ball rolling down a hill, always seeking the lowest point.
Temporal Grammar | Alexander Sabine | Active Inference Institute | temporalgrammar.ai