Surprise (ℑ)
How unexpected something is. Mathematically, it's the negative log probability: ℑ(y) = −ln P(y).
If something is very likely (high probability), it has low surprise. If something is unlikely, it has high surprise.
Think of it like: seeing a fish in the ocean = low surprise. Seeing a fish in your living room = high surprise!
Free Energy (F)
An upper bound on surprise that a system can actually compute. Living systems minimize free energy because they can't directly measure surprise.
It combines two things: how wrong your predictions are (prediction error) and how complex your beliefs are (complexity).
It's like a proxy for "how badly am I doing at predicting the world right now?"
Steady State
A condition where a system maintains its overall pattern over time, even while constantly fluctuating.
The system doesn't freeze—it keeps moving—but it stays within a recognizable region of possibilities.
Like a fish swimming in place in a current: always moving, but staying roughly where it is.
Nonequilibrium
A system that maintains order by constantly exchanging energy/matter with its environment.
Unlike a rock (equilibrium), living things are always processing, metabolizing, and adapting.
A candle flame is nonequilibrium—it maintains its shape only by continuously burning fuel.
Markov Blanket
A statistical boundary that separates "inside" from "outside." It's the set of variables that mediate
all interactions between a system and its environment. Everything inside the blanket is conditionally
independent of everything outside, given the blanket itself.
Like your skin: the outside world can only affect your internal organs through your skin (and other sensory surfaces).
Active Inference
The idea that living systems act to confirm their predictions about the world. Instead of just passively
updating beliefs based on sensory input, organisms actively change the world (or sample it differently)
to reduce prediction errors.
If you predict it's warm outside and feel cold, you can either update your belief OR put on a coat (change the world to match your prediction).
Generative Model
An internal model of how sensory data is generated by hidden causes in the world.
It captures your brain's "theory" about what's out there and how it produces the signals you receive.
Your brain's implicit guess about "what kind of world would produce these sights, sounds, and feelings?"
Prediction Error (ε)
The difference between what you predicted you would sense and what you actually sensed.
These errors drive learning and belief updating—and action, if you're doing active inference.
Expected a sweet taste, got sour? That mismatch is prediction error.
Precision (π)
How much confidence or weight you give to certain predictions or prediction errors.
High precision means "pay attention to this signal, it's reliable." Low precision means "this is noisy, don't trust it too much."
Like turning up the volume on important information and muting the background noise.
Entropy
A measure of disorder or uncertainty. In thermodynamics, it tends to increase (Second Law).
In information theory, it measures how spread out a probability distribution is.
Living systems are special because they resist entropy increase locally.
A dropped egg spreads out (high entropy). Living systems are like eggs that somehow un-scramble themselves.
Solenoidal Flow
Rotational movement that doesn't increase or decrease surprise—it just moves along contours of equal probability.
The "Q" term in the dynamics equation. This is what makes the steady-state system spiral counterclockwise.
Like a marble rolling around the inside of a bowl without climbing up or falling down—just orbiting.
Gradient Descent
Moving in the direction that most rapidly decreases some quantity (like surprise or free energy).
The "−Γ∇ℑ" part of the steady-state dynamics—the system flows "downhill" toward less surprising states.
Like a ball rolling down a hill, always seeking the lowest point.