ATTENTION FIELD THEORY (AFT)

Riemannian Geometry of Free-Energy-Driven Attention Dynamics

Thomas Orr Anderson · Preprint · Published November 24, 2025

DOI: 10.5281/zenodo.17703957

Download PDF

View on Zenodo (DOI)

Overview

Attention Field Theory (AFT) models subjective experience as a probability-density field evolving on a hyperspherical manifold of attendable distinctions. A radial coordinate encodes subjective degree of belief, separating fully real experience from latent possibilities such as memory, imagination, and prediction. Attention moves by salience-gradient drift with metric-consistent diffusion, and the framework yields concrete predictions for neural gain, behavioral variability, and learning-driven representational change.

Abstract

Attention Field Theory (AFT) models subjective experience as a continuous density field evolving on a hyperspherical manifold of attendable distinctions. Attention is represented as a probability density over Model-space, with a radial coordinate encoding subjective degree of belief: ρ = 1 marks fully real distinctions; ρ < 1 marks latent possibilities (memory, imagination, prediction). A scalar salience potential combines prediction error and precision weighting to drive attention down salience gradients, with diffusion set by a global inverse-temperature parameter β. Precision adapts via a Hebbian–Bayesian stochastic differential equation, and the metric on Model-space slowly warps toward sensory Fisher information, capturing long-term representational remapping.

Under a single forward-prediction assumption, these dynamics realize a Fisher-metric descent of variational free energy, rendering AFT formally equivalent to the perceptual component of continuous-time, Gaussian Active Inference while retaining direct phenomenological meaning. The theory predicts measurable signatures in neural gain modulation, behavioral variability, and long-term representational remapping, and it sets the stage for future extensions to action and multi-agent interaction.

At a glance

• Experience as geometry: Model-space is a manifold of attendable distinctions, with a belief-radius that separates “now-real” from “possible.”

• Attention as a field: a density over Model-space whose peaks correspond to focused attention and whose spread captures diffuse states.

• Quantitative salience: prediction error weighted by precision drives attention down salience gradients, with diffusion controlled by a global temperature parameter.

• Plasticity on multiple time-scales: precision adaptation (gain) plus slow remapping of representational geometry toward sensory Fisher information.

• Formal equivalence: under an explicit forward-prediction assumption on the evidence surface, AFT matches perception-only continuous-time Gaussian Active Inference.

• Empirical predictions: signatures in gamma-band gain, reaction-time variability, representational geometry, and pupil-linked arousal.

Downloads and citation

Download the PDF

Zenodo record (archival DOI): 10.5281/zenodo.17703957

Suggested citation

Anderson, Thomas Orr. (2025). Attention Field Theory: Riemannian Geometry of Free-Energy-Driven Attention Dynamics. Zenodo. DOI: 10.5281/zenodo.17703957

License

Creative Commons Attribution 4.0 International (CC BY 4.0)

Keywords

attention, phenomenology, Riemannian geometry, Fisher information, variational free energy, active inference, Markov blanket, precision, diffusion, plasticity