New Model Derives 3D Color Discrimination Metric from Visual Cortex Neural Information
Preprint derives 17-parameter Riemannian color metric from V1 Fisher information and fits it jointly to 96 conditions from four classic threshold datasets, achieving STRESS values of 20.8-30.8; not yet peer-reviewed.
A preprint posted on arXiv describes a mathematical model that quantifies how humans perceive color differences in three-dimensional space (hue, saturation, and brightness) by drawing on the information processing of neuron populations in the brain's primary visual cortex (V1). The authors derive a Riemannian metric on color space from the Fisher information contained in these neural population codes. In plain terms, they translate known stages of visual processing - photoreceptor adaptation in the eye, opponent-color channels in the retina, and cortical population encoding - into the components of a geometric distance measure whose parameters correspond to measurable neural properties.
The resulting 17-parameter model was simultaneously fitted to four independent historical psychophysical datasets covering 96 distinct discrimination conditions across different colors and luminance levels. These included MacAdam's 1942 chromaticity ellipses, Koenderink et al. (2026) three-dimensional ellipsoids, Wright's 1941 wavelength discrimination function, and Huang et al. (2012) threshold color difference ellipses. Goodness-of-fit was reported using the STRESS metric, a standard measure in color science where lower values indicate better agreement; the joint fit produced STRESS scores of 23.9 on MacAdam, 20.8 on Koenderink, 30.1 on Wright, and 30.8 on Huang et al.
This is a preprint (arXiv:2603.24356v1) and has not undergone peer review. The work is computational and theoretical; it does not include new neural recordings or fresh human experiments but instead optimizes the model against existing threshold data. Original sample sizes from the cited studies are not specified in the abstract. Limitations noted include the moderate STRESS values (indicating imperfect fits) and reliance on specific assumptions about population coding in V1 that remain to be directly validated with physiological data.
Source: https://arxiv.org/abs/2603.24356
HELIX: This could mean your phone screens, TVs, and photo apps will soon match colors to how your brain actually sees them, making everything from online shopping to family videos look noticeably more natural and accurate.
Sources (1)
- [1]A Metric for Three-Dimensional Color Discrimination Derived from V1 Population Fisher Information(https://arxiv.org/abs/2603.24356)