An attempt has been made to describe brightness and color discrimination in the framework of a fluctuation theory. The fluctuation theory states that a difference between two stimuli will be just noticeable if it exceeds, by some factor, the average of the fluctuations in the stimuli.
If the statistical fluctuations in the number of quanta absorbed are considered as the determining factor, the de Vries–Rose law ( ) is unforcedly explained. The deflection from this law into Weber’s law (ΔB/B=const) becomes understood in the same framework, provided the statistics are applied on the neural coding rather than on the quantum incidence, considering a refractory period in the nervous transmission.
Overfilling of the nervous channels, adaptation of the receptors and fluctuations in the refractory period are indicated as possible causes of the fall in contrast sensitivity at high levels of brightness. Adaptative effects like shrinking of the information-collecting time and area or changes in coincidence mechanisms with increasing luminance, may be reasons of deviations from the basic laws when large areas or exposure times are involved.
The application of the fluctuation theory to chromaticity discrimination is elaborated for the case of tritanopia and illustrated for the general case of trichromasy. The agreement between theory and experiment is satisfactory.
© 1963 Optical Society of AmericaFull Article | PDF Article
OSA Recommended Articles
A. E. Elsner, Joel Pokorny, and Stephen A. Burns
J. Opt. Soc. Am. A 3(7) 916-920 (1986)
Michael A. Webster, Karen K. De Valois, and Eugene Switkes
J. Opt. Soc. Am. A 7(6) 1034-1049 (1990)
H. W. Holdaway
J. Opt. Soc. Am. 63(12) 1613-1615 (1973)