Abstract

Where observers concentrate their gaze during visual search depends on several factors. The aim here was to determine how much of the variance in observers’ fixations in natural scenes can be explained by local scene color and how that variance is related to viewing bias. Fixation data were taken from an experiment in which observers searched images of 20 natural rural and urban scenes for a small target. The proportion R2 of the variance explained in a regression on local color properties (lightness and the red–green and yellow–blue chromatic components) ranged from 1% to 85%, depending mainly on how well those properties were consistent with observers’ viewing bias. When viewing bias was included in the regression, values of R2 increased, ranging from 62% to 96%. By comparison, local lightness and local lightness contrast, edge density, and entropy each explained less variance than local color properties. Local scene color may have a much stronger influence on gaze position than is generally recognized, capturing significant aspects of scene structure on target search behavior.

© 2014 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. J. M. Henderson, “Human gaze control during real-world scene perception,” Trends Cogn. Sci. 7, 498–504 (2003).
    [CrossRef]
  2. J. M. Wolfe, M. L.-H. Võ, K. K. Evans, and M. R. Greene, “Visual search in scenes involves selective and nonselective pathways,” Trends Cogn. Sci. 15, 77–84 (2011).
    [CrossRef]
  3. M. S. Castelhano, M. L. Mack, and J. M. Henderson, “Viewing task influences eye movement control during active scene perception,” J. Vis. 9(3):6, 1–15 (2009).
    [CrossRef]
  4. A. Torralba, A. Oliva, M. S. Castelhano, and J. M. Henderson, “Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search,” Psycholog. Rev. 113, 766–786 (2006).
  5. M. J. Bravo and K. Nakayama, “The role of attention in different visual-search tasks,” Percept. Psychophys. 51, 465–472 (1992).
  6. J. M. Henderson, G. L. Malcolm, and C. Schandl, “Searching in the dark: cognitive relevance drives attention in real-world scenes,” Psychon. B. Rev. 16, 850–856 (2009).
    [CrossRef]
  7. D. Parkhurst, K. Law, and E. Niebur, “Modeling the role of salience in the allocation of overt visual attention,” Vis. Res. 42, 107–123 (2002).
    [CrossRef]
  8. W. Einhäuser, U. Rutishauser, and C. Koch, “Task-demands can immediately reverse the effects of sensory-driven saliency in complex visual stimuli,” J. Vis. 8(2):2, 1–19 (2008).
    [CrossRef]
  9. G. Krieger, I. Rentschler, G. Hauske, K. Schill, and C. Zetzsche, “Object and scene analysis by saccadic eye-movements: an investigation with higher-order statistics,” Spatial Vis. 13, 201–214 (2000).
  10. W. Kienzle, F. A. Wichmann, B. Schölkopf, and M. O. Franz, “A nonparametric approach to bottom-up visual saliency,” in Advances in Neural Information Processing Systems 19, B. Schölkopf, J. Platt, and T. Hoffman, eds. (MIT, 2007), pp. 689–696.
  11. D. J. Parkhurst and E. Niebur, “Scene content selected by active vision,” Spatial Vis. 16, 125–154 (2003).
  12. S. K. Mannan, K. H. Ruddock, and D. S. Wooding, “The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images,” Spatial Vis. 10, 165–188 (1996).
    [CrossRef]
  13. P. Reinagel and A. M. Zador, “Natural scene statistics at the centre of gaze,” Netw. Comput. Neural Syst. 10, 341–350 (1999).
  14. A. Açık, S. Onat, F. Schumann, W. Einhäuser, and P. König, “Effects of luminance contrast and its modifications on fixation behavior during free viewing of images from different categories,” Vis. Res. 49, 1541–1553 (2009).
    [CrossRef]
  15. L. Itti, C. Koch, and E. Niebur, “A model of saliency-based visual attention for rapid scene analysis,” IEEE Trans. Pattern Anal. Mach. Intell. 20, 1254–1259 (1998).
    [CrossRef]
  16. B. W. Tatler, R. J. Baddeley, and I. D. Gilchrist, “Visual correlates of fixation selection: effects of scale and time,” Vis. Res. 45, 643–659 (2005).
    [CrossRef]
  17. A. Borji, D. N. Sihite, and L. Itti, “Objects do not predict fixations better than early saliency: a re-analysis of Einhäuser et al.’s data,” J. Vis. 13(10):18, 1–4 (2013).
    [CrossRef]
  18. J. M. Henderson, A. Nuthmann, and S. G. Luke, “Eye movement control during scene viewing: immediate effects of scene luminance on fixation durations,” J. Exp. Psychol.-Hum. Percept. Perform. 39, 318–322 (2013).
    [CrossRef]
  19. M. Nyström and K. Holmqvist, “Semantic override of low-level features in image viewing—both initially and overall,” J. Eye Movement Res. 2(2):2, 1–11 (2008).
  20. G. Boccignone and M. Ferraro, “Modelling gaze shift as a constrained random walk,” Physica A 331, 207–218 (2004).
    [CrossRef]
  21. G. Boccignone and M. Ferraro, “Ecological sampling of gaze shifts,” IEEE Trans. Cybern. 44, 266–279 (2013).
    [CrossRef]
  22. B. M. ’t Hart, H. C. E. F. Schmidt, I. Klein-Harmeyer, and W. Einhäuser, “Attention in natural scenes: contrast affects rapid visual processing and fixations alike,” Phil. Trans. R. Soc. B 368, 20130067 (2013).
    [CrossRef]
  23. R. J. Peters, A. Iyer, L. Itti, and C. Koch, “Components of bottom-up gaze allocation in natural images,” Vis. Res. 45, 2397–2416 (2005).
    [CrossRef]
  24. A. Hurlbert, P. H. Chow, and A. Owen, “Colour boosts performance in visual search for natural objects,” J. Vis. 12(9):105, 105 (2012).
    [CrossRef]
  25. H.-P. Frey, K. Wirz, V. Willenbockel, T. Betz, C. Schreiber, T. Troscianko, and P. König, “Beyond correlation: do color features influence attention in rainforest?” Front. Hum. Neurosci. 5:36, 1–13 (2011).
  26. T. Jost, N. Ouerhani, R. von Wartburg, R. Müri, and H. Hügli, “Assessing the contribution of color in visual attention,” Comput. Vis. Image Underst. 100, 107–123 (2005).
    [CrossRef]
  27. H.-P. Frey, C. Honey, and P. König, “What’s color got to do with it? The influence of color on visual attention in different categories,” J. Vis. 8(14):6, 1–17 (2008).
    [CrossRef]
  28. A. D. Melin, D. W. Kline, C. M. Hickey, and L. M. Fedigan, “Food search through the eyes of a monkey: a functional substitution approach for assessing the ecology of primate color vision,” Vis. Res. 86, 87–96 (2013).
    [CrossRef]
  29. P. W. Lucas, N. J. Dominy, P. Riba-Hernandez, K. E. Stoner, N. Yamashita, E. Loría-Calderón, W. Petersen-Pereira, Y. Rojas-Durán, R. Salas-Pena, S. Solis-Madrigal, D. Osorio, and B. W. Darvell, “Evolution and function of routine trichromatic vision in primates,” Evolution 57, 2636–2643 (2003).
    [CrossRef]
  30. K. Amano, D. H. Foster, M. S. Mould, and J. P. Oakley, “Visual search in natural scenes explained by local color properties,” J. Opt. Soc. Am. A 29, A194–A199 (2012).
    [CrossRef]
  31. B. W. Tatler, “The central fixation bias in scene viewing: selecting an optimal viewing position independently of motor biases and image feature distributions,” J. Vis. 7(14):4, 1–17 (2007).
    [CrossRef]
  32. F. L. Engel, “Visual conspicuity, directed attention and retinal locus,” Vis. Res. 11, 563–575 (1971).
    [CrossRef]
  33. J. Wolfe, P. O’Neill, and S. Bennett, “Why are there eccentricity effects in visual search? Visual and attentional hypotheses,” Percept. Psychophys. 60, 140–156 (1998).
  34. M. Carrasco, D. L. Evert, I. Chang, and S. M. Katz, “The eccentricity effect: target eccentricity affects performance on conjunction searches,” Percept. Psychophys. 57, 1241–1261 (1995).
    [CrossRef]
  35. CIE, Technical Committee 8-01, “A colour appearance model for colour management systems: CIECAM02,” (Commission Internationale de l’Eclairage, Vienna, Austria, 2004).
  36. D. H. Foster, K. Amano, S. M. C. Nascimento, and M. J. Foster, “Frequency of metamerism in natural scenes,” J. Opt. Soc. Am. A 23, 2359–2372 (2006).
    [CrossRef]
  37. J. M. Wolfe, N. Klempen, and K. Dahlen, “Postattentive vision,” J. Exp. Psychol. Hum. Percept. Perform. 26, 693–716 (2000).
  38. G. Harding and M. Bloj, “Real and predicted influence of image manipulations on eye movements during scene recognition,” J. Vis. 10(2):8, 1–17 (2010).
    [CrossRef]
  39. K. Kaspar and P. König, “Viewing behavior and the impact of low-level image properties across repeated presentations of complex scenes,” J. Vis. 11(13):26, 1–29 (2011).
    [CrossRef]
  40. D. Noton and L. Stark, “Scanpaths in saccadic eye movements while viewing and recognizing patterns,” Vis. Res. 11, 929–942 (1971).
    [CrossRef]
  41. M. S. Mould, D. H. Foster, K. Amano, and J. P. Oakley, “A simple nonparametric method for classifying eye fixation,” Vis. Res. 57, 18–25 (2012).
    [CrossRef]
  42. I. van der Linde, U. Rajashekar, A. C. Bovik, and L. K. Cormack, “DOVES: a database of visual eye movements,” Spatial Vis. 22, 161–177 (2009).
    [CrossRef]
  43. W. Kienzle, M. O. Franz, B. Schölkopf, and F. A. Wichmann, “Center–surround patterns emerge as optimal predictors for human saccade targets,” J. Vis. 9(5):7, 1–15 (2009).
    [CrossRef]
  44. D. D. Salvucci and J. H. Goldberg, “Identifying fixations and saccades in eye-tracking protocols,” in Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, Palm Beach Gardens, Florida (ACM, 2000), pp. 71–78.
  45. M. Nyström and K. Holmqvist, “An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data,” Behav. Res. Methods 42, 188–204 (2010).
    [CrossRef]
  46. E. Vig, M. Dorr, and E. Barth, “Efficient visual coding and the predictability of eye movements on natural movies,” Spatial Vis. 22, 397–408 (2009).
  47. N. Wilming, T. Betz, T. C. Kietzmann, and P. König, “Measures and limits of models of fixation selection,” PLoS ONE 6(9), e24038 (2011).
    [CrossRef]
  48. B. W. Silverman, Density Estimation for Statistics and Data Analysis, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1986).
  49. J. Fan and I. Gijbels, Local Polynomial Modelling and Its Applications, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1996).
  50. D. H. Foster, S. M. C. Nascimento, K. Amano, L. Arend, K. J. Linnell, J. L. Nieves, S. Plet, and J. S. Foster, “Parallel detection of violations of color constancy,” Proc. Natl. Acad. Sci. USA 98, 8151–8156 (2001).
    [CrossRef]
  51. J. Najemnik and W. S. Geisler, “Simple summation rule for optimal fixation selection in visual search,” Vis. Res. 49, 1286–1294 (2009).
    [CrossRef]
  52. R. Rosenholtz, Y. Li, and L. Nakano, “Measuring visual clutter,” J. Vis. 7(2):17, 1–22 (2007).
    [CrossRef]
  53. N. D. B. Bruce and J. K. Tsotsos, “Saliency, attention, and visual search: an information theoretic approach,” J. Vis. 9(3):5, 1–24 (2009).
    [CrossRef]
  54. C. M. Privitera and L. W. Stark, “Algorithms for defining visual regions-of-interest: comparison with eye fixations,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 970–982 (2000).
    [CrossRef]
  55. L. F. Kozachenko and N. N. Leonenko, “Sample estimate of the entropy of a random vector,” Prob. Peredachi Inf. 23(2), 9–16 (1987).
  56. M. N. Goria, N. N. Leonenko, V. V. Mergel, and P. L. Novi Inverardi, “A new class of random vector entropy estimators and its applications in testing statistical hypotheses,” J. Nonparametr. Stat. 17, 277–297 (2005).
    [CrossRef]
  57. M. S. Mould, “Visual search in natural scenes with and without guidance of fixations,” Ph.D. thesis (University of Manchester, Manchester, UK, 2011).
  58. N. R. Draper and H. Smith, Applied Regression Analysis, 3rd ed. (Wiley, 1998).
  59. T. J. Hastie and R. J. Tibshirani, Generalized Additive Models, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1990).
  60. R. Baddeley, “Searching for filters with ‘interesting’ output distributions: an uninteresting direction to explore?” Netw. Comput. Neural Syst. 7, 409–421 (1996).
  61. B. Efron and R. J. Tibshirani, An Introduction to the Bootstrap, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1993).
  62. M. Cerf, E. P. Frady, and C. Koch, “Faces and text attract gaze independent of the task: experimental data and computer model,” J. Vis. 9(12):10, 1–15 (2009).
    [CrossRef]
  63. Q. Zhao and C. Koch, “Learning a saliency map using fixated locations in natural scenes,” J. Vis. 11(3):9, 1–15 (2011).
    [CrossRef]
  64. T. Judd, F. Durand, and A. Torralba, “A benchmark of computational models of saliency to predict human fixations,” in Computer Science and Artificial Intelligence Laboratory Technical Report MIT-CSAIL-TR-2012-001 (MIT, 2012).
  65. T. Troscianko, C. P. Benton, P. G. Lovell, D. J. Tolhurst, and Z. Pizlo, “Camouflage and visual perception,” Phil. Trans. R. Soc. B 364, 449–461 (2009).
    [CrossRef]
  66. C. Kayser, K. J. Nielsen, and N. K. Logothetis, “Fixations in natural scenes: interaction of image structure and image content,” Vis. Res. 46, 2535–2545 (2006).
    [CrossRef]
  67. J. M. Henderson, P. A. Weeks, and A. Hollingworth, “The effects of semantic consistency on eye movements during complex scene viewing,” J. Exp. Psychol. Hum. Percept. Perform. 25, 210–228 (1999).
    [CrossRef]
  68. W. Einhäuser, M. Spain, and P. Perona, “Objects predict fixations better than early saliency,” J. Vis. 8(14):18, 1–26(2008).
    [CrossRef]
  69. A. F. Russell, S. Mihalaş, R. von der Heydt, E. Niebur, and R. Etienne-Cummings, “A model of proto-object based saliency,” Vis. Res. 94, 1–15 (2014).
    [CrossRef]

2014 (1)

A. F. Russell, S. Mihalaş, R. von der Heydt, E. Niebur, and R. Etienne-Cummings, “A model of proto-object based saliency,” Vis. Res. 94, 1–15 (2014).
[CrossRef]

2013 (5)

A. Borji, D. N. Sihite, and L. Itti, “Objects do not predict fixations better than early saliency: a re-analysis of Einhäuser et al.’s data,” J. Vis. 13(10):18, 1–4 (2013).
[CrossRef]

J. M. Henderson, A. Nuthmann, and S. G. Luke, “Eye movement control during scene viewing: immediate effects of scene luminance on fixation durations,” J. Exp. Psychol.-Hum. Percept. Perform. 39, 318–322 (2013).
[CrossRef]

G. Boccignone and M. Ferraro, “Ecological sampling of gaze shifts,” IEEE Trans. Cybern. 44, 266–279 (2013).
[CrossRef]

B. M. ’t Hart, H. C. E. F. Schmidt, I. Klein-Harmeyer, and W. Einhäuser, “Attention in natural scenes: contrast affects rapid visual processing and fixations alike,” Phil. Trans. R. Soc. B 368, 20130067 (2013).
[CrossRef]

A. D. Melin, D. W. Kline, C. M. Hickey, and L. M. Fedigan, “Food search through the eyes of a monkey: a functional substitution approach for assessing the ecology of primate color vision,” Vis. Res. 86, 87–96 (2013).
[CrossRef]

2012 (3)

M. S. Mould, D. H. Foster, K. Amano, and J. P. Oakley, “A simple nonparametric method for classifying eye fixation,” Vis. Res. 57, 18–25 (2012).
[CrossRef]

A. Hurlbert, P. H. Chow, and A. Owen, “Colour boosts performance in visual search for natural objects,” J. Vis. 12(9):105, 105 (2012).
[CrossRef]

K. Amano, D. H. Foster, M. S. Mould, and J. P. Oakley, “Visual search in natural scenes explained by local color properties,” J. Opt. Soc. Am. A 29, A194–A199 (2012).
[CrossRef]

2011 (5)

Q. Zhao and C. Koch, “Learning a saliency map using fixated locations in natural scenes,” J. Vis. 11(3):9, 1–15 (2011).
[CrossRef]

N. Wilming, T. Betz, T. C. Kietzmann, and P. König, “Measures and limits of models of fixation selection,” PLoS ONE 6(9), e24038 (2011).
[CrossRef]

H.-P. Frey, K. Wirz, V. Willenbockel, T. Betz, C. Schreiber, T. Troscianko, and P. König, “Beyond correlation: do color features influence attention in rainforest?” Front. Hum. Neurosci. 5:36, 1–13 (2011).

K. Kaspar and P. König, “Viewing behavior and the impact of low-level image properties across repeated presentations of complex scenes,” J. Vis. 11(13):26, 1–29 (2011).
[CrossRef]

J. M. Wolfe, M. L.-H. Võ, K. K. Evans, and M. R. Greene, “Visual search in scenes involves selective and nonselective pathways,” Trends Cogn. Sci. 15, 77–84 (2011).
[CrossRef]

2010 (2)

M. Nyström and K. Holmqvist, “An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data,” Behav. Res. Methods 42, 188–204 (2010).
[CrossRef]

G. Harding and M. Bloj, “Real and predicted influence of image manipulations on eye movements during scene recognition,” J. Vis. 10(2):8, 1–17 (2010).
[CrossRef]

2009 (10)

M. Cerf, E. P. Frady, and C. Koch, “Faces and text attract gaze independent of the task: experimental data and computer model,” J. Vis. 9(12):10, 1–15 (2009).
[CrossRef]

T. Troscianko, C. P. Benton, P. G. Lovell, D. J. Tolhurst, and Z. Pizlo, “Camouflage and visual perception,” Phil. Trans. R. Soc. B 364, 449–461 (2009).
[CrossRef]

J. Najemnik and W. S. Geisler, “Simple summation rule for optimal fixation selection in visual search,” Vis. Res. 49, 1286–1294 (2009).
[CrossRef]

N. D. B. Bruce and J. K. Tsotsos, “Saliency, attention, and visual search: an information theoretic approach,” J. Vis. 9(3):5, 1–24 (2009).
[CrossRef]

E. Vig, M. Dorr, and E. Barth, “Efficient visual coding and the predictability of eye movements on natural movies,” Spatial Vis. 22, 397–408 (2009).

I. van der Linde, U. Rajashekar, A. C. Bovik, and L. K. Cormack, “DOVES: a database of visual eye movements,” Spatial Vis. 22, 161–177 (2009).
[CrossRef]

W. Kienzle, M. O. Franz, B. Schölkopf, and F. A. Wichmann, “Center–surround patterns emerge as optimal predictors for human saccade targets,” J. Vis. 9(5):7, 1–15 (2009).
[CrossRef]

A. Açık, S. Onat, F. Schumann, W. Einhäuser, and P. König, “Effects of luminance contrast and its modifications on fixation behavior during free viewing of images from different categories,” Vis. Res. 49, 1541–1553 (2009).
[CrossRef]

M. S. Castelhano, M. L. Mack, and J. M. Henderson, “Viewing task influences eye movement control during active scene perception,” J. Vis. 9(3):6, 1–15 (2009).
[CrossRef]

J. M. Henderson, G. L. Malcolm, and C. Schandl, “Searching in the dark: cognitive relevance drives attention in real-world scenes,” Psychon. B. Rev. 16, 850–856 (2009).
[CrossRef]

2008 (4)

W. Einhäuser, U. Rutishauser, and C. Koch, “Task-demands can immediately reverse the effects of sensory-driven saliency in complex visual stimuli,” J. Vis. 8(2):2, 1–19 (2008).
[CrossRef]

M. Nyström and K. Holmqvist, “Semantic override of low-level features in image viewing—both initially and overall,” J. Eye Movement Res. 2(2):2, 1–11 (2008).

H.-P. Frey, C. Honey, and P. König, “What’s color got to do with it? The influence of color on visual attention in different categories,” J. Vis. 8(14):6, 1–17 (2008).
[CrossRef]

W. Einhäuser, M. Spain, and P. Perona, “Objects predict fixations better than early saliency,” J. Vis. 8(14):18, 1–26(2008).
[CrossRef]

2007 (2)

R. Rosenholtz, Y. Li, and L. Nakano, “Measuring visual clutter,” J. Vis. 7(2):17, 1–22 (2007).
[CrossRef]

B. W. Tatler, “The central fixation bias in scene viewing: selecting an optimal viewing position independently of motor biases and image feature distributions,” J. Vis. 7(14):4, 1–17 (2007).
[CrossRef]

2006 (3)

C. Kayser, K. J. Nielsen, and N. K. Logothetis, “Fixations in natural scenes: interaction of image structure and image content,” Vis. Res. 46, 2535–2545 (2006).
[CrossRef]

D. H. Foster, K. Amano, S. M. C. Nascimento, and M. J. Foster, “Frequency of metamerism in natural scenes,” J. Opt. Soc. Am. A 23, 2359–2372 (2006).
[CrossRef]

A. Torralba, A. Oliva, M. S. Castelhano, and J. M. Henderson, “Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search,” Psycholog. Rev. 113, 766–786 (2006).

2005 (4)

B. W. Tatler, R. J. Baddeley, and I. D. Gilchrist, “Visual correlates of fixation selection: effects of scale and time,” Vis. Res. 45, 643–659 (2005).
[CrossRef]

T. Jost, N. Ouerhani, R. von Wartburg, R. Müri, and H. Hügli, “Assessing the contribution of color in visual attention,” Comput. Vis. Image Underst. 100, 107–123 (2005).
[CrossRef]

R. J. Peters, A. Iyer, L. Itti, and C. Koch, “Components of bottom-up gaze allocation in natural images,” Vis. Res. 45, 2397–2416 (2005).
[CrossRef]

M. N. Goria, N. N. Leonenko, V. V. Mergel, and P. L. Novi Inverardi, “A new class of random vector entropy estimators and its applications in testing statistical hypotheses,” J. Nonparametr. Stat. 17, 277–297 (2005).
[CrossRef]

2004 (1)

G. Boccignone and M. Ferraro, “Modelling gaze shift as a constrained random walk,” Physica A 331, 207–218 (2004).
[CrossRef]

2003 (3)

D. J. Parkhurst and E. Niebur, “Scene content selected by active vision,” Spatial Vis. 16, 125–154 (2003).

J. M. Henderson, “Human gaze control during real-world scene perception,” Trends Cogn. Sci. 7, 498–504 (2003).
[CrossRef]

P. W. Lucas, N. J. Dominy, P. Riba-Hernandez, K. E. Stoner, N. Yamashita, E. Loría-Calderón, W. Petersen-Pereira, Y. Rojas-Durán, R. Salas-Pena, S. Solis-Madrigal, D. Osorio, and B. W. Darvell, “Evolution and function of routine trichromatic vision in primates,” Evolution 57, 2636–2643 (2003).
[CrossRef]

2002 (1)

D. Parkhurst, K. Law, and E. Niebur, “Modeling the role of salience in the allocation of overt visual attention,” Vis. Res. 42, 107–123 (2002).
[CrossRef]

2001 (1)

D. H. Foster, S. M. C. Nascimento, K. Amano, L. Arend, K. J. Linnell, J. L. Nieves, S. Plet, and J. S. Foster, “Parallel detection of violations of color constancy,” Proc. Natl. Acad. Sci. USA 98, 8151–8156 (2001).
[CrossRef]

2000 (3)

C. M. Privitera and L. W. Stark, “Algorithms for defining visual regions-of-interest: comparison with eye fixations,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 970–982 (2000).
[CrossRef]

G. Krieger, I. Rentschler, G. Hauske, K. Schill, and C. Zetzsche, “Object and scene analysis by saccadic eye-movements: an investigation with higher-order statistics,” Spatial Vis. 13, 201–214 (2000).

J. M. Wolfe, N. Klempen, and K. Dahlen, “Postattentive vision,” J. Exp. Psychol. Hum. Percept. Perform. 26, 693–716 (2000).

1999 (2)

P. Reinagel and A. M. Zador, “Natural scene statistics at the centre of gaze,” Netw. Comput. Neural Syst. 10, 341–350 (1999).

J. M. Henderson, P. A. Weeks, and A. Hollingworth, “The effects of semantic consistency on eye movements during complex scene viewing,” J. Exp. Psychol. Hum. Percept. Perform. 25, 210–228 (1999).
[CrossRef]

1998 (2)

J. Wolfe, P. O’Neill, and S. Bennett, “Why are there eccentricity effects in visual search? Visual and attentional hypotheses,” Percept. Psychophys. 60, 140–156 (1998).

L. Itti, C. Koch, and E. Niebur, “A model of saliency-based visual attention for rapid scene analysis,” IEEE Trans. Pattern Anal. Mach. Intell. 20, 1254–1259 (1998).
[CrossRef]

1996 (2)

S. K. Mannan, K. H. Ruddock, and D. S. Wooding, “The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images,” Spatial Vis. 10, 165–188 (1996).
[CrossRef]

R. Baddeley, “Searching for filters with ‘interesting’ output distributions: an uninteresting direction to explore?” Netw. Comput. Neural Syst. 7, 409–421 (1996).

1995 (1)

M. Carrasco, D. L. Evert, I. Chang, and S. M. Katz, “The eccentricity effect: target eccentricity affects performance on conjunction searches,” Percept. Psychophys. 57, 1241–1261 (1995).
[CrossRef]

1992 (1)

M. J. Bravo and K. Nakayama, “The role of attention in different visual-search tasks,” Percept. Psychophys. 51, 465–472 (1992).

1987 (1)

L. F. Kozachenko and N. N. Leonenko, “Sample estimate of the entropy of a random vector,” Prob. Peredachi Inf. 23(2), 9–16 (1987).

1971 (2)

F. L. Engel, “Visual conspicuity, directed attention and retinal locus,” Vis. Res. 11, 563–575 (1971).
[CrossRef]

D. Noton and L. Stark, “Scanpaths in saccadic eye movements while viewing and recognizing patterns,” Vis. Res. 11, 929–942 (1971).
[CrossRef]

’t Hart, B. M.

B. M. ’t Hart, H. C. E. F. Schmidt, I. Klein-Harmeyer, and W. Einhäuser, “Attention in natural scenes: contrast affects rapid visual processing and fixations alike,” Phil. Trans. R. Soc. B 368, 20130067 (2013).
[CrossRef]

Açik, A.

A. Açık, S. Onat, F. Schumann, W. Einhäuser, and P. König, “Effects of luminance contrast and its modifications on fixation behavior during free viewing of images from different categories,” Vis. Res. 49, 1541–1553 (2009).
[CrossRef]

Amano, K.

M. S. Mould, D. H. Foster, K. Amano, and J. P. Oakley, “A simple nonparametric method for classifying eye fixation,” Vis. Res. 57, 18–25 (2012).
[CrossRef]

K. Amano, D. H. Foster, M. S. Mould, and J. P. Oakley, “Visual search in natural scenes explained by local color properties,” J. Opt. Soc. Am. A 29, A194–A199 (2012).
[CrossRef]

D. H. Foster, K. Amano, S. M. C. Nascimento, and M. J. Foster, “Frequency of metamerism in natural scenes,” J. Opt. Soc. Am. A 23, 2359–2372 (2006).
[CrossRef]

D. H. Foster, S. M. C. Nascimento, K. Amano, L. Arend, K. J. Linnell, J. L. Nieves, S. Plet, and J. S. Foster, “Parallel detection of violations of color constancy,” Proc. Natl. Acad. Sci. USA 98, 8151–8156 (2001).
[CrossRef]

Arend, L.

D. H. Foster, S. M. C. Nascimento, K. Amano, L. Arend, K. J. Linnell, J. L. Nieves, S. Plet, and J. S. Foster, “Parallel detection of violations of color constancy,” Proc. Natl. Acad. Sci. USA 98, 8151–8156 (2001).
[CrossRef]

Baddeley, R.

R. Baddeley, “Searching for filters with ‘interesting’ output distributions: an uninteresting direction to explore?” Netw. Comput. Neural Syst. 7, 409–421 (1996).

Baddeley, R. J.

B. W. Tatler, R. J. Baddeley, and I. D. Gilchrist, “Visual correlates of fixation selection: effects of scale and time,” Vis. Res. 45, 643–659 (2005).
[CrossRef]

Barth, E.

E. Vig, M. Dorr, and E. Barth, “Efficient visual coding and the predictability of eye movements on natural movies,” Spatial Vis. 22, 397–408 (2009).

Bennett, S.

J. Wolfe, P. O’Neill, and S. Bennett, “Why are there eccentricity effects in visual search? Visual and attentional hypotheses,” Percept. Psychophys. 60, 140–156 (1998).

Benton, C. P.

T. Troscianko, C. P. Benton, P. G. Lovell, D. J. Tolhurst, and Z. Pizlo, “Camouflage and visual perception,” Phil. Trans. R. Soc. B 364, 449–461 (2009).
[CrossRef]

Betz, T.

H.-P. Frey, K. Wirz, V. Willenbockel, T. Betz, C. Schreiber, T. Troscianko, and P. König, “Beyond correlation: do color features influence attention in rainforest?” Front. Hum. Neurosci. 5:36, 1–13 (2011).

N. Wilming, T. Betz, T. C. Kietzmann, and P. König, “Measures and limits of models of fixation selection,” PLoS ONE 6(9), e24038 (2011).
[CrossRef]

Bloj, M.

G. Harding and M. Bloj, “Real and predicted influence of image manipulations on eye movements during scene recognition,” J. Vis. 10(2):8, 1–17 (2010).
[CrossRef]

Boccignone, G.

G. Boccignone and M. Ferraro, “Ecological sampling of gaze shifts,” IEEE Trans. Cybern. 44, 266–279 (2013).
[CrossRef]

G. Boccignone and M. Ferraro, “Modelling gaze shift as a constrained random walk,” Physica A 331, 207–218 (2004).
[CrossRef]

Borji, A.

A. Borji, D. N. Sihite, and L. Itti, “Objects do not predict fixations better than early saliency: a re-analysis of Einhäuser et al.’s data,” J. Vis. 13(10):18, 1–4 (2013).
[CrossRef]

Bovik, A. C.

I. van der Linde, U. Rajashekar, A. C. Bovik, and L. K. Cormack, “DOVES: a database of visual eye movements,” Spatial Vis. 22, 161–177 (2009).
[CrossRef]

Bravo, M. J.

M. J. Bravo and K. Nakayama, “The role of attention in different visual-search tasks,” Percept. Psychophys. 51, 465–472 (1992).

Bruce, N. D. B.

N. D. B. Bruce and J. K. Tsotsos, “Saliency, attention, and visual search: an information theoretic approach,” J. Vis. 9(3):5, 1–24 (2009).
[CrossRef]

Carrasco, M.

M. Carrasco, D. L. Evert, I. Chang, and S. M. Katz, “The eccentricity effect: target eccentricity affects performance on conjunction searches,” Percept. Psychophys. 57, 1241–1261 (1995).
[CrossRef]

Castelhano, M. S.

M. S. Castelhano, M. L. Mack, and J. M. Henderson, “Viewing task influences eye movement control during active scene perception,” J. Vis. 9(3):6, 1–15 (2009).
[CrossRef]

A. Torralba, A. Oliva, M. S. Castelhano, and J. M. Henderson, “Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search,” Psycholog. Rev. 113, 766–786 (2006).

Cerf, M.

M. Cerf, E. P. Frady, and C. Koch, “Faces and text attract gaze independent of the task: experimental data and computer model,” J. Vis. 9(12):10, 1–15 (2009).
[CrossRef]

Chang, I.

M. Carrasco, D. L. Evert, I. Chang, and S. M. Katz, “The eccentricity effect: target eccentricity affects performance on conjunction searches,” Percept. Psychophys. 57, 1241–1261 (1995).
[CrossRef]

Chow, P. H.

A. Hurlbert, P. H. Chow, and A. Owen, “Colour boosts performance in visual search for natural objects,” J. Vis. 12(9):105, 105 (2012).
[CrossRef]

Cormack, L. K.

I. van der Linde, U. Rajashekar, A. C. Bovik, and L. K. Cormack, “DOVES: a database of visual eye movements,” Spatial Vis. 22, 161–177 (2009).
[CrossRef]

Dahlen, K.

J. M. Wolfe, N. Klempen, and K. Dahlen, “Postattentive vision,” J. Exp. Psychol. Hum. Percept. Perform. 26, 693–716 (2000).

Darvell, B. W.

P. W. Lucas, N. J. Dominy, P. Riba-Hernandez, K. E. Stoner, N. Yamashita, E. Loría-Calderón, W. Petersen-Pereira, Y. Rojas-Durán, R. Salas-Pena, S. Solis-Madrigal, D. Osorio, and B. W. Darvell, “Evolution and function of routine trichromatic vision in primates,” Evolution 57, 2636–2643 (2003).
[CrossRef]

Dominy, N. J.

P. W. Lucas, N. J. Dominy, P. Riba-Hernandez, K. E. Stoner, N. Yamashita, E. Loría-Calderón, W. Petersen-Pereira, Y. Rojas-Durán, R. Salas-Pena, S. Solis-Madrigal, D. Osorio, and B. W. Darvell, “Evolution and function of routine trichromatic vision in primates,” Evolution 57, 2636–2643 (2003).
[CrossRef]

Dorr, M.

E. Vig, M. Dorr, and E. Barth, “Efficient visual coding and the predictability of eye movements on natural movies,” Spatial Vis. 22, 397–408 (2009).

Draper, N. R.

N. R. Draper and H. Smith, Applied Regression Analysis, 3rd ed. (Wiley, 1998).

Durand, F.

T. Judd, F. Durand, and A. Torralba, “A benchmark of computational models of saliency to predict human fixations,” in Computer Science and Artificial Intelligence Laboratory Technical Report MIT-CSAIL-TR-2012-001 (MIT, 2012).

Efron, B.

B. Efron and R. J. Tibshirani, An Introduction to the Bootstrap, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1993).

Einhäuser, W.

B. M. ’t Hart, H. C. E. F. Schmidt, I. Klein-Harmeyer, and W. Einhäuser, “Attention in natural scenes: contrast affects rapid visual processing and fixations alike,” Phil. Trans. R. Soc. B 368, 20130067 (2013).
[CrossRef]

A. Açık, S. Onat, F. Schumann, W. Einhäuser, and P. König, “Effects of luminance contrast and its modifications on fixation behavior during free viewing of images from different categories,” Vis. Res. 49, 1541–1553 (2009).
[CrossRef]

W. Einhäuser, M. Spain, and P. Perona, “Objects predict fixations better than early saliency,” J. Vis. 8(14):18, 1–26(2008).
[CrossRef]

W. Einhäuser, U. Rutishauser, and C. Koch, “Task-demands can immediately reverse the effects of sensory-driven saliency in complex visual stimuli,” J. Vis. 8(2):2, 1–19 (2008).
[CrossRef]

Engel, F. L.

F. L. Engel, “Visual conspicuity, directed attention and retinal locus,” Vis. Res. 11, 563–575 (1971).
[CrossRef]

Etienne-Cummings, R.

A. F. Russell, S. Mihalaş, R. von der Heydt, E. Niebur, and R. Etienne-Cummings, “A model of proto-object based saliency,” Vis. Res. 94, 1–15 (2014).
[CrossRef]

Evans, K. K.

J. M. Wolfe, M. L.-H. Võ, K. K. Evans, and M. R. Greene, “Visual search in scenes involves selective and nonselective pathways,” Trends Cogn. Sci. 15, 77–84 (2011).
[CrossRef]

Evert, D. L.

M. Carrasco, D. L. Evert, I. Chang, and S. M. Katz, “The eccentricity effect: target eccentricity affects performance on conjunction searches,” Percept. Psychophys. 57, 1241–1261 (1995).
[CrossRef]

Fan, J.

J. Fan and I. Gijbels, Local Polynomial Modelling and Its Applications, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1996).

Fedigan, L. M.

A. D. Melin, D. W. Kline, C. M. Hickey, and L. M. Fedigan, “Food search through the eyes of a monkey: a functional substitution approach for assessing the ecology of primate color vision,” Vis. Res. 86, 87–96 (2013).
[CrossRef]

Ferraro, M.

G. Boccignone and M. Ferraro, “Ecological sampling of gaze shifts,” IEEE Trans. Cybern. 44, 266–279 (2013).
[CrossRef]

G. Boccignone and M. Ferraro, “Modelling gaze shift as a constrained random walk,” Physica A 331, 207–218 (2004).
[CrossRef]

Foster, D. H.

M. S. Mould, D. H. Foster, K. Amano, and J. P. Oakley, “A simple nonparametric method for classifying eye fixation,” Vis. Res. 57, 18–25 (2012).
[CrossRef]

K. Amano, D. H. Foster, M. S. Mould, and J. P. Oakley, “Visual search in natural scenes explained by local color properties,” J. Opt. Soc. Am. A 29, A194–A199 (2012).
[CrossRef]

D. H. Foster, K. Amano, S. M. C. Nascimento, and M. J. Foster, “Frequency of metamerism in natural scenes,” J. Opt. Soc. Am. A 23, 2359–2372 (2006).
[CrossRef]

D. H. Foster, S. M. C. Nascimento, K. Amano, L. Arend, K. J. Linnell, J. L. Nieves, S. Plet, and J. S. Foster, “Parallel detection of violations of color constancy,” Proc. Natl. Acad. Sci. USA 98, 8151–8156 (2001).
[CrossRef]

Foster, J. S.

D. H. Foster, S. M. C. Nascimento, K. Amano, L. Arend, K. J. Linnell, J. L. Nieves, S. Plet, and J. S. Foster, “Parallel detection of violations of color constancy,” Proc. Natl. Acad. Sci. USA 98, 8151–8156 (2001).
[CrossRef]

Foster, M. J.

Frady, E. P.

M. Cerf, E. P. Frady, and C. Koch, “Faces and text attract gaze independent of the task: experimental data and computer model,” J. Vis. 9(12):10, 1–15 (2009).
[CrossRef]

Franz, M. O.

W. Kienzle, M. O. Franz, B. Schölkopf, and F. A. Wichmann, “Center–surround patterns emerge as optimal predictors for human saccade targets,” J. Vis. 9(5):7, 1–15 (2009).
[CrossRef]

W. Kienzle, F. A. Wichmann, B. Schölkopf, and M. O. Franz, “A nonparametric approach to bottom-up visual saliency,” in Advances in Neural Information Processing Systems 19, B. Schölkopf, J. Platt, and T. Hoffman, eds. (MIT, 2007), pp. 689–696.

Frey, H.-P.

H.-P. Frey, K. Wirz, V. Willenbockel, T. Betz, C. Schreiber, T. Troscianko, and P. König, “Beyond correlation: do color features influence attention in rainforest?” Front. Hum. Neurosci. 5:36, 1–13 (2011).

H.-P. Frey, C. Honey, and P. König, “What’s color got to do with it? The influence of color on visual attention in different categories,” J. Vis. 8(14):6, 1–17 (2008).
[CrossRef]

Geisler, W. S.

J. Najemnik and W. S. Geisler, “Simple summation rule for optimal fixation selection in visual search,” Vis. Res. 49, 1286–1294 (2009).
[CrossRef]

Gijbels, I.

J. Fan and I. Gijbels, Local Polynomial Modelling and Its Applications, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1996).

Gilchrist, I. D.

B. W. Tatler, R. J. Baddeley, and I. D. Gilchrist, “Visual correlates of fixation selection: effects of scale and time,” Vis. Res. 45, 643–659 (2005).
[CrossRef]

Goldberg, J. H.

D. D. Salvucci and J. H. Goldberg, “Identifying fixations and saccades in eye-tracking protocols,” in Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, Palm Beach Gardens, Florida (ACM, 2000), pp. 71–78.

Goria, M. N.

M. N. Goria, N. N. Leonenko, V. V. Mergel, and P. L. Novi Inverardi, “A new class of random vector entropy estimators and its applications in testing statistical hypotheses,” J. Nonparametr. Stat. 17, 277–297 (2005).
[CrossRef]

Greene, M. R.

J. M. Wolfe, M. L.-H. Võ, K. K. Evans, and M. R. Greene, “Visual search in scenes involves selective and nonselective pathways,” Trends Cogn. Sci. 15, 77–84 (2011).
[CrossRef]

Harding, G.

G. Harding and M. Bloj, “Real and predicted influence of image manipulations on eye movements during scene recognition,” J. Vis. 10(2):8, 1–17 (2010).
[CrossRef]

Hastie, T. J.

T. J. Hastie and R. J. Tibshirani, Generalized Additive Models, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1990).

Hauske, G.

G. Krieger, I. Rentschler, G. Hauske, K. Schill, and C. Zetzsche, “Object and scene analysis by saccadic eye-movements: an investigation with higher-order statistics,” Spatial Vis. 13, 201–214 (2000).

Henderson, J. M.

J. M. Henderson, A. Nuthmann, and S. G. Luke, “Eye movement control during scene viewing: immediate effects of scene luminance on fixation durations,” J. Exp. Psychol.-Hum. Percept. Perform. 39, 318–322 (2013).
[CrossRef]

M. S. Castelhano, M. L. Mack, and J. M. Henderson, “Viewing task influences eye movement control during active scene perception,” J. Vis. 9(3):6, 1–15 (2009).
[CrossRef]

J. M. Henderson, G. L. Malcolm, and C. Schandl, “Searching in the dark: cognitive relevance drives attention in real-world scenes,” Psychon. B. Rev. 16, 850–856 (2009).
[CrossRef]

A. Torralba, A. Oliva, M. S. Castelhano, and J. M. Henderson, “Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search,” Psycholog. Rev. 113, 766–786 (2006).

J. M. Henderson, “Human gaze control during real-world scene perception,” Trends Cogn. Sci. 7, 498–504 (2003).
[CrossRef]

J. M. Henderson, P. A. Weeks, and A. Hollingworth, “The effects of semantic consistency on eye movements during complex scene viewing,” J. Exp. Psychol. Hum. Percept. Perform. 25, 210–228 (1999).
[CrossRef]

Hickey, C. M.

A. D. Melin, D. W. Kline, C. M. Hickey, and L. M. Fedigan, “Food search through the eyes of a monkey: a functional substitution approach for assessing the ecology of primate color vision,” Vis. Res. 86, 87–96 (2013).
[CrossRef]

Hollingworth, A.

J. M. Henderson, P. A. Weeks, and A. Hollingworth, “The effects of semantic consistency on eye movements during complex scene viewing,” J. Exp. Psychol. Hum. Percept. Perform. 25, 210–228 (1999).
[CrossRef]

Holmqvist, K.

M. Nyström and K. Holmqvist, “An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data,” Behav. Res. Methods 42, 188–204 (2010).
[CrossRef]

M. Nyström and K. Holmqvist, “Semantic override of low-level features in image viewing—both initially and overall,” J. Eye Movement Res. 2(2):2, 1–11 (2008).

Honey, C.

H.-P. Frey, C. Honey, and P. König, “What’s color got to do with it? The influence of color on visual attention in different categories,” J. Vis. 8(14):6, 1–17 (2008).
[CrossRef]

Hügli, H.

T. Jost, N. Ouerhani, R. von Wartburg, R. Müri, and H. Hügli, “Assessing the contribution of color in visual attention,” Comput. Vis. Image Underst. 100, 107–123 (2005).
[CrossRef]

Hurlbert, A.

A. Hurlbert, P. H. Chow, and A. Owen, “Colour boosts performance in visual search for natural objects,” J. Vis. 12(9):105, 105 (2012).
[CrossRef]

Itti, L.

A. Borji, D. N. Sihite, and L. Itti, “Objects do not predict fixations better than early saliency: a re-analysis of Einhäuser et al.’s data,” J. Vis. 13(10):18, 1–4 (2013).
[CrossRef]

R. J. Peters, A. Iyer, L. Itti, and C. Koch, “Components of bottom-up gaze allocation in natural images,” Vis. Res. 45, 2397–2416 (2005).
[CrossRef]

L. Itti, C. Koch, and E. Niebur, “A model of saliency-based visual attention for rapid scene analysis,” IEEE Trans. Pattern Anal. Mach. Intell. 20, 1254–1259 (1998).
[CrossRef]

Iyer, A.

R. J. Peters, A. Iyer, L. Itti, and C. Koch, “Components of bottom-up gaze allocation in natural images,” Vis. Res. 45, 2397–2416 (2005).
[CrossRef]

Jost, T.

T. Jost, N. Ouerhani, R. von Wartburg, R. Müri, and H. Hügli, “Assessing the contribution of color in visual attention,” Comput. Vis. Image Underst. 100, 107–123 (2005).
[CrossRef]

Judd, T.

T. Judd, F. Durand, and A. Torralba, “A benchmark of computational models of saliency to predict human fixations,” in Computer Science and Artificial Intelligence Laboratory Technical Report MIT-CSAIL-TR-2012-001 (MIT, 2012).

Kaspar, K.

K. Kaspar and P. König, “Viewing behavior and the impact of low-level image properties across repeated presentations of complex scenes,” J. Vis. 11(13):26, 1–29 (2011).
[CrossRef]

Katz, S. M.

M. Carrasco, D. L. Evert, I. Chang, and S. M. Katz, “The eccentricity effect: target eccentricity affects performance on conjunction searches,” Percept. Psychophys. 57, 1241–1261 (1995).
[CrossRef]

Kayser, C.

C. Kayser, K. J. Nielsen, and N. K. Logothetis, “Fixations in natural scenes: interaction of image structure and image content,” Vis. Res. 46, 2535–2545 (2006).
[CrossRef]

Kienzle, W.

W. Kienzle, M. O. Franz, B. Schölkopf, and F. A. Wichmann, “Center–surround patterns emerge as optimal predictors for human saccade targets,” J. Vis. 9(5):7, 1–15 (2009).
[CrossRef]

W. Kienzle, F. A. Wichmann, B. Schölkopf, and M. O. Franz, “A nonparametric approach to bottom-up visual saliency,” in Advances in Neural Information Processing Systems 19, B. Schölkopf, J. Platt, and T. Hoffman, eds. (MIT, 2007), pp. 689–696.

Kietzmann, T. C.

N. Wilming, T. Betz, T. C. Kietzmann, and P. König, “Measures and limits of models of fixation selection,” PLoS ONE 6(9), e24038 (2011).
[CrossRef]

Klein-Harmeyer, I.

B. M. ’t Hart, H. C. E. F. Schmidt, I. Klein-Harmeyer, and W. Einhäuser, “Attention in natural scenes: contrast affects rapid visual processing and fixations alike,” Phil. Trans. R. Soc. B 368, 20130067 (2013).
[CrossRef]

Klempen, N.

J. M. Wolfe, N. Klempen, and K. Dahlen, “Postattentive vision,” J. Exp. Psychol. Hum. Percept. Perform. 26, 693–716 (2000).

Kline, D. W.

A. D. Melin, D. W. Kline, C. M. Hickey, and L. M. Fedigan, “Food search through the eyes of a monkey: a functional substitution approach for assessing the ecology of primate color vision,” Vis. Res. 86, 87–96 (2013).
[CrossRef]

Koch, C.

Q. Zhao and C. Koch, “Learning a saliency map using fixated locations in natural scenes,” J. Vis. 11(3):9, 1–15 (2011).
[CrossRef]

M. Cerf, E. P. Frady, and C. Koch, “Faces and text attract gaze independent of the task: experimental data and computer model,” J. Vis. 9(12):10, 1–15 (2009).
[CrossRef]

W. Einhäuser, U. Rutishauser, and C. Koch, “Task-demands can immediately reverse the effects of sensory-driven saliency in complex visual stimuli,” J. Vis. 8(2):2, 1–19 (2008).
[CrossRef]

R. J. Peters, A. Iyer, L. Itti, and C. Koch, “Components of bottom-up gaze allocation in natural images,” Vis. Res. 45, 2397–2416 (2005).
[CrossRef]

L. Itti, C. Koch, and E. Niebur, “A model of saliency-based visual attention for rapid scene analysis,” IEEE Trans. Pattern Anal. Mach. Intell. 20, 1254–1259 (1998).
[CrossRef]

König, P.

N. Wilming, T. Betz, T. C. Kietzmann, and P. König, “Measures and limits of models of fixation selection,” PLoS ONE 6(9), e24038 (2011).
[CrossRef]

H.-P. Frey, K. Wirz, V. Willenbockel, T. Betz, C. Schreiber, T. Troscianko, and P. König, “Beyond correlation: do color features influence attention in rainforest?” Front. Hum. Neurosci. 5:36, 1–13 (2011).

K. Kaspar and P. König, “Viewing behavior and the impact of low-level image properties across repeated presentations of complex scenes,” J. Vis. 11(13):26, 1–29 (2011).
[CrossRef]

A. Açık, S. Onat, F. Schumann, W. Einhäuser, and P. König, “Effects of luminance contrast and its modifications on fixation behavior during free viewing of images from different categories,” Vis. Res. 49, 1541–1553 (2009).
[CrossRef]

H.-P. Frey, C. Honey, and P. König, “What’s color got to do with it? The influence of color on visual attention in different categories,” J. Vis. 8(14):6, 1–17 (2008).
[CrossRef]

Kozachenko, L. F.

L. F. Kozachenko and N. N. Leonenko, “Sample estimate of the entropy of a random vector,” Prob. Peredachi Inf. 23(2), 9–16 (1987).

Krieger, G.

G. Krieger, I. Rentschler, G. Hauske, K. Schill, and C. Zetzsche, “Object and scene analysis by saccadic eye-movements: an investigation with higher-order statistics,” Spatial Vis. 13, 201–214 (2000).

Law, K.

D. Parkhurst, K. Law, and E. Niebur, “Modeling the role of salience in the allocation of overt visual attention,” Vis. Res. 42, 107–123 (2002).
[CrossRef]

Leonenko, N. N.

M. N. Goria, N. N. Leonenko, V. V. Mergel, and P. L. Novi Inverardi, “A new class of random vector entropy estimators and its applications in testing statistical hypotheses,” J. Nonparametr. Stat. 17, 277–297 (2005).
[CrossRef]

L. F. Kozachenko and N. N. Leonenko, “Sample estimate of the entropy of a random vector,” Prob. Peredachi Inf. 23(2), 9–16 (1987).

Li, Y.

R. Rosenholtz, Y. Li, and L. Nakano, “Measuring visual clutter,” J. Vis. 7(2):17, 1–22 (2007).
[CrossRef]

Linnell, K. J.

D. H. Foster, S. M. C. Nascimento, K. Amano, L. Arend, K. J. Linnell, J. L. Nieves, S. Plet, and J. S. Foster, “Parallel detection of violations of color constancy,” Proc. Natl. Acad. Sci. USA 98, 8151–8156 (2001).
[CrossRef]

Logothetis, N. K.

C. Kayser, K. J. Nielsen, and N. K. Logothetis, “Fixations in natural scenes: interaction of image structure and image content,” Vis. Res. 46, 2535–2545 (2006).
[CrossRef]

Loría-Calderón, E.

P. W. Lucas, N. J. Dominy, P. Riba-Hernandez, K. E. Stoner, N. Yamashita, E. Loría-Calderón, W. Petersen-Pereira, Y. Rojas-Durán, R. Salas-Pena, S. Solis-Madrigal, D. Osorio, and B. W. Darvell, “Evolution and function of routine trichromatic vision in primates,” Evolution 57, 2636–2643 (2003).
[CrossRef]

Lovell, P. G.

T. Troscianko, C. P. Benton, P. G. Lovell, D. J. Tolhurst, and Z. Pizlo, “Camouflage and visual perception,” Phil. Trans. R. Soc. B 364, 449–461 (2009).
[CrossRef]

Lucas, P. W.

P. W. Lucas, N. J. Dominy, P. Riba-Hernandez, K. E. Stoner, N. Yamashita, E. Loría-Calderón, W. Petersen-Pereira, Y. Rojas-Durán, R. Salas-Pena, S. Solis-Madrigal, D. Osorio, and B. W. Darvell, “Evolution and function of routine trichromatic vision in primates,” Evolution 57, 2636–2643 (2003).
[CrossRef]

Luke, S. G.

J. M. Henderson, A. Nuthmann, and S. G. Luke, “Eye movement control during scene viewing: immediate effects of scene luminance on fixation durations,” J. Exp. Psychol.-Hum. Percept. Perform. 39, 318–322 (2013).
[CrossRef]

Mack, M. L.

M. S. Castelhano, M. L. Mack, and J. M. Henderson, “Viewing task influences eye movement control during active scene perception,” J. Vis. 9(3):6, 1–15 (2009).
[CrossRef]

Malcolm, G. L.

J. M. Henderson, G. L. Malcolm, and C. Schandl, “Searching in the dark: cognitive relevance drives attention in real-world scenes,” Psychon. B. Rev. 16, 850–856 (2009).
[CrossRef]

Mannan, S. K.

S. K. Mannan, K. H. Ruddock, and D. S. Wooding, “The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images,” Spatial Vis. 10, 165–188 (1996).
[CrossRef]

Melin, A. D.

A. D. Melin, D. W. Kline, C. M. Hickey, and L. M. Fedigan, “Food search through the eyes of a monkey: a functional substitution approach for assessing the ecology of primate color vision,” Vis. Res. 86, 87–96 (2013).
[CrossRef]

Mergel, V. V.

M. N. Goria, N. N. Leonenko, V. V. Mergel, and P. L. Novi Inverardi, “A new class of random vector entropy estimators and its applications in testing statistical hypotheses,” J. Nonparametr. Stat. 17, 277–297 (2005).
[CrossRef]

Mihalas, S.

A. F. Russell, S. Mihalaş, R. von der Heydt, E. Niebur, and R. Etienne-Cummings, “A model of proto-object based saliency,” Vis. Res. 94, 1–15 (2014).
[CrossRef]

Mould, M. S.

K. Amano, D. H. Foster, M. S. Mould, and J. P. Oakley, “Visual search in natural scenes explained by local color properties,” J. Opt. Soc. Am. A 29, A194–A199 (2012).
[CrossRef]

M. S. Mould, D. H. Foster, K. Amano, and J. P. Oakley, “A simple nonparametric method for classifying eye fixation,” Vis. Res. 57, 18–25 (2012).
[CrossRef]

M. S. Mould, “Visual search in natural scenes with and without guidance of fixations,” Ph.D. thesis (University of Manchester, Manchester, UK, 2011).

Müri, R.

T. Jost, N. Ouerhani, R. von Wartburg, R. Müri, and H. Hügli, “Assessing the contribution of color in visual attention,” Comput. Vis. Image Underst. 100, 107–123 (2005).
[CrossRef]

Najemnik, J.

J. Najemnik and W. S. Geisler, “Simple summation rule for optimal fixation selection in visual search,” Vis. Res. 49, 1286–1294 (2009).
[CrossRef]

Nakano, L.

R. Rosenholtz, Y. Li, and L. Nakano, “Measuring visual clutter,” J. Vis. 7(2):17, 1–22 (2007).
[CrossRef]

Nakayama, K.

M. J. Bravo and K. Nakayama, “The role of attention in different visual-search tasks,” Percept. Psychophys. 51, 465–472 (1992).

Nascimento, S. M. C.

D. H. Foster, K. Amano, S. M. C. Nascimento, and M. J. Foster, “Frequency of metamerism in natural scenes,” J. Opt. Soc. Am. A 23, 2359–2372 (2006).
[CrossRef]

D. H. Foster, S. M. C. Nascimento, K. Amano, L. Arend, K. J. Linnell, J. L. Nieves, S. Plet, and J. S. Foster, “Parallel detection of violations of color constancy,” Proc. Natl. Acad. Sci. USA 98, 8151–8156 (2001).
[CrossRef]

Niebur, E.

A. F. Russell, S. Mihalaş, R. von der Heydt, E. Niebur, and R. Etienne-Cummings, “A model of proto-object based saliency,” Vis. Res. 94, 1–15 (2014).
[CrossRef]

D. J. Parkhurst and E. Niebur, “Scene content selected by active vision,” Spatial Vis. 16, 125–154 (2003).

D. Parkhurst, K. Law, and E. Niebur, “Modeling the role of salience in the allocation of overt visual attention,” Vis. Res. 42, 107–123 (2002).
[CrossRef]

L. Itti, C. Koch, and E. Niebur, “A model of saliency-based visual attention for rapid scene analysis,” IEEE Trans. Pattern Anal. Mach. Intell. 20, 1254–1259 (1998).
[CrossRef]

Nielsen, K. J.

C. Kayser, K. J. Nielsen, and N. K. Logothetis, “Fixations in natural scenes: interaction of image structure and image content,” Vis. Res. 46, 2535–2545 (2006).
[CrossRef]

Nieves, J. L.

D. H. Foster, S. M. C. Nascimento, K. Amano, L. Arend, K. J. Linnell, J. L. Nieves, S. Plet, and J. S. Foster, “Parallel detection of violations of color constancy,” Proc. Natl. Acad. Sci. USA 98, 8151–8156 (2001).
[CrossRef]

Noton, D.

D. Noton and L. Stark, “Scanpaths in saccadic eye movements while viewing and recognizing patterns,” Vis. Res. 11, 929–942 (1971).
[CrossRef]

Novi Inverardi, P. L.

M. N. Goria, N. N. Leonenko, V. V. Mergel, and P. L. Novi Inverardi, “A new class of random vector entropy estimators and its applications in testing statistical hypotheses,” J. Nonparametr. Stat. 17, 277–297 (2005).
[CrossRef]

Nuthmann, A.

J. M. Henderson, A. Nuthmann, and S. G. Luke, “Eye movement control during scene viewing: immediate effects of scene luminance on fixation durations,” J. Exp. Psychol.-Hum. Percept. Perform. 39, 318–322 (2013).
[CrossRef]

Nyström, M.

M. Nyström and K. Holmqvist, “An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data,” Behav. Res. Methods 42, 188–204 (2010).
[CrossRef]

M. Nyström and K. Holmqvist, “Semantic override of low-level features in image viewing—both initially and overall,” J. Eye Movement Res. 2(2):2, 1–11 (2008).

O’Neill, P.

J. Wolfe, P. O’Neill, and S. Bennett, “Why are there eccentricity effects in visual search? Visual and attentional hypotheses,” Percept. Psychophys. 60, 140–156 (1998).

Oakley, J. P.

K. Amano, D. H. Foster, M. S. Mould, and J. P. Oakley, “Visual search in natural scenes explained by local color properties,” J. Opt. Soc. Am. A 29, A194–A199 (2012).
[CrossRef]

M. S. Mould, D. H. Foster, K. Amano, and J. P. Oakley, “A simple nonparametric method for classifying eye fixation,” Vis. Res. 57, 18–25 (2012).
[CrossRef]

Oliva, A.

A. Torralba, A. Oliva, M. S. Castelhano, and J. M. Henderson, “Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search,” Psycholog. Rev. 113, 766–786 (2006).

Onat, S.

A. Açık, S. Onat, F. Schumann, W. Einhäuser, and P. König, “Effects of luminance contrast and its modifications on fixation behavior during free viewing of images from different categories,” Vis. Res. 49, 1541–1553 (2009).
[CrossRef]

Osorio, D.

P. W. Lucas, N. J. Dominy, P. Riba-Hernandez, K. E. Stoner, N. Yamashita, E. Loría-Calderón, W. Petersen-Pereira, Y. Rojas-Durán, R. Salas-Pena, S. Solis-Madrigal, D. Osorio, and B. W. Darvell, “Evolution and function of routine trichromatic vision in primates,” Evolution 57, 2636–2643 (2003).
[CrossRef]

Ouerhani, N.

T. Jost, N. Ouerhani, R. von Wartburg, R. Müri, and H. Hügli, “Assessing the contribution of color in visual attention,” Comput. Vis. Image Underst. 100, 107–123 (2005).
[CrossRef]

Owen, A.

A. Hurlbert, P. H. Chow, and A. Owen, “Colour boosts performance in visual search for natural objects,” J. Vis. 12(9):105, 105 (2012).
[CrossRef]

Parkhurst, D.

D. Parkhurst, K. Law, and E. Niebur, “Modeling the role of salience in the allocation of overt visual attention,” Vis. Res. 42, 107–123 (2002).
[CrossRef]

Parkhurst, D. J.

D. J. Parkhurst and E. Niebur, “Scene content selected by active vision,” Spatial Vis. 16, 125–154 (2003).

Perona, P.

W. Einhäuser, M. Spain, and P. Perona, “Objects predict fixations better than early saliency,” J. Vis. 8(14):18, 1–26(2008).
[CrossRef]

Peters, R. J.

R. J. Peters, A. Iyer, L. Itti, and C. Koch, “Components of bottom-up gaze allocation in natural images,” Vis. Res. 45, 2397–2416 (2005).
[CrossRef]

Petersen-Pereira, W.

P. W. Lucas, N. J. Dominy, P. Riba-Hernandez, K. E. Stoner, N. Yamashita, E. Loría-Calderón, W. Petersen-Pereira, Y. Rojas-Durán, R. Salas-Pena, S. Solis-Madrigal, D. Osorio, and B. W. Darvell, “Evolution and function of routine trichromatic vision in primates,” Evolution 57, 2636–2643 (2003).
[CrossRef]

Pizlo, Z.

T. Troscianko, C. P. Benton, P. G. Lovell, D. J. Tolhurst, and Z. Pizlo, “Camouflage and visual perception,” Phil. Trans. R. Soc. B 364, 449–461 (2009).
[CrossRef]

Plet, S.

D. H. Foster, S. M. C. Nascimento, K. Amano, L. Arend, K. J. Linnell, J. L. Nieves, S. Plet, and J. S. Foster, “Parallel detection of violations of color constancy,” Proc. Natl. Acad. Sci. USA 98, 8151–8156 (2001).
[CrossRef]

Privitera, C. M.

C. M. Privitera and L. W. Stark, “Algorithms for defining visual regions-of-interest: comparison with eye fixations,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 970–982 (2000).
[CrossRef]

Rajashekar, U.

I. van der Linde, U. Rajashekar, A. C. Bovik, and L. K. Cormack, “DOVES: a database of visual eye movements,” Spatial Vis. 22, 161–177 (2009).
[CrossRef]

Reinagel, P.

P. Reinagel and A. M. Zador, “Natural scene statistics at the centre of gaze,” Netw. Comput. Neural Syst. 10, 341–350 (1999).

Rentschler, I.

G. Krieger, I. Rentschler, G. Hauske, K. Schill, and C. Zetzsche, “Object and scene analysis by saccadic eye-movements: an investigation with higher-order statistics,” Spatial Vis. 13, 201–214 (2000).

Riba-Hernandez, P.

P. W. Lucas, N. J. Dominy, P. Riba-Hernandez, K. E. Stoner, N. Yamashita, E. Loría-Calderón, W. Petersen-Pereira, Y. Rojas-Durán, R. Salas-Pena, S. Solis-Madrigal, D. Osorio, and B. W. Darvell, “Evolution and function of routine trichromatic vision in primates,” Evolution 57, 2636–2643 (2003).
[CrossRef]

Rojas-Durán, Y.

P. W. Lucas, N. J. Dominy, P. Riba-Hernandez, K. E. Stoner, N. Yamashita, E. Loría-Calderón, W. Petersen-Pereira, Y. Rojas-Durán, R. Salas-Pena, S. Solis-Madrigal, D. Osorio, and B. W. Darvell, “Evolution and function of routine trichromatic vision in primates,” Evolution 57, 2636–2643 (2003).
[CrossRef]

Rosenholtz, R.

R. Rosenholtz, Y. Li, and L. Nakano, “Measuring visual clutter,” J. Vis. 7(2):17, 1–22 (2007).
[CrossRef]

Ruddock, K. H.

S. K. Mannan, K. H. Ruddock, and D. S. Wooding, “The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images,” Spatial Vis. 10, 165–188 (1996).
[CrossRef]

Russell, A. F.

A. F. Russell, S. Mihalaş, R. von der Heydt, E. Niebur, and R. Etienne-Cummings, “A model of proto-object based saliency,” Vis. Res. 94, 1–15 (2014).
[CrossRef]

Rutishauser, U.

W. Einhäuser, U. Rutishauser, and C. Koch, “Task-demands can immediately reverse the effects of sensory-driven saliency in complex visual stimuli,” J. Vis. 8(2):2, 1–19 (2008).
[CrossRef]

Salas-Pena, R.

P. W. Lucas, N. J. Dominy, P. Riba-Hernandez, K. E. Stoner, N. Yamashita, E. Loría-Calderón, W. Petersen-Pereira, Y. Rojas-Durán, R. Salas-Pena, S. Solis-Madrigal, D. Osorio, and B. W. Darvell, “Evolution and function of routine trichromatic vision in primates,” Evolution 57, 2636–2643 (2003).
[CrossRef]

Salvucci, D. D.

D. D. Salvucci and J. H. Goldberg, “Identifying fixations and saccades in eye-tracking protocols,” in Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, Palm Beach Gardens, Florida (ACM, 2000), pp. 71–78.

Schandl, C.

J. M. Henderson, G. L. Malcolm, and C. Schandl, “Searching in the dark: cognitive relevance drives attention in real-world scenes,” Psychon. B. Rev. 16, 850–856 (2009).
[CrossRef]

Schill, K.

G. Krieger, I. Rentschler, G. Hauske, K. Schill, and C. Zetzsche, “Object and scene analysis by saccadic eye-movements: an investigation with higher-order statistics,” Spatial Vis. 13, 201–214 (2000).

Schmidt, H. C. E. F.

B. M. ’t Hart, H. C. E. F. Schmidt, I. Klein-Harmeyer, and W. Einhäuser, “Attention in natural scenes: contrast affects rapid visual processing and fixations alike,” Phil. Trans. R. Soc. B 368, 20130067 (2013).
[CrossRef]

Schölkopf, B.

W. Kienzle, M. O. Franz, B. Schölkopf, and F. A. Wichmann, “Center–surround patterns emerge as optimal predictors for human saccade targets,” J. Vis. 9(5):7, 1–15 (2009).
[CrossRef]

W. Kienzle, F. A. Wichmann, B. Schölkopf, and M. O. Franz, “A nonparametric approach to bottom-up visual saliency,” in Advances in Neural Information Processing Systems 19, B. Schölkopf, J. Platt, and T. Hoffman, eds. (MIT, 2007), pp. 689–696.

Schreiber, C.

H.-P. Frey, K. Wirz, V. Willenbockel, T. Betz, C. Schreiber, T. Troscianko, and P. König, “Beyond correlation: do color features influence attention in rainforest?” Front. Hum. Neurosci. 5:36, 1–13 (2011).

Schumann, F.

A. Açık, S. Onat, F. Schumann, W. Einhäuser, and P. König, “Effects of luminance contrast and its modifications on fixation behavior during free viewing of images from different categories,” Vis. Res. 49, 1541–1553 (2009).
[CrossRef]

Sihite, D. N.

A. Borji, D. N. Sihite, and L. Itti, “Objects do not predict fixations better than early saliency: a re-analysis of Einhäuser et al.’s data,” J. Vis. 13(10):18, 1–4 (2013).
[CrossRef]

Silverman, B. W.

B. W. Silverman, Density Estimation for Statistics and Data Analysis, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1986).

Smith, H.

N. R. Draper and H. Smith, Applied Regression Analysis, 3rd ed. (Wiley, 1998).

Solis-Madrigal, S.

P. W. Lucas, N. J. Dominy, P. Riba-Hernandez, K. E. Stoner, N. Yamashita, E. Loría-Calderón, W. Petersen-Pereira, Y. Rojas-Durán, R. Salas-Pena, S. Solis-Madrigal, D. Osorio, and B. W. Darvell, “Evolution and function of routine trichromatic vision in primates,” Evolution 57, 2636–2643 (2003).
[CrossRef]

Spain, M.

W. Einhäuser, M. Spain, and P. Perona, “Objects predict fixations better than early saliency,” J. Vis. 8(14):18, 1–26(2008).
[CrossRef]

Stark, L.

D. Noton and L. Stark, “Scanpaths in saccadic eye movements while viewing and recognizing patterns,” Vis. Res. 11, 929–942 (1971).
[CrossRef]

Stark, L. W.

C. M. Privitera and L. W. Stark, “Algorithms for defining visual regions-of-interest: comparison with eye fixations,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 970–982 (2000).
[CrossRef]

Stoner, K. E.

P. W. Lucas, N. J. Dominy, P. Riba-Hernandez, K. E. Stoner, N. Yamashita, E. Loría-Calderón, W. Petersen-Pereira, Y. Rojas-Durán, R. Salas-Pena, S. Solis-Madrigal, D. Osorio, and B. W. Darvell, “Evolution and function of routine trichromatic vision in primates,” Evolution 57, 2636–2643 (2003).
[CrossRef]

Tatler, B. W.

B. W. Tatler, “The central fixation bias in scene viewing: selecting an optimal viewing position independently of motor biases and image feature distributions,” J. Vis. 7(14):4, 1–17 (2007).
[CrossRef]

B. W. Tatler, R. J. Baddeley, and I. D. Gilchrist, “Visual correlates of fixation selection: effects of scale and time,” Vis. Res. 45, 643–659 (2005).
[CrossRef]

Tibshirani, R. J.

T. J. Hastie and R. J. Tibshirani, Generalized Additive Models, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1990).

B. Efron and R. J. Tibshirani, An Introduction to the Bootstrap, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1993).

Tolhurst, D. J.

T. Troscianko, C. P. Benton, P. G. Lovell, D. J. Tolhurst, and Z. Pizlo, “Camouflage and visual perception,” Phil. Trans. R. Soc. B 364, 449–461 (2009).
[CrossRef]

Torralba, A.

A. Torralba, A. Oliva, M. S. Castelhano, and J. M. Henderson, “Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search,” Psycholog. Rev. 113, 766–786 (2006).

T. Judd, F. Durand, and A. Torralba, “A benchmark of computational models of saliency to predict human fixations,” in Computer Science and Artificial Intelligence Laboratory Technical Report MIT-CSAIL-TR-2012-001 (MIT, 2012).

Troscianko, T.

H.-P. Frey, K. Wirz, V. Willenbockel, T. Betz, C. Schreiber, T. Troscianko, and P. König, “Beyond correlation: do color features influence attention in rainforest?” Front. Hum. Neurosci. 5:36, 1–13 (2011).

T. Troscianko, C. P. Benton, P. G. Lovell, D. J. Tolhurst, and Z. Pizlo, “Camouflage and visual perception,” Phil. Trans. R. Soc. B 364, 449–461 (2009).
[CrossRef]

Tsotsos, J. K.

N. D. B. Bruce and J. K. Tsotsos, “Saliency, attention, and visual search: an information theoretic approach,” J. Vis. 9(3):5, 1–24 (2009).
[CrossRef]

van der Linde, I.

I. van der Linde, U. Rajashekar, A. C. Bovik, and L. K. Cormack, “DOVES: a database of visual eye movements,” Spatial Vis. 22, 161–177 (2009).
[CrossRef]

Vig, E.

E. Vig, M. Dorr, and E. Barth, “Efficient visual coding and the predictability of eye movements on natural movies,” Spatial Vis. 22, 397–408 (2009).

Võ, M. L.-H.

J. M. Wolfe, M. L.-H. Võ, K. K. Evans, and M. R. Greene, “Visual search in scenes involves selective and nonselective pathways,” Trends Cogn. Sci. 15, 77–84 (2011).
[CrossRef]

von der Heydt, R.

A. F. Russell, S. Mihalaş, R. von der Heydt, E. Niebur, and R. Etienne-Cummings, “A model of proto-object based saliency,” Vis. Res. 94, 1–15 (2014).
[CrossRef]

von Wartburg, R.

T. Jost, N. Ouerhani, R. von Wartburg, R. Müri, and H. Hügli, “Assessing the contribution of color in visual attention,” Comput. Vis. Image Underst. 100, 107–123 (2005).
[CrossRef]

Weeks, P. A.

J. M. Henderson, P. A. Weeks, and A. Hollingworth, “The effects of semantic consistency on eye movements during complex scene viewing,” J. Exp. Psychol. Hum. Percept. Perform. 25, 210–228 (1999).
[CrossRef]

Wichmann, F. A.

W. Kienzle, M. O. Franz, B. Schölkopf, and F. A. Wichmann, “Center–surround patterns emerge as optimal predictors for human saccade targets,” J. Vis. 9(5):7, 1–15 (2009).
[CrossRef]

W. Kienzle, F. A. Wichmann, B. Schölkopf, and M. O. Franz, “A nonparametric approach to bottom-up visual saliency,” in Advances in Neural Information Processing Systems 19, B. Schölkopf, J. Platt, and T. Hoffman, eds. (MIT, 2007), pp. 689–696.

Willenbockel, V.

H.-P. Frey, K. Wirz, V. Willenbockel, T. Betz, C. Schreiber, T. Troscianko, and P. König, “Beyond correlation: do color features influence attention in rainforest?” Front. Hum. Neurosci. 5:36, 1–13 (2011).

Wilming, N.

N. Wilming, T. Betz, T. C. Kietzmann, and P. König, “Measures and limits of models of fixation selection,” PLoS ONE 6(9), e24038 (2011).
[CrossRef]

Wirz, K.

H.-P. Frey, K. Wirz, V. Willenbockel, T. Betz, C. Schreiber, T. Troscianko, and P. König, “Beyond correlation: do color features influence attention in rainforest?” Front. Hum. Neurosci. 5:36, 1–13 (2011).

Wolfe, J.

J. Wolfe, P. O’Neill, and S. Bennett, “Why are there eccentricity effects in visual search? Visual and attentional hypotheses,” Percept. Psychophys. 60, 140–156 (1998).

Wolfe, J. M.

J. M. Wolfe, M. L.-H. Võ, K. K. Evans, and M. R. Greene, “Visual search in scenes involves selective and nonselective pathways,” Trends Cogn. Sci. 15, 77–84 (2011).
[CrossRef]

J. M. Wolfe, N. Klempen, and K. Dahlen, “Postattentive vision,” J. Exp. Psychol. Hum. Percept. Perform. 26, 693–716 (2000).

Wooding, D. S.

S. K. Mannan, K. H. Ruddock, and D. S. Wooding, “The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images,” Spatial Vis. 10, 165–188 (1996).
[CrossRef]

Yamashita, N.

P. W. Lucas, N. J. Dominy, P. Riba-Hernandez, K. E. Stoner, N. Yamashita, E. Loría-Calderón, W. Petersen-Pereira, Y. Rojas-Durán, R. Salas-Pena, S. Solis-Madrigal, D. Osorio, and B. W. Darvell, “Evolution and function of routine trichromatic vision in primates,” Evolution 57, 2636–2643 (2003).
[CrossRef]

Zador, A. M.

P. Reinagel and A. M. Zador, “Natural scene statistics at the centre of gaze,” Netw. Comput. Neural Syst. 10, 341–350 (1999).

Zetzsche, C.

G. Krieger, I. Rentschler, G. Hauske, K. Schill, and C. Zetzsche, “Object and scene analysis by saccadic eye-movements: an investigation with higher-order statistics,” Spatial Vis. 13, 201–214 (2000).

Zhao, Q.

Q. Zhao and C. Koch, “Learning a saliency map using fixated locations in natural scenes,” J. Vis. 11(3):9, 1–15 (2011).
[CrossRef]

Behav. Res. Methods (1)

M. Nyström and K. Holmqvist, “An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data,” Behav. Res. Methods 42, 188–204 (2010).
[CrossRef]

Comput. Vis. Image Underst. (1)

T. Jost, N. Ouerhani, R. von Wartburg, R. Müri, and H. Hügli, “Assessing the contribution of color in visual attention,” Comput. Vis. Image Underst. 100, 107–123 (2005).
[CrossRef]

Evolution (1)

P. W. Lucas, N. J. Dominy, P. Riba-Hernandez, K. E. Stoner, N. Yamashita, E. Loría-Calderón, W. Petersen-Pereira, Y. Rojas-Durán, R. Salas-Pena, S. Solis-Madrigal, D. Osorio, and B. W. Darvell, “Evolution and function of routine trichromatic vision in primates,” Evolution 57, 2636–2643 (2003).
[CrossRef]

Front. Hum. Neurosci. (1)

H.-P. Frey, K. Wirz, V. Willenbockel, T. Betz, C. Schreiber, T. Troscianko, and P. König, “Beyond correlation: do color features influence attention in rainforest?” Front. Hum. Neurosci. 5:36, 1–13 (2011).

IEEE Trans. Cybern. (1)

G. Boccignone and M. Ferraro, “Ecological sampling of gaze shifts,” IEEE Trans. Cybern. 44, 266–279 (2013).
[CrossRef]

IEEE Trans. Pattern Anal. Mach. Intell. (2)

L. Itti, C. Koch, and E. Niebur, “A model of saliency-based visual attention for rapid scene analysis,” IEEE Trans. Pattern Anal. Mach. Intell. 20, 1254–1259 (1998).
[CrossRef]

C. M. Privitera and L. W. Stark, “Algorithms for defining visual regions-of-interest: comparison with eye fixations,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 970–982 (2000).
[CrossRef]

J. Exp. Psychol. Hum. Percept. Perform. (2)

J. M. Wolfe, N. Klempen, and K. Dahlen, “Postattentive vision,” J. Exp. Psychol. Hum. Percept. Perform. 26, 693–716 (2000).

J. M. Henderson, P. A. Weeks, and A. Hollingworth, “The effects of semantic consistency on eye movements during complex scene viewing,” J. Exp. Psychol. Hum. Percept. Perform. 25, 210–228 (1999).
[CrossRef]

J. Exp. Psychol.-Hum. Percept. Perform. (1)

J. M. Henderson, A. Nuthmann, and S. G. Luke, “Eye movement control during scene viewing: immediate effects of scene luminance on fixation durations,” J. Exp. Psychol.-Hum. Percept. Perform. 39, 318–322 (2013).
[CrossRef]

J. Eye Movement Res. (1)

M. Nyström and K. Holmqvist, “Semantic override of low-level features in image viewing—both initially and overall,” J. Eye Movement Res. 2(2):2, 1–11 (2008).

J. Nonparametr. Stat. (1)

M. N. Goria, N. N. Leonenko, V. V. Mergel, and P. L. Novi Inverardi, “A new class of random vector entropy estimators and its applications in testing statistical hypotheses,” J. Nonparametr. Stat. 17, 277–297 (2005).
[CrossRef]

J. Opt. Soc. Am. A (2)

J. Vis. (14)

A. Hurlbert, P. H. Chow, and A. Owen, “Colour boosts performance in visual search for natural objects,” J. Vis. 12(9):105, 105 (2012).
[CrossRef]

H.-P. Frey, C. Honey, and P. König, “What’s color got to do with it? The influence of color on visual attention in different categories,” J. Vis. 8(14):6, 1–17 (2008).
[CrossRef]

B. W. Tatler, “The central fixation bias in scene viewing: selecting an optimal viewing position independently of motor biases and image feature distributions,” J. Vis. 7(14):4, 1–17 (2007).
[CrossRef]

A. Borji, D. N. Sihite, and L. Itti, “Objects do not predict fixations better than early saliency: a re-analysis of Einhäuser et al.’s data,” J. Vis. 13(10):18, 1–4 (2013).
[CrossRef]

W. Einhäuser, U. Rutishauser, and C. Koch, “Task-demands can immediately reverse the effects of sensory-driven saliency in complex visual stimuli,” J. Vis. 8(2):2, 1–19 (2008).
[CrossRef]

M. Cerf, E. P. Frady, and C. Koch, “Faces and text attract gaze independent of the task: experimental data and computer model,” J. Vis. 9(12):10, 1–15 (2009).
[CrossRef]

Q. Zhao and C. Koch, “Learning a saliency map using fixated locations in natural scenes,” J. Vis. 11(3):9, 1–15 (2011).
[CrossRef]

W. Einhäuser, M. Spain, and P. Perona, “Objects predict fixations better than early saliency,” J. Vis. 8(14):18, 1–26(2008).
[CrossRef]

M. S. Castelhano, M. L. Mack, and J. M. Henderson, “Viewing task influences eye movement control during active scene perception,” J. Vis. 9(3):6, 1–15 (2009).
[CrossRef]

G. Harding and M. Bloj, “Real and predicted influence of image manipulations on eye movements during scene recognition,” J. Vis. 10(2):8, 1–17 (2010).
[CrossRef]

K. Kaspar and P. König, “Viewing behavior and the impact of low-level image properties across repeated presentations of complex scenes,” J. Vis. 11(13):26, 1–29 (2011).
[CrossRef]

W. Kienzle, M. O. Franz, B. Schölkopf, and F. A. Wichmann, “Center–surround patterns emerge as optimal predictors for human saccade targets,” J. Vis. 9(5):7, 1–15 (2009).
[CrossRef]

R. Rosenholtz, Y. Li, and L. Nakano, “Measuring visual clutter,” J. Vis. 7(2):17, 1–22 (2007).
[CrossRef]

N. D. B. Bruce and J. K. Tsotsos, “Saliency, attention, and visual search: an information theoretic approach,” J. Vis. 9(3):5, 1–24 (2009).
[CrossRef]

Netw. Comput. Neural Syst. (2)

R. Baddeley, “Searching for filters with ‘interesting’ output distributions: an uninteresting direction to explore?” Netw. Comput. Neural Syst. 7, 409–421 (1996).

P. Reinagel and A. M. Zador, “Natural scene statistics at the centre of gaze,” Netw. Comput. Neural Syst. 10, 341–350 (1999).

Percept. Psychophys. (3)

M. J. Bravo and K. Nakayama, “The role of attention in different visual-search tasks,” Percept. Psychophys. 51, 465–472 (1992).

J. Wolfe, P. O’Neill, and S. Bennett, “Why are there eccentricity effects in visual search? Visual and attentional hypotheses,” Percept. Psychophys. 60, 140–156 (1998).

M. Carrasco, D. L. Evert, I. Chang, and S. M. Katz, “The eccentricity effect: target eccentricity affects performance on conjunction searches,” Percept. Psychophys. 57, 1241–1261 (1995).
[CrossRef]

Phil. Trans. R. Soc. B (2)

B. M. ’t Hart, H. C. E. F. Schmidt, I. Klein-Harmeyer, and W. Einhäuser, “Attention in natural scenes: contrast affects rapid visual processing and fixations alike,” Phil. Trans. R. Soc. B 368, 20130067 (2013).
[CrossRef]

T. Troscianko, C. P. Benton, P. G. Lovell, D. J. Tolhurst, and Z. Pizlo, “Camouflage and visual perception,” Phil. Trans. R. Soc. B 364, 449–461 (2009).
[CrossRef]

Physica A (1)

G. Boccignone and M. Ferraro, “Modelling gaze shift as a constrained random walk,” Physica A 331, 207–218 (2004).
[CrossRef]

PLoS ONE (1)

N. Wilming, T. Betz, T. C. Kietzmann, and P. König, “Measures and limits of models of fixation selection,” PLoS ONE 6(9), e24038 (2011).
[CrossRef]

Prob. Peredachi Inf. (1)

L. F. Kozachenko and N. N. Leonenko, “Sample estimate of the entropy of a random vector,” Prob. Peredachi Inf. 23(2), 9–16 (1987).

Proc. Natl. Acad. Sci. USA (1)

D. H. Foster, S. M. C. Nascimento, K. Amano, L. Arend, K. J. Linnell, J. L. Nieves, S. Plet, and J. S. Foster, “Parallel detection of violations of color constancy,” Proc. Natl. Acad. Sci. USA 98, 8151–8156 (2001).
[CrossRef]

Psycholog. Rev. (1)

A. Torralba, A. Oliva, M. S. Castelhano, and J. M. Henderson, “Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search,” Psycholog. Rev. 113, 766–786 (2006).

Psychon. B. Rev. (1)

J. M. Henderson, G. L. Malcolm, and C. Schandl, “Searching in the dark: cognitive relevance drives attention in real-world scenes,” Psychon. B. Rev. 16, 850–856 (2009).
[CrossRef]

Spatial Vis. (5)

G. Krieger, I. Rentschler, G. Hauske, K. Schill, and C. Zetzsche, “Object and scene analysis by saccadic eye-movements: an investigation with higher-order statistics,” Spatial Vis. 13, 201–214 (2000).

D. J. Parkhurst and E. Niebur, “Scene content selected by active vision,” Spatial Vis. 16, 125–154 (2003).

S. K. Mannan, K. H. Ruddock, and D. S. Wooding, “The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images,” Spatial Vis. 10, 165–188 (1996).
[CrossRef]

I. van der Linde, U. Rajashekar, A. C. Bovik, and L. K. Cormack, “DOVES: a database of visual eye movements,” Spatial Vis. 22, 161–177 (2009).
[CrossRef]

E. Vig, M. Dorr, and E. Barth, “Efficient visual coding and the predictability of eye movements on natural movies,” Spatial Vis. 22, 397–408 (2009).

Trends Cogn. Sci. (2)

J. M. Henderson, “Human gaze control during real-world scene perception,” Trends Cogn. Sci. 7, 498–504 (2003).
[CrossRef]

J. M. Wolfe, M. L.-H. Võ, K. K. Evans, and M. R. Greene, “Visual search in scenes involves selective and nonselective pathways,” Trends Cogn. Sci. 15, 77–84 (2011).
[CrossRef]

Vis. Res. (11)

C. Kayser, K. J. Nielsen, and N. K. Logothetis, “Fixations in natural scenes: interaction of image structure and image content,” Vis. Res. 46, 2535–2545 (2006).
[CrossRef]

A. F. Russell, S. Mihalaş, R. von der Heydt, E. Niebur, and R. Etienne-Cummings, “A model of proto-object based saliency,” Vis. Res. 94, 1–15 (2014).
[CrossRef]

D. Noton and L. Stark, “Scanpaths in saccadic eye movements while viewing and recognizing patterns,” Vis. Res. 11, 929–942 (1971).
[CrossRef]

M. S. Mould, D. H. Foster, K. Amano, and J. P. Oakley, “A simple nonparametric method for classifying eye fixation,” Vis. Res. 57, 18–25 (2012).
[CrossRef]

J. Najemnik and W. S. Geisler, “Simple summation rule for optimal fixation selection in visual search,” Vis. Res. 49, 1286–1294 (2009).
[CrossRef]

B. W. Tatler, R. J. Baddeley, and I. D. Gilchrist, “Visual correlates of fixation selection: effects of scale and time,” Vis. Res. 45, 643–659 (2005).
[CrossRef]

R. J. Peters, A. Iyer, L. Itti, and C. Koch, “Components of bottom-up gaze allocation in natural images,” Vis. Res. 45, 2397–2416 (2005).
[CrossRef]

A. Açık, S. Onat, F. Schumann, W. Einhäuser, and P. König, “Effects of luminance contrast and its modifications on fixation behavior during free viewing of images from different categories,” Vis. Res. 49, 1541–1553 (2009).
[CrossRef]

D. Parkhurst, K. Law, and E. Niebur, “Modeling the role of salience in the allocation of overt visual attention,” Vis. Res. 42, 107–123 (2002).
[CrossRef]

F. L. Engel, “Visual conspicuity, directed attention and retinal locus,” Vis. Res. 11, 563–575 (1971).
[CrossRef]

A. D. Melin, D. W. Kline, C. M. Hickey, and L. M. Fedigan, “Food search through the eyes of a monkey: a functional substitution approach for assessing the ecology of primate color vision,” Vis. Res. 86, 87–96 (2013).
[CrossRef]

Other (10)

CIE, Technical Committee 8-01, “A colour appearance model for colour management systems: CIECAM02,” (Commission Internationale de l’Eclairage, Vienna, Austria, 2004).

W. Kienzle, F. A. Wichmann, B. Schölkopf, and M. O. Franz, “A nonparametric approach to bottom-up visual saliency,” in Advances in Neural Information Processing Systems 19, B. Schölkopf, J. Platt, and T. Hoffman, eds. (MIT, 2007), pp. 689–696.

B. W. Silverman, Density Estimation for Statistics and Data Analysis, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1986).

J. Fan and I. Gijbels, Local Polynomial Modelling and Its Applications, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1996).

D. D. Salvucci and J. H. Goldberg, “Identifying fixations and saccades in eye-tracking protocols,” in Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, Palm Beach Gardens, Florida (ACM, 2000), pp. 71–78.

B. Efron and R. J. Tibshirani, An Introduction to the Bootstrap, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1993).

T. Judd, F. Durand, and A. Torralba, “A benchmark of computational models of saliency to predict human fixations,” in Computer Science and Artificial Intelligence Laboratory Technical Report MIT-CSAIL-TR-2012-001 (MIT, 2012).

M. S. Mould, “Visual search in natural scenes with and without guidance of fixations,” Ph.D. thesis (University of Manchester, Manchester, UK, 2011).

N. R. Draper and H. Smith, Applied Regression Analysis, 3rd ed. (Wiley, 1998).

T. J. Hastie and R. J. Tibshirani, Generalized Additive Models, Monographs on Statistics and Applied Probability (Chapman & Hall/CRC Press, 1990).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (5)

Fig. 1.
Fig. 1.

Examples of natural scene images and observed and fitted spatial distributions of observers’ fixations. The first row shows four of the scenes from the 20 used in the experiment, including those producing the best fit (leftmost image) and worst fit (rightmost image). In the second image, the target is shown arrowed, and in the center of the close-up on the bottom right. The middle row shows the smoothed observed fixation distributions. The bottom row shows the estimated fixation distribution obtained by fitting the distributions of local color properties in each scene [see Eq. (1) in text] and the resulting percentage R2 of variance explained at the bottom left. Values of R2 were adjusted for the degrees of freedom in each fit. The smoothing was based on a locally weighted quadratic regression (loess) with a Gaussian kernel of 2.5-deg standard deviation. Higher frequencies of fixations are indicated by darker contours. Positions of maxima are indicated by crosses.

Fig. 2.
Fig. 2.

Observed and fitted viewing-bias distributions for the scenes shown in Fig. 1. The first row shows the bias distribution estimated by pooling fixations over scenes and observers (and therefore common to all scenes). The second row shows the estimated fixation distribution obtained by fitting the distributions of local color properties in each scene [see Eq. (2) in text]. Other details as for Fig. 1.

Fig. 3.
Fig. 3.

Percentage R2 of variance in fixations explained by local color properties as a function of percentage R2 of variance in viewing bias explained by local color properties. Each symbol represents a pair of R2 values for a single scene out of the 20 tested. The dotted line is a linear regression. Data for the scenes in Fig. 1 are shown labeled. Values of R2 were adjusted for the degrees of freedom in each fit.

Fig. 4.
Fig. 4.

Observed spatial distributions of successive fixations for the scenes shown in Fig. 1. The first, second, third, and fourth rows show the smoothed first, second, third, and fourth fixation distributions. The values of the coefficients of determination R2 averaged over all six possible pairings of successive distributions within a scene (i.e., 3×2×1) are shown at the bottom left in the bottom row. Other details as for Fig. 1.

Fig. 5.
Fig. 5.

Percentage R2 of variance in fixations explained by different scene properties. Mean values of R2 are shown for fits to the fixation distributions in each scene of local lightness entropy, contrast, and edge density, local lightness, and color and color combined with viewing bias (for definitions, see Section 2.G). Symbols represent values averaged over 20 scenes, and horizontal bars ±1 standard error of the mean. Values of R2 were adjusted for the different d.f. in each fit.

Equations (4)

Equations on this page are rendered with MathJax. Learn more.

f^i(x,y)=α1,iJi(x,y)+α2,iaCi(x,y)+α3,ibCi(x,y)+βi,
o^i(x,y)=α1,iJi(x,y)+α2,iaCi(x,y)+α3,ibCi(x,y)+βi,
f^i(x,y)=α1,iJi(x,y)+α2,iaCi(x,y)+α3,ibCi(x,y)+α4,io(x,y)+βi,
RSS0=x,y[fi(x,y)f¯i]2,RSS1=x,y[fi(x,y)f^i(x,y)]2,RSS2=x,y[o(x,y)o^i(x,y)]2.

Metrics