A neural network model of visual motion perception and speed discrimination is developed to simulate data concerning the conditions under which components of moving stimuli cohere or not into a global direction of motion, as in barberpole and plaid patterns (both type 1 and type 2). The model also simulates how the perceived speed of lines moving in a prescribed direction depends on their orientation, length, duration, and contrast. Motion direction and speed both emerge as part of an interactive motion grouping or segmentation process. The model proposes a solution to the global aperture problem by showing how information from feature tracking points, namely, locations from which unambiguous motion directions can be computed, can propagate to ambiguous motion direction points and capture the motion signals there. The model does this without computing intersections of constraints or parallel Fourier and non-Fourier pathways. Instead, the model uses orientationally unselective cell responses to activate directionally tuned transient cells. These transient cells, in turn, activate spatially short-range filters and competitive mechanisms over multiple spatial scales to generate speed-tuned and directionally tuned cells. Spatially long-range filters and top–down feedback from grouping cells are then used to track motion of featural points and to select and propagate correct motion directions to ambiguous motion points. Top–down grouping can also prime the system to attend a particular motion direction. The model hereby links low-level automatic motion processing with attention-based motion processing. Homologs of model mechanisms have been used in models of other brain systems to simulate data about visual grouping, figure–ground separation, and speech perception. Earlier versions of the model have simulated data about short-range and long-range apparent motion, second-order motion, and the effects of parvocellular and magnocellular lateral geniculate nucleus lesions on motion perception.
© 1997 Optical Society of AmericaFull Article | PDF Article