2022
DOI: 10.1167/jov.22.14.3773
|View full text |Cite
|
Sign up to set email alerts
|

Toward modeling visual routines of object segmentation with biologically inspired recurrent vision models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…It is an interesting possibility that incorporating such recurrence in deep networks could bring human-like robustness to recognition of objects in unusual poses. Some efforts to incorporate recurrence in deep learning models of the visual system exist (Wyatte et al, 2012;O'Reilly et al, 2013;Spoerer et al, 2017;Tang et al, 2018;Kietzmann et al, 2019;Rajaei et al, 2019;Nayebi et al, 2022;Goetschalckx et al, 2023), but to our knowledge not in the context of recognizing objects in unusual poses. We note that evidence accumulation and recurrence are not mutually exclusive and, in fact, recurrent circuits are one possible neural implementation to allow evidence accumulation.…”
Section: Why Do Humans Need the Extra Time?mentioning
confidence: 99%
“…It is an interesting possibility that incorporating such recurrence in deep networks could bring human-like robustness to recognition of objects in unusual poses. Some efforts to incorporate recurrence in deep learning models of the visual system exist (Wyatte et al, 2012;O'Reilly et al, 2013;Spoerer et al, 2017;Tang et al, 2018;Kietzmann et al, 2019;Rajaei et al, 2019;Nayebi et al, 2022;Goetschalckx et al, 2023), but to our knowledge not in the context of recognizing objects in unusual poses. We note that evidence accumulation and recurrence are not mutually exclusive and, in fact, recurrent circuits are one possible neural implementation to allow evidence accumulation.…”
Section: Why Do Humans Need the Extra Time?mentioning
confidence: 99%