2002
DOI: 10.1016/s0736-5748(02)00042-4
|View full text |Cite
|
Sign up to set email alerts
|

Agent computing themes in biologically inspired models of learning and development

Abstract: After evaluating general features and attributes of the agent notion, the overlap of features in candidate (attribute) cores, and several less central features, the paper addresses agent and related theory in neuroscience, observing how agent notions have penetrated portions of this field and how the field itself emphasizes and further develops some agent themes via, e.g. schema theory, neural net-artificial intelligence (AI) comparisons, and other research. In remaining sections, models for development of mem… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2004
2004
2024
2024

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(9 citation statements)
references
References 14 publications
0
9
0
Order By: Relevance
“…The prior knowledge of both "unchanged" and "changed regions was modeled by energy functions, which represented the potential of a pixel being in the corresponding status ("unchanged/"changed"). Another approach based on the iterative conditional mode algorithm [21], employed in [18], has a low computational cost, but may converge to a local extremum. We believe this is a powerful framework to model the uncertainty and minimize the error rate in detecting changes.…”
Section: Discussionmentioning
confidence: 99%
See 4 more Smart Citations
“…The prior knowledge of both "unchanged" and "changed regions was modeled by energy functions, which represented the potential of a pixel being in the corresponding status ("unchanged/"changed"). Another approach based on the iterative conditional mode algorithm [21], employed in [18], has a low computational cost, but may converge to a local extremum. We believe this is a powerful framework to model the uncertainty and minimize the error rate in detecting changes.…”
Section: Discussionmentioning
confidence: 99%
“…We select a neuron c whose output is the largest among the neural network c = arg max y, (k) j (18) If there are some neurons c, 2 0, j = 1, 2, K , that satisfy Eq. (18), then the neuron with the largest sum of weights is selected.…”
Section: Unsupervised Competitive Learningmentioning
confidence: 99%
See 3 more Smart Citations