2015
DOI: 10.5194/isprsannals-ii-3-w5-467-2015
|View full text |Cite
|
Sign up to set email alerts
|

Gaussian Process for Activity Modeling and Anomaly Detection

Abstract: ABSTRACT:Complex activity modeling and identification of anomaly is one of the most interesting and desired capabilities for automated video behavior analysis. A number of different approaches have been proposed in the past to tackle this problem. There are two main challenges for activity modeling and anomaly detection: 1) most existing approaches require sufficient data and supervision for learning; 2) the most interesting abnormal activities arise rarely and are ambiguous among typical activities, i.e. hard… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 22 publications
0
5
0
Order By: Relevance
“…This addresses weaknesses of previous methods that struggle either due to restrictions of overly simplified data representations or limited distribution models and enables our method to set state-of-the-art performance on MVTec AD and MTD. In the future, the concept could be refined for video anomaly detection [40,25].…”
Section: Discussionmentioning
confidence: 99%
“…This addresses weaknesses of previous methods that struggle either due to restrictions of overly simplified data representations or limited distribution models and enables our method to set state-of-the-art performance on MVTec AD and MTD. In the future, the concept could be refined for video anomaly detection [40,25].…”
Section: Discussionmentioning
confidence: 99%
“…In the future we plan to refine the concept in order to find anomalies in video data comparable to [29,20].…”
Section: Discussionmentioning
confidence: 99%
“…It's also recommended to multiply the arbitrary values by small scalar similar as to make the activation units active and be on the regions where activation functions ' derivations aren't close to zero. After forward propagation, the cost function( L) is calculated which is the Mean Square Error( MSE) between the vaticination Y ' and the ground verity markers Y ( 19). 𝐿 = 1 𝑛 Σ( 𝑌− 𝑌 ′) 2 2) The thing of the neural network is to make L close to zero, hence making Y and Y ' nearly the same, which mean making the network classify the input cases rightly.…”
Section: Neural Networkmentioning
confidence: 99%