2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) 2018
DOI: 10.1109/cvprw.2018.00293
|View full text |Cite
|
Sign up to set email alerts
|

Markov Chain Neural Networks

Abstract: In this work we present a modified neural network model which is capable to simulate Markov Chains. We show how to express and train such a network, how to ensure given statistical properties reflected in the training data and we demonstrate several applications where the network produces non-deterministic outcomes. One example is a random walker model, e.g. useful for simulation of Brownian motions or a natural Tic-Tac-Toe network which ensures non-deterministic game behavior. Foundations and Stochastic Neura… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 20 publications
(11 citation statements)
references
References 19 publications
0
11
0
Order By: Relevance
“…The Markov chain is a particular type of Markov process in which the system state selects only discrete values, and it is based on the probability matrix. The future of this process does not depend on the way it was taken in the past, but only on its current position (Awiszus and Rosenhahn, 2018;Kratochv´ıl, 2018). Accordingly, an accidental string of variables {X t , t } is called a Markov chain if for all state values i 0 , i 1 , i 2 , ..., i t , then…”
Section: Markov Chain Theorymentioning
confidence: 99%
“…The Markov chain is a particular type of Markov process in which the system state selects only discrete values, and it is based on the probability matrix. The future of this process does not depend on the way it was taken in the past, but only on its current position (Awiszus and Rosenhahn, 2018;Kratochv´ıl, 2018). Accordingly, an accidental string of variables {X t , t } is called a Markov chain if for all state values i 0 , i 1 , i 2 , ..., i t , then…”
Section: Markov Chain Theorymentioning
confidence: 99%
“…While Machine Learning (ML) solutions demonstrated an ability to perform successfully in system modeling tasks [ 98 , 99 , 100 , 101 ], a minimal amount of studies applied those for the modeling of blockchain-based systems. For instance, the Markov chain neural networks concept introduced by Awiszus and Rosenhahn [ 102 ] has great potential in the application of Markov chains for blockchain modeling. At the same time, the Markov Decision Process Extraction Network (MPEN) [ 103 ] can potentially be utilized in order to extract automatically minimal relevant aspects of the dynamics from observations to model a Markov decision process.…”
Section: Current Challenges and Future Prospectsmentioning
confidence: 99%
“…In this work, we propose a novel end-to-end model to incorporate attention and semantic correlations for visual relationship detection. The novelties of our work are: (1) We design a spatial attention model which constrains the network focusing more on the most important regions. (2) We introduce a new spatial feature extraction model which significantly improves the detection performance.…”
Section: Contributionsmentioning
confidence: 99%
“…In recent years, deep learning technology has achieved great success in computer vision tasks, such as object detection techniques [13,26,27], pose estimation [32], tracking [10], AI games [1,29]. However, visual scene understanding remains open challenging tasks.…”
Section: Introductionmentioning
confidence: 99%