2019
DOI: 10.1007/978-3-030-30484-3_3
|View full text |Cite
|
Sign up to set email alerts
|

Discrete Stochastic Search and Its Application to Feature-Selection for Deep Relational Machines

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
1
1

Relationship

4
2

Authors

Journals

citations
Cited by 13 publications
(22 citation statements)
references
References 12 publications
0
22
0
Order By: Relevance
“…6 would seem to suggest that the answer is "yes" (Table 3). However, we caution against drawing such a conclusion for at least the following reasons: (a) Inclusion of propositions based on stochastic sampling of complex relational features in (Dash et al, 2019) can result in significantly better DRM models. It is possible also that BCP could be augmented with the same sampling methods to yield more informative propositions; (b) It is also possible that a BotGNN obtained with access to sampled relational features could improve its performance over what is shown here.…”
Section: Some Additional Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…6 would seem to suggest that the answer is "yes" (Table 3). However, we caution against drawing such a conclusion for at least the following reasons: (a) Inclusion of propositions based on stochastic sampling of complex relational features in (Dash et al, 2019) can result in significantly better DRM models. It is possible also that BCP could be augmented with the same sampling methods to yield more informative propositions; (b) It is also possible that a BotGNN obtained with access to sampled relational features could improve its performance over what is shown here.…”
Section: Some Additional Resultsmentioning
confidence: 99%
“…These features can form the input feature-vectors for standard statistical models (as in (Saha et al, (2012)) or for multilayer perceptrons (MLPs). When used with MLPs, the resulting model is called as "deep relational machines" or DRMs, as introduced by Lodhi (2013) and studied extensively in (Dash et al, 2018) and (Dash et al, 2019). In Fig.…”
Section: Resultsmentioning
confidence: 99%
“…It is a technique to transform a relational representation into a propositional single-table representation where each column in the table corresponds a feature that represents a relation constructed from data and domain-knowledge. Propositionalisation is the core technique in construction of deep relational machines [34,35,36]: these are multi-layered perceptrons constructed from propositionalised representation of relational data and domain-knowledge. Recent studies on domain-knowledge inclusion include construction of graph neural networks (GNNs) that can learn not only from relational (graph-structured) data but also symbolic domain-knowledge.…”
Section: Related Workmentioning
confidence: 99%
“…Again, stochastic feature selection of relational features that are used as input to deep relational machines can improve the performance and interpretability (Dash et al. 2019 ).…”
Section: Related Workmentioning
confidence: 99%
“…Again, sparsity and size of the propositionalized representation is a problem for deep neural networks. Again, stochastic feature selection of relational features that are used as input to deep relational machines can improve the performance and interpretability (Dash et al 2019).…”
Section: Deep Relational Machinesmentioning
confidence: 99%