2004
DOI: 10.1023/b:mach.0000035472.73496.0c
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Relational Learning, Text Mining, and Semi-Supervised Learning for Functional Genomics

Abstract: Abstract. We focus on the problem of predicting functional properties of the proteins corresponding to genes in the yeast genome. Our goal is to study the effectiveness of approaches that utilize all data sources that are available in this problem setting, including relational data, abstracts of research papers, and unlabeled data. We investigate a propositionalization approach which uses relational gene interaction data. We study the benefit of text classification and information extraction for utilizing a co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
25
0
2

Year Published

2008
2008
2021
2021

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 52 publications
(27 citation statements)
references
References 31 publications
0
25
0
2
Order By: Relevance
“…For inferring annotation rules, a machine learning algorithm takes these feature vectors and known annotations as an input to train model [31]. In order to adapt to topic model that has been developed for text mining, we set up a parallelism between text documents and proteins in our framework.…”
Section: The Bow Of Protein Sequencementioning
confidence: 99%
“…For inferring annotation rules, a machine learning algorithm takes these feature vectors and known annotations as an input to train model [31]. In order to adapt to topic model that has been developed for text mining, we set up a parallelism between text documents and proteins in our framework.…”
Section: The Bow Of Protein Sequencementioning
confidence: 99%
“…Another recent theoretical analysis treats co-training as a combinative label propagation over multiple views and provides a sufficient and necessary condition desired for co-training [163]. However, the performance could be dramatically degraded if the classifiers do not complement each other or the independency assumption does not hold [88]. Though co-training is conceptually treated as a semi-supervised learning paradigm due to the way unlabeled data is incorporated, the classifier training procedure is often supervised [22].…”
Section: Co-trainingmentioning
confidence: 99%
“…Krogel and Scheffer 22 have explored the effectiveness of using cotraining in functional genomic data that includes relational information. The authors perform an experimental analysis where they show that cotraining fails to improve classification results.…”
Section: Potential Bronchovascular Pair Detectionmentioning
confidence: 99%
“…A model L built by the relational learner using an initial training set is used in the parametertuning algorithm. The examples corresponding to the highest f1 measure 22 of completeness and correctness are used as positive examples. Completeness is also known as recall and sensitivity, while correctness is also known as precision in the pattern recognition literature.…”
Section: Image Preprocessingmentioning
confidence: 99%