Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval 2005
DOI: 10.1145/1076034.1076094
|View full text |Cite
|
Sign up to set email alerts
|

On the collective classification of email "speech acts"

Abstract: We consider classification of email messages as to whether or not they contain certain "email acts", such as a request or a commitment. We show that exploiting the sequential correlation among email messages in the same thread can improve email-act classification. More specifically, we describe a new textclassification algorithm based on a dependency-network based collective classification method, in which the local classifiers are maximum entropy models based on words and certain relational features. We show … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
87
0

Year Published

2008
2008
2016
2016

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 114 publications
(87 citation statements)
references
References 10 publications
0
87
0
Order By: Relevance
“…Yang, Slattery, & Ghani [63] conducted an in-depth investigation over multiple datasets commonly used for document classification experiments and identified different patterns. Since then, collective classification has also been applied to various other applications such as part-of-speech tagging [33], classification of hypertext documents using hyperlinks [55], link prediction in friend-of-a-friend networks [56], optical character recognition [58], entity resolution in sensor networks [9], predicting disulphide bonds in protein molecules [57], segmentation of 3D scan data [2] and classification of email "speech acts" [7].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Yang, Slattery, & Ghani [63] conducted an in-depth investigation over multiple datasets commonly used for document classification experiments and identified different patterns. Since then, collective classification has also been applied to various other applications such as part-of-speech tagging [33], classification of hypertext documents using hyperlinks [55], link prediction in friend-of-a-friend networks [56], optical character recognition [58], entity resolution in sensor networks [9], predicting disulphide bonds in protein molecules [57], segmentation of 3D scan data [2] and classification of email "speech acts" [7].…”
Section: Related Workmentioning
confidence: 99%
“…For instance, in the webpage classification problem where webpages are interconnected with hyperlinks and the task is to assign each webpage with a label that best indicates its topic, it is common to assume that the labels on interconnected webpages are correlated. Such interconnections occur naturally in data from a variety of applications such as bibliographic data [10,16], email networks [7] and social networks [41]. Traditional classification techniques would ignore the correlations represented by these interconnections and would be hard pressed to produce the classification accuracies possible using a collective classification approach.…”
Section: Introductionmentioning
confidence: 99%
“…There have been numerous approaches to automatically classifying speech acts, including neural network classification [12], maximum entropy model classification [1], and Hidden Markov Model (HMM) speech act classification [21]. Our classifier is composed of three independent HMM classifiers, one for each axis (speech act, content, and referent).…”
Section: Classifier Implementationmentioning
confidence: 99%
“…We also apply Speech Act Theory [6] to provide more adequate representation of email action items. Although the same theory was applied to email beforehand -particularly to address the problems of email task management [7][8] and email classification [9][10] [11]; it was never combined with semantic technologies. We represent action items by speech act instances from within the sMail Ontology 2 .…”
Section: Related Workmentioning
confidence: 99%