2004
DOI: 10.1007/978-3-540-28633-2_34
|View full text |Cite
|
Sign up to set email alerts
|

An Anytime Algorithm for Interpreting Arguments

Abstract: Abstract. The problem of interpreting Natural Language (NL) discourse is generally of exponential complexity. However, since interactions with users must be conducted in real time, an exhaustive search is not a practical option. In this paper, we present an anytime algorithm that generates "good enough" interpretations of probabilistic NL arguments in the context of a Bayesian network (BN). These interpretations consist of: BN nodes that match the sentences in a given argument, assumptions that justify the bel… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2005
2005
2015
2015

Publication Types

Select...
2
1
1

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 7 publications
0
4
0
Order By: Relevance
“…A preliminary algorithm for proposing explanatory extensions was presented in Zukerman et al (2004), but it was not integrated with our probabilistic process. Our anytime algorithm was described in George et al (2004), and our procedure for positing suppositions in George et al (2005).…”
Section: Related Researchmentioning
confidence: 99%
See 1 more Smart Citation
“…A preliminary algorithm for proposing explanatory extensions was presented in Zukerman et al (2004), but it was not integrated with our probabilistic process. Our anytime algorithm was described in George et al (2004), and our procedure for positing suppositions in George et al (2005).…”
Section: Related Researchmentioning
confidence: 99%
“…In Zukerman and George (2005), we presented an anytime algorithm (Dean and Boddy 1998;Horvitz et al 1989) that generated only interpretation graphs, and a probabilistic formalism for selecting the best interpretation. Our algorithm was generalized in George et al (2004), where we generated interpretations composed of node configurations (which contain BN nodes that match Natural Language sentences in an argument), supposition configurations, and interpretation graphs. In this article, we apply our anytime algorithm to generate interpretations comprising supposition configurations, interpretation graphs and explanatory extensions (as stated above, in our current implementation we obtain a user's input from a menu, and hence do not posit node configurations).…”
Section: Proposing Interpretationsmentioning
confidence: 99%
“…In each iteration, we generate candidates for a component, calculate the probability of these candidates in the context of the selections made in the previous steps, and probabilistically select one of these candidates. That is, higher probability candidates have a better chance of being selected than lower probability ones (our selection procedures are described in George et al, 2004). For example, say that in Step 1, we selected supposition configuration SC a .…”
Section: Proposing Interpretationsmentioning
confidence: 99%
“…In previous work, we proposed an anytime algorithm to propose interpretation graphs and supposition configurations until time runs out (George et al, 2004). Here we apply our algorithm to generate interpretations comprising supposition configurations (SC), interpretation graphs (IG) and explanatory extensions (EE) (Figure 2).…”
Section: Proposing Interpretationsmentioning
confidence: 99%