File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/relat/90/c90-2049_relat.xml

Size: 3,189 bytes

Last Modified: 2025-10-06 14:16:03

<?xml version="1.0" standalone="yes"?>
<Paper uid="C90-2049">
  <Title>Dependency Analyzer: A Knowledge-Based Approach to Structural Disambiguation</Title>
  <Section position="5" start_page="286" end_page="286" type="relat">
    <SectionTitle>
6 Related Work
</SectionTitle>
    <Paragraph position="0"> There are several approaches to structural disambiguation, including resolution of prepositional phrase attachment. Wilks et al. \[12\] discussed some strategies for disambiguation based on preference semantics. Our framework is closely related to their ideas. While their strategies need hand-coded semantic formulas called preplates to decide preferences, our system can construct dependency knowledge semi-automatically. Dahlgren and Mc-Dowell \[2\] proposed another preference strategy for prepositional phrase disambiguation. It is based on ontological knowledge, which is manually constructed. Whereas this framework (and also that of Wilks et al.) was aimed at disambiguating single prepositional phrases in sentences, our approach can handle the attachments of multiple prepositional phrases in sentences, ttirst \[3\] developed a mechanism for structural disambiguation, called the Semantic Enquiry Desk, which is based on Chraniak's marker passing paradigm \[1\]. Our path Search is partially equivalent to marker passing. While marker passing involves a high computational cost and finds ninny meaningless relations, our path search is restricted and finds only paths that inelude synonym/taxonym relationships and dependencies.</Paragraph>
    <Paragraph position="1"> Our system can reduce the computational cost by using a limited knowledge search. Jensen and Binot \[6\] developed a heuristic method of prepositional phrase disambiguation usinp, on-line dictionary definitions. Our approach is sire-liar t,o theirs in the sense that both use dictiouaries as knowledge sources. The differences are in tile ways in which dictionary definitions are used. While their method sear{:hes for knowledge by phrasal pattern matching and calculates certainty factors by complex procedures, ours uses knowledge it: a simt)le and efficient way, searching tree:: and traversing nodes, and calculates t)referenees by afe, w simplified processes. Wernlter \[11\] t)rop{}sed a e:}nneeliol:ist approach to 8lrllctllra.\[ disan:biguation of noun phrases. He integrated syntaclic and semantic conslraints on lhe relaxation network. ~el:laI:tic {2OllS'{l'ailltS ol: prepositional reh:tionshil)s betweetl words are learned by a backl}ro\]}agation algorithm. Learned semantics is often very t:seful for natural language processing, when sexnantic rehtti,mshit)s cannot be represented explicitly. \\2: represm:t semantic relationships between words by explicit relationship chains, al:d therefore do not need learning by backpropagation. We integrate sem.mti{: preferences and syntactic eonstrailllS t}y using e(mstraint t}ropagathm. }n:t it is a sequential {:o::ue{'tion and does not allow their iilterac-.</Paragraph>
    <Paragraph position="2"> ti{m. \\k! are thii:king of desigIfinp a frau:ework that deals wilh both syntactic and semanli{: constraints simultam'ousty. null</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML