File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/relat/06/n06-2010_relat.xml
Size: 1,782 bytes
Last Modified: 2025-10-06 14:15:53
<?xml version="1.0" standalone="yes"?> <Paper uid="N06-2010"> <Title>Gesture Improves Coreference Resolution</Title> <Section position="6" start_page="39" end_page="39" type="relat"> <SectionTitle> 5 Related Work </SectionTitle> <Paragraph position="0"> Research on multimodality in the NLP community has usually focused on multimodal dialogue systems (e.g., (Oviatt, 1999)). These systems differ fundamentally from ours in that they address human-computer interaction, whereas we address human-human interaction.</Paragraph> <Paragraph position="1"> Multimodal dialogue systems tackle interesting and difficult challenges, but the grammar, vocabulary, and recognized gestures are often pre-specified, and dialogue is controlled at least in part by the computer. In our data, all of these things are unconstrained.</Paragraph> <Paragraph position="2"> Prosody has been shown to improve performance on several NLP problems, such as topic and sentence segmentation (e.g., (Shriberg et al., 2000)). We are aware of no equivalent work showing statistically significant improvement on unconstrained speech using hand gesture features. (Nakano et al., 2003) shows that body posture predicts turn boundaries, but does not show that these features improve performance beyond a text-only system.</Paragraph> <Paragraph position="3"> (Chen et al., 2004) shows that gesture may improve sentence segmentation; however, in this study, the improvement afforded by gesture is not statistically significant, and evaluation was performed on a subset of their original corpus that was chosen to include only the three speakers who gestured most frequently. Still, this work provides a valuable starting point for the integration of gesture feature into NLP systems.</Paragraph> </Section> class="xml-element"></Paper>