File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/abstr/04/n04-1004_abstr.xml

Size: 1,021 bytes

Last Modified: 2025-10-06 13:43:31

<?xml version="1.0" standalone="yes"?>
<Paper uid="N04-1004">
  <Title>A Salience-Based Approach to Gesture-Speech Alignment</Title>
  <Section position="1" start_page="0" end_page="0" type="abstr">
    <SectionTitle>
Abstract
</SectionTitle>
    <Paragraph position="0"> One of the first steps towards understanding natural multimodal language is aligning gesture and speech, so that the appropriate gestures ground referential pronouns in the speech.</Paragraph>
    <Paragraph position="1"> This paper presents a novel technique for gesture-speech alignment, inspired by salience-based approaches to anaphoric pronoun resolution. We use a hybrid between data-driven and knowledge-based mtehods: the basic structure is derived from a set of rules about gesture salience, but the salience weights themselves are learned from a corpus. Our system achieves 95% recall and precision on a corpus of transcriptions of unconstrained multimodal monologues, significantly outperforming a competitive baseline.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML