File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/abstr/97/w97-1412_abstr.xml

Size: 3,218 bytes

Last Modified: 2025-10-06 13:49:09

<?xml version="1.0" standalone="yes"?>
<Paper uid="W97-1412">
  <Title>A Syndetic Approach to Referring Phenomena in Multimodal Interaction</Title>
  <Section position="1" start_page="0" end_page="0" type="abstr">
    <SectionTitle>
1 Introduction
</SectionTitle>
    <Paragraph position="0"> User interfaces of many application systems have begun to include multiple devices which can be used together to input single expressions. Such interfaces (and even the whole application systems) are widely labelled multi-modal, since they use different types of communication channels to acquire information.</Paragraph>
    <Paragraph position="1"> These emerging devices and recognition systems potentially allow users to express their intentions more naturally, in ways similar to those used by humans to communicate with each other. However, very few works have concentrated on the integration and synergistic use of multimodal input capabilities within the same system. Most systems simply take almost no account of how the different modes interact so that the interdependence of modalities contributing to information processing is not capitalized upon. Moreover, the close interaction and interdependency between input and output is still a largely unexplored area. For example, the capability of referring directly to the content of a rich multimodal presentation while formulating multimodal input requires the processing of a body of knowledge that largely extend the information content that can be conveyed by a simple pick operation.</Paragraph>
    <Paragraph position="2"> Underlying practical use of these new technologies is the question of their suitability: are they appropriate for the tasks users need to perform, and what is their comparative ease of use? In order to build artifacts that are to be both useful and usable, the development of interactive systems must address user-oriented requirements and accommodate different perspectives in the (formal) design process. Novel interaction techniques may interfere with the functional and task-oriented requirements that a system is intended to support. Potential conflicts between these types of requirements can be identified early in the design process through the use of appropriate specification techniques using mathematical structures able to represent perceivable elements of the system and allowing for multidisciplinary insight into the design problem.</Paragraph>
    <Paragraph position="3"> This work describes an approach to evaluating the usability of devices that accounts for the cognitive resources needed to use a device to perform particular tasks. The framework draws its expressive power from a technique called syndetic modelling that allows the description of both the device and cognitive resources to be captured in a common representation. In this paper syndesis provides a foundation for examining the interplay occurring betwen an operator and a computer system when performing tasks involving deltic references made through speech and gestures. It is the relationship between users and systems, and the transformations that are necessary to move from one to another, that provides novel insight into usability.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML