File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/intro/02/c02-1061_intro.xml

Size: 1,809 bytes

Last Modified: 2025-10-06 14:01:26

<?xml version="1.0" standalone="yes"?>
<Paper uid="C02-1061">
  <Title>Antonymy and Conceptual Vectors</Title>
  <Section position="3" start_page="0" end_page="0" type="intro">
    <SectionTitle>
1 Introduction
</SectionTitle>
    <Paragraph position="0"> Research in meaning representation in NLP is an important problem still addressed through several approaches. The NLP team at LIRMM currently works on thematic and lexical disambiguation text analysis (Laf01). Therefore we built a system, with automated learning capabilities, based on conceptual vectors for meaning representation. Vectors are supposed to encode 'ideas' associated to words or to expressions. The conceptual vectors learning system automatically de nes or revises its vectors according to the following procedure. It takes, as an input, de nitions in natural language contained in electronic dictionaries for human usage. These de nitions are then fed to a morpho-syntactic parser that provides tagging and analysis trees. Trees are then used as an input to a procedure that computes vectors using tree geometry and syntactic functions. Thus, a kernel of manually indexed terms is necessary for bootstrapping the analysis. The transversal relationships1, such as synonymy (LP01), antonymy and hyperonymy, that are more or less explicitly mentioned in de nitions can be used as a way to globally increase the coherence of vectors. In this paper, we describe a vectorial function of antonymy. This can help to improve the learning system by dealing with negation and antonym tags, as they are often present in de nition texts. The antonymy function can also help to nd an opposite thema to be used in all generative text applications: opposite ideas research, paraphrase (by negation of the antonym), summary, etc.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML