File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/evalu/05/h05-1009_evalu.xml

Size: 1,815 bytes

Last Modified: 2025-10-06 13:59:23

<?xml version="1.0" standalone="yes"?>
<Paper uid="H05-1009">
  <Title>Proceedings of Human Language Technology Conference and Conference on Empirical Methods in Natural Language Processing (HLT/EMNLP), pages 65-72, Vancouver, October 2005. c(c)2005 Association for Computational Linguistics NeurAlign: Combining Word Alignments Using Neural Networks</Title>
  <Section position="9" start_page="70" end_page="71" type="evalu">
    <SectionTitle>
5.4.2 Results for English-Chinese
</SectionTitle>
    <Paragraph position="0"> The results of the input alignments to NeurAlign, i.e., GIZA++ alignments in two different directions, NeurAlign1 (i.e., no partitioning) and variations of NeurAlign2 with different features for partitioning (English POS tag, Chinese POS tag, and POS tags on both sides) are shown in Table 6. For comparsion, we also include the results for RA in the table. For brevity, we include only the features resulting in the best configurations from the English-Spanish experiments, i.e., POS tags, dependency relations, word fertilities, and neighborhood links (the features in the third row of Table 5). The ground truth used during the training phase consisted of all the alignment links with equal weight.</Paragraph>
    <Paragraph position="1">  Without any partitioning, NeurAlign achieves an alignment error rate of 22.2%--a significant relative error reduction of 25.3% over RA. Partitioning the data according to POS tags results in significantly better results over no partitioning. When the data is partitioned according to both POS tags, NeurAlign reduces AER to 19.7%--a significant relative error reduction of 33.7% over RA. Compared to the input  alignments, the best version of NeurAlign achieves a relative error reduction of 35.8% and 38.8%, respectively. null</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML