File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/04/p04-1015_concl.xml
Size: 2,525 bytes
Last Modified: 2025-10-06 13:54:03
<?xml version="1.0" standalone="yes"?> <Paper uid="P04-1015"> <Title>Incremental Parsing with the Perceptron Algorithm</Title> <Section position="7" start_page="88" end_page="88" type="concl"> <SectionTitle> 6 Conclusions </SectionTitle> <Paragraph position="0"> In this paper we have presented a discriminative training approach, based on the perceptron algorithm with a couple of effective refinements, that provides a model capable of effective heuristic search over a very difficult search space. In such an approach, the unnormalized discriminative parsing model can be applied without either 4When POS tagging is integrated directly into the generative parsing process, the baseline performance is 87.0. For comparison with the perceptron model, results are shown with pre-tagged input.</Paragraph> <Paragraph position="1"> cluding labeled precision (LP), labeled recall (LR), and F-measure an external model to present it with candidates, or potentially expensive dynamic programming. When the training algorithm is provided the generative model scores as an additional feature, the resulting parser is quite competitive on this task. The improvement that was derived from the additional punctuation features demonstrates the flexibility of the approach in incorporating novel features in the model.</Paragraph> <Paragraph position="2"> Future research will look in two directions. First, we will look to include more useful features that are difficult for a generative model to include. This paper was intended to compare search with the generative model and the perceptron model with roughly similar feature sets. Much improvement could potentially be had by looking for other features that could improve the models. Secondly, combining with the generative model can be done in several ways. Some of the constraints on the search technique that were required in the absence of the generative model can be relaxed if the generative model score is included as another feature. In the current paper, the generative score was simply added as another feature.</Paragraph> <Paragraph position="3"> Another approach might be to use the generative model to produce candidates at a word, then assign perceptron features for those candidates. Such variants deserve investigation. null Overall, these results show much promise in the use of discriminative learning techniques such as the perceptron algorithm to help perform heuristic search in difficult domains such as statistical parsing.</Paragraph> </Section> class="xml-element"></Paper>