File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/02/p02-1003_concl.xml
Size: 4,106 bytes
Last Modified: 2025-10-06 13:53:17
<?xml version="1.0" standalone="yes"?> <Paper uid="P02-1003"> <Title>Generation as Dependency Parsing</Title> <Section position="8" start_page="5" end_page="5" type="concl"> <SectionTitle> 7 Conclusion </SectionTitle> <Paragraph position="0"> Generation from flat semantics is an NP-complete problem. In this paper, we have first given an alternative proof for this fact, which works even for a fixed grammar and makes the connection to the complexity of free word order parsing clearly visible. Then we have shown how to translate the realization problem of TAG into parsing problems of topological dependency grammar, and argued how the optimizations in the dependency parser - which were originally developed for free word order parsing - help reduce the runtime for the generation system. This reduction shows in passing that the parsing problem for TDG is NP-complete as well, which has been conjectured, but never proved.</Paragraph> <Paragraph position="1"> The NP-completeness result for the realization problem explains immediately why all existing complete generation algorithms have exponential runtimes in the worst case. As our proof shows, the main sources of the combinatorics are the interaction of lexical ambiguity and tree configuration with the completely unordered nature of the input. Modification is important and deserves careful treatment (and indeed, our system deals very gracefully with it), but it is not as intrinsically important as some of the literature suggests; our proof gets by without modification. If we allow the grammar to be part of the input, we can even modify the proof to show NP-hardness of the case where semantic atoms can be verbalized more often than they appear in the input, and of the case where they can be verbalized less often. The case where every atom can be used arbitrarily often remains open.</Paragraph> <Paragraph position="2"> By using techniques from constraint programming, the dependency parser seems to cope rather well with the combinatorics of generation. Propagators can rule out impossible local structures on the grounds of global information, and selection constraints greatly alleviate the proliferation of lexical ambiguity in large TAG grammars by making shared information available without having to commit to specific lexical entries. Initial experiments with the XTAG grammar indicate that we can generate practical examples in polynomial time, and may be competitive with state-of-the-art realization systems in terms of raw runtime.</Paragraph> <Paragraph position="3"> In the future, it will first of all be necessary to lift the restrictions we have placed on the TAG grammar: So far, the nodes of the elementary trees are only equipped with nonterminal labels and indices, not with general feature structures, and we allow only a restricted form of adjunction constraints. It should be possible to either encode these constructions directly in the dependency grammar (which allows user-defined features too), or filter out wrong realizations in a post-processing step. The effect of such extensions on the runtime remains to be seen.</Paragraph> <Paragraph position="4"> Finally, we expect that despite the general NPcompleteness, there are restricted generation problems which can be solved in polynomial time, but still contain all problems that actually arise for natural language. The results of this paper open up a new perspective from which such restrictions can be sought, especially considering that all the natural-language examples we tried are indeed processed in polynomial time. Such a polynomial realization algorithm would be the ideal starting point for algorithms that compute not just any, but the best possible realization - a problem which e.g.</Paragraph> <Paragraph position="5"> Bangalore and Rambow (2000) approximate using stochastic methods.</Paragraph> <Paragraph position="6"> Acknowledgments. We are grateful to Tilman Becker, Chris Brew, Ann Copestake, Ralph Debusmann, Gerald Penn, Stefan Thater, and our reviewers for helpful comments and discussions.</Paragraph> </Section> class="xml-element"></Paper>