File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/06/w06-3123_concl.xml
Size: 1,523 bytes
Last Modified: 2025-10-06 13:55:47
<?xml version="1.0" standalone="yes"?> <Paper uid="W06-3123"> <Title>Constraining the Phrase-Based, Joint Probability Statistical Translation Model</Title> <Section position="6" start_page="156" end_page="156" type="concl"> <SectionTitle> 5 Conclusion </SectionTitle> <Paragraph position="0"> Wepresentedthefirstattemptatcreatingasystematic framework which uses word alignment constraints to guide phrase-based EM training. This shows competitive results, to within 0.66 BLEU points for the basic systems, suggesting that a rigorous probabilistic framework is preferable to heuristics for extracting phrase pairs and their probabilities.</Paragraph> <Paragraph position="1"> By introducing constraints to the alignment space we can reduce the complexity of the joint model and increase its performance, allowing it to trainonlargercorporaandmakingthemodelmore widely applicable.</Paragraph> <Paragraph position="2"> For the future, the joint model would benefit from lexical weighting like that used in the standard model (Koehn et al., 2003). Using IBM Model 1 to extract a lexical alignment weight for eachphrasepairwoulddecreasetheimpactofdata sparseness, and other kinds smoothing techniques will be investigated. Better search algorithms for Viterbi phrasal alignments during EM would increase the number and quality of model parameters. null This work was supported in part under the GALE program of the Defense Advanced Research Projects Agency, Contract No. HR001106-C-0022. null</Paragraph> </Section> class="xml-element"></Paper>