Please use this identifier to cite or link to this item:
|dc.title||Improving statistical machine translation for a resource-poor language using related resource-rich languages|
|dc.contributor.author||Tou Ng, H.|
|dc.identifier.citation||Nakov, P.,Tou Ng, H. (2012). Improving statistical machine translation for a resource-poor language using related resource-rich languages. Journal of Artificial Intelligence Research 44 : 179-222. ScholarBank@NUS Repository.|
|dc.description.abstract||We propose a novel language-independent approach for improving machine translation for resource-poor languages by exploiting their similarity to resource-rich ones. More precisely, we improve the translation from a resource-poor source language X1 into a resourcerich language Y given a bi-text containing a limited number of parallel sentences for X 1-Y and a larger bi-text for X 2-Y for some resource-rich language X 2 that is closely related to X 1. This is achieved by taking advantage of the opportunities that vocabulary overlap and similarities between the languages X 1 and X 2 in spelling, word order, and syntax offer: (1) we improve the word alignments for the resource-poor language, (2) we further augment it with additional translation options, and (3) we take care of potential spelling differences through appropriate transliteration. The evaluation for Indonesian!English using Malay and for Spanish!English using Portuguese and pretending Spanish is resource-poor shows an absolute gain of up to 1.35 and 3.37 BLEU points, respectively, which is an improvement over the best rivaling approaches, while using much less additional data. Overall, our method cuts the amount of necessary "real" training data by a factor of 2-5. © 2012 AI Access Foundation. All rights reserved.|
|dc.description.sourcetitle||Journal of Artificial Intelligence Research|
|Appears in Collections:||Staff Publications|
Show simple item record
Files in This Item:
There are no files associated with this item.
checked on Oct 28, 2019
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.