Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/42123
Title: Comparison of smoothing techniques for robust Context Dependent acoustic modelling in hybrid NN/HMM systems
Authors: Wang, G.
Sim, K.C. 
Keywords: Context dependent acoustic modelling
Discriminative training
Hybrid system
Smoothing
Issue Date: 2011
Citation: Wang, G.,Sim, K.C. (2011). Comparison of smoothing techniques for robust Context Dependent acoustic modelling in hybrid NN/HMM systems. Proceedings of the Annual Conference of the International Speech Communication Association, INTERSPEECH : 457-460. ScholarBank@NUS Repository.
Abstract: Hybrid Neural Network/Hidden Markov Model (NN/HMM) systems have been found to yield high quality phone recognition performance. One issue with modelling the Context Dependent (CD) NN/HMM is the robust estimation of the NN parameters to reliably predict the large number of CD state posteriors. Previously, factorization based on conditional probabilities has been commonly adopted to circumvent this problem. This paper proposes two factorization schemes based on the product-of-expert framework, depending on the choice of the experts. In addition, smoothing and interpolation schemes were introduced to improve robustness. Experimental results on the WSJCAM0 reveal that the proposed CD NN/HMM parameter estimation techniques achieved consistent improvement compared to CI hybrid systems. The best hybrid system achieves a 21.7% relative phone error rate reduction and a 17.6% word error reduction compared to a discriminative trained context dependent triphone GMM/HMM system. Copyright © 2011 ISCA.
Source Title: Proceedings of the Annual Conference of the International Speech Communication Association, INTERSPEECH
URI: http://scholarbank.nus.edu.sg/handle/10635/42123
ISSN: 19909772
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.