Please use this identifier to cite or link to this item:
https://scholarbank.nus.edu.sg/handle/10635/99608
DC Field | Value | |
---|---|---|
dc.title | Two-level TDNN (TLDTDNN) technique for large vocabulary Mandarin FINAL recognition | |
dc.contributor.author | Poo, Gee-Swee | |
dc.date.accessioned | 2014-10-27T06:05:48Z | |
dc.date.available | 2014-10-27T06:05:48Z | |
dc.date.issued | 1994 | |
dc.identifier.citation | Poo, Gee-Swee (1994). Two-level TDNN (TLDTDNN) technique for large vocabulary Mandarin FINAL recognition. IEEE International Conference on Neural Networks - Conference Proceedings 7 : 4396-4399. ScholarBank@NUS Repository. | |
dc.identifier.uri | http://scholarbank.nus.edu.sg/handle/10635/99608 | |
dc.description.abstract | A Two-Level Time-Delay Neural Network (TLTDNN) technique has been developed to recognize all Mandarin Finals of the entire Chinese syllables. The first level discriminates the vowel-group based on (a,e,i,o,u,v) and the nasal-group based on nasal ending, (-n,-ng,-others). Orthogonal combination of the two groupings in the first level enables the second level discrimination of all 35 Mandarin Finals. The technique was thoroughly tested with 8 sets of 1265 isolated Hanyu Pinyin syllables, with 6 sets used for training and 2 sets used for testing. The overall result shows that a high recognition rate of 95.3% for inside testing and 93.9% for outside testing is achievable. | |
dc.source | Scopus | |
dc.type | Conference Paper | |
dc.contributor.department | INFORMATION SYSTEMS & COMPUTER SCIENCE | |
dc.description.sourcetitle | IEEE International Conference on Neural Networks - Conference Proceedings | |
dc.description.volume | 7 | |
dc.description.page | 4396-4399 | |
dc.description.coden | 176 | |
dc.identifier.isiut | NOT_IN_WOS | |
Appears in Collections: | Staff Publications |
Show simple item record
Files in This Item:
There are no files associated with this item.
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.