Please use this identifier to cite or link to this item:
https://doi.org/10.1007/978-3-642-03983-6_34
Title: | Equivalent relationship of feedforward neural networks and real-time face detection system | Authors: | Ge, S.S. Pan, Y. Zhang, Q. Chen, L. |
Issue Date: | 2009 | Citation: | Ge, S.S., Pan, Y., Zhang, Q., Chen, L. (2009). Equivalent relationship of feedforward neural networks and real-time face detection system. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 5744 LNCS : 301-310. ScholarBank@NUS Repository. https://doi.org/10.1007/978-3-642-03983-6_34 | Abstract: | In this paper, we mainly investigate a fast algorithm, Extreme Learning Machine (ELM), on its equivalent relationship, approximation capability and real-time face detection application. Firstly, an equivalent relationship is presented for neural networks without orthonormalization (ELM) and orthonormal neural networks. Secondly, based on the equivalent relationship and the universal approximation of orthonormal neural networks, we successfully prove that neural networks with ELM have the property of universal approximation, and adjustable parameters of hidden neurons and orthonormal transformation are not necessary. Finally, based on the fast learning characteristic of ELM, we successfully combine ELM with AdaBoost algorithm of Viola-Jones in face detection applications such that the whole system not only retains a real-time learning speed, but also possesses high face detection accuracy. © 2009 Springer. | Source Title: | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | URI: | http://scholarbank.nus.edu.sg/handle/10635/51157 | ISBN: | 3642039820 | ISSN: | 03029743 | DOI: | 10.1007/978-3-642-03983-6_34 |
Appears in Collections: | Staff Publications |
Show full item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.