Please use this identifier to cite or link to this item:
Title: An adaptive model for multi-modal biometrics decision fusion
Keywords: multi-modal biometrics, decision fusion, biometrics verification, recursive least squares, parameter estimation
Issue Date: 18-Jul-2005
Citation: TRAN QUOC LONG (2005-07-18). An adaptive model for multi-modal biometrics decision fusion. ScholarBank@NUS Repository.
Abstract: Multi-modal biometric verification is gaining more and more attention recently because of the high security level it provides and the non-universality of uni-modal biometrics. Multi-modal biometrics decision fusion can be considered as a classification task since the output is either a genuine user or an impostor. This treatment allows many available classifiers to be applied in the field. In this thesis, two problems related to multi-modal biometrics decision fusion are considered. The first problem is new user registration. Frequent registration not only requires storing of new patterns into the biometric database but also requires updating the combination module efficiently. The second problem is related to sensor decay which results in change of matching scores with time. The performance of a fixed classifier may be affected for such case. In this thesis, an adaptive algorithm to solve these problems has been proposed. This algorithm can update the combination module whenever new training patterns are available without having to retrain the module from scratch. The new algorithm is demonstrated using experiments on physical application data to address both the registration and matching scores distribution changing problems using three biometrics, namely fingerprint, speech and hand-geometry.
Appears in Collections:Master's Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
TRAN QUOC LONG MEng ECE 2005 thesis.pdf9.08 MBAdobe PDF



Page view(s)

checked on May 23, 2019


checked on May 23, 2019

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.