Please use this identifier to cite or link to this item:
https://scholarbank.nus.edu.sg/handle/10635/243787
Title: | A SELECTIVE GAN-BASED DATA AUGMENTATION METHOD FOR MEDICAL IMAGE CLASSIFICATION | Authors: | LIU ZHANHONG | ORCID iD: | orcid.org/0000-0002-1068-419X | Keywords: | Deep learning, Selective Data Augmentation, Generative adversarial networks, Image classification, Label certainty, Triplet loss | Issue Date: | 15-Jun-2023 | Citation: | LIU ZHANHONG (2023-06-15). A SELECTIVE GAN-BASED DATA AUGMENTATION METHOD FOR MEDICAL IMAGE CLASSIFICATION. ScholarBank@NUS Repository. | Abstract: | Although deep learning has significantly improved classification tasks, the available medical image datasets are often inadequate and imbalanced, hindering the development of accurate models for disease diagnosis and treatment. In this research, a selective GAN-based data augmentation method for imbalanced classification tasks is proposed. The method involves two selection steps based on label certainty and triplet loss to control the quality of synthetic samples generated, and make the dataset more balanced by generating more synthetic samples for minority classes. The research evaluates the method on the HyperKvasir dataset. Eventually, we achieved better classification performance than the baseline method, with an accuracy of 86.80 %, a precision of 76.64%, a recall of 77.39%, a F1-score of 76.92% and an AUC-ROC of 0.8966. Overall, the results demonstrated significant improvements in the classification performance, particularly for the minority classes, which shows potential in enhancing the classification performance of imbalanced medical image datasets. | URI: | https://scholarbank.nus.edu.sg/handle/10635/243787 |
Appears in Collections: | Master's Theses (Open) |
Show full item record
Files in This Item:
File | Description | Size | Format | Access Settings | Version | |
---|---|---|---|---|---|---|
LiuZhanhong.pdf | 4.64 MB | Adobe PDF | OPEN | None | View/Download |
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.