Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/248175
DC FieldValue
dc.titleTOWARD EFFECTIVE AND EFFICIENT GRAPH NEURAL NETWORKS WITH IMPLICIT LAYERS
dc.contributor.authorLIU JUNCHENG
dc.date.accessioned2024-04-30T18:01:20Z
dc.date.available2024-04-30T18:01:20Z
dc.date.issued2023-08-24
dc.identifier.citationLIU JUNCHENG (2023-08-24). TOWARD EFFECTIVE AND EFFICIENT GRAPH NEURAL NETWORKS WITH IMPLICIT LAYERS. ScholarBank@NUS Repository.
dc.identifier.urihttps://scholarbank.nus.edu.sg/handle/10635/248175
dc.description.abstractMotivated by limitations regarding effectiveness and memory efficiency, in the first work, we propose a GNN model with infinite depth, which we call Efficient Infinite-Depth Graph Neural Networks (EIGNN), to efficiently capture very long-range dependencies. In the second work, we introduce and justify two limitations of EIGNN and previous implicit GNNs: the constrained expressiveness due to their limited effective range for capturing long-range dependencies, and their lack of ability to capture multiscale information on graphs at multiple resolutions. Lastly, we focus on making implicit GNNs able to be trained on large graphs. Despite the advantages of memory efficiency and the better ability to capture long-range dependencies, existing implicit GNNs still have some limitations such as scalability and training efficiency issues on large graphs.
dc.language.isoen
dc.subjectGraph Neural Networks, Graph Representation Learning, Implict Models, Deep Learning, Implicit Layers
dc.typeThesis
dc.contributor.departmentCOMPUTER SCIENCE
dc.contributor.supervisorXiaokui Xiao
dc.description.degreePh.D
dc.description.degreeconferredDOCTOR OF PHILOSOPHY (SOC)
dc.identifier.orcid0000-0002-3054-4629
Appears in Collections:Ph.D Theses (Open)

Show simple item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
Final_Thesis_Juncheng.pdf1.35 MBAdobe PDF

OPEN

NoneView/Download

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.