Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/248175
Title: TOWARD EFFECTIVE AND EFFICIENT GRAPH NEURAL NETWORKS WITH IMPLICIT LAYERS
Authors: LIU JUNCHENG
ORCID iD:   orcid.org/0000-0002-3054-4629
Keywords: Graph Neural Networks, Graph Representation Learning, Implict Models, Deep Learning, Implicit Layers
Issue Date: 24-Aug-2023
Citation: LIU JUNCHENG (2023-08-24). TOWARD EFFECTIVE AND EFFICIENT GRAPH NEURAL NETWORKS WITH IMPLICIT LAYERS. ScholarBank@NUS Repository.
Abstract: Motivated by limitations regarding effectiveness and memory efficiency, in the first work, we propose a GNN model with infinite depth, which we call Efficient Infinite-Depth Graph Neural Networks (EIGNN), to efficiently capture very long-range dependencies. In the second work, we introduce and justify two limitations of EIGNN and previous implicit GNNs: the constrained expressiveness due to their limited effective range for capturing long-range dependencies, and their lack of ability to capture multiscale information on graphs at multiple resolutions. Lastly, we focus on making implicit GNNs able to be trained on large graphs. Despite the advantages of memory efficiency and the better ability to capture long-range dependencies, existing implicit GNNs still have some limitations such as scalability and training efficiency issues on large graphs.
URI: https://scholarbank.nus.edu.sg/handle/10635/248175
Appears in Collections:Ph.D Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
Final_Thesis_Juncheng.pdf1.35 MBAdobe PDF

OPEN

NoneView/Download

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.