Table. 5 Architecture of the hierarchical graph neural network (GNN)
Layer | Function | Parameters | Activation/Dropout | Input → Output dimensions |
|---|---|---|---|---|
Feature compression layer | Reduces ResNet-50 features to lower-dimensional graph nodes | Linear layer | 2048 → 512 | |
Graph convolutional layer 1 | Aggregates neighborhood features via message passing | GCNConv (k = 10 neighbors) | ReLU + Dropout (rate unspecified) | 512 → 512 |
Graph convolutional layer 2 | Further refines node embeddings with hierarchical spatial dependencies | GCNConv (k = 10 neighbors) | ReLU + Dropout (rate unspecified) | 512 → 512 |
Global mean pooling layer | Aggregates node-level features into slide-level embeddings | Global graph pooling | 512 → 512 (slide-level) | |
Classification head | Predicts LNM probability from slide embeddings | Linear layer + Softmax | Softmax | 512 → 2 (binary classes) |