Table. 5 Architecture of the hierarchical graph neural network (GNN)

From: Artificial intelligence-driven prediction of lymph node metastasis in T1 esophageal squamous cell carcinoma using whole slide images

Layer

Function

Parameters

Activation/Dropout

Input → Output dimensions

Feature compression layer

Reduces ResNet-50 features to lower-dimensional graph nodes

Linear layer

 

2048 → 512

Graph convolutional layer 1

Aggregates neighborhood features via message passing

GCNConv (k = 10 neighbors)

ReLU + Dropout (rate unspecified)

512 → 512

Graph convolutional layer 2

Further refines node embeddings with hierarchical spatial dependencies

GCNConv (k = 10 neighbors)

ReLU + Dropout (rate unspecified)

512 → 512

Global mean pooling layer

Aggregates node-level features into slide-level embeddings

Global graph pooling

 

512 → 512 (slide-level)

Classification head

Predicts LNM probability from slide embeddings

Linear layer + Softmax

Softmax

512 → 2 (binary classes)