Table 4 Holistic model architecture overview with NASNet backbone, PABR module, and T-block feature extractors.

From: Pyramidal attention-based T network for brain tumor classification: a comprehensive analysis of transfer learning approaches for clinically reliable and reliable AI hybrid approaches

Block name

Output shape

Parameters

Input layer

(512, 512, 3)

0

NASNet-mobile backbone

(8, 8, 1056)

4,269,716

Transition conv layer

(8, 8, 512)

543,232

PABR Module

(8, 8, 512)

525,824

T-Block 1

(8, 8, 192)

623,552

T-Block 2

(8, 8, 384)

468,864

T-Block 3

(8, 8, 768)

1,871,616

Feature abstracted fusion conv

(8, 8, 512)

393,728

Residual concatination

(8, 8, 512)

0

Dense (128 units)

(128,)

65,664

Dropout

(128,)

0

Dense

(4,)

387