Table 5 Impact of removing hierarchical attention components.

From: English-focused CL-HAMC with contrastive learning and hierarchical attention for multiple-choice reading comprehension

Configuration

RACE-M

RACE-H

RACE

Overall

Full model

92.3

89.0

90.1

90.3

– Passage-question pair & options

89.8

87.3

89.0

88.7 (\(\downarrow\)1.6)

– Question-option pair & passage

90.3

87.0

89.8

89.0 (\(\downarrow\)1.3)

– All hierarchical attention

89.2

86.0

87.4

87.5 (\(\downarrow\)2.8)

  1. \(\downarrow\) indicates performance drop relative to full model