Table 1 Benchmarking MultiCauseNet against baseline methods.

From: MULTICAUSENET temporal attention for multimodal emotion cause pair extraction

Method (year)

Deep learning technique

Dataset

Notable aspects

DialogueGCN (2019)17

Graph Convolutional Network (GCN)

IEMOCAP

Models interrelations among dialogue turns

DialogueRNN (2019)18

Recurrent Neural Network (RNN)

IEMOCAP, MELD

Captures sequential dynamics of dialogue

MMGCN (2019)19

Multimodal GCN

IEMOCAP

Enhances recognition for Sadness and Excitement

IterativeERC (2020)20

Iterative Method

IEMOCAP

Refines predictions through multiple iterations

QMNN (2021)21

Quantum-Inspired Techniques

Various

Integrates techniques across modalities

MM-DFN (2022)22

Deep Fusion Network

IEMOCAP

Addresses complex emotional expressions

MVN (2022)23

Multi-View Approach

Various

Extracts diverse emotional signals

UniMSE (2022)24

Self-Supervised Learning

Various

Unified multimodal strategy

EmoCaps (2022)2

Various

Various

Detects nuanced emotional expressions

GA2MIF (2023)25

Facial and Contextual Info

Various

Enhances emotion recognition

MALN (2023)26

Multimodal Learning Network

Various

Excels in recognizing multiple emotions

MultiEMO (2023)27

Advanced Methodology

Various

Excels in detecting Sad emotions