Table 9 Comparative analysis of BERT and GNN-based approaches for fake news detection.
Author | Method | Dataset | Result |
---|---|---|---|
Singh and Jain31 | Hybrid model combining BERT for semantic embedding with GNN to model relational dependencies among socio-political news elements | RSS Feeds | Achieved superior accuracy, precision, and recall compared to traditional ML and standalone DL models |
Bhowmik, Mondal, and Arifuzzaman32 | GNN and transformer-based framework for Bangla sentiment analysis using BERT, Word2 Vec, FastText | Bangla Sentiment Dataset (15,114 samples) | Accuracy: 0.8957, Precision: 0.9056, Recall: 0.8515, F1-score: 0.8635 |
Zhang33 | GBCA (Graph BERT Co-Attention) integrating GCN and BERT for feature fusion | Three benchmark fake news datasets | Outperformed baselines in both accuracy and training efficiency |
Proposed Study | Dual-Stream Graph-Augmented Transformer integrating BERT with GNN through semantic-guided graph fusion and co-attention | GossipCop, PolitiFact, FakeNewsNet | Accuracy: 0.99, Precision: 0.99, Recall: 0.987, F1-score: 0.98 |