Table 1 Summary of related works.
From: Deep representation learning using layer-wise VICReg losses
Category | Objectives and features | Limitations |
|---|---|---|
Backpropagation4 | Learning using Backpropagation that consists of two passes through a neural network. The forward pass works as a prediction, and the backward pass minimizes the error based on derivatives. | Computationally inefficient; biologically implausible; vanishing gradient issue; requires large amounts of labeled data; sensitive to hyperparameter settings. |
Forward-Forward algorithm5 | Bridging the gap between learning in the human brain and learning in machines, mitigating downsides of backpropagation with greedy multi-layer learning. | Implementation complexity; relatively unexplored. |
Utilization of large unlabeled datasets, and learning representations with pseudo-labels, examples include SimCLR, MoCo, BYOL, SwAV, Barlow Twins, and VICReg. | Identical output representation generation problem known as ‘Collapse’. | |
Attenuating the drawbacks of backpropagation. | Requires careful choice and management of layer-wise loss that leads to generalization. |