Table 4 Computational overhead analysis of RNN-Bi-LSTM in Spectrum Sensing.
From: RNN-Bi-LSTM spectrum sensing algorithm for NOMA waveform with diverse channel conditions
References | Computational complexity | Overhead discussion |
|---|---|---|
Complexity increases with feature dimensions and sequence length: \(O\left( {T \times \left( {H^{2} + I \times H} \right)} \right)\) for RNN | Bi-LSTM doubles the RNN overhead due to forward & backward passes. Effective but computationally expensive | |
Bi-LSTM:\(O\left( {2 \times T \times \left( {H^{2} + I \times H} \right)} \right)\) | Bi-LSTM as superior in accuracy but notes significant training time and resource demands | |
\(O\left( {T \times H^{2} } \right)\) for single direction | Highlights better accuracy with moderate complexity; doesn’t include bidirectional variant | |
\(O\left( {n^{2} } \right) to O\left( {n log n} \right)\) | Emphasizes low complexity but lacks Bi-LSTM’s learning capability | |
Negligible computational overhead | Not DL-based; used for comparison with ML/DL models | |
\(O\left( {T \times H^{2} } \right),\) less than LSTM | Bi-LSTM overhead > GRU. This paper optimizes sensing by reducing model size | |
\(O(n \times log n),\) low overhead | Non-DL models: efficient but less adaptive to channel changes than Bi-LSTM | |
Varies by model | Discusses trade-offs, confirms Bi-LSTM has higher compute cost but better temporal modeling | |
Dependent on clustering complexity | Computational cost moderate; Bi-LSTM not used but suggested as future enhancement | |
\(O\left( {T \times H^{2} } \right)\) | Discusses practical deployment, Bi-LSTM adds complexity requiring acceleration | |
\(O(nk)\) where n = samples, k = components | Efficient for cooperation, but lacks the sequential modeling strength of Bi-LSTM |