Table 4 Computational overhead analysis of RNN-Bi-LSTM in Spectrum Sensing.

From: RNN-Bi-LSTM spectrum sensing algorithm for NOMA waveform with diverse channel conditions

References

Computational complexity

Overhead discussion

9

Complexity increases with feature dimensions and sequence length: \(O\left( {T \times \left( {H^{2} + I \times H} \right)} \right)\) for RNN

Bi-LSTM doubles the RNN overhead due to forward & backward passes. Effective but computationally expensive

10

Bi-LSTM:\(O\left( {2 \times T \times \left( {H^{2} + I \times H} \right)} \right)\)

Bi-LSTM as superior in accuracy but notes significant training time and resource demands

11

\(O\left( {T \times H^{2} } \right)\) for single direction

Highlights better accuracy with moderate complexity; doesn’t include bidirectional variant

12

\(O\left( {n^{2} } \right) to O\left( {n log n} \right)\)

Emphasizes low complexity but lacks Bi-LSTM’s learning capability

13

Negligible computational overhead

Not DL-based; used for comparison with ML/DL models

14

\(O\left( {T \times H^{2} } \right),\) less than LSTM

Bi-LSTM overhead > GRU. This paper optimizes sensing by reducing model size

15

\(O(n \times log n),\) low overhead

Non-DL models: efficient but less adaptive to channel changes than Bi-LSTM

16

Varies by model

Discusses trade-offs, confirms Bi-LSTM has higher compute cost but better temporal modeling

17

Dependent on clustering complexity

Computational cost moderate; Bi-LSTM not used but suggested as future enhancement

18

\(O\left( {T \times H^{2} } \right)\)

Discusses practical deployment, Bi-LSTM adds complexity requiring acceleration

19

\(O(nk)\) where n = samples, k = components

Efficient for cooperation, but lacks the sequential modeling strength of Bi-LSTM