Fig. 12: Attention schematic for feature importance evaluation and multi-objective performance optimization.

The attention mechanism computes feature weights through Query-Key-Value interactions, enabling interpretable and weighted aggregation for optimized performance prediction.