Fig. 1: Overview of the ERNIE-RNA model architecture and application. | Nature Communications

Fig. 1: Overview of the ERNIE-RNA model architecture and application.

From: ERNIE-RNA: an RNA language model with structure-enhanced representations

Fig. 1

ERNIE-RNA incorporates RNA structural information into the self-attention mechanism. a In the pre-training stage, ERNIE-RNA, consisting of 12 transformer layers, was pre-trained with 20.4 million non-coding RNA sequences from RNAcentral via self-supervised learning. b In the fine-tuning stage, ERNIE-RNA provides attention maps and token embeddings that encode rich structural and semantic RNA features, achieving state-of-the-art performance on diverse downstream tasks spanning structure prediction and functional annotation.

Back to article page