Fig. 4
From: CalliFormer: a structure-aware transformer for Chinese calligraphy generation

Detailed workflow of the dynamic structural bias generation for the character “好” (hǎo). This two-stage process translates the high-level structural code “⿰” into a nuanced pos_weights vector, which then seeds the creation of multiple factor matrices. A learnable combination of these factors produces the final, detailed attention bias.