JourneyToCoding

Code for Fun

Position embeddings enable self-attention to perceive the sequence order. Without position embeddings, the self-attention will treat the same words in different order of a sentence as the same thing.

Read more »

Diffusion, Stable Diffusion, Diffusion Transformer.

Read more »

Variational Auto-encoder (VAE).

Read more »
0%