0%

自注意力机制

自注意力机制

Sophisticated input

image-20231229105452661

Vector Set as Input

image-20231229105727351

image-20231229105938497

image-20231229110040187

image-20231229110139172

What is the output?

image-20231229110343955

image-20231229110500001

image-20231229110528351

Sequence Labeling

image-20231229111029189

Self-attention

image-20231229111243095

image-20231229111328519

image-20231229111447447

image-20231229111613666

image-20231229111844380

image-20231229112012157

image-20231229112129280

image-20231229112323860

image-20231229112504332

image-20231229112825156

矩阵计算

image-20231229113203405

image-20231229115018255

image-20231229115203578

image-20231229115350227

Multi-head Self-attention

image-20231229115757287

image-20231229115844341

Position Encoding

image-20231229120245068

image-20231229120401270

Self-attention for Speech

image-20231229120647905

Self-attention for Image

image-20231229120820731

Self-Attention GAN

image-20231229120933611

Self-attention v.s. CNN

image-20231229121120848

image-20231229121239055

image-20231229121432020

Self-attention v.s. RNN

image-20231229121807937

Self-attention v.s. GNN

image-20231229122046331

To Leain More

image-20231229122257533