논문,thesis

Difference between r1.4 and the current

@@ -9,7 +9,7 @@
{
Attention Is All You Need
https://namu.wiki/w/Attention%20Is%20All%20You%20Need
transformer(ANN) 발표 논문.
[[트랜스포머,transformer]]([[인공신경망,artificial_neural_network,ANN]]) 발표 논문.

MKL [[BERT,Bidirectional_Encoder_Representations_from_Transformers]]