Difference between r1.4 and the current
@@ -9,7 +9,7 @@
{
Attention Is All You Need
https://namu.wiki/w/Attention%20Is%20All%20You%20Need
MKL [[BERT,Bidirectional_Encoder_Representations_from_Transformers]]
Attention Is All You Need
https://namu.wiki/w/Attention%20Is%20All%20You%20Need
transformer(ANN) 발표 논문.
[[트랜스포머,transformer]]([[인공신경망,artificial_neural_network,ANN]]) 발표 논문.
MKL [[BERT,Bidirectional_Encoder_Representations_from_Transformers]]
Attention_Is_All_You_Need
{
Attention Is All You Need
https://namu.wiki/w/Attention Is All You Need
MKL BERT,Bidirectional_Encoder_Representations_from_Transformers
{
Attention Is All You Need
https://namu.wiki/w/Attention Is All You Need
MKL BERT,Bidirectional_Encoder_Representations_from_Transformers