Difference between r1.8 and the current
@@ -11,6 +11,14 @@
Opp. [[디코더,decoder]]
[[encoder]]
Srch:encoder
Ndict:encoder
[[encoder]]
Sub:
priority_encoder
{
priority encoder
'''priority encoder'''
} // priority encoder ... NN:"priority encoder" Ggl:"priority encoder"
Ndict:encoder
디코더는 n개의 입력단자, (최대?) 2n개의 출력단자.
인코더는 (최대?) 2n개의 입력단자, n개의 출력단자.
인코더는 (최대?) 2n개의 입력단자, n개의 출력단자.
{
=부호화기, ?
=부호화기, ?
Opp. 디코더,decoder
Sub:
priority_encoder
{
priority encoder
priority encoder
} // priority encoder ...
priority encoder
priority encoder
priority_encoder
{
priority encoder
priority encoder
} // priority encoder ...
![NN: NN:](/wiki/imgs/interwiki/nn-16.png)
![Ggl: Ggl:](/wiki/imgs/interwiki/ggl-16.png)
}
Opp. 디코더,decoder
MKL
BERT,Bidirectional_Encoder_Representations_from_Transformers |=,BERT BERT
BERT,Bidirectional_Encoder_Representations_from_Transformers |=,BERT BERT
{
Bidirectional Encoder Representations from Transformers (BERT)
MKL
Attention_Is_All_You_Need
인코더,encoder
트랜스포머,transformer
표현,representation
임베딩,embedding
BERT_(언어_모델)
BERT_(language_model)
...
BERT
Up: 언어모형,language_model
}
Bidirectional Encoder Representations from Transformers (BERT)
MKL
Attention_Is_All_You_Need
인코더,encoder
트랜스포머,transformer
표현,representation
임베딩,embedding
![WpKo: WpKo:](/wiki/imgs/interwiki/wpko-16.png)
![WpEn: WpEn:](/wiki/imgs/interwiki/wpen-16.png)
...
![Ggl: Ggl:](/wiki/imgs/interwiki/ggl-16.png)
Up: 언어모형,language_model
}