Difference between r1.12 and the current
@@ -1,3 +1,5 @@
#noindex
##=====임베딩,embedding =,embedding 임베딩 embedding
ML/NLP, 수학, typography([[폰트,font]]), .... 등에 다양하게 쓰이는 단어ML/NLP:
@@ -36,6 +38,8 @@
수학:
[[묻기,embedding]] - w
https://en.wikipedia.org/wiki/Embedding
[[graph_embedding]] =,graph_embedding =,graph_embedding . graph_embedding
ML/NLP, 수학, typography(폰트,font), .... 등에 다양하게 쓰이는 단어
ML/NLP:
{
MKL-reciprocally
임베딩,embedding
언임베딩,unembedding =언임베딩,unembedding =,unembedding 언임베딩 unembedding {
unembedding } //unembedding ....
unembedding
unembedding
어텐션,attention
MLP,multilayer_perceptron or multi-layer_perceptron
{
MKL-reciprocally
임베딩,embedding
언임베딩,unembedding =언임베딩,unembedding =,unembedding 언임베딩 unembedding {
![WtEn: WtEn:](/wiki/imgs/interwiki/wten-16.png)
![Ggl: Ggl:](/wiki/imgs/interwiki/ggl-16.png)
![Bing: Bing:](/wiki/imgs/interwiki/bing-16.png)
어텐션,attention
MLP,multilayer_perceptron or multi-layer_perceptron
word_embedding =,word_embedding . word_embedding
{
word embedding
rel
원핫,one-hot
단어,word
특징,feature?
전처리,preprocessing인가? always? 자연어,natural_language를 기계가 이해할 수 있으려면 벡터공간,vector_space에 embed해야(ie 단어,word 문장 뭐뭐 등등을, 벡터공간의 원소인 벡터,vector로 만들어야)하는데... chk
links en
Deep Learning, NLP, and Representations - colah's blog
https://colah.github.io/posts/2014-07-NLP-RNNs-Representations/
심층학습,deep_learning과 연관지어 설명
} // word embedding ...
word embedding
word embedding
word embedding
} // ML, NLP에서 embeddingword embedding
rel
원핫,one-hot
단어,word
특징,feature?
전처리,preprocessing인가? always? 자연어,natural_language를 기계가 이해할 수 있으려면 벡터공간,vector_space에 embed해야(ie 단어,word 문장 뭐뭐 등등을, 벡터공간의 원소인 벡터,vector로 만들어야)하는데... chk
links en
Deep Learning, NLP, and Representations - colah's blog
https://colah.github.io/posts/2014-07-NLP-RNNs-Representations/
심층학습,deep_learning과 연관지어 설명
} // word embedding ...
![NN: NN:](/wiki/imgs/interwiki/nn-16.png)
![Ggl: Ggl:](/wiki/imgs/interwiki/ggl-16.png)
![Bing: Bing:](/wiki/imgs/interwiki/bing-16.png)
![KmsE: KmsE:](/wiki/imgs/interwiki/kmse-16.png)
https://en.wikipedia.org/wiki/Embedding
graph_embedding =,graph_embedding =,graph_embedding . graph_embedding
{
graph embedding
![WtEn: WtEn:](/wiki/imgs/interwiki/wten-16.png)
https://en.wikipedia.org/wiki/Graph_embedding - In topological graph theory, { https://en.wikipedia.org/wiki/Topological_graph_theory }
rel. 평면그래프,planar_graph
Up: 그래프,graph} // "graph embedding" ....
![Ndict: Ndict:](/wiki/imgs/interwiki/ndict-16.png)
![Ggl: Ggl:](/wiki/imgs/interwiki/ggl-16.png)
order_embedding =,order_embedding =,order_embedding . order_embedding
{
order embedding
https://en.wikipedia.org/wiki/Order_embedding
![WtEn: WtEn:](/wiki/imgs/interwiki/wten-16.png)
![Ndict: Ndict:](/wiki/imgs/interwiki/ndict-16.png)
Up: 순서,order
}//order embedding ....
![Bing: Bing:](/wiki/imgs/interwiki/bing-16.png)
![Ggl: Ggl:](/wiki/imgs/interwiki/ggl-16.png)
see also and merge: 수학,math#s-1
기타 embedded
embedded - computer 하드웨어,hardware쪽에 쓰임
https://en.wikipedia.org/wiki/Embedded
https://foldoc.org/embedded system
embedded - computer 하드웨어,hardware쪽에 쓰임
https://en.wikipedia.org/wiki/Embedded
https://foldoc.org/embedded system
AKA imbedding
TMP WORKING 2023-08-27
embedding - 일단 embedding은 '매장 매입' / but 묻기 매장 매입 끼워넣기 몰입 - 이 모두 보임.
embedding - 끼워넣기
embedding x
embedding영한사전: "어떤 위상(位相) 공간에서 다른 위상 공간으로의 동상 사상(同相寫像)"
embedding
embedding - good
embedding x
embedding
![KmsE: KmsE:](/wiki/imgs/interwiki/kmse-16.png)
![KpsE: KpsE:](/wiki/imgs/interwiki/kpse-16.png)
![KcsE: KcsE:](/wiki/imgs/interwiki/kcse-16.png)
![NdEn: NdEn:](/wiki/imgs/interwiki/nden-16.png)
![WtEn: WtEn:](/wiki/imgs/interwiki/wten-16.png)
![Foldoc: Foldoc:](/wiki/imgs/interwiki/foldoc-16.png)
![WpSimple: WpSimple:](/wiki/imgs/interwiki/wpsimple-16.png)
![WpEn: WpEn:](/wiki/imgs/interwiki/wpen-16.png)
Ndict:
Google:
Bing:
YouTube:
Srch:
Google:
Bing:
YouTube:
Srch: