白丝美女被狂躁免费视频网站,500av导航大全精品,yw.193.cnc爆乳尤物未满,97se亚洲综合色区,аⅴ天堂中文在线网官网

Information processing device, learning method, and storage medium

專利號
US11176327B2
公開日期
2021-11-16
申請人
FUJITSU LIMITED(JP Kawasaki)
發(fā)明人
Yuji Mizobuchi
IPC分類
G06F40/58; G06F40/30; G06F16/00; G06F40/45; G06F40/216; G06F40/284; G06N20/00
技術(shù)領(lǐng)域
word,learning,language,words,parameter,in,section,target,space,vector
地域: Kawasaki

摘要

A non-transitory computer-readable storage medium storing a program that causes a computer to execute a process, the process includes learning distributed representations of words included in a word space of a first language using a learner for learning the distributed representations; classifying words included in a word space of a second language different from the first language into words common to words included in the word space of the first language and words not common to words included in the word space of the first language; and replacing distributed representations of the common words included in the word space of the second language with distributed representations of the words, corresponding to the common words, in the first language and adjusting a parameter of the learner.

說明書

The alignment dictionary 23 includes correspondence relationships between words for representations in the reference language and representations in the target language. For example, when the reference language is English and the target language is Japanese, the alignment dictionary 23 includes correspondence relationships between words for representations in English and representations in Japanese. An example of the alignment dictionary 23 is Weblio that is an online English dictionary service in Japan.

The distributed representation learning section 11 uses a technique for producing a distributed representation of a word to learn a distributed representation of a word included in a word space of the reference language. For example, the distributed representation learning section 11 receives the reference language learning corpus 21 and uses, for example, the Skip-gram model of Word2Vec to learn distributed representations of words included in the reference language learning corpus 21. The learning using the Skip-gram model of Word2Vec is executed using an existing technique. The learning may be executed using a technique disclosed in “Tomas Mikolov, Kai Chen, Greg Corrado, and Jeffrey Dean. Efficient Estimation of Word Representations in Vector Space. In Proceedings of Workshop at ICLR, 2013”. Alternatively, the learning may be executed using a technique disclosed in “Xin Rong. word2vec Parameter Learning Explained”.

權(quán)利要求

1
微信群二維碼
意見反饋