白丝美女被狂躁免费视频网站,500av导航大全精品,yw.193.cnc爆乳尤物未满,97se亚洲综合色区,аⅴ天堂中文在线网官网

Information processing device, learning method, and storage medium

專利號
US11176327B2
公開日期
2021-11-16
申請人
FUJITSU LIMITED(JP Kawasaki)
發(fā)明人
Yuji Mizobuchi
IPC分類
G06F40/58; G06F40/30; G06F16/00; G06F40/45; G06F40/216; G06F40/284; G06N20/00
技術(shù)領(lǐng)域
word,learning,language,words,parameter,in,section,target,space,vector
地域: Kawasaki

摘要

A non-transitory computer-readable storage medium storing a program that causes a computer to execute a process, the process includes learning distributed representations of words included in a word space of a first language using a learner for learning the distributed representations; classifying words included in a word space of a second language different from the first language into words common to words included in the word space of the first language and words not common to words included in the word space of the first language; and replacing distributed representations of the common words included in the word space of the second language with distributed representations of the words, corresponding to the common words, in the first language and adjusting a parameter of the learner.

說明書

Then, the word classifying section 12 classifies words into a group of words appearing in common between the reference language and the target language and a group of words uncommon between the reference language and the target language (in step S30). For example, the word classifying section 12 executes morphological analysis on the target language learning corpus 22 and outputs words indicated by results of the analysis. The word classifying section 12 uses the words indicated by the results of the analysis and the alignment dictionary 23 to produce correspondence relationships between words in the target language and words in the reference language. Then, the word classifying section 12 uses the correspondence relationships to classify the words included in the word space of the target language into a group of words common to words included in the word space of the reference language and a group of words not common to words included in the word space of the reference language.

Subsequently, the parameter adjusting section 13 adjusts a learning parameter W′ for the distributed representation learning for the group of the words appearing in common and included in the word space of the target language (in step S40). For example, the parameter adjusting section 13 sequentially selects the words from among the group of the words appearing in common and replaces distributed representations of the selected words with distributed representations of the words in the reference language. For example, the parameter adjusting section 13 replaces the hidden layer of the Skip-gram model with the distributed representations of the words in the reference language. Then, the parameter adjusting section 13 adjusts the weight W′N×V that is the parameter between the hidden layer and the output layer.

權(quán)利要求

1
微信群二維碼
意見反饋