Studying Sentence Generating Function using Neural Network
Autor: | Jau-Chi Huang, 黃昭綺 |
---|---|
Rok vydání: | 2002 |
Druh dokumentu: | 學位論文 ; thesis |
Popis: | 90 A sentence generating function is a function that outputs words in sequence and these words are able to form sentences. Generally speaking, the human brain itself is a sentence generating function. The ultimate goal of language processing is to produce such a sentence generating function. However, current techniques still perform poorly. This thesis is based on the result of language processing so far to study whether it’s practicable to use neural network as a sentence generating function. This thesis continues the works of our lab [1][2] and uses three methods to process language: Non-negative Matrix Factorization, Self-Organizing Maps, and Elman network. We aim at the eight works of Mark Twain and get three different semantic codes in these three methods. Then, from these three semantic codes we do semantic retrieval and compare their results. Using the property of predicting the next word of Elman networks, we input the query sentence to the well-trained Elman network, and add the word it predicts to the original query sentence to do semantic retrieval. Besides, we can choose an initial word (word1) to input into the well-trained Elman network, and it will output the next word (word2) it predicts. We then input word1 and word2 sequentially, and will get another next word (word3). By repeating these steps, we will get a sequence of words. If the network can predict the next word correctly, these words can make up sentences. And this is the concept of sentence generating function. Therefore, using Elman network as sentence generating function is practicable. This work was partially supported by National Science Council, ROC under contract number NSC 90-2213-E-002-092. |
Databáze: | Networked Digital Library of Theses & Dissertations |
Externí odkaz: |