Sequence-to-sequence Bangla Sentence Generation with LSTM Recurrent Neural Networks
Autor: | Syed Akhter Hossain, Sadia Sultana Sharmin Mousumi, Md. Sanzidul Islam, Sheikh Abujar |
---|---|
Rok vydání: | 2019 |
Předmět: |
Closed captioning
Sequence Machine translation Language identification Computer science business.industry Speech recognition Deep learning Natural language generation 020206 networking & telecommunications 02 engineering and technology computer.software_genre language.human_language Bengali Recurrent neural network 0202 electrical engineering electronic engineering information engineering language General Earth and Planetary Sciences 020201 artificial intelligence & image processing Artificial intelligence business computer Word (computer architecture) General Environmental Science |
Zdroj: | Procedia Computer Science. 152:51-58 |
ISSN: | 1877-0509 |
Popis: | Sequence to sequence text generation is the most efficient approach for automatically converting the script of a word from a source sequence to a target sequence. Text generation is the application of natural language generation which is useful in sequence modeling like the machine translation, speech recognition, image captioning, language identification, video captioning and much more. In this paper we have discussed about Bangla text generation, using deep learning approach, Long Short-term Memory (LSTM), a special kind of RNN (Recurrent Neural Network). LSTM networks are suitable for analyzing sequences of text data and predicting the next word. LSTM could be a respectable solution if you want to predict the very next point of a given time sequence. In this article we proposed a artificial Bangla Text Generator with LSTM, which is very early for this language and also this model is validated with satisfactory accuracy rate. |
Databáze: | OpenAIRE |
Externí odkaz: |