Transformer Based Implementation for Automatic Book Summarization
Autor: | Porwal, Siddhant, Bewoor, Laxmi, Deshpande, Vivek |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2022 |
Předmět: |
FOS: Computer and information sciences
Computer Science - Machine Learning Transformer Artificial Intelligence (cs.AI) Computer Science - Computation and Language Computer Science - Artificial Intelligence Summarization Extractive Abstractive Computation and Language (cs.CL) Machine Learning (cs.LG) |
Zdroj: | International Journal of Intelligent Systems and Applications in Engineering; Vol. 10 No. 3s (2022); 123-128 |
ISSN: | 2147-6799 |
Popis: | Document Summarization is the procedure of generating a meaningful and concise summary of a given document with the inclusion of relevant and topic-important points. There are two approaches: one is picking up the most relevant statements from the document itself and adding it to the Summary known as Extractive and the other is generating sentences for the Summary known as Abstractive Summarization. Training a machine learning model to perform tasks that are time-consuming or very difficult for humans to evaluate is a major challenge. Book Abstract generation is one of such complex tasks. Traditional machine learning models are getting modified with pre-trained transformers. Transformer based language models trained in a self-supervised fashion are gaining a lot of attention; when fine-tuned for Natural Language Processing(NLP) downstream task like text summarization. This work is an attempt to use Transformer based techniques for Abstract generation. Published at - https://ijisae.org/index.php/IJISAE/article/view/2421 |
Databáze: | OpenAIRE |
Externí odkaz: |