Autor: |
Harouna Soumare, Alia Benkahla, Nabil Gmati |
Jazyk: |
angličtina |
Rok vydání: |
2021 |
Předmět: |
|
Zdroj: |
Array, Vol 11, Iss , Pp 100068- (2021) |
Druh dokumentu: |
article |
ISSN: |
2590-0056 |
DOI: |
10.1016/j.array.2021.100068 |
Popis: |
Deep Learning algorithms have achieved a great success in many domains where large scale datasets are used. However, training these algorithms on high dimensional data requires the adjustment of many parameters. Avoiding overfitting problem is difficult. Regularization techniques such as L1 and L2 are used to prevent the parameters of training model from being large. Another commonly used regularization method called Dropout randomly removes some hidden units during the training phase. In this work, we describe some architectures of Deep Learning algorithms, we explain optimization process for training them and attempt to establish a theoretical relationship between L2-regularization and Dropout. We experimentally compare the effect of these techniques on the learning model using genomics datasets. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|