Zobrazeno 1 - 10
of 46
pro vyhledávání: '"Hajabdollahi, Mohsen"'
In many practical few-shot learning problems, even though labeled examples are scarce, there are abundant auxiliary datasets that potentially contain useful information. We propose the problem of extended few-shot learning to study these scenarios. W
Externí odkaz:
http://arxiv.org/abs/2012.07176
Autor:
Abbasi, Sajjad, Hajabdollahi, Mohsen, Khadivi, Pejman, Karimi, Nader, Roshandel, Roshanak, Shirani, Shahram, Samavi, Shadrokh
Knowledge distillation allows transferring knowledge from a pre-trained model to another. However, it suffers from limitations, and constraints related to the two models need to be architecturally similar. Knowledge distillation addresses some of the
Externí odkaz:
http://arxiv.org/abs/2009.00982
Convolutional neural networks (CNNs) have a large number of variables and hence suffer from a complexity problem for their implementation. Different methods and techniques have developed to alleviate the problem of CNN's complexity, such as quantizat
Externí odkaz:
http://arxiv.org/abs/2003.12621
Unlabeled Data Deployment for Classification of Diabetic Retinopathy Images Using Knowledge Transfer
Convolutional neural networks (CNNs) are extensively beneficial for medical image processing. Medical images are plentiful, but there is a lack of annotated data. Transfer learning is used to solve the problem of lack of labeled data and grants CNNs
Externí odkaz:
http://arxiv.org/abs/2002.03321
For convolutional neural networks (CNNs) that have a large volume of input data, memory management becomes a major concern. Memory cost reduction can be an effective way to deal with these problems that can be realized through different techniques su
Externí odkaz:
http://arxiv.org/abs/2002.03302
Autor:
Mousa-Pasandi, Morteza, Hajabdollahi, Mohsen, Karimi, Nader, Samavi, Shadrokh, Shirani, Shahram
Filters are the essential elements in convolutional neural networks (CNNs). Filters are corresponded to the feature maps and form the main part of the computational and memory requirement for the CNN processing. In filter pruning methods, a filter wi
Externí odkaz:
http://arxiv.org/abs/2002.03299
Convolutional Neural Networks (CNNs) suffer from different issues, such as computational complexity and the number of parameters. In recent years pruning techniques are employed to reduce the number of operations and model size in CNNs. Different pru
Externí odkaz:
http://arxiv.org/abs/2001.04062
There are many research works on the designing of architectures for the deep neural networks (DNN), which are named neural architecture search (NAS) methods. Although there are many automatic and manual techniques for NAS problems, there is no unifyi
Externí odkaz:
http://arxiv.org/abs/1912.13183
Knowledge distillation (KD) is a new method for transferring knowledge of a structure under training to another one. The typical application of KD is in the form of learning a small model (named as a student) by soft labels produced by a complex mode
Externí odkaz:
http://arxiv.org/abs/1912.13179
Autor:
Hajabdollahi, Mohsen, Esfandiarpoor, Reza, Sabeti, Elyas, Karimi, Nader, Najarian, Kayvan, Soroushmehr, S. M. Reza, Samavi, Shadrokh
Automating classification and segmentation process of abnormal regions in different body organs has a crucial role in most of medical imaging applications such as funduscopy, endoscopy, and dermoscopy. Detecting multiple abnormalities in each type of
Externí odkaz:
http://arxiv.org/abs/1809.05831