Zobrazeno 1 - 10
of 7 506
pro vyhledávání: '"Knyazev, P."'
Publikováno v:
Терапевтический архив, Vol 91, Iss 4, Pp 53-61 (2019)
Aim. To compare fecal calprotectin (FC) concentration with laboratory and diagnostic methods in patients with inflammatory bowel diseases (IBD). Materials and methods. The level of FC was measured in 110 patients with established IBD. Crohn diseases
Externí odkaz:
https://doaj.org/article/52e63ef23b5443098e7bf7431ca382e5
Autor:
Knyazev, Boris, Moudgil, Abhinav, Lajoie, Guillaume, Belilovsky, Eugene, Lacoste-Julien, Simon
Neural network training can be accelerated when a learnable update rule is used in lieu of classic adaptive optimizers (e.g. Adam). However, learnable update rules can be costly and unstable to train and use. Recently, Jang et al. (2023) proposed a s
Externí odkaz:
http://arxiv.org/abs/2409.04434
Autor:
Knyazev, Yu. V., Balaev, D. A., Stolyar, S. V., Shokhrina, A. O., Velikanov, D. A., Pankrats, A. I., Vorotynov, A. M., Krasikov, A. A., Skorobogatov, S. A., Volochaev, M. N., Bayukov, O. A., Iskhakov, R. S.
The relation of the magnetically dead layer and structural defects in ultrafine interacting NiFe2O4 nanoparticles ( = 4 nm) have been investigated using transmission electron microscopy, X-ray diffraction, ^57Fe M\"ossbauer spectroscopy, and dc ma
Externí odkaz:
http://arxiv.org/abs/2408.16203
Generating novel molecules is challenging, with most representations leading to generative models producing many invalid molecules. Spanning Tree-based Graph Generation (STGG) is a promising approach to ensure the generation of valid molecules, outpe
Externí odkaz:
http://arxiv.org/abs/2407.09357
Autor:
Thérien, Benjamin, Joseph, Charles-Étienne, Knyazev, Boris, Oyallon, Edouard, Rish, Irina, Belilovsky, Eugene
Learned optimizers (LOs) can significantly reduce the wall-clock training time of neural networks, substantially reducing training costs. However, they can struggle to optimize unseen tasks (meta-generalize), especially when training networks much la
Externí odkaz:
http://arxiv.org/abs/2406.00153
LoGAH: Predicting 774-Million-Parameter Transformers using Graph HyperNetworks with 1/100 Parameters
A good initialization of deep learning models is essential since it can help them converge better and faster. However, pretraining large models is unaffordable for many researchers, which makes a desired prediction for initial parameters more necessa
Externí odkaz:
http://arxiv.org/abs/2405.16287
Autor:
Kofinas, Miltiadis, Knyazev, Boris, Zhang, Yan, Chen, Yunlu, Burghouts, Gertjan J., Gavves, Efstratios, Snoek, Cees G. M., Zhang, David W.
Neural networks that process the parameters of other neural networks find applications in domains as diverse as classifying implicit neural representations, generating neural network weights, and predicting generalization errors. However, existing ap
Externí odkaz:
http://arxiv.org/abs/2403.12143
Autor:
Manenti, Laura, Pepe, Carlo, Sarnoff, Isaac, Ibrayev, Tengiz, Oikonomou, Panagiotis, Knyazev, Artem, Monticone, Eugenio, Garrone, Hobey, Alder, Fiona, Fawwaz, Osama, Millar, Alexander J., Morå, Knut Dundas, Shams, Hamad, Arneodo, Francesco, Rajteri, Mauro
Superconducting transition-edge sensors (TESs) are a type of quantum sensor known for its high single-photon detection efficiency and low background. This makes them ideal for particle physics experiments searching for rare events. In this work, we p
Externí odkaz:
http://arxiv.org/abs/2402.03073
Autor:
Joseph, Charles-Étienne, Thérien, Benjamin, Moudgil, Abhinav, Knyazev, Boris, Belilovsky, Eugene
Communication-efficient variants of SGD, specifically local SGD, have received a great deal of interest in recent years. These approaches compute multiple gradient steps locally, that is on each worker, before averaging model parameters, helping reli
Externí odkaz:
http://arxiv.org/abs/2312.02204
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.