Zobrazeno 1 - 10
of 655
pro vyhledávání: '"distributed training"'
Publikováno v:
Proceedings of the XXth Conference of Open Innovations Association FRUCT, Vol 36, Iss 1, Pp 649-654 (2024)
Federated learning is an increasingly common technique used within machine learning that allows multiple devices to collectively train a model without necessitating the centralization of data. This approach is highly valuable within medical tasks, wh
Externí odkaz:
https://doaj.org/article/87e080f3f3c642f7ab8908bc71f2c5ee
Publikováno v:
BMC Medical Education, Vol 24, Iss 1, Pp 1-12 (2024)
Abstract Introduction To ensure that pre-final year medical students at Stellenbosch University were able to resume clinical training during the COVID-19 pandemic, a 12-week integrated rotation was introduced, during which students were distributed a
Externí odkaz:
https://doaj.org/article/7a9168874c124dee97432ff030e7fcb0
Autor:
Hatsapon Teparrukkul, Pravej Serichetaphongse, Wareerat Chengprapakorn, Sirida Arunjaroensuk, Nikos Mattheos, Atiphan Pimkhaokham
Publikováno v:
Journal of Dental Sciences, Vol 19, Iss , Pp S122-S127 (2024)
Background/Purpose: The increasing importance of computer assisted implant surgery (CAIS) in the practice of implant dentistry calls for adequate education and training of clinicians. However, limited evidence exists to support optimal educational st
Externí odkaz:
https://doaj.org/article/45d69a9083c74655b4d7314d97c9479a
Publikováno v:
Frontiers in High Performance Computing, Vol 2 (2024)
This manuscript presents the library AI4HPC with its architecture and components. The library enables large-scale trainings of AI models on High-Performance Computing systems. It addresses challenges in handling non-uniform datasets through data mani
Externí odkaz:
https://doaj.org/article/1de2ecdf8f244483a492e7adc9d72a3a
Publikováno v:
Dianxin kexue, Vol 40, Pp 146-159 (2024)
AI large model is leading the hot ICT(information and communications technology) industry in the next decade. Intelligent computing center network is a communication base to support the distributed training of AI large model, and it is one of the key
Externí odkaz:
https://doaj.org/article/9ed4a4712dc942038b2c3013542bcb49
Autor:
Chanhee Yu, Kyongseok Park
Publikováno v:
IEEE Access, Vol 12, Pp 165653-165662 (2024)
Recently, the adoption of deep learning models in several domains and for various tasks has increased, correspondingly amplifying the number of model layers and parameters needed to achieve the required performance. Accordingly, the amount of memory
Externí odkaz:
https://doaj.org/article/159075dcd334439b8be5c1e8322127a3
Publikováno v:
IEEE Access, Vol 12, Pp 96017-96050 (2024)
In recent years, large language models (LLMs) have achieved remarkable success in natural language processing (NLP). LLMs require an extreme amount of parameters to attain high performance. As models grow into the trillion-parameter range, computatio
Externí odkaz:
https://doaj.org/article/10c9f434db39404ea87a85f203724bcd
Autor:
Seyed Mahmoud Sajjadi Mohammadabadi, Mahmoudreza Entezami, Aidin Karimi Moghaddam, Mansour Orangian, Shayan Nejadshamsi
Publikováno v:
International Journal of Intelligent Networks, Vol 5, Iss , Pp 267-274 (2024)
Machine learning models are the backbone of smart grid optimization, but their effectiveness hinges on access to vast amounts of training data. However, smart grids face critical communication bottlenecks due to the ever-increasing volume of data fro
Externí odkaz:
https://doaj.org/article/53f2452b95a64c0b9d61560946f5080e
Publikováno v:
Revista Colombiana de Computación, Vol 25, Iss 1 (2024)
The current popularity in the application of artificial intelligence to solve complex problems is growing. The appearance of chats based on artificial intelligence or natural language processing has generated the creation of increasingly large and so
Externí odkaz:
https://doaj.org/article/16ad31fe3b8544d0b61d641d3b58b32b
Publikováno v:
工程科学学报, Vol 45, Iss 8, Pp 1400-1416 (2023)
With the rapid arrival of the Internet of Everything era, massive data resources are generated on edge sides, causing problems such as large network load, high energy consumption, and privacy security in traditional distributed training based on clou
Externí odkaz:
https://doaj.org/article/6c1030f11ca74b74b681c0e1baf8660c