Autor: |
A.I. Ivanov, A.P. Ivanov, K.N. Savinov, R.V. Eremenko |
Jazyk: |
English<br />Russian |
Rok vydání: |
2022 |
Předmět: |
|
Zdroj: |
Надежность и качество сложных систем, Iss 4 (2022) |
Druh dokumentu: |
article |
ISSN: |
2307-4205 |
DOI: |
10.21685/2307-4205-2022-4-10 |
Popis: |
Background. The problem of parallelization of neural network calculations in an implicit form is considered. The problem occurs mainly when trying to speed up calculations on multi-core processors. A similar situation arises when a neural network combines several classical statistical criteria. Materials and methods. Summarization of five classical statistical criteria for testing the hypothesis of normal distribution of small samples in 16 experiments is considered. Classical tests of Anderson-Darling, normalized range, Vasicek, Frotsini and the test of the fourth statistical moment are considered. Unfortunately, their neural network counterparts have a low decision confidence of – 0.75. Five binary neurons are not enough. In this regard, the simulation of the result of combining up to 1000 binary neurons was performed. Results. Binary neurons cannot provide a confidence level greater than 0.93. A thousand ternary neurons can provide a confidence level of 0.98. The transition to 5-art artificial neurons should allow reaching a confidence level of 0.997 when combining 40 neurons. Conclusions. We observe a significant increase in the quality of decisions made by neural networks with an increase in the number of levels in their output quantizes. Natural neurons of living beings exchange bursts of impulses, which indirectly indicates that they have multilevel quantizes. There is no need to synthesize new statistical criteria; it is enough to switch to the use of q-ary artificial neurons, which are analogues of already known statistical criteria. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|