On Reducing the Amount of Samples Required for Training of QNNs: Constraints on the Linear Structure of the Training Data

Autor: Mandl, Alexander, Barzen, Johanna, Leymann, Frank, Vietz, Daniel
Rok vydání: 2023
Předmět:
Druh dokumentu: Working Paper
Popis: Training classical neural networks generally requires a large number of training samples. Using entangled training samples, Quantum Neural Networks (QNNs) have the potential to significantly reduce the amount of training samples required in the training process. However, to minimize the number of incorrect predictions made by the resulting QNN, it is essential that the structure of the training samples meets certain requirements. On the one hand, the exact degree of entanglement must be fixed for the whole set of training samples. On the other hand, training samples must be linearly independent and non-orthogonal. However, how failing to meet these requirements affects the resulting QNN is not fully studied. To address this, we extend the proof of the QNFL theorem to (i) provide a generalization of the theorem for varying degrees of entanglement. This generalization shows that the average degree of entanglement in the set of training samples can be used to predict the expected quality of the QNN. Furthermore, we (ii) introduce new estimates for the expected accuracy of QNNs for moderately entangled training samples that are linear dependent or orthogonal. Our analytical results are (iii) experimentally validated by simulating QNN training and analyzing the quality of the QNN after training.
Comment: 35 pages, 6 figures; changed layout and fixed typos
Databáze: arXiv