Popis: |
Deep neural network (DNN) requires large-scale labeled training data to prevent converging to poor local-minima and maintain satisfactory performance. In DOA estimation problem, however, acquiring sufficient measured data is challenging and demanding due to the constraints on manpower, cost, and other resources. In this paper, a DNN initialization technique — unsupervised pretraining — is presente to deal with small training data set d. Unsupervised pretraining is implemented through the design of a restricted Boltzmann machine (RBM), which is able to capture the underlying features of the input data and offer better weights initialization for DNN. We demonstrate the efficacy of the proposed method by comparing its DOA estimation performance with that of a randomly initialized DNN. Results show that DNN with unsupervised pretraining surpasses randomly initialized one when fed with the same training data, so it could provide superior performance in the scenario where training sample support is limited. |