Popis: |
In this paper, we propose an improved Hierarchical Temporal Memory (HTM) that can consider long-term dependence. HTM is a temporal sequence prediction model imitating the cerebral cortex structure and learning algorithm. This model is composed of cells of a two-dimensional map representing neurons of the brain, and expresses data by a set of cells in an activated state. Further, the data of the next time is predicted from the set of the cells in the predicted state. HTM learns the time series data by updating synapses connecting each cell according to Hebb's rule and keeps the proper relationship of data. In the conventional model, only the connection with the previous data is learned, but in the proposed model the connection with several former data can be learned. The proposed HTM is modified in terms of structure and learning algorithm. In the structure, we introduced a time axis for the segment which is a collection of synapses. About learning algorithm, the connection with several times ago leads the predicted state. As a result of evaluation experiments, it was confirmed that the proposed model can consider longer-term dependency than the conventional model on temporal sequence prediction. |