A predictive processing model of episodic memory and time perception

Autor: Kyriacos Nikiforou, Anastasia Sylaidi, Murray Shanahan, Zafeirios Fountas, Anil K. Seth, Warrick Roseboom
Rok vydání: 2020
Předmět:
ISSN: 0899-7667
Popis: Human perception and experience of time is strongly influenced by ongoing stimulation, memory of past experiences, and required task context. When paying attention to time, time experience seems to expand; when distracted, it seems to contract. When considering time based on memory, the experience may be different than in the moment, exemplified by sayings like “time flies when you’re having fun”. Experience of time also depends on the content of perceptual experience – rapidly changing or complex perceptual scenes seem longer in duration than less dynamic ones. The complexity of interactions between attention, memory, and perceptual stimulation is a likely reason that an overarching theory of time perception has been difficult to achieve. Here, we introduce a model of perceptual processing and episodic memory that makes use of hierarchical predictive coding, short-term plasticity, spatio-temporal attention, and episodic memory formation and recall, and apply this model to the problem of human time perception. In an experiment with ~ 13, 000 human participants we investigated the effects of memory, cognitive load, and stimulus content on duration reports of dynamic natural scenes up to ~ 1 minute long. Using our model to generate duration estimates, we compared human and model performance. Model-based estimates replicated key qualitative biases, including differences by cognitive load (attention), scene type (stimulation), and whether the judgement was made based on current or remembered experience (memory). Our work provides a comprehensive model of human time perception and a foundation for exploring the computational basis of episodic memory within a hierarchical predictive coding framework.Author summaryExperience of the duration of present or past events is a central aspect of human experience, the underlying mechanisms of which are not yet fully understood. In this work, we combine insights from machine learning and neuroscience to propose a combination of mathematical models that replicate human perceptual processing, long-term memory, attention, and duration perception. Our computational implementation of this framework can process information from video clips of ordinary life scenes, record and recall important events, and report the duration of these clips. To assess the validity of our proposal, we conducted an experiment with ~ 13, 000 human participants. Each was shown a video between 1-64 seconds long and reported how long they believed it was. Reports of duration by our computational model qualitatively matched these human reports, made about the exact same videos. This was true regardless of the video content, whether time was actively judged or based on memory of the video, or whether the participants focused on a single task or were distracted - all factors known to influence human time perception. Our work provides the first model of human duration perception to incorporate these diverse and complex factors and provides a basis to probe the deep links between memory and time in human experience.
Databáze: OpenAIRE