Patterned wireless transcranial optogenetics generates artificial perception.

Autor: Wu M, Yang Y, Zhang J, Efimov AI, Vazquez-Guardado A, Li X, Zhang K, Wang Y, Gu J, Zeng L, Liu J, Riahi M, Yoon H, Kim M, Zhang H, Lee M, Kang J, Ting K, Cheng S, Zhang W, Banks A, Good CH, Cox JM, Pinto L, Huang Y, Kozorovitskiy Y, Rogers JA
Jazyk: angličtina
Zdroj: BioRxiv : the preprint server for biology [bioRxiv] 2024 Sep 20. Date of Electronic Publication: 2024 Sep 20.
DOI: 10.1101/2024.09.20.613966
Abstrakt: Synthesizing perceivable artificial neural inputs independent of typical sensory channels remains a fundamental challenge in the development of next-generation brain-machine interfaces. Establishing a minimally invasive, wirelessly effective, and miniaturized platform with long-term stability is crucial for creating a clinically meaningful interface capable of mediating artificial perceptual feedback. In this study, we demonstrate a miniaturized fully implantable wireless transcranial optogenetic encoder designed to generate artificial perceptions through digitized optogenetic manipulation of large cortical ensembles. This platform enables the spatiotemporal orchestration of large-scale cortical activity for remote perception genesis via real-time wireless communication and control, with optimized device performance achieved by simulation-guided methods addressing light and heat propagation during operation. Cue discrimination during operant learning demonstrates the wireless genesis of artificial percepts sensed by mice, where spatial distance across large cortical networks and sequential order-based analyses of discrimination performance reveal principles that adhere to general perceptual rules. These conceptual and technical advancements expand our understanding of artificial neural syntax and its perception by the brain, guiding the evolution of next-generation brain-machine communication.
Databáze: MEDLINE