Antiviral Peptide-Generative Pre-Trained Transformer (AVP-GPT): A Deep Learning-Powered Model for Antiviral Peptide Design with High-Throughput Discovery and Exceptional Potency

Autor: Huajian Zhao, Gengshen Song
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: Viruses, Vol 16, Iss 11, p 1673 (2024)
Druh dokumentu: article
ISSN: 1999-4915
DOI: 10.3390/v16111673
Popis: Traditional antiviral peptide (AVP) discovery is a time-consuming and expensive process. This study introduces AVP-GPT, a novel deep learning method utilizing transformer-based language models and multimodal architectures specifically designed for AVP design. AVP-GPT demonstrated exceptional efficiency, generating 10,000 unique peptides and identifying potential AVPs within two days on a GPU system. Pre-trained on a respiratory syncytial virus (RSV) dataset, AVP-GPT successfully adapted to influenza A virus (INFVA) and other respiratory viruses. Compared to state-of-the-art models like LSTM and SVM, AVP-GPT achieved significantly lower perplexity (2.09 vs. 16.13) and higher AUC (0.90 vs. 0.82), indicating superior peptide sequence prediction and AVP classification. AVP-GPT generated a diverse set of peptides with excellent novelty and identified candidates with remarkably higher antiviral success rates than conventional design methods. Notably, AVP-GPT generated novel peptides against RSV and INFVA with exceptional potency, including four peptides exhibiting EC50 values around 0.02 uM—the strongest anti-RSV activity reported to date. These findings highlight AVP-GPT’s potential to revolutionize AVP discovery and development, accelerating the creation of novel antiviral drugs. Future studies could explore the application of AVP-GPT to other viral targets and investigate alternative AVP design strategies.
Databáze: Directory of Open Access Journals
Nepřihlášeným uživatelům se plný text nezobrazuje