Popis: |
SUMMARYThe lack of robust quantifiable behavioral assessments of pain in pre-clinical models, especially of ongoing pain, has been a major limitation for both pain neurobiology research and development of novel analgesics. Current methods typically require extensive animal-human observer interaction and are often followed by manual, thus subjective, scorings, which are susceptible to bias and variability, and are labor-intensive and time-consuming. Here we demonstrate that, by leveraging modern machine vision and machine learning tools, quantifiable features of the spontaneous behavior, or the “body language” of an animal recorded in a dark and observer-free environment from a bottom-up view, can be directly used to automatically detect and infer both the internal state of pain and analgesia in an animal, and from surface contact measurements the presence of mechanical hypersensitivity, in an objective manner. This approach minimizes animal handling, reduces numbers of animals needed and the potential for biased assessment, making it a useful tool for exploring the biology of pain and other neurological disorders, one readily scalable for pre-clinical drug efficacy assessment. |