Popis: |
To determine whether the monocyte-to-lymphocyte ratio (MLR), as a systemic inflammation index, predicts malnutrition risk during the early stages of cirrhosis.We conducted a single-center prospective cohort study, enrolling patients from June 2016 to September 2020. The patients underwent malnutrition risk assessments upon admission. The patients were classified into five clinical stages according to portal hypertension. The malnutrition risk was scored using the Royal Free Hospital-Nutritional Prioritizing Tool (RFH-NPT) and validated by the Nutritional Risk Screening 2002 (NRS-2002) or Liver Disease Undernutrition Screening Tool (LDUST). Routine clinical laboratory measurements were performed to calculate the MLR, Child-Turcotte-Pugh (CTP) class, and model for end-stage liver disease (MELD) score. The patients were followed up for 2 years.Among the 154 patients with cirrhosis, 60 had compensated cirrhosis and 94 had decompensated cirrhosis. The optimal cutoff value of the MLR,0.4, was effective in predicting malnutrition related to death or liver transplantation. Those with a high malnutrition risk defined by the NRS-2002 or RFH-NPT had a higher MLR than those with a low malnutrition risk. For patients with class A CTP cirrhosis or a MELD score of10, an MLR cutoff of0.4 significantly distinguished more patients with a low malnutrition risk than those with a high malnutrition risk. Both the RFH-NPT score and MLR increased significantly across the decompensated cirrhosis substages. Interestingly, the MLR exhibited a positive correlation with the RFH-NPT score until varices appeared, but the correlation was the highest at the substage of a history of variceal bleeding (r = 0.714, P = 0.009). Multivariable analysis demonstrated that an MLR of0.4 was an independent factor for malnutrition risk by screening with the RFH-NPT, and this was confirmed using the LDUST and NRS-2002.Immune-related inflammatory dysfunction predicts malnutrition risk during the early stages of cirrhosis. |