Griffon: Reasoning about Job Anomalies with Unlabeled Data in Cloud-based Platforms

Autor: Kristin Lieber, Liqun Shao, Janhavi Mahajan, Siqi Liu, Abhiram Eswaran, Sudhir Darbha, Konstantinos Karanasos, Soundar Srinivasan, Subru Krishnan, Carlo Curino, Yiwen Zhu, Minsoo Thigpen
Rok vydání: 2019
Předmět:
Zdroj: SoCC
DOI: 10.48550/arxiv.1908.09048
Popis: Microsoft's internal big data analytics platform is comprised of hundreds of thousands of machines, serving over half a million jobs daily, from thousands of users. The majority of these jobs are recurring and are crucial for the company's operation. Although administrators spend significant effort tuning system performance, some jobs inevitably experience slowdowns, i.e., their execution time degrades over previous runs. Currently, the investigation of such slowdowns is a labor-intensive and error-prone process, which costs Microsoft significant human and machine resources, and which negatively impacts several lines of business. In this work, we present Griffon, a system we built and have deployed in our production analytics clusters since last year to automatically discover the root cause of job slowdowns. Most existing solutions rely on labeled data (i.e., resolved incidents with labeled reasons for job slowdowns), which is in most practical scenarios non-existent or non-trivial to acquire. Others rely on time-series analysis of individual metrics that do not target specific jobs holistically. In contrast, in Griffon we cast the problem to a corresponding regression one that predicts the runtime of a job, and we show how the relative contributions of the features used to train our interpretable model can be exploited to rank the potential causes of job slowdowns. Evaluated over historical incidents, we show that Griffon discovers slowdown causes that are consistent with the ones validated by domain-expert engineers in a fraction of the time required by them.
Databáze: OpenAIRE