Fault Detection of Single and Interval Valued Data Using Statistical Process Monitoring Techniques
Autor: | Hazem Nounou, Muhammad Nazmul Karim, Mohamed Nounou, Mohammed Ziyan Sheriff, Nour Basha |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2019 |
Předmět: |
Computer science
Noise (signal processing) business.industry InformationSystems_INFORMATIONSTORAGEANDRETRIEVAL Pattern recognition Interval (mathematics) Fault detection and isolation Data modeling Constant false alarm rate Principal component analysis Benchmark (computing) Artificial intelligence business GeneralLiterature_REFERENCE(e.g. dictionaries encyclopedias glossaries) Statistical hypothesis testing |
Popis: | Principal component analysis (PCA) is a linear data analysis technique widely used for fault detection and isolation, data modeling, and noise filtration. PCA may be combined with statistical hypothesis testing methods, such as the generalized likelihood ratio (GLR) technique in order to detect faults. GLR functions by using the concept of maximum likelihood estimation (MLE) in order to maximize the detection rate for a fixed false alarm rate. The benchmark Tennessee Eastman Process (TEP) is used to examine the performance of the different techniques, and the results show that for processes that experience both shifts in the mean and/or variance, the best performance is achieved by independently monitoring the mean and variance using two separate GLR charts, rather than simultaneously monitoring them using a single chart. Moreover, single-valued data can be aggregated into interval form in order to provide a more robust model with improved fault detection performance using PCA and GLR. The TEP example is used once more in order to demonstrate the effectiveness of using of interval-valued data over single-valued data. |
Databáze: | OpenAIRE |
Externí odkaz: |