Using Applicability to Quantifying Octave Resonance in Deep Neural Networks
Autor: | Supratik Mukhopadhyay, Robert DiBiano, Edward Collier |
---|---|
Rok vydání: | 2020 |
Předmět: |
Artificial neural network
business.industry Computer science Pattern recognition 02 engineering and technology GeneralLiterature_MISCELLANEOUS Robustness (computer science) 020204 information systems Computer Science::Mathematical Software 0202 electrical engineering electronic engineering information engineering Octave Deep neural networks 020201 artificial intelligence & image processing Artificial intelligence Invariant (mathematics) business |
Zdroj: | Communications in Computer and Information Science ISBN: 9783030638221 ICONIP (5) |
Popis: | Features in a deep neural network are only as robust as those present in the data provided for training. The robustness of features applies to not just the types of features and how they apply to various classes, known or unknown, but also to how those features apply to different octaves, or scales. Neural Networks trained at one octave have been shown to be invariant to other octaves, while neural networks trained on large robust datasets operate optimally at only the octaves that resonate best with the learned features. This may still discard features that existed in the data. Not knowing the octave a trained neural network is most applicable to can lead to sub-optimal results during prediction due to poor preprocessing. Recent work has shown good results in quantifying how the learned features in a neural network apply to objects. In this work, we follow up on work in feature applicability, using it to quantify which octaves the features in a trained neural network resonate best with. |
Databáze: | OpenAIRE |
Externí odkaz: |