Zobrazeno 1 - 10
of 69
pro vyhledávání: '"Thomas, Janek"'
We present a model-agnostic framework for jointly optimizing the predictive performance and interpretability of supervised machine learning models for tabular data. Interpretability is quantified via three measures: feature sparsity, interaction spar
Externí odkaz:
http://arxiv.org/abs/2307.08175
Hyperparameter optimization (HPO) is a powerful technique for automating the tuning of machine learning (ML) models. However, in many real-world applications, accuracy is only one of multiple performance criteria that must be considered. Optimizing t
Externí odkaz:
http://arxiv.org/abs/2305.04502
Autor:
Schneider, Lennart, Pfisterer, Florian, Kent, Paul, Branke, Juergen, Bischl, Bernd, Thomas, Janek
Neural architecture search (NAS) has been studied extensively and has grown to become a research field with substantial impact. While classical single-objective NAS searches for the architecture with the best performance, multi-objective NAS consider
Externí odkaz:
http://arxiv.org/abs/2208.00204
Autor:
Gijsbers, Pieter, Bueno, Marcos L. P., Coors, Stefan, LeDell, Erin, Poirier, Sébastien, Thomas, Janek, Bischl, Bernd, Vanschoren, Joaquin
Comparing different AutoML frameworks is notoriously challenging and often done incorrectly. We introduce an open and extensible benchmark that follows best practices and avoids common mistakes when comparing AutoML frameworks. We conduct a thorough
Externí odkaz:
http://arxiv.org/abs/2207.12560
Autor:
Karl, Florian, Pielok, Tobias, Moosbauer, Julia, Pfisterer, Florian, Coors, Stefan, Binder, Martin, Schneider, Lennart, Thomas, Janek, Richter, Jakob, Lang, Michel, Garrido-Merchán, Eduardo C., Branke, Juergen, Bischl, Bernd
Publikováno v:
ACM Transactions on Evolutionary Learning and Optimization 3.4 (2023): 1-50
Hyperparameter optimization constitutes a large part of typical modern machine learning workflows. This arises from the fact that machine learning methods and corresponding preprocessing steps often only yield optimal performance when hyperparameters
Externí odkaz:
http://arxiv.org/abs/2206.07438
The goal of Quality Diversity Optimization is to generate a collection of diverse yet high-performing solutions to a given problem at hand. Typical benchmark problems are, for example, finding a repertoire of robot arm configurations or a collection
Externí odkaz:
http://arxiv.org/abs/2204.14061
Autor:
Bischl, Bernd, Binder, Martin, Lang, Michel, Pielok, Tobias, Richter, Jakob, Coors, Stefan, Thomas, Janek, Ullmann, Theresa, Becker, Marc, Boulesteix, Anne-Laure, Deng, Difan, Lindauer, Marius
Most machine learning algorithms are configured by one or several hyperparameters that must be carefully chosen and often considerably impact performance. To avoid a time consuming and unreproducible manual trial-and-error process to find well-perfor
Externí odkaz:
http://arxiv.org/abs/2107.05847
Since most machine learning (ML) algorithms are designed for numerical inputs, efficiently encoding categorical variables is a crucial aspect in data analysis. A common problem are high cardinality features, i.e. unordered categorical predictor varia
Externí odkaz:
http://arxiv.org/abs/2104.00629
Autor:
Goschenhofer, Jann, Hvingelby, Rasmus, Rügamer, David, Thomas, Janek, Wagner, Moritz, Bischl, Bernd
Publikováno v:
2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA)
While Semi-supervised learning has gained much attention in computer vision on image data, yet limited research exists on its applicability in the time series domain. In this work, we investigate the transferability of state-of-the-art deep semi-supe
Externí odkaz:
http://arxiv.org/abs/2102.03622
Both feature selection and hyperparameter tuning are key tasks in machine learning. Hyperparameter tuning is often useful to increase model performance, while feature selection is undertaken to attain sparse models. Sparsity may yield better model in
Externí odkaz:
http://arxiv.org/abs/1912.12912