Startseite Uncertainty-aware automated machine learning toolbox
Artikel
Lizenziert
Nicht lizenziert Erfordert eine Authentifizierung

Uncertainty-aware automated machine learning toolbox

  • Tanja Dorst

    Tanja Dorst studied Mathematics at Saarland University and received her Master of Science degree in November 2013. After that, she studied Mechanical Engineering at University of Applied Sciences in Saarbrücken and received her Bachelor of Engineering degree in September 2017. Since July 2020 she has been working at Center for Mechatronics and Automation Technology (ZeMA) gGmbH as a scientific researcher. Her research interests include measurement uncertainties in ML for condition monitoring of technical systems.

    ORCID logo EMAIL logo
    , Tizian Schneider

    Tizian Schneider studied Microtechnologies and Nanostructures at Saarland University and received his Master of Science degree in January 2016. Since that time, he has been working at the Lab for Measurement Technology (LMT) of Saarland University and at Center for Mechatronics and Automation Technology (ZeMA) gGmbH leading the research group Data Engineering & Smart Sensors. His research interests include ML methods for condition monitoring of technical systems, automatic ML model building and interpretable AI.

    , Sascha Eichstädt

    Dr. Sascha Eichstädt is the leader of the Physikalisch-Technische Bundesanstalt (PTB) department “Metrology for digital transformation”. He received his Diploma in Mathematics in 2008 at the HU Berlin, and his PhD in Theoretical Physics in 2012 at the TU Berlin. From 2008 to 2017 he joined the group “Mathematical modelling and data analysis” at PTB. His main research areas are signal processing and sensor networks.

    ORCID logo
    und Andreas Schütze

    Andreas Schütze received his diploma in physics from RWTH Aachen in 1990 and his doctorate in Applied Physics from Justus-Liebig-Universität in Gießen in 1994 with a thesis on microsensors and sensor systems for the detection of reducing and oxidizing gases. From 1994 until 1998 he worked for VDI/VDE-IT, Teltow, Germany, mainly in the fields of microsystems technology. From 1998 until 2000 he was professor for Sensors and Microsystem Technology at the University of Applied Sciences in Krefeld, Germany. Since April 2000 he is professor for Measurement Technology in the Department Systems Engineering at Saarland University, Saarbrücken, Germany and head of the Laboratory for Measurement Technology (LMT). His research interests include smart gas sensor systems as well as data engineering methods for industrial applications.

    ORCID logo
Veröffentlicht/Copyright: 22. September 2022

Abstract

Measurement data can be considered complete only with an associated measurement uncertainty to express knowledge about the spread of values reasonably attributed to the measurand. Measurement uncertainty also allows to assess the comparability and the reliability of measurement results as well as to evaluate decisions based on the measurement result. Artificial Intelligence (AI) methods and especially Machine Learning (ML) are often based on measurements, but so far, uncertainty is widely neglected in this field. We propose to apply uncertainty propagation in ML to allow estimating the uncertainty of ML results and, furthermore, an optimization of ML methods to minimize this uncertainty. Here, we present an extension of a previously published automated ML toolbox (AMLT), which performs feature extraction, feature selection and classification in an automated way without any expert knowledge. To this end, we propose to apply the principles described in the “Guide to the Expression of Uncertainty in Measurement” (GUM) and its supplements to carry out uncertainty propagation for every step in the AMLT. In previous publications we have presented the uncertainty propagation for some of the feature extraction methods in the AMLT. In this contribution, we add some more elements to this concept by also including statistical moments as a feature extraction method, add uncertainty propagation to the feature selection methods and extend it to also include the classification method, linear discriminant analysis combined with Mahalanobis distance. For these methods, analytical approaches for uncertainty propagation are derived in detail, and the uncertainty propagation for the other feature extraction and selection methods are briefly revisited. Finally, the use the uncertainty-aware AMLT is demonstrated for a data set consisting of uncorrelated measurement data and associated uncertainties.

Zusammenfassung

Messdaten können nur dann als vollständig angesehen werden, wenn sie mit einer Messunsicherheit versehen sind, die das Wissen über die Streuung der Werte ausdrückt, die der Messgröße zugeordnet werden kann. Die Messunsicherheit ermöglicht zudem die Beurteilung der Vergleichbarkeit und Zuverlässigkeit von Messergebnissen sowie die Bewertung von Entscheidungen auf der Grundlage von Messergebnissen. Methoden der künstlichen Intelligenz (KI) und insbesondere des maschinellen Lernens (ML) basieren häufig auf Messungen, aber bisher wurde die Unsicherheit in diesem Bereich weitgehend vernachlässigt. Wir schlagen daher in diesem Beitrag vor, die Unsicherheitsfortpflanzung beim ML anzuwenden, um die Unsicherheit von ML-Ergebnissen abzuschätzen und darüber hinaus eine Optimierung von ML-Methoden zur Minimierung dieser Unsicherheit zu ermöglichen. Dazu stellen wir eine Erweiterung einer bereits veröffentlichten automatisierten ML-Toolbox (AMLT) vor, die Merkmalsextraktion, Merkmalsselektion und Klassifikation automatisiert und ohne Expertenwissen durchführt. Die im „Guide to the Expression of Uncertainty in Measurement“ (GUM) und seinen Supplementen beschriebenen Prinzipien werden angewandt, um eine Unsicherheitsfortpflanzung für jeden Schritt in der AMLT durchzuführen. In früheren Veröffentlichungen haben wir bereits die Unsicherheitsfortpflanzung für einige der Merkmalsextraktionsmethoden in der AMLT vorgestellt. In diesem Beitrag fügen wir nun diesem Konzept einige weitere Elemente hinzu, indem wir auch statistische Momente als Merkmalsextraktionsmethode einbeziehen, die Unsicherheitsfortpflanzung zu den Merkmalselektionsmethoden hinzufügen und sie auch auf die Klassifikationsmethode, die lineare Diskriminanzanalyse in Kombination mit der Mahalanobis-Distanz, ausweiten. Für diese Methoden werden analytische Ansätze für die Unsicherheitsfortpflanzung im Detail abgeleitet, und die Unsicherheitsfortpflanzungen für die anderen Merkmalsextraktions- und -selektionsmethoden werden kurz aufgegriffen. Abschließend wird die Anwendung der zuvor vorgestellten Version der AMLT, welche Unsicherheiten berücksichtig, für einen Datensatz, welcher aus unkorrelierten Messdaten und dazugehörigen Unsicherheiten besteht, demonstriert.

Funding source: Horizon 2020

Award Identifier / Grant number: 17IND12

Award Identifier / Grant number: 16ES0419K

Funding statement: Part of this work has received funding within the project 17IND12 Met4FoF from the EMPIR program co-financed by the Participating States and from the European Union’s Horizon 2020 research and innovation program. The basic version of the automated ML toolbox was developed at ZeMA as part of the MoSeS-Pro research project funded by the German Federal Ministry of Education and Research in the call “Sensor-based electronic systems for applications for Industry 4.0 – SElekt I 4.0”, funding code 16ES0419K, within the framework of the German Hightech Strategy.

About the authors

Tanja Dorst

Tanja Dorst studied Mathematics at Saarland University and received her Master of Science degree in November 2013. After that, she studied Mechanical Engineering at University of Applied Sciences in Saarbrücken and received her Bachelor of Engineering degree in September 2017. Since July 2020 she has been working at Center for Mechatronics and Automation Technology (ZeMA) gGmbH as a scientific researcher. Her research interests include measurement uncertainties in ML for condition monitoring of technical systems.

Tizian Schneider

Tizian Schneider studied Microtechnologies and Nanostructures at Saarland University and received his Master of Science degree in January 2016. Since that time, he has been working at the Lab for Measurement Technology (LMT) of Saarland University and at Center for Mechatronics and Automation Technology (ZeMA) gGmbH leading the research group Data Engineering & Smart Sensors. His research interests include ML methods for condition monitoring of technical systems, automatic ML model building and interpretable AI.

Sascha Eichstädt

Dr. Sascha Eichstädt is the leader of the Physikalisch-Technische Bundesanstalt (PTB) department “Metrology for digital transformation”. He received his Diploma in Mathematics in 2008 at the HU Berlin, and his PhD in Theoretical Physics in 2012 at the TU Berlin. From 2008 to 2017 he joined the group “Mathematical modelling and data analysis” at PTB. His main research areas are signal processing and sensor networks.

Andreas Schütze

Andreas Schütze received his diploma in physics from RWTH Aachen in 1990 and his doctorate in Applied Physics from Justus-Liebig-Universität in Gießen in 1994 with a thesis on microsensors and sensor systems for the detection of reducing and oxidizing gases. From 1994 until 1998 he worked for VDI/VDE-IT, Teltow, Germany, mainly in the fields of microsystems technology. From 1998 until 2000 he was professor for Sensors and Microsystem Technology at the University of Applied Sciences in Krefeld, Germany. Since April 2000 he is professor for Measurement Technology in the Department Systems Engineering at Saarland University, Saarbrücken, Germany and head of the Laboratory for Measurement Technology (LMT). His research interests include smart gas sensor systems as well as data engineering methods for industrial applications.

Appendix A Derivations of the sensitivity coefficients and the covariance matrix for statistical moments

A.1 Standard deviation

β p , j = σ p d j = 1 2 · ( 1 N p 1 i = a p e p ( d i d p ) 2 ) 1 2 · 2 N p 1 · i = a p e p ( ( d i d p ) · d j ( d i d p ) ) = 1 2 · σ p 1 · 2 N p 1 · ( ( d j d p ) · ( 1 1 N p ) + i = a p , i j e p ( d i d p ) · ( 1 N p ) ) = 1 2 · σ p 1 · 2 N p 1 · ( ( d j d p ) 1 N p i = a p e p ( d i d p ) ) = 1 2 · σ p 1 · 2 N p 1 · ( d j d p 1 N p ( i = a p e p d i N p d p ) ) = 1 2 · σ p 1 · 2 N p 1 · ( d j d p 1 N p i = a p e p d i + N p N p d p ) = 1 2 · σ p 1 · 2 N p 1 · ( d j d p d p + d p ) = 1 2 · σ p · 2 N p 1 · ( d j d p ) = d j d p ( N p 1 ) · σ p

A.2 Skewness

v p denom d j = 3 N p · i = a p e p ( ( d i d p ) 2 · d j ( d i d p ) ) = 3 N p · ( ( d j d p ) 2 · ( 1 1 N p ) + i = a p , i j e p ( d i d p ) 2 · ( 1 N p ) ) = 3 N p · ( ( d j d p ) 2 1 N p i = a p e p ( d i d p ) 2 )

v p nom d j = 3 2 · ( 1 N p i = a p e p ( d i d p ) 2 ) 1 2 · 2 N p · i = a p e p ( ( d i d p ) 1 · d j ( d i d p ) ) = 3 2 · ( 1 N p i = a p e p ( d i d p ) 2 ) 1 2 · 2 N p · ( ( d j d p ) · ( 1 1 N p ) + i = a p , i j e p ( d i d p ) · ( 1 N p ) ) = 3 2 · ( 1 N p i = a p e p ( d i d p ) 2 ) 1 2 · 2 N p · ( ( d j d p ) 1 N p i = a p e p ( d i d p ) ) = 3 2 · ( 1 N p i = a p e p ( d i d p ) 2 ) 1 2 · 2 N p · ( d j d p 1 N p i = a p e p d i + N p N p · d p ) = 3 N p · ( 1 N p i = a p e p ( d i d p ) 2 ) 1 2 · ( d j d p )

A.3 Kurtosis

w p denom d j = 4 N p · i = a p e p ( ( d i d p ) 3 · d j ( d i d p ) ) = 4 N p · ( ( d j d p ) 3 · ( 1 1 N p ) + i = a p , i j e p ( d i d p ) 3 · ( 1 N p ) ) = 4 N p · ( ( d j d p ) 3 1 N p i = a p e p ( d i d p ) 3 )

w p nom d j = 2 · ( 1 N p i = a p e p ( d i d p ) 2 ) 1 · 2 N p · i = a p e p ( ( d i d p ) 1 · d j ( d i d p ) ) = 2 · ( 1 N p i = a p e p ( d i d p ) 2 ) · 2 N p · ( ( d j d p ) · ( 1 1 N p ) + i = a p , i j e p ( d i d p ) · ( 1 N p ) ) = 2 · ( 1 N p i = a p e p ( d i d p ) 2 ) · 2 N p · ( ( d j d p ) 1 N p i = a p e p ( d i d p ) ) = 2 · ( 1 N p i = a p e p ( d i d p ) 2 ) · 2 N p · ( d j d p 1 N p i = a p e p d i + N p N p · d p ) = 4 N p 2 · ( i = a p e p ( d i d p ) 2 ) · ( d j d p )

A.4 Covariance matrix

U q = J α , β , γ , δ q · U c · ( J α , β , γ , δ q ) = A B Γ Δ · U c · A , B , Γ , Δ = A B Γ Δ · U c A , U c B , U c Γ , U c Δ = A U c A A U c B A U c Γ A U c Δ B U c A B U c B B U c Γ B U c Δ Γ U c A Γ U c B Γ U c Γ Γ U c Δ Δ U c A Δ U c B Δ U c Γ Δ U c Δ = A U c A A U c B A U c Γ A U c Δ ( A U c B ) B U c B B U c Γ B U c Δ ( A U c Γ ) ( B U c Γ ) Γ U c Γ Γ U c Δ ( A U c Δ ) ( B U c Δ ) ( Γ U c Δ ) Δ U c Δ

References

1. Tizian Schneider, Nikolai Helwig, and Andreas Schütze. Industrial condition monitoring with smart sensors using automated feature extraction and selection. Measurement Science and Technology, 29(9), 2018.10.1088/1361-6501/aad1d4Suche in Google Scholar

2. Tanja Dorst, Yannick Robin, Tizian Schneider, and Andreas Schütze. Automated ML Toolbox for Cyclic Sensor Data. In MSMM 2021 – Mathematical and Statistical Methods for Metrology, pages 149–150, Online, Jun 2021.Suche in Google Scholar

3. Ronald Aylmer Fisher. The use of multiple measurements in taxonomic problems. Annals of Eugenics, 7(2):179–188, Sep 1936.10.1111/j.1469-1809.1936.tb02137.xSuche in Google Scholar

4. Pourya Shamsolmoali, Deepak Kumar Jain, Masoumeh Zareapoor, Jie Yang, and M Afshar Alam. High-dimensional multimedia classification using deep CNN and extended residual units. Multimedia Tools and Applications, 78(17):23867–23882, 2019.10.1007/s11042-018-6146-7Suche in Google Scholar

5. Douglas M Hawkins. The problem of overfitting. Journal of Chemical Information and Computer Sciences, 44(1):1–12, Jan 2004.10.1021/ci0342472Suche in Google Scholar PubMed

6. Kevin Beyer, Jonathan Goldstein, Raghu Ramakrishnan, and Uri Shaft. When Is “Nearest Neighbor” Meaningful? In Database Theory — ICDT’99, pages 217–235. Springer Berlin Heidelberg, 1999.10.1007/3-540-49257-7_15Suche in Google Scholar

7. Michel Verleysen and Damien François. The Curse of Dimensionality in Data Mining and Time Series Prediction. In Joan Cabestany, Alberto Prieto, and Francisco Sandoval, editors, Computational Intelligence and Bioinspired Systems, pages 758–770. Springer Berlin Heidelberg, 2005.10.1007/11494669_93Suche in Google Scholar

8. Dimitrios Stratakis, Andreas Miaoudakis, Charalambos Katsidis, Vassilios Zacharopoulos, and Thomas Xenos. On the uncertainty estimation of electromagnetic field measurements using field sensors: a general approach. Radiation Protection Dosimetry, 133(4):240–247, 2009.10.1093/rpd/ncp050Suche in Google Scholar PubMed

9. Maximilian Gruber, Wenzel Pilar von Pilchau, Varun Gowtham, Nikolaos-Stefanos Koutrakis, Matthias Riedl, Sascha Eichstädt, Jörg Hähner, Eckart Uhlmann, Julian Polte, and Alexander Willner. Uncertainty-Aware Sensor Fusion in Sensor Networks. In SMSI 2021 – Sensor and Measurement Science International, pages 346–347, 2021.10.5162/SMSI2021/D2.2Suche in Google Scholar

10. Robert T. Olszewski, Roy A. Maxion, and Dan P. Siewiorek. Generalized feature extraction for structural pattern recognition in time-series data. PhD thesis, Carnegie Mellon University, Pittsburgh, PA, USA, 2001.Suche in Google Scholar

11. Ingrid Daubechies. Ten Lectures on Wavelets. Society for Industrial and Applied Mathematics, Philadelphia, PA, USA, 1992.10.1137/1.9781611970104Suche in Google Scholar

12. Fabian Mörchen. Time series feature extraction for data mining using DWT and DFT. Department of Mathematics and Computer Science, University of Marburg, Germany – Technical Report, 33:1–31, 2003.Suche in Google Scholar

13. Karl Pearson F. R. S.. LIII. On lines and planes of closest fit to systems of points in space. The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science, 2(11):559–572, 1901.10.1080/14786440109462720Suche in Google Scholar

14. Harold Hotelling. Analysis of a complex of statistical variables into principal components. Journal of Educational Psychology, 24(6):417–441, 1933.10.1037/h0071325Suche in Google Scholar

15. Svante Wold, Kim Esbensen, and Paul Geladi. Principal component analysis. Chemometrics and Intelligent Laboratory Systems, 2(1-3):37–52, 1987. Proceedings of the Multivariate Statistical Workshop for Geologists and Geochemists.10.1016/0169-7439(87)80084-9Suche in Google Scholar

16. J. Edward Jackson. A Use’s Guide to Principal Components. John Wiley & Sons, Inc., 1991.10.1002/0471725331Suche in Google Scholar

17. H. R. Martin and Farhang Honarvar. Application of statistical moments to bearing failure detection. Applied Acoustics, 44(1):67–77, 1995.10.1016/0003-682X(94)P4420-BSuche in Google Scholar

18. Isabelle Guyon and André Elisseeff. An introduction to variable and feature selection. Journal of Machine Learning Research, 3:1157–1182, Mar 2003.Suche in Google Scholar

19. Alain Rakotomamonjy. Variable selection using SVM-based criteria. Journal of Machine Learning Research, 3:1357–1370, Mar 2003.Suche in Google Scholar

20. Kenji Kira and Larry A. Rendell. The Feature Selection Problem: Traditional Methods and a New Algorithm. In Proceedings / Tenth National Conference on Artificial Intelligence, July 12–16, 1992, pages 129–134. AAAI Press, 1992.Suche in Google Scholar

21. Kenji Kira and Larry A. Rendell. A Practical Approach to Feature Selection. In Derek Sleeman and Peter Edwards, editors, Machine Learning Proceedings 1992, pages 249–256. Morgan Kaufmann, San Francisco (CA), 1992.10.1016/B978-1-55860-247-2.50037-1Suche in Google Scholar

22. Igor Kononenko and Se June Hong. Attribute selection for modelling. Future Generation Computer Systems, 13(2-3):181–195, Nov 1997.10.1016/S0167-739X(97)81974-7Suche in Google Scholar

23. Igor Kononenko, Edvard Šimec, and Marko Robnik-Šikonja. Overcoming the myopia of inductive learning algorithms with RELIEFF. Applied Intelligence, 7(1):39–55, Jan 1997.10.1023/A:1008280620621Suche in Google Scholar

24. Marko Robnik-Šikonja and Igor Kononenko. Theoretical and empirical analysis of ReliefF and RReliefF. Machine Learning, 53(1):23–69, 2003.10.1023/A:1025667309714Suche in Google Scholar

25. Richard O. Duda, Peter E. Hart, and David G. Stork. Pattern Classification, 2 edition. A Wiley-Interscience Publication. Wiley, New York, 2001.Suche in Google Scholar

26. Prasanta Chandra Mahalanobis. On tests and measures of group divergence. Journal of the Asiatic Society of Bengal, 26:541–588, 1930.Suche in Google Scholar

27. Prasanta Chandra Mahalanobis. On the generalized distance in statistics. Proceedings of the National Institute of Sciences (Calcutta), 2:49–55, 1936.Suche in Google Scholar

28. Roy De Maesschalck, Delphine Jouan-Rimbaud, and Desire L. Massart. The Mahalanobis distance. Chemometrics and Intelligent Laboratory Systems, 50(1):1–18, 2000.10.1016/S0169-7439(99)00047-7Suche in Google Scholar

29. Ron Kohavi. A study of cross-validation and bootstrap for accuracy estimation and model selection. In Proceedings of the 14th International Joint Conference on Artificial Intelligence – Volume 2, IJCAI ’95, pages 1137–1143. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, 1995.Suche in Google Scholar

30. BIPM, IEC, IFCC, ILAC, ISO, IUPAC, IUPAP, and OIML. JCGM 100: Evaluation of measurement data Guide to the expression of uncertainty in measurement. 2008.Suche in Google Scholar

31. BIPM, IEC, IFCC, ILAC, ISO, IUPAC, IUPAP, and OIML. JCGM 101: Evaluation of measurement data Supplement 1 to the “Guide to the expression of uncertainty in measurement” Propagation of distributions using a Monte Carlo method. 2008.Suche in Google Scholar

32. BIPM, IEC, IFCC, ILAC, ISO, IUPAC, IUPAP, and OIML. JCGM 102: Evaluation of measurement data Supplement 2 to the “Guide to the expression of uncertainty in measurement” Extension to any number of output quantities. 2011.Suche in Google Scholar

33. Sascha Eichstädt, Alfred Link, Peter Harris, and Clemens Elster. Efficient implementation of a Monte Carlo method for uncertainty evaluation in dynamic measurements. Metrologia, 49(3):401–410, Apr 2012.10.1088/0026-1394/49/3/401Suche in Google Scholar

34. Tanja Dorst, Sascha Eichstädt, Tizian Schneider, and Andreas Schütze. Propagation of uncertainty for an Adaptive Linear Approximation algorithm. In SMSI 2020 – Sensor and Measurement Science International, pages 366–367. Jun 2020.Suche in Google Scholar

35. Tanja Dorst, Sascha Eichstädt, Tizian Schneider, and Andreas Schütze. GUM2ALA – Uncertainty propagation algorithm for the Adaptive Linear Approximation according to the GUM. In SMSI 2021 – Sensor and Measurement Science International, pages 314–315, May 2021.10.5162/SMSI2021/D1.1Suche in Google Scholar

36. Sascha Eichstädt and Volker Wilkens. GUM2DFT — a software tool for uncertainty evaluation of transient signals in the frequency domain. Measurement Science and Technology, 27(5):055001, 2016.10.1088/0957-0233/27/5/055001Suche in Google Scholar

37. Lorenzo Peretto, Renato Sasdelli, and Roberto Tinarelli. Uncertainty propagation in the discrete-time wavelet transform. In Proceedings of the 20th IEEE Instrumentation Technology Conference (Cat. No. 03CH37412), volume 2, pages 1465–1470, 2003.10.1109/IMTC.2003.1207993Suche in Google Scholar

38. Lorenzo Peretto, Renato Sasdelli, and Roberto Tinarelli. Uncertainty propagation in the discrete-time wavelet transform. IEEE Transactions on Instrumentation and Measurement, 54(6):2474–2480, 2005.10.1109/IMTC.2003.1207993Suche in Google Scholar

39. Lorenzo Peretto, Renato Sasdelli, and Roberto Tinarelli. On uncertainty in wavelet-based signal analysis. IEEE Transactions on Instrumentation and Measurement, 54(4):1593–1599, 2005.10.1109/TIM.2005.851210Suche in Google Scholar

40. Maximilian Gruber, Tanja Dorst, Andreas Schütze, Sascha Eichstädt, and Clemens Elster. Discrete wavelet transform on uncertain data: Efficient online implementation for practical applications. In Franco Pavese, Alistair B Forbes, Nien-Fan Zhang, and Anna Chunovkina, editors, Series on Advances in Mathematics for Applied Sciences, pages 249–261. World Scientific, Jan 2022.10.1142/9789811242380_0014Suche in Google Scholar

41. Yingyao Zhou, Jason A. Young, Andrey Santrosyan, Kaisheng Chen, Frank S. Yan, and Elizabeth A. Winzeler. In silico gene function prediction using ontology-based pattern identification. Bioinformatics, 21(7):1237–1245, Apr 2005.10.1093/bioinformatics/bti111Suche in Google Scholar PubMed

42. Charles Spearman. The proof and measurement of association between two things. The American Journal of Psychology, 15:72–101, 1904.10.1037/11491-005Suche in Google Scholar

43. Clark Wissler. The Spearman correlation formula. Science, 22(558):309–311, 1905.10.1126/science.22.558.309Suche in Google Scholar PubMed

44. Jinbo Bi and Tong Zhang. Support Vector Classification with Input Data Uncertainty. In L. Saul, Y. Weiss, and L. Bottou, editors, Advances in Neural Information Processing Systems, volume 17. MIT Press, 2004.Suche in Google Scholar

45. Gene H. Golub and Charles F. van Loan. An analysis of the total least squares problem. SIAM Journal on Numerical Analysis, 17(6):883–893, 1980.10.1137/0717073Suche in Google Scholar

46. Roger A. Horn. The Hadamard product. In Charles R. Johnson, editor, Matrix theory and applications, volume 40 of Proceedings of Symposia in Applied Mathematics, pages 87–169. Amer. Math. Soc., Providence, RI, 1990.10.1090/psapm/040/1059485Suche in Google Scholar

47. Robert Reams. Hadamard inverses, square roots and products of almost semidefinite matrices. Linear Algebra and its Applications, 288:35–43, 1999.10.1016/S0024-3795(98)10162-3Suche in Google Scholar

48. Tizian Schneider, Steffen Klein, and Manuel Bastuck. Condition monitoring of hydraulic systems Data Set at ZeMA, Apr 2018.Suche in Google Scholar

49. Nikolai Helwig, Eliseo Pignanelli, and Andreas Schütze. Condition monitoring of a complex hydraulic system using multivariate statistics. In 2015 IEEE International Instrumentation and Measurement Technology Conference (I2MTC) Proceedings, pages 210–215, 2015.10.1109/I2MTC.2015.7151267Suche in Google Scholar

50. Nikolai Helwig, Eliseo Pignanelli, and Andreas Schütze. Detecting and Compensating Sensor Faults in a Hydraulic Condition Monitoring System. In Proceedings SENSOR 2015, pages 641–646, 2015.10.5162/sensor2015/D8.1Suche in Google Scholar

Received: 2022-03-29
Accepted: 2022-09-13
Published Online: 2022-09-22
Published in Print: 2023-03-28

© 2022 Walter de Gruyter GmbH, Berlin/Boston

Heruntergeladen am 8.10.2025 von https://www.degruyterbrill.com/document/doi/10.1515/teme-2022-0042/html?lang=de
Button zum nach oben scrollen