Home Data mining with Random Forests as a methodology for biomedical signal classification
Article
Licensed
Unlicensed Requires Authentication

Data mining with Random Forests as a methodology for biomedical signal classification

  • Klaudia Proniewska EMAIL logo
Published/Copyright: May 24, 2016
Become an author with De Gruyter Brill

Abstract

As the contribution of specific parameters is not known and significant intersubject variability is expected, a decision system allowing adaptation for subject and environment conditions has to be designed to evaluate biomedical signal classification. A decision support system has to be trained in its desirable functionality prior to being used for patient monitoring evaluation. This paper describes a decision system based on data mining with Random Forests, allowing the adaptation for subject and environment conditions. This methodology may lead to specific system scoring by an artificial intelligence-supported patient monitoring evaluation system, which may help find a way of making decisions concerning future treatment and have influence on the quality of patients’ life.

  1. Author contributions: The author has accepted responsibility for the entire content of this submitted manuscript and approved submission.

  2. Research funding: None declared.

  3. Employment or leadership: None declared.

  4. Honorarium: None declared.

  5. Competing interests: The funding organization(s) played no role in the study design; in the collection, analysis, and interpretation of data; in the writing of the report; or in the decision to submit the report for publication.

References

1. Strauss W. Digital signal processing. IEEE Signal Process Mag 2000;17:52–6.10.1109/79.826412Search in Google Scholar

2. Pawar P, Jones V, van Beijnum BJ, Hermens H. A framework for the comparison of mobile patient monitoring systems. J Biomed Inform 2012;45:544–56.10.1016/j.jbi.2012.02.007Search in Google Scholar PubMed

3. Varshney U. A framework for supporting emergency messages in wireless patient monitoring. Decis Support Syst 2008;45:981–96.10.1016/j.dss.2008.03.006Search in Google Scholar

4. Musen MA, Shahar Y, Shortliffe EH. Clinical decision-support systems. Biomed Inform 2006;30:698–736.10.1007/0-387-36278-9_20Search in Google Scholar

5. Kaniusas E. Biomedical signals and sensors I: linking physiological phenomena and biosensors. In: Biomedical signals and sensors I. Berlin: Springer, 2012:183–282. doi:10.1007/978-3-642-24843-6.10.1007/978-3-642-24843-6Search in Google Scholar

6. Breiman L. Random Forests. Mach Learn 2001;45:5–32.10.1023/A:1010933404324Search in Google Scholar

7. Genuer R, Poggi J-M, Tuleau-Malot C. Variable selection using random forests. Pattern Recognit Lett 2010;31:2225–36.10.1016/j.patrec.2010.03.014Search in Google Scholar

8. Cutler DR, Edwards TC, Beard KH, Cutler A, Hess KT, Gibson J, et al. Random forests for classification in ecology. Ecology 2007;88:2783–92.10.1890/07-0539.1Search in Google Scholar PubMed

9. Ishwaran H, Kogalur UB, Blackstone EH, Lauer MS. Random survival forests. Ann Appl Stat 2008;2:841–60.10.1002/9781118445112.stat08188Search in Google Scholar

10. Chen X, Ishwaran H. Random forests for genomic data analysis. Genomics 2012;99:323–9.10.1016/j.ygeno.2012.04.003Search in Google Scholar PubMed PubMed Central

11. Ho TK. Random decision forests. In: Proceedings of 3rd International Conference on Document Analysis and Recognition, 1, 1995:278–82. doi:10.1109/ICDAR.1995.598994.10.1109/ICDAR.1995.598994Search in Google Scholar

12. Saffari A, Leistner C, Santner J, Godec M, Bischof H. On-line random forests. In: 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops, 2009:1393–400. doi:10.1109/ICCVW.2009.5457447.10.1109/ICCVW.2009.5457447Search in Google Scholar

13. Genuer R, Poggi J-M, Tuleau C. Random Forests: some methodological insights. Inria 6729, 2008:32, arXiv:0811.3619v1 [stat.ML], ISSN 0249-6399.Search in Google Scholar

14. Strobl C, Boulesteix A-L, Kneib T, Augustin T, Zeileis A. Conditional variable importance for random forests. BMC Bioinform 2008;9:307.10.1186/1471-2105-9-307Search in Google Scholar PubMed PubMed Central

15. Amaratunga D, Cabrera J, Lee YS. Enriched random forests. Bioinformatics 2008;24:2010–4.10.1093/bioinformatics/btn356Search in Google Scholar PubMed

16. Statsoft. Statistica 10 Manual. [Online]. Available at: www.statsoft.com. Accessed: 6 April 2016.Search in Google Scholar

17. Lin Y, Jeon Y. Random Forests and adaptive nearest neighbors. J Am Stat Assoc 2006;101:578–90.10.1198/016214505000001230Search in Google Scholar

18. Adele C, Cutler DR, Stevens JR. Random Forests. In: Ensemble machine learning. Cambridge, MA, USA: Academic Press, 2012:157–75. doi:10.1007/978-1-4419-9326-7.10.1007/978-1-4419-9326-7Search in Google Scholar

19. Boström H. Calibrating random forests. In: Proceedings – 7th International Conference on Machine Learning and Applications, ICMLA 2008, 2008:121–6. doi:10.1109/ICMLA.2008.107.10.1109/ICMLA.2008.107Search in Google Scholar

20. Biau G. Analysis of a Random Forests model. J Mach Learn Res 2012;13:1063–95.Search in Google Scholar

21. Abdulsalam H, Skillicorn DB, Martin P. Streaming Random Forests. In: Proceedings of the International Database Engineering and Applications Symposium, IDEAS, 2007:225–32. doi:10.1109/IDEAS.2007.4318108.10.1109/IDEAS.2007.4318108Search in Google Scholar

Received: 2016-4-8
Accepted: 2016-4-29
Published Online: 2016-5-24
Published in Print: 2016-6-1

©2016 by De Gruyter

Downloaded on 29.10.2025 from https://www.degruyterbrill.com/document/doi/10.1515/bams-2016-0005/html
Scroll to top button