{"title":"Information theoretic clustering using minimal spanning trees","scopus_import":1,"date_updated":"2021-01-12T07:41:14Z","conference":{"start_date":"2012-08-28","name":"DAGM: German Association For Pattern Recognition","location":"Graz, Austria","end_date":"2012-08-31"},"alternative_title":["LNCS"],"department":[{"_id":"ChLa"}],"volume":7476,"date_published":"2012-08-14T00:00:00Z","quality_controlled":"1","_id":"3126","user_id":"3E5EF7F0-F248-11E8-B48F-1D18A9856A87","publication_status":"published","citation":{"apa":"Müller, A., Nowozin, S., & Lampert, C. (2012). Information theoretic clustering using minimal spanning trees (Vol. 7476, pp. 205–215). Presented at the DAGM: German Association For Pattern Recognition, Graz, Austria: Springer. https://doi.org/10.1007/978-3-642-32717-9_21","ieee":"A. Müller, S. Nowozin, and C. Lampert, “Information theoretic clustering using minimal spanning trees,” presented at the DAGM: German Association For Pattern Recognition, Graz, Austria, 2012, vol. 7476, pp. 205–215.","short":"A. Müller, S. Nowozin, C. Lampert, in:, Springer, 2012, pp. 205–215.","chicago":"Müller, Andreas, Sebastian Nowozin, and Christoph Lampert. “Information Theoretic Clustering Using Minimal Spanning Trees,” 7476:205–15. Springer, 2012. https://doi.org/10.1007/978-3-642-32717-9_21.","ista":"Müller A, Nowozin S, Lampert C. 2012. Information theoretic clustering using minimal spanning trees. DAGM: German Association For Pattern Recognition, LNCS, vol. 7476, 205–215.","ama":"Müller A, Nowozin S, Lampert C. Information theoretic clustering using minimal spanning trees. In: Vol 7476. Springer; 2012:205-215. doi:10.1007/978-3-642-32717-9_21","mla":"Müller, Andreas, et al. Information Theoretic Clustering Using Minimal Spanning Trees. Vol. 7476, Springer, 2012, pp. 205–15, doi:10.1007/978-3-642-32717-9_21."},"type":"conference","doi":"10.1007/978-3-642-32717-9_21","page":"205 - 215","intvolume":" 7476","language":[{"iso":"eng"}],"month":"08","publisher":"Springer","author":[{"first_name":"Andreas","last_name":"Müller","full_name":"Müller, Andreas"},{"full_name":"Nowozin, Sebastian","last_name":"Nowozin","first_name":"Sebastian"},{"full_name":"Lampert, Christoph","last_name":"Lampert","id":"40C20FD2-F248-11E8-B48F-1D18A9856A87","first_name":"Christoph","orcid":"0000-0001-8622-7887"}],"status":"public","year":"2012","date_created":"2018-12-11T12:01:32Z","day":"14","oa_version":"None","publist_id":"3573","abstract":[{"lang":"eng","text":"In this work we propose a new information-theoretic clustering algorithm that infers cluster memberships by direct optimization of a non-parametric mutual information estimate between data distribution and cluster assignment. Although the optimization objective has a solid theoretical foundation it is hard to optimize. We propose an approximate optimization formulation that leads to an efficient algorithm with low runtime complexity. The algorithm has a single free parameter, the number of clusters to find. We demonstrate superior performance on several synthetic and real datasets.\r\n"}]}