{"title":"Renyi entropy estimation revisited","publication_identifier":{"issn":["18688969"]},"file_date_updated":"2020-07-14T12:47:49Z","pubrep_id":"888","language":[{"iso":"eng"}],"date_created":"2018-12-11T11:48:04Z","project":[{"name":"Teaching Old Crypto New Tricks","_id":"258AA5B2-B435-11E9-9278-68D0E5697425","grant_number":"682815","call_identifier":"H2020"}],"intvolume":" 81","day":"01","article_number":"20","file":[{"access_level":"open_access","file_id":"4991","content_type":"application/pdf","file_name":"IST-2017-888-v1+1_LIPIcs-APPROX-RANDOM-2017-20.pdf","relation":"main_file","checksum":"89225c7dcec2c93838458c9102858985","creator":"system","date_updated":"2020-07-14T12:47:49Z","file_size":604813,"date_created":"2018-12-12T10:13:10Z"}],"publisher":"Schloss Dagstuhl - Leibniz-Zentrum für Informatik","publist_id":"6979","tmp":{"legal_code_url":"https://creativecommons.org/licenses/by/4.0/legalcode","image":"/images/cc_by.png","short":"CC BY (4.0)","name":"Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)"},"type":"conference","abstract":[{"text":"We revisit the problem of estimating entropy of discrete distributions from independent samples, studied recently by Acharya, Orlitsky, Suresh and Tyagi (SODA 2015), improving their upper and lower bounds on the necessary sample size n. For estimating Renyi entropy of order alpha, up to constant accuracy and error probability, we show the following * Upper bounds n = O(1) 2^{(1-1/alpha)H_alpha} for integer alpha>1, as the worst case over distributions with Renyi entropy equal to H_alpha. * Lower bounds n = Omega(1) K^{1-1/alpha} for any real alpha>1, with the constant being an inverse polynomial of the accuracy, as the worst case over all distributions on K elements. Our upper bounds essentially replace the alphabet size by a factor exponential in the entropy, which offers improvements especially in low or medium entropy regimes (interesting for example in anomaly detection). As for the lower bounds, our proof explicitly shows how the complexity depends on both alphabet and accuracy, partially solving the open problem posted in previous works. The argument for upper bounds derives a clean identity for the variance of falling-power sum of a multinomial distribution. Our approach for lower bounds utilizes convex optimization to find a distribution with possibly worse estimation performance, and may be of independent interest as a tool to work with Le Cam’s two point method. ","lang":"eng"}],"ec_funded":1,"oa":1,"date_published":"2017-08-01T00:00:00Z","oa_version":"Published Version","conference":{"location":"Berkeley, USA","start_date":"2017-08-18","end_date":"2017-08-18","name":"20th International Workshop on Approximation Algorithms for Combinatorial Optimization Problems, APPROX"},"department":[{"_id":"KrPi"}],"user_id":"3E5EF7F0-F248-11E8-B48F-1D18A9856A87","author":[{"last_name":"Obremski","first_name":"Maciej","full_name":"Obremski, Maciej"},{"full_name":"Skórski, Maciej","last_name":"Skórski","id":"EC09FA6A-02D0-11E9-8223-86B7C91467DD","first_name":"Maciej"}],"has_accepted_license":"1","date_updated":"2021-01-12T08:11:50Z","quality_controlled":"1","doi":"10.4230/LIPIcs.APPROX-RANDOM.2017.20","alternative_title":["LIPIcs"],"_id":"710","year":"2017","scopus_import":1,"volume":81,"ddc":["005","600"],"citation":{"mla":"Obremski, Maciej, and Maciej Skórski. Renyi Entropy Estimation Revisited. Vol. 81, 20, Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2017, doi:10.4230/LIPIcs.APPROX-RANDOM.2017.20.","ista":"Obremski M, Skórski M. 2017. Renyi entropy estimation revisited. 20th International Workshop on Approximation Algorithms for Combinatorial Optimization Problems, APPROX, LIPIcs, vol. 81, 20.","short":"M. Obremski, M. Skórski, in:, Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2017.","ieee":"M. Obremski and M. Skórski, “Renyi entropy estimation revisited,” presented at the 20th International Workshop on Approximation Algorithms for Combinatorial Optimization Problems, APPROX, Berkeley, USA, 2017, vol. 81.","ama":"Obremski M, Skórski M. Renyi entropy estimation revisited. In: Vol 81. Schloss Dagstuhl - Leibniz-Zentrum für Informatik; 2017. doi:10.4230/LIPIcs.APPROX-RANDOM.2017.20","chicago":"Obremski, Maciej, and Maciej Skórski. “Renyi Entropy Estimation Revisited,” Vol. 81. Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2017. https://doi.org/10.4230/LIPIcs.APPROX-RANDOM.2017.20.","apa":"Obremski, M., & Skórski, M. (2017). Renyi entropy estimation revisited (Vol. 81). Presented at the 20th International Workshop on Approximation Algorithms for Combinatorial Optimization Problems, APPROX, Berkeley, USA: Schloss Dagstuhl - Leibniz-Zentrum für Informatik. https://doi.org/10.4230/LIPIcs.APPROX-RANDOM.2017.20"},"month":"08","publication_status":"published","status":"public"}