{"oa_version":"Published Version","title":"Unifying two views on multiple mean-payoff objectives in Markov decision processes","user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","citation":{"ama":"Chatterjee K, Komarkova Z, Kretinsky J. *Unifying Two Views on Multiple Mean-Payoff Objectives in Markov Decision Processes*. IST Austria; 2015. doi:10.15479/AT:IST-2015-318-v1-1","ista":"Chatterjee K, Komarkova Z, Kretinsky J. 2015. Unifying two views on multiple mean-payoff objectives in Markov decision processes, IST Austria, 41p.","mla":"Chatterjee, Krishnendu, et al. *Unifying Two Views on Multiple Mean-Payoff Objectives in Markov Decision Processes*. IST Austria, 2015, doi:10.15479/AT:IST-2015-318-v1-1.","chicago":"Chatterjee, Krishnendu, Zuzana Komarkova, and Jan Kretinsky. *Unifying Two Views on Multiple Mean-Payoff Objectives in Markov Decision Processes*. IST Austria, 2015. https://doi.org/10.15479/AT:IST-2015-318-v1-1.","ieee":"K. Chatterjee, Z. Komarkova, and J. Kretinsky, *Unifying two views on multiple mean-payoff objectives in Markov decision processes*. IST Austria, 2015.","short":"K. Chatterjee, Z. Komarkova, J. Kretinsky, Unifying Two Views on Multiple Mean-Payoff Objectives in Markov Decision Processes, IST Austria, 2015.","apa":"Chatterjee, K., Komarkova, Z., & Kretinsky, J. (2015). *Unifying two views on multiple mean-payoff objectives in Markov decision processes*. IST Austria. https://doi.org/10.15479/AT:IST-2015-318-v1-1"},"related_material":{"record":[{"relation":"later_version","status":"public","id":"466"},{"id":"1657","status":"public","relation":"later_version"},{"id":"5435","status":"public","relation":"later_version"}]},"language":[{"iso":"eng"}],"author":[{"full_name":"Chatterjee, Krishnendu","first_name":"Krishnendu","last_name":"Chatterjee","orcid":"0000-0002-4561-241X","id":"2E5DCA20-F248-11E8-B48F-1D18A9856A87"},{"last_name":"Komarkova","full_name":"Komarkova, Zuzana","first_name":"Zuzana"},{"first_name":"Jan","full_name":"Kretinsky, Jan","orcid":"0000-0002-8122-2881","last_name":"Kretinsky","id":"44CEF464-F248-11E8-B48F-1D18A9856A87"}],"page":"41","date_updated":"2021-01-12T08:02:15Z","_id":"5429","publisher":"IST Austria","year":"2015","has_accepted_license":"1","publication_identifier":{"issn":["2664-1690"]},"oa":1,"day":"12","doi":"10.15479/AT:IST-2015-318-v1-1","alternative_title":["IST Austria Technical Report"],"month":"01","publication_status":"published","status":"public","date_published":"2015-01-12T00:00:00Z","file_date_updated":"2020-07-14T12:46:52Z","type":"technical_report","date_created":"2018-12-12T11:39:17Z","department":[{"_id":"KrCh"}],"abstract":[{"lang":"eng","text":"We consider Markov decision processes (MDPs) with multiple limit-average (or mean-payoff) objectives. \r\nThere have been two different views: (i) the expectation semantics, where the goal is to optimize the expected mean-payoff objective, and (ii) the satisfaction semantics, where the goal is to maximize the probability of runs such that the mean-payoff value stays above a given vector. \r\nWe consider the problem where the goal is to optimize the expectation under the constraint that the satisfaction semantics is ensured, and thus consider a generalization that unifies the existing semantics.\r\nOur problem captures the notion of optimization with respect to strategies that are risk-averse (i.e., ensures certain probabilistic guarantee).\r\nOur main results are algorithms for the decision problem which are always polynomial in the size of the MDP. We also show that an approximation of the Pareto-curve can be computed in time polynomial in the size of the MDP, and the approximation factor, but exponential in the number of dimensions.\r\nFinally, we present a complete characterization of the strategy complexity (in terms of memory bounds and randomization) required to solve our problem."}],"file":[{"content_type":"application/pdf","file_size":689863,"date_created":"2018-12-12T11:54:11Z","relation":"main_file","date_updated":"2020-07-14T12:46:52Z","file_name":"IST-2015-318-v1+1_main.pdf","checksum":"e4869a584567c506349abda9c8ec7db3","creator":"system","file_id":"5533","access_level":"open_access"}],"ddc":["004"],"pubrep_id":"318"}