{"intvolume":" 80","oa":1,"oa_version":"Preprint","publication_status":"published","isi":1,"author":[{"full_name":"Kuzborskij, Ilja","last_name":"Kuzborskij","first_name":"Ilja"},{"orcid":"0000-0001-8622-7887","full_name":"Lampert, Christoph","last_name":"Lampert","first_name":"Christoph","id":"40C20FD2-F248-11E8-B48F-1D18A9856A87"}],"year":"2018","type":"conference","date_created":"2019-02-14T14:51:57Z","volume":80,"month":"02","date_updated":"2023-10-17T09:51:13Z","page":"2815-2824","title":"Data-dependent stability of stochastic gradient descent","_id":"6011","department":[{"_id":"ChLa"}],"user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","abstract":[{"text":"We establish a data-dependent notion of algorithmic stability for Stochastic Gradient Descent (SGD), and employ it to develop novel generalization bounds. This is in contrast to previous distribution-free algorithmic stability results for SGD which depend on the worst-case constants. By virtue of the data-dependent argument, our bounds provide new insights into learning with SGD on convex and non-convex problems. In the convex case, we show that the bound on the generalization error depends on the risk at the initialization point. In the non-convex case, we prove that the expected curvature of the objective function around the initialization point has crucial influence on the generalization error. In both cases, our results suggest a simple data-driven strategy to stabilize SGD by pre-screening its initialization. As a corollary, our results allow us to show optimistic generalization bounds that exhibit fast convergence rates for SGD subject to a vanishing empirical risk and low noise of stochastic gradient. ","lang":"eng"}],"publisher":"ML Research Press","language":[{"iso":"eng"}],"article_processing_charge":"No","citation":{"short":"I. Kuzborskij, C. Lampert, in:, Proceedings of the 35 Th International Conference on Machine Learning, ML Research Press, 2018, pp. 2815–2824.","apa":"Kuzborskij, I., & Lampert, C. (2018). Data-dependent stability of stochastic gradient descent. In Proceedings of the 35 th International Conference on Machine Learning (Vol. 80, pp. 2815–2824). Stockholm, Sweden: ML Research Press.","ista":"Kuzborskij I, Lampert C. 2018. Data-dependent stability of stochastic gradient descent. Proceedings of the 35 th International Conference on Machine Learning. ICML: International Conference on Machine Learning vol. 80, 2815–2824.","mla":"Kuzborskij, Ilja, and Christoph Lampert. “Data-Dependent Stability of Stochastic Gradient Descent.” Proceedings of the 35 Th International Conference on Machine Learning, vol. 80, ML Research Press, 2018, pp. 2815–24.","chicago":"Kuzborskij, Ilja, and Christoph Lampert. “Data-Dependent Stability of Stochastic Gradient Descent.” In Proceedings of the 35 Th International Conference on Machine Learning, 80:2815–24. ML Research Press, 2018.","ieee":"I. Kuzborskij and C. Lampert, “Data-dependent stability of stochastic gradient descent,” in Proceedings of the 35 th International Conference on Machine Learning, Stockholm, Sweden, 2018, vol. 80, pp. 2815–2824.","ama":"Kuzborskij I, Lampert C. Data-dependent stability of stochastic gradient descent. In: Proceedings of the 35 Th International Conference on Machine Learning. Vol 80. ML Research Press; 2018:2815-2824."},"conference":{"location":"Stockholm, Sweden","end_date":"2018-07-15","name":"ICML: International Conference on Machine Learning","start_date":"2018-07-10"},"scopus_import":"1","date_published":"2018-02-01T00:00:00Z","project":[{"name":"Lifelong Learning of Visual Scene Understanding","call_identifier":"FP7","_id":"2532554C-B435-11E9-9278-68D0E5697425","grant_number":"308036"}],"main_file_link":[{"url":"https://arxiv.org/abs/1703.01678","open_access":"1"}],"ec_funded":1,"status":"public","day":"01","publication":"Proceedings of the 35 th International Conference on Machine Learning","quality_controlled":"1","external_id":{"isi":["000683379202095"],"arxiv":["1703.01678"]}}