{"abstract":[{"lang":"eng","text":"Quantization converts neural networks into low-bit fixed-point computations which can be carried out by efficient integer-only hardware, and is standard practice for the deployment of neural networks on real-time embedded devices. However, like their real-numbered counterpart, quantized networks are not immune to malicious misclassification caused by adversarial attacks. We investigate how quantization affects a network’s robustness to adversarial attacks, which is a formal verification question. We show that neither robustness nor non-robustness are monotonic with changing the number of bits for the representation and, also, neither are preserved by quantization from a real-numbered network. For this reason, we introduce a verification method for quantized neural networks which, using SMT solving over bit-vectors, accounts for their exact, bit-precise semantics. We built a tool and analyzed the effect of quantization on a classifier for the MNIST dataset. We demonstrate that, compared to our method, existing methods for the analysis of real-numbered networks often derive false conclusions about their quantizations, both when determining robustness and when detecting attacks, and that existing methods for quantized networks often miss attacks. Furthermore, we applied our method beyond robustness, showing how the number of bits in quantization enlarges the gender bias of a predictor for students’ grades."}],"volume":12079,"file_date_updated":"2020-07-14T12:48:03Z","date_created":"2020-05-10T22:00:49Z","publication_status":"published","language":[{"iso":"eng"}],"publication":"International Conference on Tools and Algorithms for the Construction and Analysis of Systems","_id":"7808","scopus_import":1,"oa":1,"file":[{"relation":"main_file","file_size":2744030,"checksum":"f19905a42891fe5ce93d69143fa3f6fb","access_level":"open_access","date_created":"2020-05-26T12:48:15Z","date_updated":"2020-07-14T12:48:03Z","content_type":"application/pdf","file_name":"2020_TACAS_Giacobbe.pdf","creator":"dernst","file_id":"7893"}],"publisher":"Springer Nature","day":"17","department":[{"_id":"ToHe"}],"user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","article_processing_charge":"No","oa_version":"Published Version","alternative_title":["LNCS"],"date_updated":"2023-06-23T07:01:11Z","status":"public","conference":{"name":"TACAS: Tools and Algorithms for the Construction and Analysis of Systems","end_date":"2020-04-30","start_date":"2020-04-25","location":"Dublin, Ireland"},"author":[{"first_name":"Mirco","orcid":"0000-0001-8180-0904","last_name":"Giacobbe","full_name":"Giacobbe, Mirco","id":"3444EA5E-F248-11E8-B48F-1D18A9856A87"},{"id":"40876CD8-F248-11E8-B48F-1D18A9856A87","full_name":"Henzinger, Thomas A","last_name":"Henzinger","first_name":"Thomas A","orcid":"0000-0002-2985-7724"},{"full_name":"Lechner, Mathias","id":"3DC22916-F248-11E8-B48F-1D18A9856A87","first_name":"Mathias","last_name":"Lechner"}],"ddc":["000"],"doi":"10.1007/978-3-030-45237-7_5","type":"conference","has_accepted_license":"1","page":"79-97","tmp":{"name":"Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)","image":"/images/cc_by.png","short":"CC BY (4.0)","legal_code_url":"https://creativecommons.org/licenses/by/4.0/legalcode"},"intvolume":" 12079","date_published":"2020-04-17T00:00:00Z","citation":{"ama":"Giacobbe M, Henzinger TA, Lechner M. How many bits does it take to quantize your neural network? In: International Conference on Tools and Algorithms for the Construction and Analysis of Systems. Vol 12079. Springer Nature; 2020:79-97. doi:10.1007/978-3-030-45237-7_5","ieee":"M. Giacobbe, T. A. Henzinger, and M. Lechner, “How many bits does it take to quantize your neural network?,” in International Conference on Tools and Algorithms for the Construction and Analysis of Systems, Dublin, Ireland, 2020, vol. 12079, pp. 79–97.","short":"M. Giacobbe, T.A. Henzinger, M. Lechner, in:, International Conference on Tools and Algorithms for the Construction and Analysis of Systems, Springer Nature, 2020, pp. 79–97.","mla":"Giacobbe, Mirco, et al. “How Many Bits Does It Take to Quantize Your Neural Network?” International Conference on Tools and Algorithms for the Construction and Analysis of Systems, vol. 12079, Springer Nature, 2020, pp. 79–97, doi:10.1007/978-3-030-45237-7_5.","apa":"Giacobbe, M., Henzinger, T. A., & Lechner, M. (2020). How many bits does it take to quantize your neural network? In International Conference on Tools and Algorithms for the Construction and Analysis of Systems (Vol. 12079, pp. 79–97). Dublin, Ireland: Springer Nature. https://doi.org/10.1007/978-3-030-45237-7_5","ista":"Giacobbe M, Henzinger TA, Lechner M. 2020. How many bits does it take to quantize your neural network? International Conference on Tools and Algorithms for the Construction and Analysis of Systems. TACAS: Tools and Algorithms for the Construction and Analysis of Systems, LNCS, vol. 12079, 79–97.","chicago":"Giacobbe, Mirco, Thomas A Henzinger, and Mathias Lechner. “How Many Bits Does It Take to Quantize Your Neural Network?” In International Conference on Tools and Algorithms for the Construction and Analysis of Systems, 12079:79–97. Springer Nature, 2020. https://doi.org/10.1007/978-3-030-45237-7_5."},"title":"How many bits does it take to quantize your neural network?","related_material":{"record":[{"id":"11362","status":"public","relation":"dissertation_contains"}]},"quality_controlled":"1","project":[{"call_identifier":"FWF","name":"Rigorous Systems Engineering","_id":"25832EC2-B435-11E9-9278-68D0E5697425","grant_number":"S 11407_N23"},{"name":"The Wittgenstein Prize","call_identifier":"FWF","_id":"25F42A32-B435-11E9-9278-68D0E5697425","grant_number":"Z211"}],"publication_identifier":{"eissn":["16113349"],"issn":["03029743"],"isbn":["9783030452360"]},"month":"04","year":"2020"}