[{"publication_status":"published","publication_identifier":{"isbn":["978-1-4503-8635-7"]},"language":[{"iso":"eng"}],"file":[{"date_updated":"2021-10-18T07:36:03Z","file_size":29796364,"creator":"bbickel","date_created":"2021-10-18T07:36:03Z","file_name":"degraen-UIST2021_Texture_Appropriation_CR_preprint.pdf","content_type":"application/pdf","access_level":"open_access","relation":"main_file","checksum":"b0b26464df79b3a59e8ed82e4e19ab15","file_id":"10149"}],"ec_funded":1,"abstract":[{"text":"Tactile feedback of an object’s surface enables us to discern its material properties and affordances. This understanding is used in digital fabrication processes by creating objects with high-resolution surface variations to influence a user’s tactile perception. As the design of such surface haptics commonly relies on knowledge from real-life experiences, it is unclear how to adapt this information for digital design methods. In this work, we investigate replicating the haptics of real materials. Using an existing process for capturing an object’s microgeometry, we digitize and reproduce the stable surface information of a set of 15 fabric samples. In a psychophysical experiment, we evaluate the tactile qualities of our set of original samples and their replicas. From our results, we see that direct reproduction of surface variations is able to influence different psychophysical dimensions of the tactile perception of surface textures. While the fabrication process did not preserve all properties, our approach underlines that replication of surface microgeometries benefits fabrication methods in terms of haptic perception by covering a large range of tactile variations. Moreover, by changing the surface structure of a single fabricated material, its material perception can be influenced. We conclude by proposing strategies for capturing and reproducing digitized textures to better resemble the perceived haptics of the originals.","lang":"eng"}],"oa_version":"Preprint","month":"10","date_updated":"2021-10-19T19:29:06Z","ddc":["000"],"file_date_updated":"2021-10-18T07:36:03Z","department":[{"_id":"BeBi"}],"_id":"10148","conference":{"start_date":"2021-10-10","location":"Virtual","end_date":"2021-10-14","name":"UIST: User Interface Software and Technology"},"type":"conference","status":"public","year":"2021","has_accepted_license":"1","publication":"34th Annual ACM Symposium","day":"10","page":"954-971","date_created":"2021-10-18T07:36:11Z","date_published":"2021-10-10T00:00:00Z","doi":"10.1145/3472749.3474798","acknowledgement":"Our gratitude goes out to Kamila Mushkina, Akhmajon Makhsadov, Jordan Espenshade, Bruno Fruchard, Roland Bennewitz, and Robert Drumm. This project has received funding from the EU’s Horizon 2020 research and innovation programme, under the Marie Skłodowska-Curie grant agreement No 642841 (DISTRO).","oa":1,"quality_controlled":"1","publisher":"Association for Computing Machinery","citation":{"chicago":"Degraen, Donald, Michael Piovarci, Bernd Bickel, and Antonio Kruger. “Capturing Tactile Properties of Real Surfaces for Haptic Reproduction.” In 34th Annual ACM Symposium, 954–71. Association for Computing Machinery, 2021. https://doi.org/10.1145/3472749.3474798.","ista":"Degraen D, Piovarci M, Bickel B, Kruger A. 2021. Capturing tactile properties of real surfaces for haptic reproduction. 34th Annual ACM Symposium. UIST: User Interface Software and Technology, 954–971.","mla":"Degraen, Donald, et al. “Capturing Tactile Properties of Real Surfaces for Haptic Reproduction.” 34th Annual ACM Symposium, Association for Computing Machinery, 2021, pp. 954–71, doi:10.1145/3472749.3474798.","ama":"Degraen D, Piovarci M, Bickel B, Kruger A. Capturing tactile properties of real surfaces for haptic reproduction. In: 34th Annual ACM Symposium. Association for Computing Machinery; 2021:954-971. doi:10.1145/3472749.3474798","apa":"Degraen, D., Piovarci, M., Bickel, B., & Kruger, A. (2021). Capturing tactile properties of real surfaces for haptic reproduction. In 34th Annual ACM Symposium (pp. 954–971). Virtual: Association for Computing Machinery. https://doi.org/10.1145/3472749.3474798","short":"D. Degraen, M. Piovarci, B. Bickel, A. Kruger, in:, 34th Annual ACM Symposium, Association for Computing Machinery, 2021, pp. 954–971.","ieee":"D. Degraen, M. Piovarci, B. Bickel, and A. Kruger, “Capturing tactile properties of real surfaces for haptic reproduction,” in 34th Annual ACM Symposium, Virtual, 2021, pp. 954–971."},"user_id":"8b945eb4-e2f2-11eb-945a-df72226e66a9","article_processing_charge":"No","author":[{"full_name":"Degraen, Donald","last_name":"Degraen","first_name":"Donald"},{"last_name":"Piovarci","full_name":"Piovarci, Michael","first_name":"Michael","id":"62E473F4-5C99-11EA-A40E-AF823DDC885E"},{"first_name":"Bernd","id":"49876194-F248-11E8-B48F-1D18A9856A87","last_name":"Bickel","full_name":"Bickel, Bernd","orcid":"0000-0001-6511-9385"},{"full_name":"Kruger, Antonio","last_name":"Kruger","first_name":"Antonio"}],"title":"Capturing tactile properties of real surfaces for haptic reproduction","project":[{"grant_number":"642841","name":"Distributed 3D Object Design","_id":"2508E324-B435-11E9-9278-68D0E5697425","call_identifier":"H2020"}]},{"file_date_updated":"2021-03-22T08:15:28Z","department":[{"_id":"BeBi"}],"date_updated":"2023-08-07T14:11:57Z","ddc":["000"],"article_type":"original","type":"journal_article","tmp":{"legal_code_url":"https://creativecommons.org/licenses/by/4.0/legalcode","image":"/images/cc_by.png","name":"Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)","short":"CC BY (4.0)"},"status":"public","_id":"9241","volume":29,"issue":"5","license":"https://creativecommons.org/licenses/by/4.0/","ec_funded":1,"publication_identifier":{"eissn":["1094-4087"]},"publication_status":"published","file":[{"creator":"dernst","date_updated":"2021-03-22T08:15:28Z","file_size":10873700,"date_created":"2021-03-22T08:15:28Z","file_name":"2021_OpticsExpress_Elek.pdf","access_level":"open_access","relation":"main_file","content_type":"application/pdf","file_id":"9269","checksum":"a9697ad83136c19ad87e46aa2db63cfd","success":1}],"language":[{"iso":"eng"}],"scopus_import":"1","month":"03","intvolume":" 29","abstract":[{"text":"Volumetric light transport is a pervasive physical phenomenon, and therefore its accurate simulation is important for a broad array of disciplines. While suitable mathematical models for computing the transport are now available, obtaining the necessary material parameters needed to drive such simulations is a challenging task: direct measurements of these parameters from material samples are seldom possible. Building on the inverse scattering paradigm, we present a novel measurement approach which indirectly infers the transport parameters from extrinsic observations of multiple-scattered radiance. The novelty of the proposed approach lies in replacing structured illumination with a structured reflector bonded to the sample, and a robust fitting procedure that largely compensates for potential systematic errors in the calibration of the setup. We show the feasibility of our approach by validating simulations of complex 3D compositions of the measured materials against physical prints, using photo-polymer resins. As presented in this paper, our technique yields colorspace data suitable for accurate appearance reproduction in the area of 3D printing. Beyond that, and without fundamental changes to the basic measurement methodology, it could equally well be used to obtain spectral measurements that are useful for other application areas.","lang":"eng"}],"oa_version":"Published Version","author":[{"first_name":"Oskar","last_name":"Elek","full_name":"Elek, Oskar"},{"id":"4DDBCEB0-F248-11E8-B48F-1D18A9856A87","first_name":"Ran","last_name":"Zhang","orcid":"0000-0002-3808-281X","full_name":"Zhang, Ran"},{"full_name":"Sumin, Denis","last_name":"Sumin","first_name":"Denis"},{"full_name":"Myszkowski, Karol","last_name":"Myszkowski","first_name":"Karol"},{"last_name":"Bickel","full_name":"Bickel, Bernd","orcid":"0000-0001-6511-9385","first_name":"Bernd","id":"49876194-F248-11E8-B48F-1D18A9856A87"},{"last_name":"Wilkie","full_name":"Wilkie, Alexander","first_name":"Alexander"},{"full_name":"Křivánek, Jaroslav","last_name":"Křivánek","first_name":"Jaroslav"},{"full_name":"Weyrich, Tim","last_name":"Weyrich","first_name":"Tim"}],"external_id":{"isi":["000624968100103"]},"article_processing_charge":"No","title":"Robust and practical measurement of volume transport parameters in solid photo-polymer materials for 3D printing","citation":{"chicago":"Elek, Oskar, Ran Zhang, Denis Sumin, Karol Myszkowski, Bernd Bickel, Alexander Wilkie, Jaroslav Křivánek, and Tim Weyrich. “Robust and Practical Measurement of Volume Transport Parameters in Solid Photo-Polymer Materials for 3D Printing.” Optics Express. The Optical Society, 2021. https://doi.org/10.1364/OE.406095.","ista":"Elek O, Zhang R, Sumin D, Myszkowski K, Bickel B, Wilkie A, Křivánek J, Weyrich T. 2021. Robust and practical measurement of volume transport parameters in solid photo-polymer materials for 3D printing. Optics Express. 29(5), 7568–7588.","mla":"Elek, Oskar, et al. “Robust and Practical Measurement of Volume Transport Parameters in Solid Photo-Polymer Materials for 3D Printing.” Optics Express, vol. 29, no. 5, The Optical Society, 2021, pp. 7568–88, doi:10.1364/OE.406095.","apa":"Elek, O., Zhang, R., Sumin, D., Myszkowski, K., Bickel, B., Wilkie, A., … Weyrich, T. (2021). Robust and practical measurement of volume transport parameters in solid photo-polymer materials for 3D printing. Optics Express. The Optical Society. https://doi.org/10.1364/OE.406095","ama":"Elek O, Zhang R, Sumin D, et al. Robust and practical measurement of volume transport parameters in solid photo-polymer materials for 3D printing. Optics Express. 2021;29(5):7568-7588. doi:10.1364/OE.406095","ieee":"O. Elek et al., “Robust and practical measurement of volume transport parameters in solid photo-polymer materials for 3D printing,” Optics Express, vol. 29, no. 5. The Optical Society, pp. 7568–7588, 2021.","short":"O. Elek, R. Zhang, D. Sumin, K. Myszkowski, B. Bickel, A. Wilkie, J. Křivánek, T. Weyrich, Optics Express 29 (2021) 7568–7588."},"user_id":"4359f0d1-fa6c-11eb-b949-802e58b17ae8","project":[{"call_identifier":"H2020","_id":"2508E324-B435-11E9-9278-68D0E5697425","name":"Distributed 3D Object Design","grant_number":"642841"},{"call_identifier":"H2020","_id":"24F9549A-B435-11E9-9278-68D0E5697425","name":"MATERIALIZABLE: Intelligent fabrication-oriented Computational Design and Modeling","grant_number":"715767"}],"page":"7568-7588","doi":"10.1364/OE.406095","date_published":"2021-03-01T00:00:00Z","date_created":"2021-03-14T23:01:33Z","has_accepted_license":"1","isi":1,"year":"2021","day":"01","publication":"Optics Express","publisher":"The Optical Society","quality_controlled":"1","oa":1,"acknowledgement":"H2020 Marie Skłodowska-Curie Actions (642841); European Research Council (715767); Grantová Agentura České Republiky (16-08111S, 16-18964S); Univerzita Karlova v Praze (SVV-2017-260452); Engineering and Physical Sciences Research Council (EP/K023578/1).\r\nWe are grateful to Stratasys Ltd. for access to the voxel-level print interface of the J750\r\nmachine."},{"file":[{"content_type":"application/pdf","access_level":"open_access","relation":"main_file","checksum":"8564b3118457d4c8939a8ef2b1a2f16c","file_id":"9377","date_updated":"2021-05-08T17:36:59Z","file_size":18926557,"creator":"bbickel","date_created":"2021-05-08T17:36:59Z","file_name":"Multistable-authorversion.pdf"},{"creator":"bbickel","file_size":76542901,"date_updated":"2021-05-08T17:38:22Z","file_name":"multistable-video.mp4","date_created":"2021-05-08T17:38:22Z","relation":"main_file","access_level":"open_access","content_type":"video/mp4","success":1,"checksum":"3b6e874e30bfa1bfc3ad3498710145a1","file_id":"9378"},{"date_created":"2021-12-17T08:13:51Z","title":"Supplementary Material for “Computational Design of Planar Multistable Compliant Structures”","file_name":"multistable-supplementary material.pdf","creator":"bbickel","date_updated":"2021-12-17T08:13:51Z","file_size":3367072,"file_id":"10562","checksum":"20dc3bc42e1a912a5b0247c116772098","access_level":"open_access","relation":"supplementary_material","content_type":"application/pdf","description":"This document provides additional results and analyzes the robustness and limitations of our approach."}],"language":[{"iso":"eng"}],"publication_identifier":{"eissn":["1557-7368"],"issn":["0730-0301"]},"publication_status":"published","issue":"5","volume":40,"ec_funded":1,"oa_version":"Published Version","acknowledged_ssus":[{"_id":"M-Shop"}],"abstract":[{"text":"This paper presents a method for designing planar multistable compliant structures. Given a sequence of desired stable states and the corresponding poses of the structure, we identify the topology and geometric realization of a mechanism—consisting of bars and joints—that is able to physically reproduce the desired multistable behavior. In order to solve this problem efficiently, we build on insights from minimally rigid graph theory to identify simple but effective topologies for the mechanism. We then optimize its geometric parameters, such as joint positions and bar lengths, to obtain correct transitions between the given poses. Simultaneously, we ensure adequate stability of each pose based on an effective approximate error metric related to the elastic energy Hessian of the bars in the mechanism. As demonstrated by our results, we obtain functional multistable mechanisms of manageable complexity that can be fabricated using 3D printing. Further, we evaluated the effectiveness of our method on a large number of examples in the simulation and fabricated several physical prototypes.","lang":"eng"}],"month":"10","intvolume":" 40","ddc":["000"],"date_updated":"2023-08-08T13:31:38Z","file_date_updated":"2021-12-17T08:13:51Z","department":[{"_id":"BeBi"}],"_id":"9376","status":"public","keyword":["multistability","mechanism","computational design","rigidity"],"type":"journal_article","article_type":"original","tmp":{"legal_code_url":"https://creativecommons.org/licenses/by/4.0/legalcode","image":"/images/cc_by.png","name":"Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)","short":"CC BY (4.0)"},"day":"08","publication":"ACM Transactions on Graphics","isi":1,"has_accepted_license":"1","year":"2021","date_published":"2021-10-08T00:00:00Z","doi":"10.1145/3453477","date_created":"2021-05-08T17:37:08Z","acknowledgement":"We would like to thank everyone who contributed to this paper, the authors of artworks for all the examples, including @macrovec-tor_official and Wikimedia for the FLAG semaphore, and @pikisuper-star for the FIGURINE. The photos of iconic poses in the teaser were supplied by (from left to right): Mike Hewitt/Olympics Day 8 - Athletics/Gettty Images, Oneinchpunch/Basketball player training on acourt in New york city/Shutterstock, and Andrew Redington/TigerWoods/Getty Images. We also want to express our gratitude to Christian Hafner for insightful discussions, the IST Austria machine shop SSU, all proof-readers, and anonymous reviewers. This project has received funding from the European Union’s Horizon 2020 research and innovation programme, under the Marie Skłodowska-Curie grant agreement No 642841 (DISTRO), and under the European Research Council grant agreement No 715767 (MATERIALIZABLE).","publisher":"Association for Computing Machinery","quality_controlled":"1","oa":1,"user_id":"4359f0d1-fa6c-11eb-b949-802e58b17ae8","citation":{"ieee":"R. Zhang, T. Auzinger, and B. Bickel, “Computational design of planar multistable compliant structures,” ACM Transactions on Graphics, vol. 40, no. 5. Association for Computing Machinery, 2021.","short":"R. Zhang, T. Auzinger, B. Bickel, ACM Transactions on Graphics 40 (2021).","ama":"Zhang R, Auzinger T, Bickel B. Computational design of planar multistable compliant structures. ACM Transactions on Graphics. 2021;40(5). doi:10.1145/3453477","apa":"Zhang, R., Auzinger, T., & Bickel, B. (2021). Computational design of planar multistable compliant structures. ACM Transactions on Graphics. Association for Computing Machinery. https://doi.org/10.1145/3453477","mla":"Zhang, Ran, et al. “Computational Design of Planar Multistable Compliant Structures.” ACM Transactions on Graphics, vol. 40, no. 5, 186, Association for Computing Machinery, 2021, doi:10.1145/3453477.","ista":"Zhang R, Auzinger T, Bickel B. 2021. Computational design of planar multistable compliant structures. ACM Transactions on Graphics. 40(5), 186.","chicago":"Zhang, Ran, Thomas Auzinger, and Bernd Bickel. “Computational Design of Planar Multistable Compliant Structures.” ACM Transactions on Graphics. Association for Computing Machinery, 2021. https://doi.org/10.1145/3453477."},"title":"Computational design of planar multistable compliant structures","author":[{"id":"4DDBCEB0-F248-11E8-B48F-1D18A9856A87","first_name":"Ran","last_name":"Zhang","orcid":"0000-0002-3808-281X","full_name":"Zhang, Ran"},{"orcid":"0000-0002-1546-3265","full_name":"Auzinger, Thomas","last_name":"Auzinger","first_name":"Thomas","id":"4718F954-F248-11E8-B48F-1D18A9856A87"},{"id":"49876194-F248-11E8-B48F-1D18A9856A87","first_name":"Bernd","last_name":"Bickel","orcid":"0000-0001-6511-9385","full_name":"Bickel, Bernd"}],"external_id":{"isi":["000752079300003"]},"article_processing_charge":"No","article_number":"186","project":[{"name":"Distributed 3D Object Design","grant_number":"642841","_id":"2508E324-B435-11E9-9278-68D0E5697425","call_identifier":"H2020"},{"call_identifier":"H2020","_id":"24F9549A-B435-11E9-9278-68D0E5697425","grant_number":"715767","name":"MATERIALIZABLE: Intelligent fabrication-oriented Computational Design and Modeling"}]},{"acknowledgement":"The authors would like to thank anonymous reviewers for their constructive comments. Weiwei Xu is partially supported by Zhejiang Lab. Yin Yang is partially spported by NSF under Grant Nos. CHS 1845024 and 1717972. Weiwei Xu and Hujun Bao are supported by Fundamental Research Funds for the Central Universities. This project has received funding from the European Research Council (ERC) under the European Unions Horizon 2020 research and innovation programme (Grant agreement No 715767).","oa":1,"publisher":"IEEE","quality_controlled":"1","publication":"IEEE Transactions on Visualization and Computer Graphics","day":"01","year":"2021","isi":1,"has_accepted_license":"1","date_created":"2021-05-23T22:01:42Z","doi":"10.1109/TVCG.2019.2957218","date_published":"2021-06-01T00:00:00Z","article_number":"2881-2895","project":[{"_id":"24F9549A-B435-11E9-9278-68D0E5697425","call_identifier":"H2020","grant_number":"715767","name":"MATERIALIZABLE: Intelligent fabrication-oriented Computational Design and Modeling"}],"user_id":"4359f0d1-fa6c-11eb-b949-802e58b17ae8","citation":{"chicago":"Feng, Xudong, Jiafeng Liu, Huamin Wang, Yin Yang, Hujun Bao, Bernd Bickel, and Weiwei Xu. “Computational Design of Skinned Quad-Robots.” IEEE Transactions on Visualization and Computer Graphics. IEEE, 2021. https://doi.org/10.1109/TVCG.2019.2957218.","ista":"Feng X, Liu J, Wang H, Yang Y, Bao H, Bickel B, Xu W. 2021. Computational design of skinned Quad-Robots. IEEE Transactions on Visualization and Computer Graphics. 27(6), 2881–2895.","mla":"Feng, Xudong, et al. “Computational Design of Skinned Quad-Robots.” IEEE Transactions on Visualization and Computer Graphics, vol. 27, no. 6, 2881–2895, IEEE, 2021, doi:10.1109/TVCG.2019.2957218.","ama":"Feng X, Liu J, Wang H, et al. Computational design of skinned Quad-Robots. IEEE Transactions on Visualization and Computer Graphics. 2021;27(6). doi:10.1109/TVCG.2019.2957218","apa":"Feng, X., Liu, J., Wang, H., Yang, Y., Bao, H., Bickel, B., & Xu, W. (2021). Computational design of skinned Quad-Robots. IEEE Transactions on Visualization and Computer Graphics. IEEE. https://doi.org/10.1109/TVCG.2019.2957218","ieee":"X. Feng et al., “Computational design of skinned Quad-Robots,” IEEE Transactions on Visualization and Computer Graphics, vol. 27, no. 6. IEEE, 2021.","short":"X. Feng, J. Liu, H. Wang, Y. Yang, H. Bao, B. Bickel, W. Xu, IEEE Transactions on Visualization and Computer Graphics 27 (2021)."},"title":"Computational design of skinned Quad-Robots","article_processing_charge":"No","external_id":{"isi":["000649620700009"],"pmid":["31804937"]},"author":[{"last_name":"Feng","full_name":"Feng, Xudong","first_name":"Xudong"},{"first_name":"Jiafeng","last_name":"Liu","full_name":"Liu, Jiafeng"},{"last_name":"Wang","full_name":"Wang, Huamin","first_name":"Huamin"},{"first_name":"Yin","last_name":"Yang","full_name":"Yang, Yin"},{"first_name":"Hujun","full_name":"Bao, Hujun","last_name":"Bao"},{"last_name":"Bickel","orcid":"0000-0001-6511-9385","full_name":"Bickel, Bernd","id":"49876194-F248-11E8-B48F-1D18A9856A87","first_name":"Bernd"},{"first_name":"Weiwei","last_name":"Xu","full_name":"Xu, Weiwei"}],"pmid":1,"oa_version":"Published Version","abstract":[{"text":"We present a computational design system that assists users to model, optimize, and fabricate quad-robots with soft skins. Our system addresses the challenging task of predicting their physical behavior by fully integrating the multibody dynamics of the mechanical skeleton and the elastic behavior of the soft skin. The developed motion control strategy uses an alternating optimization scheme to avoid expensive full space time-optimization, interleaving space-time optimization for the skeleton, and frame-by-frame optimization for the full dynamics. The output are motor torques to drive the robot to achieve a user prescribed motion trajectory. We also provide a collection of convenient engineering tools and empirical manufacturing guidance to support the fabrication of the designed quad-robot. We validate the feasibility of designs generated with our system through physics simulations and with a physically-fabricated prototype.","lang":"eng"}],"intvolume":" 27","month":"06","scopus_import":"1","language":[{"iso":"eng"}],"file":[{"creator":"kschuh","date_updated":"2021-05-25T15:08:49Z","file_size":6183002,"date_created":"2021-05-25T15:08:49Z","file_name":"2021_TVCG_Feng.pdf","access_level":"open_access","relation":"main_file","content_type":"application/pdf","file_id":"9427","checksum":"a78e6ac94e33ade4ffaea66943d5f7dc","success":1}],"publication_status":"published","publication_identifier":{"eissn":["10772626"],"issn":["19410506"]},"ec_funded":1,"volume":27,"issue":"6","_id":"9408","status":"public","tmp":{"legal_code_url":"https://creativecommons.org/licenses/by/4.0/legalcode","image":"/images/cc_by.png","name":"Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)","short":"CC BY (4.0)"},"type":"journal_article","ddc":["000"],"date_updated":"2023-08-08T13:45:46Z","file_date_updated":"2021-05-25T15:08:49Z","department":[{"_id":"BeBi"}]},{"user_id":"4359f0d1-fa6c-11eb-b949-802e58b17ae8","citation":{"apa":"Mallikarjun, B. R., Tewari, A., Dib, A., Weyrich, T., Bickel, B., Seidel, H. P., … Theobalt, C. (2021). PhotoApp: Photorealistic appearance editing of head portraits. ACM Transactions on Graphics. Association for Computing Machinery. https://doi.org/10.1145/3450626.3459765","ama":"Mallikarjun BR, Tewari A, Dib A, et al. PhotoApp: Photorealistic appearance editing of head portraits. ACM Transactions on Graphics. 2021;40(4). doi:10.1145/3450626.3459765","short":"B.R. Mallikarjun, A. Tewari, A. Dib, T. Weyrich, B. Bickel, H.P. Seidel, H. Pfister, W. Matusik, L. Chevallier, M.A. Elgharib, C. Theobalt, ACM Transactions on Graphics 40 (2021).","ieee":"B. R. Mallikarjun et al., “PhotoApp: Photorealistic appearance editing of head portraits,” ACM Transactions on Graphics, vol. 40, no. 4. Association for Computing Machinery, 2021.","mla":"Mallikarjun, B. R., et al. “PhotoApp: Photorealistic Appearance Editing of Head Portraits.” ACM Transactions on Graphics, vol. 40, no. 4, 44, Association for Computing Machinery, 2021, doi:10.1145/3450626.3459765.","ista":"Mallikarjun BR, Tewari A, Dib A, Weyrich T, Bickel B, Seidel HP, Pfister H, Matusik W, Chevallier L, Elgharib MA, Theobalt C. 2021. PhotoApp: Photorealistic appearance editing of head portraits. ACM Transactions on Graphics. 40(4), 44.","chicago":"Mallikarjun, B. R., Ayush Tewari, Abdallah Dib, Tim Weyrich, Bernd Bickel, Hans Peter Seidel, Hanspeter Pfister, et al. “PhotoApp: Photorealistic Appearance Editing of Head Portraits.” ACM Transactions on Graphics. Association for Computing Machinery, 2021. https://doi.org/10.1145/3450626.3459765."},"title":"PhotoApp: Photorealistic appearance editing of head portraits","author":[{"last_name":"Mallikarjun","full_name":"Mallikarjun, B. R.","first_name":"B. R."},{"full_name":"Tewari, Ayush","last_name":"Tewari","first_name":"Ayush"},{"last_name":"Dib","full_name":"Dib, Abdallah","first_name":"Abdallah"},{"first_name":"Tim","full_name":"Weyrich, Tim","last_name":"Weyrich"},{"id":"49876194-F248-11E8-B48F-1D18A9856A87","first_name":"Bernd","last_name":"Bickel","full_name":"Bickel, Bernd","orcid":"0000-0001-6511-9385"},{"full_name":"Seidel, Hans Peter","last_name":"Seidel","first_name":"Hans Peter"},{"first_name":"Hanspeter","last_name":"Pfister","full_name":"Pfister, Hanspeter"},{"first_name":"Wojciech","full_name":"Matusik, Wojciech","last_name":"Matusik"},{"last_name":"Chevallier","full_name":"Chevallier, Louis","first_name":"Louis"},{"last_name":"Elgharib","full_name":"Elgharib, Mohamed A.","first_name":"Mohamed A."},{"first_name":"Christian","full_name":"Theobalt, Christian","last_name":"Theobalt"}],"external_id":{"arxiv":["2103.07658"],"isi":["000674930900011"]},"article_processing_charge":"Yes (in subscription journal)","article_number":"44","day":"01","publication":"ACM Transactions on Graphics","has_accepted_license":"1","isi":1,"year":"2021","doi":"10.1145/3450626.3459765","date_published":"2021-08-01T00:00:00Z","date_created":"2021-08-08T22:01:27Z","acknowledgement":"This work was supported by the ERC Consolidator Grant 4DReply (770784). We also acknowledge support from Technicolor and InterDigital. We thank Tiancheng Sun for kindly helping us with the comparisons with Sun et al. [2019].","publisher":"Association for Computing Machinery","quality_controlled":"1","oa":1,"ddc":["000"],"date_updated":"2023-08-10T14:25:08Z","department":[{"_id":"BeBi"}],"file_date_updated":"2021-08-09T11:41:50Z","_id":"9819","status":"public","article_type":"original","type":"journal_article","tmp":{"legal_code_url":"https://creativecommons.org/licenses/by/4.0/legalcode","image":"/images/cc_by.png","name":"Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)","short":"CC BY (4.0)"},"file":[{"creator":"asandaue","date_updated":"2021-08-09T11:41:50Z","file_size":49840741,"date_created":"2021-08-09T11:41:50Z","file_name":"2021_ACMTransactionsOnGraphics_Mallikarjun.pdf","access_level":"open_access","relation":"main_file","content_type":"application/pdf","checksum":"51b61b7e5c175e2d7ed8fa3b35f7525a","file_id":"9834","success":1}],"language":[{"iso":"eng"}],"publication_identifier":{"eissn":["15577368"],"issn":["07300301"]},"publication_status":"published","volume":40,"issue":"4","oa_version":"Published Version","abstract":[{"text":"Photorealistic editing of head portraits is a challenging task as humans are very sensitive to inconsistencies in faces. We present an approach for high-quality intuitive editing of the camera viewpoint and scene illumination (parameterised with an environment map) in a portrait image. This requires our method to capture and control the full reflectance field of the person in the image. Most editing approaches rely on supervised learning using training data captured with setups such as light and camera stages. Such datasets are expensive to acquire, not readily available and do not capture all the rich variations of in-the-wild portrait images. In addition, most supervised approaches only focus on relighting, and do not allow camera viewpoint editing. Thus, they only capture and control a subset of the reflectance field. Recently, portrait editing has been demonstrated by operating in the generative model space of StyleGAN. While such approaches do not require direct supervision, there is a significant loss of quality when compared to the supervised approaches. In this paper, we present a method which learns from limited supervised training data. The training images only include people in a fixed neutral expression with eyes closed, without much hair or background variations. Each person is captured under 150 one-light-at-a-time conditions and under 8 camera poses. Instead of training directly in the image space, we design a supervised problem which learns transformations in the latent space of StyleGAN. This combines the best of supervised learning and generative adversarial modeling. We show that the StyleGAN prior allows for generalisation to different expressions, hairstyles and backgrounds. This produces high-quality photorealistic results for in-the-wild images and significantly outperforms existing methods. Our approach can edit the illumination and pose simultaneously, and runs at interactive rates.","lang":"eng"}],"month":"08","intvolume":" 40","scopus_import":"1"}]