User Acceptance Criteria for Privacy Preserving Machine Learning Techniques
Löbner, S.; Pape, S. and Bracamonte, V.
In Proceedings of the 18th International Conference on Availability, Reliability and Security, ARES 2023, Benevento, Italy, 29 August 2023- 1 September 2023, pages 149:1-149:8, ACM, 2023, 20th International Workshop on Trust, Privacy and Security in the Digital Society.Abstract
Users are confronted with a variety of different machine learning applications in many domains. To make this possible especially for applications relying on private data, companies and developers are implementing Privacy Preserving Machine Learning (PPML) techniques what is already a challenge in itself. This study provides the first step for answering the question how to include the user's preferences for a PPML technique into the privacy by design process, when developing a new application. The goal is to support developers and AI service providers when choosing a PPML technique that best reflects the users' preferences. Based on discussions with privacy and PPML experts, we derived a framework that maps the characteristics of PPML and user acceptance criteria to better understand the key attributes.











Bibtex
@InProceedings{LPB23trustbus, author = {Sascha L\"obner and Sebastian Pape and Vanessa Bracamonte}, title = {User Acceptance Criteria for Privacy Preserving Machine Learning Techniques}, booktitle = {Proceedings of the 18th International Conference on Availability, Reliability and Security, {ARES} 2023, Benevento, Italy, 29 August 2023- 1 September 2023}, year = {2023}, pages = {149:1--149:8}, month = {08}, publisher = {{{ACM}}}, doi = {https://doi.org/10.1145/3600160.3605004}, keywords = {machine learning, ai, privacy, PETs, CS4E}, review = {Users are confronted with a variety of different machine learning applications in many domains. To make this possible especially for applications relying on private data, companies and developers are implementing Privacy Preserving Machine Learning (PPML) techniques what is already a challenge in itself. This study provides the first step for answering the question how to include the user's preferences for a PPML technique into the privacy by design process, when developing a new application. The goal is to support developers and AI service providers when choosing a PPML technique that best reflects the users' preferences. Based on discussions with privacy and PPML experts, we derived a framework that maps the characteristics of PPML and user acceptance criteria to better understand the key attributes.}, url = {https://dl.acm.org/doi/abs/10.1145/3600160.3605004}, }