A Systematic Analysis of User Evaluations in Security Research

Hamm, P.; Harborth, D. and Pape, S.

In Proceedings of the 14th International Conference on Availability, Reliability and Security, ARES 2019, Canterbury, UK, August 26-29, 2019, ACM, 2019.

Abstract

We conducted a literature survey on reproducibility and replicability of user surveys in security research. For that purpose, we examined all papers published over the last five years at three leading security research conferences and recorded the type of study and whether the authors made the underlying responses available as open data, as well as if they published the used questionnaire respectively interview guide. We uncovered how user surveys become more widespread in security research and how authors and conferences are increasingly publishing their methodologies, while we had no examples of data being made available. Based on these findings, we recommend that future researchers publish their data in addition to their results to facilitate replication and ensure a firm basis for user studies in security research.


PDF DOI LinkBibtexsecuritymethodologycs4e

Bibtex

@InProceedings{HHP19iwsmr,
  author    = {Peter Hamm and David Harborth and Sebastian Pape},
  title     = {A Systematic Analysis of User Evaluations in Security Research},
  booktitle = {Proceedings of the 14th International Conference on Availability, Reliability and Security, {ARES} 2019, Canterbury, UK, August 26-29, 2019},
  year      = {2019},
  month     = {08},
  publisher = {{ACM}},
  doi       = {10.1145/3339252.3340339},
  keywords  = {security, CS4E, methodology},
  url       = {https://doi.org/10.1145/3339252.3340339},
}