Jump to content

Usable security

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 0066cc (talk | contribs) at 19:17, 1 January 2024 (Submitting using AfC-submit-wizard). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

  • Comment: Note that the article's title should be in sentence case per MOS:AT ("Usable security"). —DocWatson42 (talk) 02:41, 7 December 2022 (UTC) (Just passing through.)
  • Comment: I'd suggest removing the "common goals" section as that's not very encyclopedic. ― Blaze WolfTalkBlaze Wolf#6545 15:57, 5 December 2022 (UTC)


Usable security is a subfield of computer science, human-computer interaction and cybersecurity concerned with the user interface design of cybersecurity systems.[1] In particular, usable security focuses on ensuring that the security implications of interacting with computer systems, such as via alert dialog boxes, are accessible and understandable to human users. This differs from the software engineering method of secure by design by placing greater focus on the human aspects of cybersecurity rather than the technical. Usable security also sits opposite the idea of security through obscurity by instead working to ensure that users are aware of the security implications of their decisions.[2][3]

History

Usable security was first established by Computer Scientists Jerry Saltzer and Michael Schroeder in their 1975 work The Protection of Information in Computer Systems[4], now colloquially referred to as Saltzer and Schroeder's design principles. The principles draw attention to 'psychological acceptability', stating that the design an interface should match the users mental model of the system. The authors note that security errors are likely to occur when the users mental model and underlying system operation do not match.

Despite Saltzer and Schroeder's work, the widely-held view was, and continued to be, that security and usability were inherently in conflict; being either that security through obscurity was a preferable approach, or that user discomfort and confusion was just a requirement to ensuring good security[5]. One such example is that of user login systems. When the user enters incorrect login details, the system must reply that the username and/or login is incorrect without clarifying which contains the incorrect value. By stating which of the inputs is incorrect (either the username or password), this could be used by an attacker to determine valid users on a system whom could then be targetted by password-guessing attacks or similar exploitation[6]. While this may cause some annoyance to the user, the approach does offer a heightened level of security.

It wouldn't be until 1995 with the publication of "User-Centered Security[7]" by Mary Ellen Zurko and Richard T. Simon that what is now called usable security would become a distinct field of research and design. This shift largely stems from placing greater focus on usability testing and ensuring that security aspects are understandable during the design and development process, rather than being added as an afterthought.

Scientific conferences

Research on usable security is widely accepted by many HCI and Cyber Security conferences, however, dedicated venues for such work include:

  • EuroUSEC: European Symposium on Usable Security[8]
  • HAS: International Conference on Human Aspects of Information Security, Privacy, and Trust[9]
  • IFIP World Conference on Information Security Education[10]
  • STAST: International Workshop on Socio-Technical Aspects in Security[11]
  • TrustBus: International Conference on Trust and Privacy in Digital Business[12]
  • USEC: Usable Security and Privacy Symposium[13]

See also

References

  1. ^ Garfinkel, Simson; Lipford, Heather Richter (2014), "Introduction", Usable Security, Cham: Springer International Publishing, pp. 1–11, doi:10.1007/978-3-031-02343-9_1, ISBN 978-3-031-01215-0, retrieved 2022-12-01
  2. ^ Renaud, Karen; Volkamer, Melanie; Renkema-Padmos, Arne (2014), De Cristofaro, Emiliano; Murdoch, Steven J. (eds.), "Why Doesn't Jane Protect Her Privacy?", Privacy Enhancing Technologies, vol. 8555, Cham: Springer International Publishing, pp. 244–262, doi:10.1007/978-3-319-08506-7_13, ISBN 978-3-319-08505-0, S2CID 9509269, retrieved 2022-12-01
  3. ^ Yee, Ka-Ping (2004). "Aligning security and usability". IEEE Security & Privacy. 2 (5): 48–55. doi:10.1109/MSP.2004.64. ISSN 1558-4046.
  4. ^ "A Contemporary Look at Saltzer and Schroeder's 1975 Design Principles | IEEE Journals & Magazine | IEEE Xplore". ieeexplore.ieee.org. doi:10.1109/msp.2012.85. Retrieved 2023-12-28.
  5. ^ Garfinkel, Simson; Lipford, Heather Richter (2014), "A Brief History of Usable Privacy and Security Research", Usable Security, Cham: Springer International Publishing, pp. 13–21, doi:10.1007/978-3-031-02343-9_2, ISBN 978-3-031-01215-0, retrieved 2023-12-28
  6. ^ Nielsen, Jakob (1993). Usability engineering. Boston San Diego New York [etc.]: Academic press. ISBN 978-0-12-518405-2.
  7. ^ Zurko, Mary Ellen; Simon, Richard T. (1996). "User-centered security". New Security Paradigms. ACM Press: 27–33. doi:10.1145/304851.304859. ISBN 978-0-89791-944-9.
  8. ^ "EUROUSEC Conference - Home". ACM Digital Library. Retrieved 2023-12-28.
  9. ^ "International Conference on Human Aspects of Information Security, Privacy, and Trust". link.springer.com. Retrieved 2023-12-28.
  10. ^ "IFIP World Conference on Information Security Education". link.springer.com. Retrieved 2023-12-28.
  11. ^ "International Workshop on Socio-Technical Aspects in Security". link.springer.com. Retrieved 2023-12-28.
  12. ^ "International Conference on Trust and Privacy in Digital Business". link.springer.com. Retrieved 2023-12-28.
  13. ^ "SOUPS Symposia | USENIX". www.usenix.org. Retrieved 2023-12-28.