Usable security: Difference between revisions
m Bot: Change redirected category Human-computer interaction to Human–computer interaction |
Removing link(s) to "Secure by default": Removing links to deleted page Secure by default. |
||
(7 intermediate revisions by 6 users not shown) | |||
Line 1: | Line 1: | ||
{{copy edit|date=January 2024}} |
|||
{{Short description|Subfield of Computer Science and Cybersecurity}} |
{{Short description|Subfield of Computer Science and Cybersecurity}} |
||
Usable security is a subfield of [[computer science]], [[Human–computer interaction|human-computer interaction]] and [[Computer security|cybersecurity]] concerned with the [[user interface design]] of cybersecurity systems.<ref>{{Citation |last1=Garfinkel |first1=Simson |title=Introduction |date=2014 |url=https://link.springer.com/10.1007/978-3-031-02343-9_1 |work=Usable Security |pages=1–11 |place=Cham |publisher=Springer International Publishing |language=en |doi=10.1007/978-3-031-02343-9_1 |isbn=978-3-031-01215-0 |access-date=2022-12-01 |last2=Lipford |first2=Heather Richter}}</ref> In particular, usable security focuses on ensuring that the security implications of interacting with computer systems, such as via [[alert dialog box |
'''Usable security''' is a subfield of [[computer science]], [[Human–computer interaction|human-computer interaction]], and [[Computer security|cybersecurity]] concerned with the [[user interface design]] of cybersecurity systems.<ref>{{Citation |last1=Garfinkel |first1=Simson |title=Introduction |date=2014 |url=https://link.springer.com/10.1007/978-3-031-02343-9_1 |work=Usable Security |pages=1–11 |place=Cham |publisher=Springer International Publishing |language=en |doi=10.1007/978-3-031-02343-9_1 |isbn=978-3-031-01215-0 |access-date=2022-12-01 |last2=Lipford |first2=Heather Richter}}</ref> In particular, usable security focuses on ensuring that the security implications of interacting with computer systems, such as via [[alert dialog box]]es, are accessible and understandable to human users. This differs from the [[software engineering]] method of [[secure by design]] in that it emphasizes human aspects of cybersecurity rather than the technical. Usable security also sits opposite the idea of [[security through obscurity]] by working to ensure that users are aware of the security implications of their decisions.<ref>{{Citation |last1=Renaud |first1=Karen |title=Why Doesn't Jane Protect Her Privacy? |date=2014 |url=http://link.springer.com/10.1007/978-3-319-08506-7_13 |work=Privacy Enhancing Technologies |volume=8555 |pages=244–262 |editor-last=De Cristofaro |editor-first=Emiliano |place=Cham |publisher=Springer International Publishing |doi=10.1007/978-3-319-08506-7_13 |isbn=978-3-319-08505-0 |access-date=2022-12-01 |last2=Volkamer |first2=Melanie |last3=Renkema-Padmos |first3=Arne |s2cid=9509269 |editor2-last=Murdoch |editor2-first=Steven J.}}</ref><ref>{{Cite journal |last=Yee |first=Ka-Ping |date=2004 |title=Aligning security and usability |url=https://ieeexplore.ieee.org/document/1341409 |journal=IEEE Security & Privacy |volume=2 |issue=5 |pages=48–55 |doi=10.1109/MSP.2004.64 |s2cid=206485281 |issn=1558-4046}}</ref> |
||
== History == |
== History == |
||
Usable security was first established by [[Computer scientist|Computer Scientists]] [[Jerry Saltzer]] and [[Michael Schroeder]] in their 1975 work ''The Protection of Information in Computer Systems<ref>{{Cite journal |title=A Contemporary Look at Saltzer and Schroeder's 1975 Design Principles |
Usable security was first established by [[Computer scientist|Computer Scientists]] [[Jerry Saltzer]] and [[Michael Schroeder]] in their 1975 work ''The Protection of Information in Computer Systems<ref>{{Cite journal |title=A Contemporary Look at Saltzer and Schroeder's 1975 Design Principles |url=https://ieeexplore.ieee.org/document/6226346 |access-date=2023-12-28 |journal=IEEE Security & Privacy Magazine |doi=10.1109/msp.2012.85 |date=2012 |last1=Smith |first1=Richard |page=1 |s2cid=13371996 }}</ref>'', now colloquially referred to as [[Saltzer and Schroeder's design principles]]. The principles draw attention to 'psychological acceptability', stating that the design of an interface should match the user's mental model of the system. The authors note that security errors are likely to occur when the user's mental model and underlying system operation do not match. |
||
Despite Saltzer and Schroeder's work, the widely-held view was, and continued to be, that security and usability were inherently in conflict; being either that security through obscurity was a preferable approach, or that user discomfort and confusion was just a requirement to ensuring good security<ref>{{Citation |last1=Garfinkel |first1=Simson |title=A Brief History of Usable Privacy and Security Research |date=2014 |url=https://link.springer.com/10.1007/978-3-031-02343-9_2 |work=Usable Security |pages=13–21 |access-date=2023-12-28 |place=Cham |publisher=Springer International Publishing |language=en |doi=10.1007/978-3-031-02343-9_2 |isbn=978-3-031-01215-0 |last2=Lipford |first2=Heather Richter}}</ref> |
Despite Saltzer and Schroeder's work, the widely-held view was, and continued to be, that security and usability were inherently in conflict; being either that security through obscurity was a preferable approach, or that user discomfort and confusion was just a requirement to ensuring good security.<ref>{{Citation |last1=Garfinkel |first1=Simson |title=A Brief History of Usable Privacy and Security Research |date=2014 |url=https://link.springer.com/10.1007/978-3-031-02343-9_2 |work=Usable Security |pages=13–21 |access-date=2023-12-28 |place=Cham |publisher=Springer International Publishing |language=en |doi=10.1007/978-3-031-02343-9_2 |isbn=978-3-031-01215-0 |last2=Lipford |first2=Heather Richter}}</ref> One such example is that of user login systems. When the user enters incorrect login details, the system must reply that the username and/or login is incorrect without clarifying which contains the incorrect value. By stating which of the inputs is incorrect (either the username or password), this could be used by an attacker to determine valid users on a system who could then be targeted by [[Brute-force attack|password-guessing attacks]] or similar exploitation.<ref>{{Cite book |last=Nielsen |first=Jakob |title=Usability engineering |date=1993 |publisher=Academic press |isbn=978-0-12-518405-2 |location=Boston San Diego New York [etc.]}}</ref> While this may cause some annoyance to the user, the approach does offer a heightened level of security. |
||
It wouldn't be until 1995 with the publication of "User-Centered Security<ref>{{Cite book |last1=Zurko |first1=Mary Ellen |last2=Simon |first2=Richard T. |chapter=User-centered security |date=1996 |title=Proceedings of the 1996 workshop on New security paradigms - NSPW '96 |chapter-url=http://portal.acm.org/citation.cfm?doid=304851.304859 |
It wouldn't be until 1995 with the publication of "User-Centered Security"<ref>{{Cite book |last1=Zurko |first1=Mary Ellen |last2=Simon |first2=Richard T. |chapter=User-centered security |date=1996 |title=Proceedings of the 1996 workshop on New security paradigms - NSPW '96 |chapter-url=http://portal.acm.org/citation.cfm?doid=304851.304859 |language=en |publisher=ACM Press |pages=27–33 |doi=10.1145/304851.304859 |isbn=978-0-89791-944-9}}</ref> by [[Mary Ellen Zurko]] and [[Richard T. Simon]], that what is now called usable security would become a distinct field of research and design. This shift largely stems from placing greater focus on [[usability testing]], and ensuring that security aspects are understandable during the design and development process, rather than being added as an afterthought. |
||
== Scientific conferences == |
== Scientific conferences == |
||
While research on usable security is widely accepted by many [[Human–computer interaction#Scientific conferences|HCI]] and [[Computer security conference|Cyber Security]] conferences, dedicated venues for such work include: |
|||
* EuroUSEC: European Symposium on Usable Security<ref>{{Cite web |title=EUROUSEC Conference - Home |url=https://dl.acm.org/conference/eurousec |access-date=2023-12-28 |website=ACM Digital Library |language=en}}</ref> |
* EuroUSEC: European Symposium on Usable Security<ref>{{Cite web |title=EUROUSEC Conference - Home |url=https://dl.acm.org/conference/eurousec |access-date=2023-12-28 |website=ACM Digital Library |language=en}}</ref> |
||
Line 28: | Line 26: | ||
* [[Information design]] |
* [[Information design]] |
||
* [[Information architecture]] |
* [[Information architecture]] |
||
* |
* Secure by default |
||
* [[Secure by design]] |
* [[Secure by design]] |
||
* [[Software Security Assurance]] |
* [[Software Security Assurance]] |
Latest revision as of 17:30, 1 September 2024
Usable security is a subfield of computer science, human-computer interaction, and cybersecurity concerned with the user interface design of cybersecurity systems.[1] In particular, usable security focuses on ensuring that the security implications of interacting with computer systems, such as via alert dialog boxes, are accessible and understandable to human users. This differs from the software engineering method of secure by design in that it emphasizes human aspects of cybersecurity rather than the technical. Usable security also sits opposite the idea of security through obscurity by working to ensure that users are aware of the security implications of their decisions.[2][3]
History
[edit]Usable security was first established by Computer Scientists Jerry Saltzer and Michael Schroeder in their 1975 work The Protection of Information in Computer Systems[4], now colloquially referred to as Saltzer and Schroeder's design principles. The principles draw attention to 'psychological acceptability', stating that the design of an interface should match the user's mental model of the system. The authors note that security errors are likely to occur when the user's mental model and underlying system operation do not match.
Despite Saltzer and Schroeder's work, the widely-held view was, and continued to be, that security and usability were inherently in conflict; being either that security through obscurity was a preferable approach, or that user discomfort and confusion was just a requirement to ensuring good security.[5] One such example is that of user login systems. When the user enters incorrect login details, the system must reply that the username and/or login is incorrect without clarifying which contains the incorrect value. By stating which of the inputs is incorrect (either the username or password), this could be used by an attacker to determine valid users on a system who could then be targeted by password-guessing attacks or similar exploitation.[6] While this may cause some annoyance to the user, the approach does offer a heightened level of security.
It wouldn't be until 1995 with the publication of "User-Centered Security"[7] by Mary Ellen Zurko and Richard T. Simon, that what is now called usable security would become a distinct field of research and design. This shift largely stems from placing greater focus on usability testing, and ensuring that security aspects are understandable during the design and development process, rather than being added as an afterthought.
Scientific conferences
[edit]While research on usable security is widely accepted by many HCI and Cyber Security conferences, dedicated venues for such work include:
- EuroUSEC: European Symposium on Usable Security[8]
- HAS: International Conference on Human Aspects of Information Security, Privacy, and Trust[9]
- IFIP World Conference on Information Security Education[10]
- STAST: International Workshop on Socio-Technical Aspects in Security[11]
- TrustBus: International Conference on Trust and Privacy in Digital Business[12]
- USEC: Usable Security and Privacy Symposium[13]
See also
[edit]- Information design
- Information architecture
- Secure by default
- Secure by design
- Software Security Assurance
- User-centered design
- User experience design
References
[edit]- ^ Garfinkel, Simson; Lipford, Heather Richter (2014), "Introduction", Usable Security, Cham: Springer International Publishing, pp. 1–11, doi:10.1007/978-3-031-02343-9_1, ISBN 978-3-031-01215-0, retrieved 2022-12-01
- ^ Renaud, Karen; Volkamer, Melanie; Renkema-Padmos, Arne (2014), De Cristofaro, Emiliano; Murdoch, Steven J. (eds.), "Why Doesn't Jane Protect Her Privacy?", Privacy Enhancing Technologies, vol. 8555, Cham: Springer International Publishing, pp. 244–262, doi:10.1007/978-3-319-08506-7_13, ISBN 978-3-319-08505-0, S2CID 9509269, retrieved 2022-12-01
- ^ Yee, Ka-Ping (2004). "Aligning security and usability". IEEE Security & Privacy. 2 (5): 48–55. doi:10.1109/MSP.2004.64. ISSN 1558-4046. S2CID 206485281.
- ^ Smith, Richard (2012). "A Contemporary Look at Saltzer and Schroeder's 1975 Design Principles". IEEE Security & Privacy Magazine: 1. doi:10.1109/msp.2012.85. S2CID 13371996. Retrieved 2023-12-28.
- ^ Garfinkel, Simson; Lipford, Heather Richter (2014), "A Brief History of Usable Privacy and Security Research", Usable Security, Cham: Springer International Publishing, pp. 13–21, doi:10.1007/978-3-031-02343-9_2, ISBN 978-3-031-01215-0, retrieved 2023-12-28
- ^ Nielsen, Jakob (1993). Usability engineering. Boston San Diego New York [etc.]: Academic press. ISBN 978-0-12-518405-2.
- ^ Zurko, Mary Ellen; Simon, Richard T. (1996). "User-centered security". Proceedings of the 1996 workshop on New security paradigms - NSPW '96. ACM Press. pp. 27–33. doi:10.1145/304851.304859. ISBN 978-0-89791-944-9.
- ^ "EUROUSEC Conference - Home". ACM Digital Library. Retrieved 2023-12-28.
- ^ "International Conference on Human Aspects of Information Security, Privacy, and Trust". link.springer.com. Retrieved 2023-12-28.
- ^ "IFIP World Conference on Information Security Education". link.springer.com. Retrieved 2023-12-28.
- ^ "International Workshop on Socio-Technical Aspects in Security". link.springer.com. Retrieved 2023-12-28.
- ^ "International Conference on Trust and Privacy in Digital Business". link.springer.com. Retrieved 2023-12-28.
- ^ "SOUPS Symposia | USENIX". www.usenix.org. Retrieved 2023-12-28.