Jump to content

Hardware security

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Davidy2001 (talk | contribs) at 13:39, 30 November 2019 (See also). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Hardware security as a discipline originated out of cryptographic engineering and involves hardware design, access control, secure multi-party computation, secure key storage, ensuring code authenticity, measures to ensure that the supply chain that built the product is secure among other things.[1][2][3][4]

A hardware security module (HSM) is a physical computing device that safeguards and manages digital keys for strong authentication and provides cryptoprocessing. These modules traditionally come in the form of a plug-in card or an external device that attaches directly to a computer or network server.

Some providers in this discipline consider that the key difference between hardware security and software security is that hardware security is implemented using "non-Turing-machine" logic (raw combinatorial logic or simple state machines). One approach, referred to as "hardsec", uses FPGAs to implement non-Turing-machine security controls as a way of combining the security of hardware with the flexibility of software.[5]

Hardware backdoors are backdoors in hardware. Conceptionally related, a hardware Trojan (HT) is a malicious modification an electronic system, particularly in the context an integrated circuit.[1][3]

A physical unclonable function (PUF)[6][7] is a physical entity that is embodied in a physical structure and is easy to evaluate but hard to predict. Further, an individual PUF device must be easy to make but practically impossible to duplicate, even given the exact manufacturing process that produced it. In this respect it is the hardware analog of a one-way function. The name "physical unclonable function" might be a little misleading as some PUFs are clonable, and most PUFs are noisy and therefore do not achieve the requirements for a function. Today, PUFs are usually implemented in integrated circuits and are typically used in applications with high security requirements.

Many attacks on sensitive data and resources reported by organizations occur from within the organization itself.[8]

See also

References

  1. ^ a b Mukhopadhyay, Debdeep; Chakraborty, Rajat Subhra (2014). Hardware Security: Design, Threats, and Safeguards. CRC Press. ISBN 9781439895849. Retrieved 3 June 2017.
  2. ^ "Hardware security in the IoT - Embedded Computing Design". embedded-computing.com. Retrieved 3 June 2017.
  3. ^ a b Rostami, M.; Koushanfar, F.; Karri, R. (August 2014). "A Primer on Hardware Security: Models, Methods, and Metrics". Proceedings of the IEEE. 102 (8): 1283–1295. doi:10.1109/jproc.2014.2335155. ISSN 0018-9219.
  4. ^ Rajendran, J.; Sinanoglu, O.; Karri, R. (August 2014). "Regaining Trust in VLSI Design: Design-for-Trust Techniques". Proceedings of the IEEE. 102 (8): 1266–1282. doi:10.1109/jproc.2014.2332154. ISSN 0018-9219.
  5. ^ Cook, James (2019-06-22). "British start-ups race ahead of US rivals to develop new ultra-secure computer chips to defeat hackers". The Telegraph. ISSN 0307-1235. Retrieved 2019-08-27.
  6. ^ Sadeghi, Ahmad-Reza; Naccache, David (2010). Towards Hardware-Intrinsic Security: Foundations and Practice. Springer Science & Business Media. ISBN 9783642144523. Retrieved 3 June 2017.
  7. ^ "Hardware Security - Fraunhofer AISEC". Fraunhofer-Institut für Angewandte und Integrierte Sicherheit (in German). Retrieved 3 June 2017.
  8. ^ "Hardware Security". web.mit.edu. Retrieved 3 June 2017.