Social engineering (security)
In the context of information security, social engineering is the psychological manipulation of people into performing actions or divulging confidential information. This differs from social engineering within the social sciences, which does not concern the divulging of confidential information. A type of confidence trick for the purpose of information gathering, fraud, or system access, it differs from a traditional "con" in that it is often one of many steps in a more complex fraud scheme.[1]
It has also been defined as "an act that influences a person to take an action that may or may not be in their best interests."[2]
An example of social engineering is the use of the "forgot password" function on most websites which require login. An improperly-secured password-recovery system can be used to grant a malicious attacker full access to a user's account, while the original user will lose access to the account.
Information security culture
Employee behavior can have a big impact on information security in organizations. Cultural concepts can help different segments of the organization work effectively or work against effectiveness towards information security within an organization. "Exploring the Relationship between Organizational Culture and Information Security Culture" provides the following definition of information security culture: "ISC is the totality of patterns of behavior in an organization that contribute to the protection of information of all kinds."[3]
Andersson and Reimers (2014) found that employees often do not see themselves as part of the organization Information Security "effort" and often take actions that ignore organizational information security best interests.[4] Research shows Information security culture needs to be improved continuously. In "Information Security Culture from Analysis to Change," authors commented that "it's a never ending process, a cycle of evaluation and change or maintenance." They suggest that to manage information security culture, five steps should be taken: Pre-evaluation, strategic planning, operative planning, implementation, and post-evaluation.[5]
- Pre-Evaluation: to identify the awareness of information security within employees and to analyse current security policy.
- Strategic Planning: to come up with a better awareness-program, we need to set clear targets. Clustering people is helpful to achieve it.
- Operative Planning: set a good security culture based on internal communication, management-buy-in, and security awareness and training program.[5]
- Implementation: four stages should be used to implement the information security culture. They are commitment of the management, communication with organizational members, courses for all organizational members, and commitment of the employees.[5]
Techniques and terms
All social engineering techniques are based on specific attributes of human decision-making known as cognitive biases.[6][7] These biases, sometimes called "bugs in the human hardware,” are exploited in various combinations to create attack techniques, some of which are listed below. The attacks used in social engineering can be used to steal employees' confidential information. The most common type of social engineering happens over the phone. Other examples of social engineering attacks are criminals posing as exterminators, fire marshals and technicians to go unnoticed as they steal company secrets.
One example of social engineering is an individual who walks into a building and posts an official-looking announcement to the company bulletin that says the number for the help desk has changed. So, when employees call for help the individual asks them for their passwords and IDs thereby gaining the ability to access the company's private information. Another example of social engineering would be that the hacker contacts the target on a social networking site and starts a conversation with the target. Gradually the hacker gains the trust of the target and then uses that trust to get access to sensitive information like password or bank account details.[8]
Social engineering relies heavily on the six principles of influence established by Robert Cialdini. Cialdini's theory of influence is based on six key principles: reciprocity, commitment and consistency, social proof, authority, liking, scarcity.
Six key principles
Authority
In social engineering, the attacker may pose as authority to increase the likelihood of adherence from the victim.
Intimidation
Attacker (potentially disguised) informs or implies that there will be negative consequences if certain actions are not performed. Consequences could include subtle intimidation phrases such as "I'll tell your manager" to much worse.
Consensus/Social proof
People will do things that they see other people are doing. For example, in one experiment[which?], one or more confederates would look up into the sky; bystanders would then look up into the sky to see what they were missing. At one point this experiment was aborted, as so many people were looking up that they stopped traffic. See conformity, and the Asch conformity experiments.
Scarcity
Perceived scarcity will generate demand. The common advertising phrase "while supplies last" capitalizes on a sense of scarcity.
Urgency
Linked to scarcity, attackers use urgency as a time-based psychological principle of social engineering. For example, saying offers are available for a "limited time only" encourages sales through a sense of urgency.
Familiarity / Liking
People are easily persuaded by other people whom they like. Cialdini cites the marketing of Tupperware in what might now be called viral marketing. People were more likely to buy if they liked the person selling it to them. Some of the many biases favoring more attractive people are discussed. See physical attractiveness stereotype.
Four social engineering vectors
Vishing
Vishing, otherwise known as "voice phishing", is the criminal practice of using social engineering over a telephone system to gain access to private personal and financial information from the public for the purpose of financial reward.[9] It is also employed by attackers for reconnaissance purposes to gather more detailed intelligence on a target organization.
Phishing
Phishing is a technique of fraudulently obtaining private information. Typically, the phisher sends an e-mail that appears to come from a legitimate business—a bank, or credit card company—requesting "verification" of information and warning of some dire consequence if it is not provided. The e-mail usually contains a link to a fraudulent web page that seems legitimate—with company logos and content—and has a form requesting everything from a home address to an ATM card's PIN or a credit card number. For example, in 2003, there was a phishing scam in which users received emails supposedly from eBay claiming that the user's account was about to be suspended unless a link provided was clicked to update a credit card (information that the genuine eBay already had).[10] By mimicking a legitimate organization's HTML code and logos, it is relatively simple to make a fake Website look authentic. The scam tricked some people into thinking that eBay was requiring them to update their account information by clicking on the link provided. By indiscriminately spamming extremely large groups of people, the "phisher" counted on gaining sensitive financial information from the small percentage (yet large number) of recipients who already have eBay accounts and also fall prey to the scam.
Smishing
The act of using SMS text messaging to lure victims into a specific course of action. Like phishing it can be clicking on a malicious link or divulging information. Examples are text messages that claim to be from a common carrier (like FedEx) stating a package is in transit, with a link provided.
Impersonation
Pretending or pretexting to be another person with the goal of gaining access physically to a system or building. Impersonation is used in the "SIM swap scam" fraud.
Other concepts
Pretexting
Pretexting (adj. pretextual) is the act of creating and using an invented scenario (the pretext) to engage a targeted victim in a manner that increases the chance the victim will divulge information or perform actions that would be unlikely in ordinary circumstances.[11] An elaborate lie, it most often involves some prior research or setup and the use of this information for impersonation (e.g., date of birth, Social Security number, last bill amount) to establish legitimacy in the mind of the target.[12] As a background, pretexting can be interpreted as the first evolution of social engineering, and continued to develop as social engineering incorporated current-day technologies. Current and past examples of pretexting demonstrate this development.
This technique can be used to fool a business into disclosing customer information as well as by private investigators to obtain telephone records, utility records, banking records and other information directly from company service representatives.[13] The information can then be used to establish even greater legitimacy under tougher questioning with a manager, e.g., to make account changes, get specific balances, etc.
Pretexting can also be used to impersonate co-workers, police, bank, tax authorities, clergy, insurance investigators—or any other individual who could have perceived authority or right-to-know in the mind of the targeted victim. The pretexter must simply prepare answers to questions that might be asked by the victim. In some cases, all that is needed is a voice that sounds authoritative, an earnest tone, and an ability to think on one's feet to create a pretextual scenario.
Vishing
Phone phishing (or "vishing") uses a rogue interactive voice response (IVR) system to recreate a legitimate-sounding copy of a bank or other institution's IVR system. The victim is prompted (typically via a phishing e-mail) to call in to the "bank" via a (ideally toll free) number provided in order to "verify" information. A typical "vishing" system will reject log-ins continually, ensuring the victim enters PINs or passwords multiple times, often disclosing several different passwords. More advanced systems transfer the victim to the attacker/defrauder, who poses as a customer service agent or security expert for further questioning of the victim.
Spear phishing
Although similar to "phishing", spear phishing is a technique that fraudulently obtains private information by sending highly customized emails to few end users. It is the main difference between phishing attacks because phishing campaigns focus on sending out high volumes of generalized emails with the expectation that only a few people will respond. On the other hand, spear-phishing emails require the attacker to perform additional research on their targets in order to "trick" end users into performing requested activities. The success rate of spear-phishing attacks is considerably higher than phishing attacks with people opening roughly 3% of phishing emails when compared to roughly 70% of potential attempts. When users actually open the emails phishing emails have a relatively modest 5% success rate to have the link or attachment clicked when compared to a spear-phishing attack's 50% success rate.[14]
Spear-phishing success is heavily dependent on the amount and quality of OSINT (open-source intelligence) that the attacker can obtain. Social media account activity is one example of a source of OSINT.
Water holing
Water holing is a targeted social engineering strategy that capitalizes on the trust users have in websites they regularly visit. The victim feels safe to do things they would not do in a different situation. A wary person might, for example, purposefully avoid clicking a link in an unsolicited email, but the same person would not hesitate to follow a link on a website they often visit. So, the attacker prepares a trap for the unwary prey at a favored watering hole. This strategy has been successfully used to gain access to some (supposedly) very secure systems.[15]
The attacker may set out by identifying a group or individuals to target. The preparation involves gathering information about websites the targets often visit from the secure system. The information gathering confirms that the targets visit the websites and that the system allows such visits. The attacker then tests these websites for vulnerabilities to inject code that may infect a visitor's system with malware. The injected code trap and malware may be tailored to the specific target group and the specific systems they use. In time, one or more members of the target group will get infected and the attacker can gain access to the secure system.
Baiting
Baiting is like the real-world Trojan horse that uses physical media and relies on the curiosity or greed of the victim.[16] In this attack, attackers leave malware-infected floppy disks, CD-ROMs, or USB flash drives in locations people will find them (bathrooms, elevators, sidewalks, parking lots, etc.), give them legitimate and curiosity-piquing labels, and wait for victims.
For example, an attacker may create a disk featuring a corporate logo, available from the target's website, and label it "Executive Salary Summary Q2 2012". The attacker then leaves the disk on the floor of an elevator or somewhere in the lobby of the target company. An unknowing employee may find it and insert the disk into a computer to satisfy their curiosity, or a good Samaritan may find it and return it to the company. In any case, just inserting the disk into a computer installs malware, giving attackers access to the victim's PC and, perhaps, the target company's internal computer network.
Unless computer controls block infections, insertion compromises PCs "auto-running" media. Hostile devices can also be used.[17] For instance, a "lucky winner" is sent a free digital audio player compromising any computer it is plugged to. A "road apple" (the colloquial term for horse manure, suggesting the device's undesirable nature) is any removable media with malicious software left in opportunistic or conspicuous places. It may be a CD, DVD, or USB flash drive, among other media. Curious people take it and plug it into a computer, infecting the host and any attached networks. Again, hackers may give them enticing labels, such as "Employee Salaries" or "Confidential".[18]
One study done in 2016 had researchers drop 297 USB drives around the campus of the University of Illinois. The drives contained files on them that linked to webpages owned by the researchers. The researchers were able to see how many of the drives had files on them opened, but not how many were inserted into a computer without having a file opened. Of the 297 drives that were dropped, 290 (98%) of them were picked up and 135 (45%) of them "called home".[19]
Quid pro quo
Quid pro quo means something for something:
- An attacker calls random numbers at a company, claiming to be calling back from technical support. Eventually this person will hit someone with a legitimate problem, grateful that someone is calling back to help them. The attacker will "help" solve the problem and, in the process, have the user type commands that give the attacker access or launch malware.
- In a 2003 information security survey, 91% of office workers gave researchers what they claimed was their password in answer to a survey question in exchange for a cheap pen.[20] Similar surveys in later years obtained similar results using chocolates and other cheap lures, although they made no attempt to validate the passwords.[21]
Tailgating
An attacker, seeking entry to a restricted area secured by unattended, electronic access control, e.g. by RFID card, simply walks in behind a person who has legitimate access. Following common courtesy, the legitimate person will usually hold the door open for the attacker or the attackers themselves may ask the employee to hold it open for them. The legitimate person may fail to ask for identification for any of several reasons, or may accept an assertion that the attacker has forgotten or lost the appropriate identity token. The attacker may also fake the action of presenting an identity token.
Other types
Common confidence tricksters or fraudsters also could be considered "social engineers" in the wider sense, in that they deliberately deceive and manipulate people, exploiting human weaknesses to obtain personal benefit. They may, for example, use social engineering techniques as part of an IT fraud.
As of the early 2000s, another type of social engineering technique includes spoofing or hacking IDs of people having popular e-mail IDs such as Yahoo!, Gmail, or Hotmail. Additionally, some spoofing attempts included emails from major online service providers, like PayPal.[22] This led to the "proposed standard" of Sender Policy Framework RFC 7208 dated April 2014, in combination with DMARC, as means to combat spoofing. Among the many motivations for this deception are:
- Phishing credit-card account numbers and their passwords.
- Cracking private e-mails and chat histories, and manipulating them by using common editing techniques before using them to extort money and creating distrust among individuals.
- Cracking websites of companies or organizations and destroying their reputation.
- Computer virus hoaxes
- Convincing users to run malicious code within the web browser via self-XSS attack to allow access to their web account
Another type is to read sensitive information of unshielded or unprotected Displays and input devices, called Shoulder surfing.
Countermeasures
Organizations reduce their security risks by:
Training to Employees: Training employees in security protocols relevant to their position. (e.g., in situations such as tailgating, if a person's identity cannot be verified, then employees must be trained to politely refuse.)
Standard Framework: Establishing frameworks of trust on an employee/personnel level (i.e., specify and train personnel when/where/why/how sensitive information should be handled)
Scrutinizing Information: Identifying which information is sensitive and evaluating its exposure to social engineering and breakdowns in security systems (building, computer system, etc.)
Security Protocols: Establishing security protocols, policies, and procedures for handling sensitive information.
Event Test: Performing unannounced, periodic tests of the security framework.
Inoculation: Preventing social engineering and other fraudulent tricks or traps by instilling a resistance to persuasion attempts through exposure to similar or related attempts.[23]
Review: Reviewing the above steps regularly: no solutions to information integrity are perfect.[24]
Waste Management: Using a waste management service that has dumpsters with locks on them, with keys to them limited only to the waste management company and the cleaning staff. Locating the dumpster either in view of employees so that trying to access it carries a risk of being seen or caught, or behind a locked gate or fence where the person must trespass before they can attempt to access the dumpster.[25]
The lifecycle of social engineering
- Information gathering: Information gathering is the first and foremost step of the lifecycle. It requires much patience and keenly watching habits of the victim. This step gathering data about the victim's interests, personal information. It determines the success rate of the overall attack.
- Engaging with victim: After gathering required amount of information, the attacker opens a conversation with the victim smoothly without the victim finding anything inappropriate.
- Attacking: This step generally occurs after a long period of engaging with the target and during this information from the target is retrieved by using social engineering. In phase, the attacker gets the results from the target.
- Closing interaction: This is the last step which includes slowly shutting down the communication by the attacker without arising any suspicion in the victim. In this way, the motive is fulfilled as well as the victim rarely comes to know the attack even happened.[26]
Notable social engineers
Frank Abagnale Jr.
Frank Abagnale Jr. is an American security consultant known for his background as a former con man, check forger, and impostor while he was between the ages of 15 and 21. He became one of the most notorious impostors,[27] claiming to have assumed no fewer than eight identities, including an airline pilot, a physician, a U.S. Bureau of Prisons agent, and a lawyer. Abagnale escaped from police custody twice (once from a taxiing airliner and once from a U.S. federal penitentiary) before turning 22 years old.[28] The popular Steven Spielberg movie Catch Me If You Can is based on his life.
Kevin Mitnick
Kevin Mitnick is an American computer security consultant, author and hacker, best known for his high-profile 1995 arrest and later five-year conviction for various computer and communications-related crimes.[29]
Susan Headley
Susan Headley was an American hacker active during the late 1970s and early 1980s widely respected for her expertise in social engineering, pretexting, and psychological subversion.[30] She was known for her specialty in breaking into military computer systems, which often involved going to bed with military personnel and going through their clothes for usernames and passwords while they slept.[31] She became heavily involved in phreaking with Kevin Mitnick and Lewis de Payne in Los Angeles, but later framed them for erasing the system files at US Leasing after a falling out, leading to Mitnick's first conviction. She retired to professional poker.[32]
James Linton
James Linton is a British hacker and social engineer who in 2017 used OSINT and spear phishing techniques to trick a variety of targets over email including the CEOs of Major Banks, and members of the Trump White House Administration. He then went to work in email security where he socially engineered BEC (Business Email Compromise) threat actors to collect specific threat intelligence.
Badir Brothers
Brothers Ramy, Muzher, and Shadde Badir—all of whom were blind from birth—managed to set up an extensive phone and computer fraud scheme in Israel in the 1990s using social engineering, voice impersonation, and Braille-display computers.[33] [34]
Christopher J. Hadnagy
Christopher J. Hadnagy is an American social engineer and information technology security consultant. He is best known as an author of 4 books on social engineering and cyber security[35][36][37][38] and founder of Innocent Lives Foundation, an organization that helps tracking and identifying child trafficking using various security techniques such as seeking the assistance of information security specialists, utilizing data from open-source intelligence (OSINT) and collaborating with law enforcement.[39][40]
Law
In common law, pretexting is an invasion of privacy tort of appropriation.[41]
Pretexting of telephone records
In December 2006, United States Congress approved a Senate sponsored bill making the pretexting of telephone records a federal felony with fines of up to $250,000 and ten years in prison for individuals (or fines of up to $500,000 for companies). It was signed by President George W. Bush on 12 January 2007.[42]
Federal legislation
The 1999 "GLBA" is a U.S. Federal law that specifically addresses pretexting of banking records as an illegal act punishable under federal statutes. When a business entity such as a private investigator, SIU insurance investigator, or an adjuster conducts any type of deception, it falls under the authority of the Federal Trade Commission (FTC). This federal agency has the obligation and authority to ensure that consumers are not subjected to any unfair or deceptive business practices. US Federal Trade Commission Act, Section 5 of the FTCA states, in part: "Whenever the Commission shall have reason to believe that any such person, partnership, or corporation has been or is using any unfair method of competition or unfair or deceptive act or practice in or affecting commerce, and if it shall appear to the Commission that a proceeding by it in respect thereof would be to the interest of the public, it shall issue and serve upon such person, partnership, or corporation a complaint stating its charges in that respect."
The statute states that when someone obtains any personal, non-public information from a financial institution or the consumer, their action is subject to the statute. It relates to the consumer's relationship with the financial institution. For example, a pretexter using false pretenses either to get a consumer's address from the consumer's bank, or to get a consumer to disclose the name of their bank, would be covered. The determining principle is that pretexting only occurs when information is obtained through false pretenses.
While the sale of cell telephone records has gained significant media attention, and telecommunications records are the focus of the two bills currently before the United States Senate, many other types of private records are being bought and sold in the public market. Alongside many advertisements for cell phone records, wireline records and the records associated with calling cards are advertised. As individuals shift to VoIP telephones, it is safe to assume that those records will be offered for sale as well. Currently, it is legal to sell telephone records, but illegal to obtain them.[43]
1st Source Information Specialists
U.S. Rep. Fred Upton (R-Kalamazoo, Michigan), chairman of the Energy and Commerce Subcommittee on Telecommunications and the Internet, expressed concern over the easy access to personal mobile phone records on the Internet during a House Energy & Commerce Committee hearing on "Phone Records For Sale: Why Aren't Phone Records Safe From Pretexting?" Illinois became the first state to sue an online records broker when Attorney General Lisa Madigan sued 1st Source Information Specialists, Inc. A spokeswoman for Madigan's office said. The Florida-based company operates several Web sites that sell mobile telephone records, according to a copy of the suit. The attorneys general of Florida and Missouri quickly followed Madigan's lead, filing suits respectively, against 1st Source Information Specialists and, in Missouri's case, one other records broker – First Data Solutions, Inc.
Several wireless providers, including T-Mobile, Verizon, and Cingular filed earlier lawsuits against records brokers, with Cingular winning an injunction against First Data Solutions and 1st Source Information Specialists. U.S. Senator Charles Schumer (D-New York) introduced legislation in February 2006 aimed at curbing the practice. The Consumer Telephone Records Protection Act of 2006 would create felony criminal penalties for stealing and selling the records of mobile phone, landline, and Voice over Internet Protocol (VoIP) subscribers.
HP
Patricia Dunn, former chairwoman of Hewlett Packard, reported that the HP board hired a private investigation company to delve into who was responsible for leaks within the board. Dunn acknowledged that the company used the practice of pretexting to solicit the telephone records of board members and journalists. Chairman Dunn later apologized for this act and offered to step down from the board if it was desired by board members.[44] Unlike Federal law, California law specifically forbids such pretexting. The four felony charges brought on Dunn were dismissed.[45]
Preventive measures
Taking some precautions reduce the risk of being a victim to social engineering frauds. The precautions that can be made are as follows:
- Be aware of offers that seem "Too good to be true".
- Use multifactor authentication.
- Avoid clicking on attachments from unknown sources.
- Not giving out personal information to anyone via email, phone, or text messages.
- Use of spam filter software.
- Avoid befriending people that you do not know in real life.
- Teach kids to contact a trusted adult in case they are being bullied over the internet (cyberbullying) or feel threatened by anything online.[46]
See also
- Certified Social Engineering Prevention Specialist (CSEPS)
- Code Shikara – Family of malware worms that spreads through instant messaging
- Confidence trick – Attempt to defraud a person or group
- Countermeasure (computer) – Process to reduce a security threat
- Cyber-HUMINT – Set of skills used by cyberspace hackers
- Cyberheist – Attack on a computer system
- Inoculation theory – How people's attitudes can resist change through weak counterargument exposures
- Internet Security Awareness Training – Training given for increasing cybersecurity awareness.
- IT risk – Any risk related to information technology
- Media prank – Type of media events, which often use similar tactics (though usually not for criminal purposes)
- Penetration test – Authorized cyberattack for testing purposes
- Phishing – Form of social engineering
- Physical information security – Common ground of physical and information security
- Piggybacking (security) – Gaining entry by following another person
- SMS phishing – Form of social engineering
- Threat (computer) – Potential negative action or event facilitated by a vulnerability
- Voice phishing – Phishing attack via telephony
- Vulnerability (computing) – Exploitable weakness in a computer system
- Cyber security awareness
References
- ^ Anderson, Ross J. (2008). Security engineering: a guide to building dependable distributed systems (2nd ed.). Indianapolis, IN: Wiley. p. 1040. ISBN 978-0-470-06852-6. Chapter 2, page 17
- ^ "Social Engineering Defined". Security Through Education. Retrieved 3 October 2021.
- ^ Lim, Joo S., et al. "Exploring the Relationship between Organizational Culture and Information Security Culture." Australian Information Security Management Conference.
- ^ Andersson, D., Reimers, K. and Barretto, C. (March 2014). Post-Secondary Education Network Security: Results of Addressing the End-User Challenge.publication date 11 March 2014 publication description INTED2014 (International Technology, Education, and Development Conference)
- ^ a b c Schlienger, Thomas; Teufel, Stephanie (2003). "Information security culture-from analysis to change". South African Computer Journal. 31: 46–52.
- ^ Jaco, K: "CSEPS Course Workbook" (2004), unit 3, Jaco Security Publishing.
- ^ Kirdemir, Baris (2019). "HOSTILE INFLUENCE AND EMERGING COGNITIVE THREATS IN CYBERSPACE". Centre for Economics and Foreign Policy Studies.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ Hatfield, Joseph M (June 2019). "Virtuous human hacking: The ethics of social engineering in penetration-testing". Computers & Security. 83: 354–366. doi:10.1016/j.cose.2019.02.012. S2CID 86565713.
- ^ Choi, Kwan; Lee, Ju-lak; Chun, Yong-tae (1 May 2017). "Voice phishing fraud and its modus operandi". Security Journal. 30 (2): 454–466. doi:10.1057/sj.2014.49. ISSN 0955-1662. S2CID 154080668.
- ^ Austen, Ian (7 March 2005). "On EBay, E-Mail Phishers Find a Well-Stocked Pond". The New York Times. ISSN 0362-4331. Retrieved 1 May 2021.
- ^ The story of HP pretexting scandal with discussion is available at Davani, Faraz (14 August 2011). "HP Pretexting Scandal by Faraz Davani". Retrieved 15 August 2011 – via Scribd.
- ^ "Pretexting: Your Personal Information Revealed", Federal Trade Commission
- ^ Fagone, Jason (24 November 2015). "The Serial Swatter". The New York Times. Retrieved 25 November 2015.
- ^ "The Real Dangers of Spear-Phishing Attacks". FireEye. 2016. Retrieved 9 October 2016.
- ^ "Chinese Espionage Campaign Compromises Forbes.com to Target US Defense, Financial Services Companies in Watering Hole Style Attack". invincea.com. 10 February 2015. Retrieved 23 February 2017.
- ^ "Social Engineering, the USB Way". Light Reading Inc. 7 June 2006. Archived from the original on 13 July 2006. Retrieved 23 April 2014.
- ^ "Archived copy" (PDF). Archived from the original (PDF) on 11 October 2007. Retrieved 2 March 2012.
{{cite web}}
: CS1 maint: archived copy as title (link) - ^ Conklin, Wm. Arthur; White, Greg; Cothren, Chuck; Davis, Roger; Williams, Dwayne (2015). Principles of Computer Security, Fourth Edition (Official Comptia Guide). New York: McGraw-Hill Education. pp. 193–194. ISBN 978-0071835978.
- ^ Raywood, Dan (4 August 2016). "#BHUSA Dropped USB Experiment Detailed". info security. Retrieved 28 July 2017.
- ^ Leyden, John (18 April 2003). "Office workers give away passwords". The Register. Retrieved 11 April 2012.
- ^ "Passwords revealed by sweet deal". BBC News. 20 April 2004. Retrieved 11 April 2012.
- ^ "Email Spoofing – What it Is, How it Works & More - Proofpoint US". www.proofpoint.com. 26 February 2021. Retrieved 11 October 2021.
- ^ Treglia, J., & Delia, M. (2017). Cyber Security Inoculation. Presented at NYS Cyber Security Conference, Empire State Plaza Convention Center, Albany, NY, 3–4 June.
- ^ Mitnick, K., & Simon, W. (2005). "The Art of Intrusion". Indianapolis, IN: Wiley Publishing.
- ^ Allsopp, William. Unauthorised access: Physical penetration testing for it security teams. Hoboken, NJ: Wiley, 2009. 240–241.
- ^ "social engineering – GW Information Security Blog". blogs.gwu.edu. Retrieved 18 February 2020.
- ^ Salinger, Lawrence M. (2005). Encyclopedia of White-Collar & Corporate Crime. SAGE. ISBN 978-0-7619-3004-4.
- ^ "How Frank Abagnale Would Swindle You". U.S. News. 17 December 2019. Archived from the original on 28 April 2013. Retrieved 17 December 2019.
- ^ "Kevin Mitnick sentenced to nearly four years in prison; computer hacker ordered to pay restitution to victim companies whose systems were compromised" (Press release). United States Attorney's Office, Central District of California. 9 August 1999. Archived from the original on 13 June 2013.
- ^ "DEF CON III Archives – Susan Thunder Keynote". DEF CON. Retrieved 12 August 2017.
- ^ "Archived copy". Archived from the original on 17 April 2001. Retrieved 6 January 2007.
{{cite web}}
: CS1 maint: archived copy as title (link) - ^ Hafner, Katie (August 1995). "Kevin Mitnick, unplugged". Esquire. 124 (2): 80(9).
- ^ "Wired 12.02: Three Blind Phreaks". Wired. 14 June 1999. Retrieved 11 April 2012.
- ^ "Social Engineering A Young Hacker's Tale" (PDF). 15 February 2013. Retrieved 13 January 2020.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ "43 Best Social Engineering Books of All Time". BookAuthority. Retrieved 22 January 2020.
- ^ \ (31 August 2018). "Bens Book of the Month Review of Social Engineering The Science of Human Hacking". RSA Conference. Retrieved 22 January 2020.
{{cite web}}
:|author=
has numeric name (help) - ^ "Book Review: Social Engineering: The Science of Human Hacking". The Ethical Hacker Network. 26 July 2018. Retrieved 22 January 2020.
- ^ Hadnagy, Christopher; Fincher, Michele (22 January 2020). "Phishing Dark Waters: The Offensive and Defensive Sides of Malicious E-mails". ISACA. Retrieved 22 January 2020.
- ^ "WTVR:"Protect Your Kids from Online Threats"
- ^ Larson, Selena (14 August 2017). "Hacker creates organization to unmask child predators". CNN. Retrieved 14 November 2019.
- ^ Restatement 2d of Torts § 652C.
- ^ "Congress outlaws pretexting". 109th Congress (2005–2006) H.R.4709 – Telephone Records and Privacy Protection Act of 2006. 2007.
- ^ Mitnick, K (2002): "The Art of Deception", p. 103 Wiley Publishing Ltd: Indianapolis, Indiana; United States of America. ISBN 0-471-23712-4
- ^ HP chairman: Use of pretexting 'embarrassing' Stephen Shankland, 8 September 2006 1:08 PM PDT CNET News.com
- ^ "Calif. court drops charges against Dunn". CNET. 14 March 2007. Retrieved 11 April 2012.
- ^ "What is Social Engineering | Attack Techniques & Prevention Methods | Imperva". Learning Center. Retrieved 18 February 2020.
Further reading
- Boyington, Gregory. (1990). 'Baa Baa Black Sheep' Published by Gregory Boyington ISBN 0-553-26350-1
- Harley, David. 1998 Re-Floating the Titanic: Dealing with Social Engineering Attacks EICAR Conference.
- Laribee, Lena. June 2006 Development of methodical social engineering taxonomy project Master's Thesis, Naval Postgraduate School.
- Leyden, John. 18 April 2003. Office workers give away passwords for a cheap pen. The Register. Retrieved 2004-09-09.
- Long, Johnny. (2008). No Tech Hacking – A Guide to Social Engineering, Dumpster Diving, and Shoulder Surfing Published by Syngress Publishing Inc. ISBN 978-1-59749-215-7
- Mann, Ian. (2008). Hacking the Human: Social Engineering Techniques and Security Countermeasures Published by Gower Publishing Ltd. ISBN 0-566-08773-1 or ISBN 978-0-566-08773-8
- Mitnick, Kevin, Kasperavičius, Alexis. (2004). CSEPS Course Workbook. Mitnick Security Publishing.
- Mitnick, Kevin, Simon, William L., Wozniak, Steve,. (2002). The Art of Deception: Controlling the Human Element of Security Published by Wiley. ISBN 0-471-23712-4 or ISBN 0-7645-4280-X
- Hadnagy, Christopher, (2011) Social Engineering: The Art of Human Hacking Published by Wiley. ISBN 0-470-63953-9
- N.J. Evans. (2009). "Information Technology Social Engineering: An Academic Definition and Study of Social Engineering-Analyzing the Human Firewall." Graduate Theses and Dissertations. 10709. https://lib.dr.iastate.edu/etd/10709
- Z. Wang, L. Sun and H. Zhu. (2020) "Defining Social Engineering in Cybersecurity," in IEEE Access, vol. 8, pp. 85094-85115, doi: 10.1109/ACCESS.2020.2992807.
External links
- Social Engineering Fundamentals – Securityfocus.com. Retrieved 3 August 2009.
- "Social Engineering, the USB Way". Light Reading Inc. 7 June 2006. Archived from the original on 13 July 2006. Retrieved 23 April 2014.
- Should Social Engineering be a part of Penetration Testing? – Darknet.org.uk. Retrieved 3 August 2009.
- "Protecting Consumers' Phone Records", Electronic Privacy Information Center US Committee on Commerce, Science, and Transportation . Retrieved 8 February 2006.
- Plotkin, Hal. Memo to the Press: Pretexting is Already Illegal. Retrieved 9 September 2006.