Counterintuitive Behavior of Social Systems
The topic of this article may not meet Wikipedia's general notability guideline. |
This redirect may meet Wikipedia's criteria for speedy deletion because: The article is here to capture this great quote. The quote is now incorporated into the articleflame wars removing this articles raison d'etre. For valid criteria, see CSD.
If this redirect does not meet the criteria for speedy deletion, or you intend to fix it, please remove this notice, but do not remove this notice from pages that you have created yourself. If you created this page and you disagree with the given reason for deletion, you can click the button below and leave a message explaining why you believe it should not be deleted. You can also visit the talk page to check if you have received a response to your message. Note that this redirect may be deleted at any time if it unquestionably meets the speedy deletion criteria, or if an explanation posted to the talk page is found to be insufficient.
Note to administrators: this redirect has content on its talk page which should be checked before deletion. Administrators: check links, talk, history (last), and logs before deletion. Consider checking Google.This page was last edited by Cosnahang (contribs | logs) at 10:17, 3 September 2009 (UTC) (15 years ago) |
"Counterintuitive Behavior of Social Systems" is a paper by Jay Forrester.
In it, Forrester gives a concise explanation of how flame wars start.
- The mental model is fuzzy. It is incomplete. It is imprecisely stated. Furthermore, within one individual, a mental model changes with time and even during the flow of a single conversation. The human mind assembles a few relationships to fit the context of a discussion. As the subject shifts so does the model. When only a single topic is being discussed, each participant in a conversation employs a different mental model to interpret the subject. Fundamental assumptions differ but are never brought into the open. Goals are different and are left unstated. It is little wonder that compromise takes so long. And it is not surprising that consensus leads to laws and programs that fail in their objectives or produce new difficulties greater than those that have been relieved.
Forrester's work with complex systems has potential applications to semiotics, artificial intelligence, and the semantic web.