Complex systems: Difference between revisions
m General fixes and Typo fixing, typo(s) fixed: phenomena → phenomenon, the exact same → exactly the same using AWB |
No edit summary |
||
Line 2: | Line 2: | ||
{{Complex systems}} |
{{Complex systems}} |
||
'''Complex systems''' are [[system]]s whose behavior is intrinsically difficult to model due to the dependencies, relationships, or interactions between their parts or between a given system and its environment. Systems that are "[[Complexity|complex]]" have distinct properties that arise from these relationships, such as [[Nonlinear system|nonlinearity]], [[emergence]], [[spontaneous order]], [[Complex adaptive system|adaptation]], and [[Feedback|feedback loops]], among others. Because such systems appear in a wide variety of fields, the commonalities among them have become the topic of their own independent area of research. |
'''Complex systems''' are [[system]]s whose behavior is intrinsically difficult to [[Systems modeling|model]] due to the dependencies, relationships, or interactions between their parts or between a given system and its environment. Systems that are "[[Complexity|complex]]" have distinct properties that arise from these relationships, such as [[Nonlinear system|nonlinearity]], [[emergence]], [[spontaneous order]], [[Complex adaptive system|adaptation]], and [[Feedback|feedback loops]], among others. Because such systems appear in a wide variety of fields, the commonalities among them have become the topic of their own independent area of research. |
||
== Overview == |
== Overview == |
Revision as of 19:24, 5 July 2017
Complex systems |
---|
Topics |
Complex systems are systems whose behavior is intrinsically difficult to model due to the dependencies, relationships, or interactions between their parts or between a given system and its environment. Systems that are "complex" have distinct properties that arise from these relationships, such as nonlinearity, emergence, spontaneous order, adaptation, and feedback loops, among others. Because such systems appear in a wide variety of fields, the commonalities among them have become the topic of their own independent area of research.
Overview
The term complex systems often refers to the study of complex systems, which is an approach to science that investigates how relationships between a system's parts give rise to its collective behaviors and how the system interacts and forms relationships with its environment.[1] The study of complex systems regards collective, or system-wide, behaviors as the fundamental object of study; for this reason, complex systems can be understood as an alternative paradigm to reductionism, which attempts to explain systems in terms of their constituent parts and the individual interactions between them.
As an interdisciplinary domain, complex systems draws contributions from many different fields, such as the study of self-organization from physics, that of spontaneous order from the social sciences, chaos from mathematics, adaptation from biology, and many others. Complex systems is therefore often used as a broad term encompassing a research approach to problems in many diverse disciplines, including statistical physics, information theory, nonlinear dynamics, anthropology, computer science, meteorology, sociology, economics, psychology, and biology.
Key concepts
Systems
Complex systems is chiefly concerned with the behaviors and properties of systems. A system, broadly defined, is a set of entities that, through their interactions, relationships, or dependencies, form a unified whole. It is always defined in terms of its boundary, which determines the entities that are or are not part of the system. Entities lying outside the system then become part of the system's environment.
A system can exhibit properties that produce behaviors which are distinct from the properties and behaviors of its parts; these system-wide or global properties and behaviors are characteristics of how the system interacts with or appears to its environment, or of how its parts behave (say, in response to external stimuli) by virtue of being within the system. The notion of behavior implies that the study of systems is also concerned with processes that take place over time (or, in mathematics, some other phase space parameterization). Because of their broad, interdisciplinary applicability, systems concepts play a central role in complex systems.
As a field of study, complex systems is a subset of systems theory. General systems theory focuses similarly on the collective behaviors of interacting entities, but it studies a much broader class of systems, including non-complex systems where traditional reductionist approaches may remain viable. Indeed, systems theory seeks to explore and describe all classes of systems, and the invention of categories that are useful to researchers across widely varying fields is one of systems theory's main objectives.
As it relates to complex systems, systems theory contributes an emphasis on the way relationships and dependencies between a system's parts can determine system-wide properties. It also contributes the interdisciplinary perspective of the study of complex systems: the notion that shared properties link systems across disciplines, justifying the pursuit of modeling approaches applicable to complex systems wherever they appear. Specific concepts important to complex systems, such as emergence, feedback loops, and adaptation, also originate in systems theory.
Complexity
Systems exhibit complexity when difficulties with modeling them are endemic. This means their behaviors cannot be understood apart from the very properties that make them difficult to model, and they are governed entirely, or almost entirely, by the behaviors those properties produce. Any modeling approach that ignores such difficulties or characterizes them as noise, then, will necessarily produce models that are neither accurate nor useful. As yet no fully general theory of complex systems has emerged for addressing these problems, so researchers must solve them in domain-specific contexts. Researchers in complex systems address these problems by viewing the chief task of modeling to be capturing, rather than reducing, the complexity of their respective systems of interest.
While no generally accepted exact definition of complexity exists yet, there are many archetypal examples of complexity. Systems can be complex if, for instance, they have chaotic behavior (behavior that exhibits extreme sensitivity to initial conditions), or if they have emergent properties (properties that are not apparent from their components in isolation but which result from the relationships and dependencies they form when placed together in a system), or if they are computationally intractable to model (if they depend on a number of parameters that grows too rapidly with respect to the size of the system).
Networks
The interacting components of a complex system form a network, which is a collection of discrete objects and relationships between them, usually depicted as a graph of vertices connected by edges. Networks can describe the relationships between individuals within an organization, between logic gates in a circuit, between genes in gene regulatory networks, or between any other set of related entities.
Networks often describe the sources of complexity in complex systems. Studying complex systems as networks therefore enables many useful applications of graph theory and network science. Some complex systems, for example, are also complex networks, which have properties such as power-law degree distributions that readily lend themselves to emergent or chaotic behavior. The fact that the number of edges in a complete graph grows quadratically in the number of vertices sheds additional light on the source of complexity in large networks: as a network grows, the number of relationships between entities quickly dwarfs the number of entities in the network.
Nonlinearity
Complex systems often have nonlinear behavior, meaning they may respond in different ways to the same input depending on their state or context. In mathematics and physics, nonlinearity describes systems in which a change in the size of the input does not produce a proportional change in the size of the output. For a given change in input, such systems may yield significantly greater than or less than proportional changes in output, or even no output at all, depending on the current state of the system or its parameter values.
Of particular interest to complex systems are nonlinear dynamical systems, which are systems of differential equations that have one or more nonlinear terms. Some nonlinear dynamical systems, such as the Lorenz system, can produce a mathematical phenomenon known as chaos. Chaos as it applies to complex systems refers to the sensitive dependence on initial conditions, or "butterfly effect," that a complex system can exhibit. In such a system, small changes to initial conditions can lead to dramatically different outcomes. Chaotic behavior can therefore be extremely hard to model numerically, because small rounding errors at an intermediate stage of computation can cause the model to generate completely inaccurate output. Furthermore, if a complex system returns to a state similar to one it held previously, it may behave completely differently in response to exactly the same stimuli, so chaos also poses challenges for extrapolating from past experience.
Emergence
Another common feature of complex systems is the presence of emergent behaviors and properties: these are traits of a system which are not apparent from its components in isolation but which result from the interactions, dependencies, or relationships they form when placed together in a system. Emergence broadly describes the appearance of such behaviors and properties, and has applications to systems studied in both the social and physical sciences. While emergence is often used to refer only to the appearance of unplanned organized behavior in a complex system, emergence can also refer to the breakdown of organization; it describes any phenomena which are difficult or even impossible to predict from the smaller entities that make up the system.
One example of complex system whose emergent properties have been studied extensively is cellular automata. In a cellular automaton, a grid of cells, each having one of finitely many states, evolves over time according to a simple set of rules. These rules guide the "interactions" of each cell with its neighbors. Although the rules are only defined locally, they have been shown capable of producing globally interesting behavior, for example in Conway's Game of Life.
Spontaneous order and self-organization
When emergence describes the appearance of unplanned order, it is spontaneous order (in the social sciences) or self-organization (in physical sciences). Spontaneous order can be seen in herd behavior, whereby a group of individuals coordinates their actions without centralized planning. Self-organization can be seen in the global symmetry of certain crystals, for instance the apparent radial symmetry of snowflakes, which arises from purely local attractive and repulsive forces both between water molecules and between water molecules and their surrounding environment.
Adaptation
Complex adaptive systems are special cases of complex systems that are adaptive in that they have the capacity to change and learn from experience. Examples of complex adaptive systems include the stock market, social insect and ant colonies, the biosphere and the ecosystem, the brain and the immune system, the cell and the developing embryo, manufacturing businesses and any human social group-based endeavor in a cultural and social system such as political parties or communities.
History
The earliest precursor to modern complex systems theory can be found in the classical political economy of the Scottish Enlightenment, later developed by the Austrian school of economics, which argues that order in market systems is spontaneous (or emergent) in that it is the result of human action, but not the execution of any human design.[3][4]
Upon this the Austrian school developed from the 19th to the early 20th century the economic calculation problem, along with the concept of dispersed knowledge, which were to fuel debates against the then-dominant Keynesian economics. This debate would notably lead economists, politicians and other parties to explore the question of computational complexity.[citation needed]
A pioneer in the field, and inspired by Karl Popper's and Warren Weaver's works, Nobel prize economist and philosopher Friedrich Hayek dedicated much of his work, from early to the late 20th century, to the study of complex phenomena,[5] not constraining his work to human economies but venturing into other fields such as psychology,[6] biology and cybernetics. Gregory Bateson played a key role in establishing the connection between anthropology and systems theory; he recognized that the interactive parts of cultures function much like ecosystems.
In mathematics, arguably the largest contribution to the study of complex systems was the discovery of chaos in deterministic systems, a feature of certain dynamical systems that is strongly related to nonlinearity.[7]
The notion of self-organizing systems is tied to work in nonequilibrium thermodynamics, including that pioneered by chemist and Nobel laureate Ilya Prigogine in his study of dissipative structures. Even older is the work by Hartree-Fock c.s. on the quantum-chemistry equations and later calculations of the structure of molecules which can be regarded as one of the earliest examples of emergence and emergent wholes in science.
The first research institute focused on complex systems, the Santa Fe Institute, was founded in 1984.[8] Early Santa Fe Institute participants included physics Nobel laureates Murray Gell-Mann and Philip Anderson, economics Nobel laureate Kenneth Arrow, and Manhattan Project scientists George Cowan and Herb Anderson.[9] Today, there are over 50 institutes and research centers focusing on complex systems.
Applications of complex systems
Complexity in practice
The traditional approach to dealing with complexity is to reduce or constrain it. Typically, this involves compartmentalisation: dividing a large system into separate parts. Organizations, for instance, divide their work into departments that each deal with separate issues. Engineering systems are often designed using modular components. However, modular designs become susceptible to failure when issues arise that bridge the divisions.
Complexity management
As projects and acquisitions become increasingly complex, companies and governments are challenged to find effective ways to manage mega-acquisitions such as the Army Future Combat Systems. Acquisitions such as the FCS rely on a web of interrelated parts which interact unpredictably. As acquisitions become more network-centric and complex, businesses will be forced to find ways to manage complexity while governments will be challenged to provide effective governance to ensure flexibility and resiliency.[10]
Complexity economics
Over the last decades, within the emerging field of complexity economics new predictive tools have been developed to explain economic growth. Such is the case with the models built by the Santa Fe Institute in 1989 and the more recent economic complexity index (ECI), introduced by the MIT physicist Cesar A. Hidalgo and the Harvard economist Ricardo Hausmann. Based on the ECI, Hausmann, Hidalgo and their team of The Observatory of Economic Complexity have produced GDP forecasts for the year 2020.[citation needed]
Complexity and education
Focusing on issues of student persistence with their studies, Forsman, Moll and Linder explore the "viability of using complexity science as a frame to extend methodological applications for physics education research," finding that "framing a social network analysis within a complexity science perspective offers a new and powerful applicability across a broad range of PER topics."[11]
Complexity and modeling
One of Friedrich Hayek's main contributions to early complexity theory is his distinction between the human capacity to predict the behaviour of simple systems and its capacity to predict the behaviour of complex systems through modeling. He believed that economics and the sciences of complex phenomena in general, which in his view included biology, psychology, and so on, could not be modeled after the sciences that deal with essentially simple phenomena like physics.[12] Hayek would notably explain that complex phenomena, through modeling, can only allow pattern predictions, compared with the precise predictions that can be made out of non-complex phenomena.[13]
Complexity and chaos theory
Complexity theory is rooted in chaos theory, which in turn has its origins more than a century ago in the work of the French mathematician Henri Poincaré. Chaos is sometimes viewed as extremely complicated information, rather than as an absence of order.[14] Chaotic systems remain deterministic, though their long-term behavior can be difficult to predict with any accuracy. With perfect knowledge of the initial conditions and of the relevant equations describing the chaotic system's behavior, one can theoretically make perfectly accurate predictions about the future of the system, though in practice this is impossible to do with arbitrary accuracy. Ilya Prigogine argued[15] that complexity is non-deterministic, and gives no way whatsoever to precisely predict the future.[16]
The emergence of complexity theory shows a domain between deterministic order and randomness which is complex.[17] This is referred as the "edge of chaos".[18]
When one analyzes complex systems, sensitivity to initial conditions, for example, is not an issue as important as it is within chaos theory, in which it prevails. As stated by Colander,[19] the study of complexity is the opposite of the study of chaos. Complexity is about how a huge number of extremely complicated and dynamic sets of relationships can generate some simple behavioral patterns, whereas chaotic behavior, in the sense of deterministic chaos, is the result of a relatively small number of non-linear interactions.[17]
Therefore, the main difference between chaotic systems and complex systems is their history.[20] Chaotic systems do not rely on their history as complex ones do. Chaotic behaviour pushes a system in equilibrium into chaotic order, which means, in other words, out of what we traditionally define as 'order'.[clarification needed] On the other hand, complex systems evolve far from equilibrium at the edge of chaos. They evolve at a critical state built up by a history of irreversible and unexpected events, which physicist Murray Gell-Mann called "an accumulation of frozen accidents."[21] In a sense chaotic systems can be regarded as a subset of complex systems distinguished precisely by this absence of historical dependence. Many real complex systems are, in practice and over long but finite time periods, robust. However, they do possess the potential for radical qualitative change of kind whilst retaining systemic integrity. Metamorphosis serves as perhaps more than a metaphor for such transformations.
Complexity and network science
A complex system is usually composed of many components and their interactions. Such a system can be represented by a network where nodes represent the components and links represent their interactions.[22] [23][24] for example, the INTERNET can be represented as a network composed of nodes (computers) and links (direct connections between computers). Its resilience to failures was studied using percolation theory in.[25] Other examples are social networks, airline networks,[26] biological networks and climate networks.[27] Networks can also fail and recover spontaneously. For modeling this phenomenon see ref.[28] Interacting complex systems can be modeled as networks of networks. For their breakdown and recovery properties see [29] [30]
General form of complexity computation
The computational law of reachable optimality[31] is established as a general form of computation for ordered system and it reveals complexity computation is a compound computation of optimal choice and optimality driven reaching pattern overtime underlying a specific and any experience path of ordered system within the general limitation of system integrity.
The computational law of reachable optimality has four key components as described below.
1. Reachability of Optimality: Any intended optimality shall be reachable. Unreachable optimality has no meaning for a member in the ordered system and even for the ordered system itself.
2. Prevailing and Consistency: Maximizing reachability to explore best available optimality is the prevailing computation logic for all members in the ordered system and is accommodated by the ordered system.
3. Conditionality: Realizable tradeoff between reachability and optimality depends primarily upon the initial bet capacity and how the bet capacity evolves along with the payoff table update path triggered by bet behavior and empowered by the underlying law of reward and punishment. Precisely, it is a sequence of conditional events where the next event happens upon reached status quo from experience path.
4. Robustness: The more challenge a reachable optimality can accommodate, the more robust it is in term of path integrity.
There are also four computation features in the law of reachable optimality.
1. Optimal Choice: Computation in realizing Optimal Choice can be very simple or very complex. A simple rule in Optimal Choice is to accept whatever is reached, Reward As You Go (RAYG). A Reachable Optimality computation reduces into optimizing reachability when RAYG is adopted. The Optimal Choice computation can be more complex when multiple NE strategies present in a reached game.
2. Initial Status: Computation is assumed to start at an interested beginning even the absolute beginning of an ordered system in nature may not and need not present. An assumed neutral Initial Status facilitates an artificial or a simulating computation and is not expected to change the prevalence of any findings.
3. Territory: An ordered system shall have a territory where the universal computation sponsored by the system will produce an optimal solution still within the territory.
4. Reaching Pattern: The forms of Reaching Pattern in the computation space, or the Optimality Driven Reaching Pattern in the computation space, primarily depend upon the nature and dimensions of measure space underlying a computation space and the law of punishment and reward underlying the realized experience path of reaching. There are five basic forms of experience path we are interested in, persistently positive reinforcement experience path, persistently negative reinforcement experience path, mixed persistent pattern experience path, decaying scale experience path and selection experience path.
The compound computation in selection experience path includes current and lagging interaction, dynamic topological transformation and implies both invariance and variance characteristics in an ordered system's experience path.
In addition, the computation law of reachable optimality gives out the boundary between complexity model, chaotic model and determination model. When RAYG is the Optimal Choice computation, and the reaching pattern is a persistently positive experience path, persistently negative experience path, or mixed persistent pattern experience path, the underlying computation shall be a simple system computation adopting determination rules. If the reaching pattern has no persistent pattern experienced in RAYG regime, the underlying computation hints there is a chaotic system. When the optimal choice computation involves non-RAYG computation, it's a complexity computation driving the compound effect.
Notable figures
- Christopher Alexander
- Gregory Bateson
- Ludwig von Bertalanffy
- Samuel Bowles
- Paul Cilliers
- Murray Gell-Mann
- Arthur Iberall
- Stuart Kauffman
- Cris Moore
- Bill McKelvey
- Jerry Sabloff
- Geoffrey West
- Yaneer Bar-Yam
- Walter Clemens, Jr.
See also
References
- ^ Bar-Yam, Yaneer (2002). "General Features of Complex Systems" (PDF). Encyclopedia of Life Support Systems. EOLSS UNESCO Publishers, Oxford, UK. Retrieved 16 September 2014.
- ^ Daniel Dennett (1995), Darwin's Dangerous Idea, Penguin Books, London, ISBN 978-0-14-016734-4, ISBN 0-14-016734-X
- ^ Ferguson, Adam (1767). An Essay on the History of Civil Society. London: T. Cadell. Part the Third, Section II, p. 205.
{{cite book}}
: Cite has empty unknown parameter:|coauthors=
(help); Unknown parameter|nopp=
ignored (|no-pp=
suggested) (help) - ^ Friedrich Hayek, "The Results of Human Action but Not of Human Design" in New Studies in Philosophy, Politics, Economics, Chicago: University of Chicago Press, 1978, pp. 96–105.
- ^ Bruce J. Caldwell, Popper and Hayek: Who influenced whom?, Karl Popper 2002 Centenary Congress, 2002.
- ^ Friedrich von Hayek, The Sensory Order: An Inquiry into the Foundations of Theoretical Psychology, The University of Chicago Press, 1952.
- ^ History of Complex Systems
- ^ Ledford, H (2015). "How to solve the world's biggest problems". Nature. 525 (7569): 308–311. doi:10.1038/525308a.
- ^ Waldrop, M. M. (1993). Complexity: The emerging science at the edge of order and chaos. Simon and Schuster.
- ^ CSIS paper: "Organizing for a Complex World: The Way Ahead
- ^ Forsman, Jonas; Moll, Rachel; Linder, Cedric (2014). "Extending the theoretical framing for physics education research: An illustrative application of complexity science". Physical Review Special Topics - Physics Education Research. 10 (2). doi:10.1103/PhysRevSTPER.10.020122. http://hdl.handle.net/10613/2583.
- ^ Reason Magazine - The Road from Serfdom
- ^ Friedrich August von Hayek - Prize Lecture
- ^ Hayles, N. K. (1991). Chaos Bound: Orderly Disorder in Contemporary Literature and Science. Cornell University Press, Ithaca, NY.
- ^ Prigogine, I. (1997). The End of Certainty, The Free Press, New York.
- ^ See also D. Carfì (2008). "Superpositions in Prigogine approach to irreversibility". AAPP: Physical, Mathematical, and Natural Sciences. 86 (1): 1–13..
- ^ a b Cilliers, P. (1998). Complexity and Postmodernism: Understanding Complex Systems, Routledge, London.
- ^ Per Bak (1996). How Nature Works: The Science of Self-Organized Criticality, Copernicus, New York, U.S.
- ^ Colander, D. (2000). The Complexity Vision and the Teaching of Economics, E. Elgar, Northampton, Massachusetts.
- ^ Buchanan, M. (2000). Ubiquity : Why catastrophes happen, three river press, New-York.
- ^ Gell-Mann, M. (1995). What is Complexity? Complexity 1/1, 16-19
- ^ Dorogovtsev, S.N.; Mendes, J.F.F. (2003). "Evolution of Networks". doi:10.1093/acprof:oso/9780198515906.001.0001.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ Fortunato, Santo (2011). "Reuven Cohen and Shlomo Havlin: Complex Networks". Journal of Statistical Physics. 142 (3): 640–641. doi:10.1007/s10955-011-0129-7. ISSN 0022-4715.
- ^ Newman, Mark (2010). "Networks". doi:10.1093/acprof:oso/9780199206650.001.0001.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ Cohen, Reuven; Erez, Keren; ben-Avraham, Daniel; Havlin, Shlomo (2001). "Cohen, Erez, ben-Avraham, and Havlin Reply:". Physical Review Letters. 87 (21). Bibcode:2001PhRvL..87u9802C. doi:10.1103/PhysRevLett.87.219802. ISSN 0031-9007.
- ^ Barrat, A.; Barthelemy, M.; Pastor-Satorras, R.; Vespignani, A. (2004). "The architecture of complex weighted networks". Proceedings of the National Academy of Sciences. 101 (11): 3747–3752. doi:10.1073/pnas.0400087101. ISSN 0027-8424. PMC 374315. PMID 15007165.
- ^ Yamasaki, K.; Gozolchiani, A.; Havlin, S. (2008). "Climate Networks around the Globe are Significantly Affected by El Niño". Physical Review Letters. 100 (22): 228501. doi:10.1103/PhysRevLett.100.228501. ISSN 0031-9007. PMID 18643467.
- ^ Majdandzic, Antonio; Podobnik, Boris; Buldyrev, Sergey V.; Kenett, Dror Y.; Havlin, Shlomo; Eugene Stanley, H. (2013). "Spontaneous recovery in dynamical networks". Nature Physics. 10 (1): 34–38. doi:10.1038/nphys2819. ISSN 1745-2473.
- ^ Gao, Jianxi; Buldyrev, Sergey V.; Stanley, H. Eugene; Havlin, Shlomo (2011). "Networks formed from interdependent networks". Nature Physics. 8 (1): 40–48. Bibcode:2012NatPh...8...40G. doi:10.1038/nphys2180. ISSN 1745-2473.
- ^ Majdandzic, Antonio; Braunstein, Lidia A.; Curme, Chester; Vodenska, Irena; Levy-Carciente, Sary; Eugene Stanley, H.; Havlin, Shlomo (2016). "Multiple tipping points and optimal repairing in interacting networks". Nature Communications. 7: 10850. doi:10.1038/ncomms10850. ISSN 2041-1723.
- ^ Wenliang Wang (2015). Pooling Game Theory and Public Pension Plan. ISBN 978-1507658246. Chapter 4.
Further reading
- Bazin, A. (2014). Defeating ISIS and Their Complex Way of War Small Wars Journal.
- Syed M. Mehmud (2011), A Healthcare Exchange Complexity Model
- Chu, D.; Strand, R.; Fjelland, R. (2003). "Theories of complexity". Complexity. 8 (3): 19–30. doi:10.1002/cplx.10059.
- L.A.N. Amaral and J.M. Ottino, Complex networks — augmenting the framework for the study of complex system, 2004.
- Gell-Mann, Murray (1995). "Let's Call It Plectics" (PDF). Complexity. 1 (5).
- Nigel Goldenfeld and Leo P. Kadanoff, Simple Lessons from Complexity, 1999
- A. Gogolin, A. Nersesyan and A. Tsvelik, Theory of strongly correlated systems , Cambridge University Press, 1999.
- Kelly, K. (1995). Out of Control, Perseus Books Group.
- Donald Snooks, Graeme (2008). "A general theory of complex living systems: Exploring the demand side of dynamics". Complexity. 13 (6): 12–20. doi:10.1002/cplx.20225.
- Sorin Solomon and Eran Shir, Complexity; a science at 30, 2003.
- Preiser-Kapeller, Johannes, "Calculating Byzantium. Social Network Analysis and Complexity Sciences as tools for the exploration of medieval social dynamics". August 2010
- Walter Clemens, Jr., Complexity Science and World Affairs, SUNY Press, 2013.
External links
- "The Open Agent-Based Modeling Consortium".
- "Complexity Science Focus".
- "Santa Fe Institute".
- "The Center for the Study of Complex Systems, Univ. of Michigan Ann Arbor".
- "INDECS". (Interdisciplinary Description of Complex Systems)
- "Center for Complex Systems Research, Univ. of Illinois".
- "Introduction to complex systems - Short course by Shlomo Havlin".
- Jessie Henshaw (October 24, 2013). "Complex Systems". Encyclopedia of Earth.