Computational sociology: Difference between revisions
Citation bot (talk | contribs) Added s2cid. | Use this bot. Report bugs. | Suggested by Neko-chan | Category:CS1 maint: DOI inactive as of February 2024 | #UCB_Category 77/597 |
Citation bot (talk | contribs) Removed parameters. | Use this bot. Report bugs. | #UCB_CommandLine |
||
(3 intermediate revisions by 2 users not shown) | |||
Line 25: | Line 25: | ||
===Systems theory and structural functionalism=== |
===Systems theory and structural functionalism=== |
||
{{main|Systems theory|Structural functionalism}} |
{{main|Systems theory|Structural functionalism}} |
||
In the post-war era, [[Vannevar Bush]]'s [[differential analyser]], [[John von Neumann]]'s [[Von Neumann cellular automata|cellular automata]], [[Norbert Wiener]]'s [[cybernetics]], and [[Claude Shannon]]'s [[information theory]] became influential paradigms for modeling and understanding complexity in technical systems. In response, scientists in disciplines such as physics, biology, electronics, and economics began to articulate a [[systems theory|general theory of systems]] in which all natural and physical phenomena are manifestations of interrelated elements in a system that has common patterns and properties. Following [[Émile Durkheim]]'s call to analyze complex modern society ''[[sui generis]]'',<ref>{{cite book|first=Émile |last=Durkheim |title=The Division of Labor in Society |location=New York, NY |publisher=Macmillan}}</ref> post-war structural functionalist sociologists such as [[Talcott Parsons]] seized upon these theories of systematic and hierarchical interaction among constituent components to attempt to generate grand unified sociological theories, such as the [[AGIL paradigm]].<ref name="Bailey">{{cite book|first1=Kenneth D. |last1=Bailey |editor=Jonathan H. Turner |chapter=Systems Theory |title=Handbook of Sociological Theory |publisher=Springer Science |year=2006 |location=New York, NY |isbn=978-0-387-32458-6 |pages=379–404}}</ref> Sociologists such as [[George Homans]] argued that sociological theories should be formalized into hierarchical structures of propositions and precise terminology from which other propositions and hypotheses could be derived and operationalized into empirical studies.<ref name="Blackwell">{{cite encyclopedia|year=2007 |title=Computational Sociology |last=Bainbridge |first=William Sims |encyclopedia=Blackwell Encyclopedia of Sociology |publisher=Blackwell Reference Online |url=http://www.sociologyencyclopedia.com/subscriber/tocnode?id=g9781405124331_chunk_g97814051243319_ss1-85 |doi=10.1111/b.9781405124331.2007.x |
In the post-war era, [[Vannevar Bush]]'s [[differential analyser]], [[John von Neumann]]'s [[Von Neumann cellular automata|cellular automata]], [[Norbert Wiener]]'s [[cybernetics]], and [[Claude Shannon]]'s [[information theory]] became influential paradigms for modeling and understanding complexity in technical systems. In response, scientists in disciplines such as physics, biology, electronics, and economics began to articulate a [[systems theory|general theory of systems]] in which all natural and physical phenomena are manifestations of interrelated elements in a system that has common patterns and properties. Following [[Émile Durkheim]]'s call to analyze complex modern society ''[[sui generis]]'',<ref>{{cite book|first=Émile |last=Durkheim |title=The Division of Labor in Society |location=New York, NY |publisher=Macmillan}}</ref> post-war structural functionalist sociologists such as [[Talcott Parsons]] seized upon these theories of systematic and hierarchical interaction among constituent components to attempt to generate grand unified sociological theories, such as the [[AGIL paradigm]].<ref name="Bailey">{{cite book|first1=Kenneth D. |last1=Bailey |editor=Jonathan H. Turner |chapter=Systems Theory |title=Handbook of Sociological Theory |publisher=Springer Science |year=2006 |location=New York, NY |isbn=978-0-387-32458-6 |pages=379–404}}</ref> Sociologists such as [[George Homans]] argued that sociological theories should be formalized into hierarchical structures of propositions and precise terminology from which other propositions and hypotheses could be derived and operationalized into empirical studies.<ref name="Blackwell">{{cite encyclopedia|year=2007 |title=Computational Sociology |last=Bainbridge |first=William Sims |encyclopedia=Blackwell Encyclopedia of Sociology |publisher=Blackwell Reference Online |url=http://www.sociologyencyclopedia.com/subscriber/tocnode?id=g9781405124331_chunk_g97814051243319_ss1-85 |doi=10.1111/b.9781405124331.2007.x |hdl=10138/224218 |editor=Ritzer, George|isbn=978-1-4051-2433-1|hdl-access=free }}</ref> Because computer algorithms and programs had been used as early as 1956 to test and validate mathematical theorems, such as the [[four color theorem]],<ref>{{cite book|last=Crevier |first=D. |year=1993 |title=AI: The Tumultuous History of the Search for Artificial Intelligence |url=https://archive.org/details/aitumultuoushist00crev |url-access=registration |publisher=Basic Books |location=New York, NY|isbn=9780465001040 }}</ref> some scholars anticipated that similar computational approaches could "solve" and "prove" analogously formalized problems and theorems of social structures and dynamics. |
||
===Macrosimulation and microsimulation=== |
===Macrosimulation and microsimulation=== |
||
Line 35: | Line 35: | ||
The 1970s and 1980s were also a time when physicists and mathematicians were attempting to model and analyze how simple component units, such as atoms, give rise to global properties, such as complex material properties at low temperatures, in magnetic materials, and within turbulent flows.<ref>{{cite book|title=Cellular automata machines: a new environment for modeling |url=https://archive.org/details/cellularautomata00toff |url-access=registration |first1=Tommaso |last1=Toffoli |first2=Norman |last2=Margolus | author2-link = Norman Margolus |year=1987 |publisher=MIT Press |location=Cambridge, MA|isbn=9780262200608 }}</ref> Using cellular automata, scientists were able to specify systems consisting of a grid of cells in which each cell only occupied some finite states and changes between states were solely governed by the states of immediate neighbors. Along with advances in [[artificial intelligence]] and [[microcomputer]] power, these methods contributed to the development of "[[chaos theory]]" and "[[complex systems|complexity theory]]" which, in turn, renewed interest in understanding complex physical and social systems across disciplinary boundaries.<ref name="SfSS1"/> Research organizations explicitly dedicated to the interdisciplinary study of complexity were also founded in this era: the [[Santa Fe Institute]] was established in 1984 by scientists based at [[Los Alamos National Laboratory]] and the BACH group at the [[University of Michigan]] likewise started in the mid-1980s. |
The 1970s and 1980s were also a time when physicists and mathematicians were attempting to model and analyze how simple component units, such as atoms, give rise to global properties, such as complex material properties at low temperatures, in magnetic materials, and within turbulent flows.<ref>{{cite book|title=Cellular automata machines: a new environment for modeling |url=https://archive.org/details/cellularautomata00toff |url-access=registration |first1=Tommaso |last1=Toffoli |first2=Norman |last2=Margolus | author2-link = Norman Margolus |year=1987 |publisher=MIT Press |location=Cambridge, MA|isbn=9780262200608 }}</ref> Using cellular automata, scientists were able to specify systems consisting of a grid of cells in which each cell only occupied some finite states and changes between states were solely governed by the states of immediate neighbors. Along with advances in [[artificial intelligence]] and [[microcomputer]] power, these methods contributed to the development of "[[chaos theory]]" and "[[complex systems|complexity theory]]" which, in turn, renewed interest in understanding complex physical and social systems across disciplinary boundaries.<ref name="SfSS1"/> Research organizations explicitly dedicated to the interdisciplinary study of complexity were also founded in this era: the [[Santa Fe Institute]] was established in 1984 by scientists based at [[Los Alamos National Laboratory]] and the BACH group at the [[University of Michigan]] likewise started in the mid-1980s. |
||
This cellular automata paradigm gave rise to a third wave of social simulation emphasizing agent-based modeling. Like micro-simulations, these models emphasized bottom-up designs but adopted four key assumptions that diverged from microsimulation: autonomy, interdependency, simple rules, and adaptive behavior.<ref name="MW"/> Agent-based models are less concerned with predictive accuracy and instead emphasize theoretical development.<ref>{{cite journal |title=A simulation of the structure of academic science |journal=Sociological Research Online |volume=2 |issue=2 |pages=1–15 |year=1997 |first=Nigel |last=Gilbert |url=http://www.socresonline.org.uk/socresonline/2/2/3.html |doi=10.5153/sro.85 |s2cid=5077349 |access-date=2009-12-16 |archive-url=https://web.archive.org/web/19980524062306/http://www.socresonline.org.uk/socresonline/2/2/3.html |archive-date=1998-05-24 |url-status=dead }}</ref> In 1981, mathematician and political scientist [[Robert Axelrod]] and evolutionary biologist [[W.D. Hamilton]] published a major paper in ''[[Science (journal)|Science]]'' titled "The Evolution of Cooperation" which used an agent-based modeling approach to demonstrate how social cooperation based upon reciprocity can be established and stabilized in a [[prisoner's dilemma]] game when agents followed simple rules of self-interest.<ref>{{cite journal|title=The Evolution of Cooperation |first1=Robert |last1=Axelrod |first2=William D. |last2=Hamilton |journal=Science |volume=211 |issue=4489 |pages=1390–1396 |doi=10.1126/science.7466396|pmid=7466396 |date=March 27, 1981|bibcode=1981Sci...211.1390A }}</ref> Axelrod and Hamilton demonstrated that individual agents following a simple rule set of (1) cooperate on the first turn and (2) thereafter replicate the partner's previous action were able to develop "norms" of cooperation and sanctioning in the absence of canonical sociological constructs such as demographics, values, religion, and culture as preconditions or mediators of cooperation.<ref name="Cooperation"/> Throughout the 1990s, scholars like [[William Sims Bainbridge]], [[Kathleen Carley]], [[Michael Macy]], and [[John Skvoretz]] developed multi-agent-based models of [[generalized reciprocity]], [[prejudice]], [[social influence]], and organizational [[information processing (psychology)]]. In 1999, [[Nigel Gilbert]] published the first textbook on Social Simulation: ''Simulation for the social scientist'' and established its most relevant journal: the [[Journal of Artificial Societies and Social Simulation]]. |
This cellular automata paradigm gave rise to a third wave of social simulation emphasizing agent-based modeling. Like micro-simulations, these models emphasized bottom-up designs but adopted four key assumptions that diverged from microsimulation: autonomy, interdependency, simple rules, and adaptive behavior.<ref name="MW"/> Agent-based models are less concerned with predictive accuracy and instead emphasize theoretical development.<ref>{{cite journal |title=A simulation of the structure of academic science |journal=Sociological Research Online |volume=2 |issue=2 |pages=1–15 |year=1997 |first=Nigel |last=Gilbert |url=http://www.socresonline.org.uk/socresonline/2/2/3.html |doi=10.5153/sro.85 |s2cid=5077349 |access-date=2009-12-16 |archive-url=https://web.archive.org/web/19980524062306/http://www.socresonline.org.uk/socresonline/2/2/3.html |archive-date=1998-05-24 |url-status=dead }}</ref> In 1981, mathematician and political scientist [[Robert Axelrod (political scientist)|Robert Axelrod]] and evolutionary biologist [[W.D. Hamilton]] published a major paper in ''[[Science (journal)|Science]]'' titled "The Evolution of Cooperation" which used an agent-based modeling approach to demonstrate how social cooperation based upon reciprocity can be established and stabilized in a [[prisoner's dilemma]] game when agents followed simple rules of self-interest.<ref>{{cite journal|title=The Evolution of Cooperation |first1=Robert |last1=Axelrod |first2=William D. |last2=Hamilton |journal=Science |volume=211 |issue=4489 |pages=1390–1396 |doi=10.1126/science.7466396|pmid=7466396 |date=March 27, 1981|bibcode=1981Sci...211.1390A }}</ref> Axelrod and Hamilton demonstrated that individual agents following a simple rule set of (1) cooperate on the first turn and (2) thereafter replicate the partner's previous action were able to develop "norms" of cooperation and sanctioning in the absence of canonical sociological constructs such as demographics, values, religion, and culture as preconditions or mediators of cooperation.<ref name="Cooperation"/> Throughout the 1990s, scholars like [[William Sims Bainbridge]], [[Kathleen Carley]], [[Michael Macy]], and [[John Skvoretz]] developed multi-agent-based models of [[generalized reciprocity]], [[prejudice]], [[social influence]], and organizational [[information processing (psychology)]]. In 1999, [[Nigel Gilbert]] published the first textbook on Social Simulation: ''Simulation for the social scientist'' and established its most relevant journal: the [[Journal of Artificial Societies and Social Simulation]]. |
||
===Data mining and social network analysis=== |
===Data mining and social network analysis=== |
||
{{main|Data mining|Social network analysis}} |
{{main|Data mining|Social network analysis}} |
||
Independent from developments in computational models of social systems, social network analysis emerged in the 1970s and 1980s from advances in graph theory, statistics, and studies of social structure as a distinct analytical method and was articulated and employed by sociologists like [[James Samuel Coleman|James S. Coleman]], [[Harrison White]], [[Linton Freeman]], [[J. Clyde Mitchell]], [[Mark Granovetter]], [[Ronald Burt]], and [[Barry Wellman]].<ref>{{cite book|title=The Development of Social Network Analysis: A Study in the Sociology of Science |first=Linton C. |last=Freeman |publisher=Empirical Press |location=Vancouver, BC |year=2004}}</ref> The increasing pervasiveness of computing and telecommunication technologies throughout the 1980s and 1990s demanded analytical techniques, such as [[network theory|network analysis]] and [[multilevel modeling]], that could scale to increasingly complex and large data sets. The most recent wave of computational sociology, rather than employing simulations, uses network analysis and advanced statistical techniques to analyze large-scale computer databases of electronic proxies for behavioral data. Electronic records such as email and instant message records, hyperlinks on the [[World Wide Web]], mobile phone usage, and discussion on [[Usenet]] allow social scientists to directly observe and analyze social behavior at multiple points in time and multiple levels of analysis without the constraints of traditional empirical methods such as interviews, participant observation, or survey instruments.<ref>{{cite journal|title=Life in the network: the coming age of computational social science|first9=J|last10=Gutmann|first10=M.|last11=Jebara|first11=T.|last12=King|first12=G.|last13=Macy|first13=M.|last14=Roy|first14=D.|last15=Van Alstyne|first15=M.|last9=Fowler|first8=N|last8=Contractor|first7=N|last7=Christakis|first6=D|last6=Brewer|first5=AL|last5=Barabasi|first4=S |journal=Science|last4=Aral |date=February 6, 2009|first3=L |volume=323|pmid=19197046 |issue=5915|last3=Adamic |pages=721–723|pmc=2745217 |doi=10.1126/science.1167742 |first1=David |last1=Lazer |first2=Alex |last2=Pentland |display-authors=8}}</ref> Continued improvements in [[machine learning]] algorithms likewise have permitted social scientists and entrepreneurs to use novel techniques to identify latent and meaningful patterns of social interaction and evolution in large electronic datasets.<ref>{{cite journal|first1=Jaideep |last1=Srivastava |first2=Robert |last2=Cooley |first3=Mukund |last3=Deshpande |first4=Pang-Ning |last4=Tan |title=Web usage mining: Discovery and applications of usage patterns from Web data |journal=ACM |
Independent from developments in computational models of social systems, social network analysis emerged in the 1970s and 1980s from advances in graph theory, statistics, and studies of social structure as a distinct analytical method and was articulated and employed by sociologists like [[James Samuel Coleman|James S. Coleman]], [[Harrison White]], [[Linton Freeman]], [[J. Clyde Mitchell]], [[Mark Granovetter]], [[Ronald Burt]], and [[Barry Wellman]].<ref>{{cite book|title=The Development of Social Network Analysis: A Study in the Sociology of Science |first=Linton C. |last=Freeman |publisher=Empirical Press |location=Vancouver, BC |year=2004}}</ref> The increasing pervasiveness of computing and telecommunication technologies throughout the 1980s and 1990s demanded analytical techniques, such as [[network theory|network analysis]] and [[multilevel modeling]], that could scale to increasingly complex and large data sets. The most recent wave of computational sociology, rather than employing simulations, uses network analysis and advanced statistical techniques to analyze large-scale computer databases of electronic proxies for behavioral data. Electronic records such as email and instant message records, hyperlinks on the [[World Wide Web]], mobile phone usage, and discussion on [[Usenet]] allow social scientists to directly observe and analyze social behavior at multiple points in time and multiple levels of analysis without the constraints of traditional empirical methods such as interviews, participant observation, or survey instruments.<ref>{{cite journal|title=Life in the network: the coming age of computational social science|first9=J|last10=Gutmann|first10=M.|last11=Jebara|first11=T.|last12=King|first12=G.|last13=Macy|first13=M.|last14=Roy|first14=D.|last15=Van Alstyne|first15=M.|last9=Fowler|first8=N|last8=Contractor|first7=N|last7=Christakis|first6=D|last6=Brewer|first5=AL|last5=Barabasi|first4=S |journal=Science|last4=Aral |date=February 6, 2009|first3=L |volume=323|pmid=19197046 |issue=5915|last3=Adamic |pages=721–723|pmc=2745217 |doi=10.1126/science.1167742 |first1=David |last1=Lazer |first2=Alex |last2=Pentland |display-authors=8}}</ref> Continued improvements in [[machine learning]] algorithms likewise have permitted social scientists and entrepreneurs to use novel techniques to identify latent and meaningful patterns of social interaction and evolution in large electronic datasets.<ref>{{cite journal|first1=Jaideep |last1=Srivastava |first2=Robert |last2=Cooley |first3=Mukund |last3=Deshpande |first4=Pang-Ning |last4=Tan |title=Web usage mining: Discovery and applications of usage patterns from Web data |journal=ACM SIGKDD Explorations Newsletter |volume=1 |year=2000 |pages=12–23 |doi=10.1145/846183.846188|issue=2|s2cid=967595 |doi-access=free }}</ref><ref>{{cite journal|doi=10.1016/S0169-7552(98)00110-X|title=The anatomy of a large-scale hypertextual Web search engine |first1=Sergey |last1=Brin |first2=Lawrence |last2=Page |journal=Computer Networks and ISDN Systems |volume=30 |issue=1–7 |pages=107–117 |date=April 1998|citeseerx=10.1.1.115.5930 |s2cid=7587743 }}</ref> |
||
[[File:Tripletsnew2012.png|thumb|right|Narrative network of US Elections 2012<ref name="ReferenceA">{{cite journal|title=Automated analysis of the US presidential elections using Big Data and network analysis|author1=S Sudhahar|author2=GA Veltri|author3=N Cristianini|journal=Big Data & Society|volume=2|issue=1|pages=1–28|year=2015|doi=10.1177/2053951715572916|doi-access=free|hdl=2381/31767|hdl-access=free}}</ref>]] |
[[File:Tripletsnew2012.png|thumb|right|Narrative network of US Elections 2012<ref name="ReferenceA">{{cite journal|title=Automated analysis of the US presidential elections using Big Data and network analysis|author1=S Sudhahar|author2=GA Veltri|author3=N Cristianini|journal=Big Data & Society|volume=2|issue=1|pages=1–28|year=2015|doi=10.1177/2053951715572916|doi-access=free|hdl=2381/31767|hdl-access=free}}</ref>]] |
Latest revision as of 05:28, 2 November 2024
Part of a series on |
Sociology |
---|
Computational sociology is a branch of sociology that uses computationally intensive methods to analyze and model social phenomena. Using computer simulations, artificial intelligence, complex statistical methods, and analytic approaches like social network analysis, computational sociology develops and tests theories of complex social processes through bottom-up modeling of social interactions.[1]
It involves the understanding of social agents, the interaction among these agents, and the effect of these interactions on the social aggregate.[2] Although the subject matter and methodologies in social science differ from those in natural science or computer science, several of the approaches used in contemporary social simulation originated from fields such as physics and artificial intelligence.[3][4] Some of the approaches that originated in this field have been imported into the natural sciences, such as measures of network centrality from the fields of social network analysis and network science.
In relevant literature, computational sociology is often related to the study of social complexity.[5] Social complexity concepts such as complex systems, non-linear interconnection among macro and micro process, and emergence, have entered the vocabulary of computational sociology.[6] A practical and well-known example is the construction of a computational model in the form of an "artificial society", by which researchers can analyze the structure of a social system.[2][7]
History
[edit]Background
[edit]In the past four decades, computational sociology has been introduced and gaining popularity [according to whom?]. This has been used primarily for modeling or building explanations of social processes and are depending on the emergence of complex behavior from simple activities.[8] The idea behind emergence is that properties of any bigger system do not always have to be properties of the components that the system is made of.[9] Alexander, Morgan, and Broad, classical emergentists, introduced the idea of emergence in the early 20th century. The aim of this method was to find a good enough accommodation between two different and extreme ontologies, which were reductionist materialism and dualism.[8]
While emergence has had a valuable and important role with the foundation of Computational Sociology, there are those who do not necessarily agree. One major leader in the field, Epstein, doubted the use because there were aspects that are unexplainable. Epstein put up a claim against emergentism, in which he says it "is precisely the generative sufficiency of the parts that constitutes the whole's explanation".[8]
Agent-based models have had a historical influence on Computational Sociology. These models first came around in the 1960s, and were used to simulate control and feedback processes in organizations, cities, etc. During the 1970s, the application introduced the use of individuals as the main units for the analyses and used bottom-up strategies for modeling behaviors. The last wave occurred in the 1980s. At this time, the models were still bottom-up; the only difference is that the agents interact interdependently.[8]
Systems theory and structural functionalism
[edit]In the post-war era, Vannevar Bush's differential analyser, John von Neumann's cellular automata, Norbert Wiener's cybernetics, and Claude Shannon's information theory became influential paradigms for modeling and understanding complexity in technical systems. In response, scientists in disciplines such as physics, biology, electronics, and economics began to articulate a general theory of systems in which all natural and physical phenomena are manifestations of interrelated elements in a system that has common patterns and properties. Following Émile Durkheim's call to analyze complex modern society sui generis,[10] post-war structural functionalist sociologists such as Talcott Parsons seized upon these theories of systematic and hierarchical interaction among constituent components to attempt to generate grand unified sociological theories, such as the AGIL paradigm.[11] Sociologists such as George Homans argued that sociological theories should be formalized into hierarchical structures of propositions and precise terminology from which other propositions and hypotheses could be derived and operationalized into empirical studies.[12] Because computer algorithms and programs had been used as early as 1956 to test and validate mathematical theorems, such as the four color theorem,[13] some scholars anticipated that similar computational approaches could "solve" and "prove" analogously formalized problems and theorems of social structures and dynamics.
Macrosimulation and microsimulation
[edit]By the late 1960s and early 1970s, social scientists used increasingly available computing technology to perform macro-simulations of control and feedback processes in organizations, industries, cities, and global populations. These models used differential equations to predict population distributions as holistic functions of other systematic factors such as inventory control, urban traffic, migration, and disease transmission.[14][15] Although simulations of social systems received substantial attention in the mid-1970s after the Club of Rome published reports predicting that policies promoting exponential economic growth would eventually bring global environmental catastrophe,[16] the inconvenient conclusions led many authors to seek to discredit the models, attempting to make the researchers themselves appear unscientific.[2][17] Hoping to avoid the same fate, many social scientists turned their attention toward micro-simulation models to make forecasts and study policy effects by modeling aggregate changes in state of individual-level entities rather than the changes in distribution at the population level.[18] However, these micro-simulation models did not permit individuals to interact or adapt and were not intended for basic theoretical research.[1]
Cellular automata and agent-based modeling
[edit]The 1970s and 1980s were also a time when physicists and mathematicians were attempting to model and analyze how simple component units, such as atoms, give rise to global properties, such as complex material properties at low temperatures, in magnetic materials, and within turbulent flows.[19] Using cellular automata, scientists were able to specify systems consisting of a grid of cells in which each cell only occupied some finite states and changes between states were solely governed by the states of immediate neighbors. Along with advances in artificial intelligence and microcomputer power, these methods contributed to the development of "chaos theory" and "complexity theory" which, in turn, renewed interest in understanding complex physical and social systems across disciplinary boundaries.[2] Research organizations explicitly dedicated to the interdisciplinary study of complexity were also founded in this era: the Santa Fe Institute was established in 1984 by scientists based at Los Alamos National Laboratory and the BACH group at the University of Michigan likewise started in the mid-1980s.
This cellular automata paradigm gave rise to a third wave of social simulation emphasizing agent-based modeling. Like micro-simulations, these models emphasized bottom-up designs but adopted four key assumptions that diverged from microsimulation: autonomy, interdependency, simple rules, and adaptive behavior.[1] Agent-based models are less concerned with predictive accuracy and instead emphasize theoretical development.[20] In 1981, mathematician and political scientist Robert Axelrod and evolutionary biologist W.D. Hamilton published a major paper in Science titled "The Evolution of Cooperation" which used an agent-based modeling approach to demonstrate how social cooperation based upon reciprocity can be established and stabilized in a prisoner's dilemma game when agents followed simple rules of self-interest.[21] Axelrod and Hamilton demonstrated that individual agents following a simple rule set of (1) cooperate on the first turn and (2) thereafter replicate the partner's previous action were able to develop "norms" of cooperation and sanctioning in the absence of canonical sociological constructs such as demographics, values, religion, and culture as preconditions or mediators of cooperation.[4] Throughout the 1990s, scholars like William Sims Bainbridge, Kathleen Carley, Michael Macy, and John Skvoretz developed multi-agent-based models of generalized reciprocity, prejudice, social influence, and organizational information processing (psychology). In 1999, Nigel Gilbert published the first textbook on Social Simulation: Simulation for the social scientist and established its most relevant journal: the Journal of Artificial Societies and Social Simulation.
Data mining and social network analysis
[edit]Independent from developments in computational models of social systems, social network analysis emerged in the 1970s and 1980s from advances in graph theory, statistics, and studies of social structure as a distinct analytical method and was articulated and employed by sociologists like James S. Coleman, Harrison White, Linton Freeman, J. Clyde Mitchell, Mark Granovetter, Ronald Burt, and Barry Wellman.[22] The increasing pervasiveness of computing and telecommunication technologies throughout the 1980s and 1990s demanded analytical techniques, such as network analysis and multilevel modeling, that could scale to increasingly complex and large data sets. The most recent wave of computational sociology, rather than employing simulations, uses network analysis and advanced statistical techniques to analyze large-scale computer databases of electronic proxies for behavioral data. Electronic records such as email and instant message records, hyperlinks on the World Wide Web, mobile phone usage, and discussion on Usenet allow social scientists to directly observe and analyze social behavior at multiple points in time and multiple levels of analysis without the constraints of traditional empirical methods such as interviews, participant observation, or survey instruments.[23] Continued improvements in machine learning algorithms likewise have permitted social scientists and entrepreneurs to use novel techniques to identify latent and meaningful patterns of social interaction and evolution in large electronic datasets.[24][25]
The automatic parsing of textual corpora has enabled the extraction of actors and their relational networks on a vast scale, turning textual data into network data. The resulting networks, which can contain thousands of nodes, are then analysed by using tools from Network theory to identify the key actors, the key communities or parties, and general properties such as robustness or structural stability of the overall network, or centrality of certain nodes.[27] This automates the approach introduced by quantitative narrative analysis,[28] whereby subject-verb-object triplets are identified with pairs of actors linked by an action, or pairs formed by actor-object.[26]
Computational content analysis
[edit]Content analysis has been a traditional part of social sciences and media studies for a long time. The automation of content analysis has allowed a "big data" revolution to take place in that field, with studies in social media and newspaper content that include millions of news items. Gender bias, readability, content similarity, reader preferences, and even mood have been analyzed based on text mining methods over millions of documents.[29][30][31][32][33] The analysis of readability, gender bias and topic bias was demonstrated in Flaounas et al.[34] showing how different topics have different gender biases and levels of readability; the possibility to detect mood shifts in a vast population by analysing Twitter content was demonstrated as well.[35]
The analysis of vast quantities of historical newspaper content has been pioneered by Dzogang et al.,[36] which showed how periodic structures can be automatically discovered in historical newspapers. A similar analysis was performed on social media, again revealing strongly periodic structures.[37]
Challenges
[edit]Computational sociology, as with any field of study, faces a set of challenges.[38] These challenges need to be handled meaningfully so as to make the maximum impact on society.
Levels and their interactions
[edit]Each society that is formed tends to be in one level or the other and there exists tendencies of interactions between and across these levels. Levels need not only be micro-level or macro-level in nature. There can be intermediate levels in which a society exists say - groups, networks, communities etc.[38]
The question however arises as to how to identify these levels and how they come into existence? And once they are in existence how do they interact within themselves and with other levels?
If we view entities (agents) as nodes and the connections between them as the edges, we see the formation of networks. The connections in these networks do not come about based on just objective relationships between the entities, rather they are decided upon by factors chosen by the participating entities.[39] The challenge with this process is that, it is difficult to identify when a set of entities will form a network. These networks may be of trust networks, co-operation networks, dependence networks etc. There have been cases where heterogeneous set of entities have shown to form strong and meaningful networks among themselves.[40][41]
As discussed previously, societies fall into levels and in one such level, the individual level, a micro-macro link[42] refers to the interactions which create higher-levels. There are a set of questions that needs to be answered regarding these Micro-Macro links. How they are formed? When do they converge? What is the feedback pushed to the lower levels and how are they pushed?
Another major challenge in this category concerns the validity of information and their sources. In recent years there has been a boom in information gathering and processing. However, little attention was paid to the spread of false information between the societies. Tracing back the sources and finding ownership of such information is difficult.
Culture modeling
[edit]The evolution of the networks and levels in the society brings about cultural diversity.[43] A thought which arises however is that, when people tend to interact and become more accepting of other cultures and beliefs, how is it that diversity still persists? Why is there no convergence? A major challenge is how to model these diversities. Are there external factors like mass media, locality of societies etc. which influence the evolution or persistence of cultural diversities?[citation needed]
Experimentation and evaluation
[edit]Any study or modelling when combined with experimentation needs to be able to address the questions being asked. Computational social science deals with large scale data and the challenge becomes much more evident as the scale grows. How would one design informative simulations on a large scale? And even if a large scale simulation is brought up, how is the evaluation supposed to be performed?
Model choice and model complexities
[edit]Another challenge is identifying the models that would best fit the data and the complexities of these models. These models would help us predict how societies might evolve over time and provide possible explanations on how things work.[44]
Generative models
[edit]Generative models helps us to perform extensive qualitative analysis in a controlled fashion. A model proposed by Epstein, is the agent-based simulation, which talks about identifying an initial set of heterogeneous entities (agents) and observe their evolution and growth based on simple local rules.[45]
But what are these local rules? How does one identify them for a set of heterogeneous agents? Evaluation and impact of these rules state a whole new set of difficulties.
Heterogeneous or ensemble models
[edit]Integrating simple models which perform better on individual tasks to form a Hybrid model is an approach that can be looked into.[46] These models can offer better performance and understanding of the data. However the trade-off of identifying and having a deep understanding of the interactions between these simple models arises when one needs to come up with one combined, well performing model. Also, coming up with tools and applications to help analyse and visualize the data based on these hybrid models is another added challenge.
Impact
[edit]Computational sociology can bring impacts to science, technology and society.[38]
Impact on science
[edit]In order for the study of computational sociology to be effective, there has to be valuable innovations. These innovation can be of the form of new data analytics tools, better models and algorithms. The advent of such innovation will be a boom for the scientific community in large. [citation needed]
Impact on society
[edit]One of the major challenges of computational sociology is the modelling of social processes [citation needed]. Various law and policy makers would be able to see efficient and effective paths to issue new guidelines and the mass in general would be able to evaluate and gain fair understanding of the options presented in front of them enabling an open and well balanced decision process. [citation needed].
See also
[edit]- Journal of Artificial Societies and Social Simulation
- Artificial society
- Simulated reality
- Social simulation
- Agent-based social simulation
- Social complexity
- Computational economics
- Computational epidemiology
- Cliodynamics
- Predictive analytics
References
[edit]- ^ a b c Macy, Michael W.; Willer, Robert (2002). "From Factors to Actors: Computational Sociology and Agent-Based Modeling". Annual Review of Sociology. 28: 143–166. doi:10.1146/annurev.soc.28.110601.141117. JSTOR 3069238. S2CID 1368768.
- ^ a b c d Gilbert, Nigel; Troitzsch, Klaus (2005). "Simulation and social science". Simulation for Social Scientists (2 ed.). Open University Press.
- ^ Epstein, Joshua M.; Axtell, Robert (1996). Growing Artificial Societies: Social Science from the Bottom Up. Washington DC: Brookings Institution Press. ISBN 978-0262050531.
- ^ a b Axelrod, Robert (1997). The Complexity of Cooperation: Agent-Based Models of Competition and Collaboration. Princeton, NJ: Princeton University Press. ISBN 0691015678.
- ^ Casti, J (1999). "The Computer as Laboratory: Toward a Theory of Complex Adaptive Systems". Complexity. 4 (5): 12–14. doi:10.1002/(SICI)1099-0526(199905/06)4:5<12::AID-CPLX3>3.0.CO;2-4.
- ^ Goldspink, C (2002). "Methodological Implications of Complex Systems Approaches to Sociality: Simulation as a Foundation for Knowledge". 5 (1). Journal of Artificial Societies and Social Simulation.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ Epstein, Joshua (2007). Generative Social Science: Studies in Agent-Based Computational Modeling. Princeton, NJ: Princeton University Press.
- ^ a b c d Salgado, Mauricio, and Nigel Gilbert. "Emergence and communication in computational sociology." Journal for the Theory of Social Behaviour 43.1 (2013): 87-110.
- ^ Macy, Michael W., and Robert Willer. "From factors to actors: computational sociology and agent-based modeling Archived 2014-07-13 at the Wayback Machine." Annual review of sociology 28.1 (2002): 143-166.
- ^ Durkheim, Émile. The Division of Labor in Society. New York, NY: Macmillan.
- ^ Bailey, Kenneth D. (2006). "Systems Theory". In Jonathan H. Turner (ed.). Handbook of Sociological Theory. New York, NY: Springer Science. pp. 379–404. ISBN 978-0-387-32458-6.
- ^ Bainbridge, William Sims (2007). "Computational Sociology". In Ritzer, George (ed.). Blackwell Encyclopedia of Sociology. Blackwell Reference Online. doi:10.1111/b.9781405124331.2007.x. hdl:10138/224218. ISBN 978-1-4051-2433-1.
- ^ Crevier, D. (1993). AI: The Tumultuous History of the Search for Artificial Intelligence. New York, NY: Basic Books. ISBN 9780465001040.
- ^ Forrester, Jay (1971). World Dynamics. Cambridge, MA: MIT Press.
- ^ Ignall, Edward J.; Kolesar, Peter; Walker, Warren E. (1978). "Using Simulation to Develop and Validate Analytic Models: Some Case Studies". Operations Research. 26 (2): 237–253. doi:10.1287/opre.26.2.237.
- ^ Meadows, DL; Behrens, WW; Meadows, DH; Naill, RF; Randers, J; Zahn, EK (1974). The Dynamics of Growth in a Finite World. Cambridge, MA: MIT Press.
- ^ "Computer View of Disaster Is Rebutted". The New York Times. October 18, 1974.
- ^ Orcutt, Guy H. (1990). "From engineering to microsimulation : An autobiographical reflection". Journal of Economic Behavior & Organization. 14 (1): 5–27. doi:10.1016/0167-2681(90)90038-F.
- ^ Toffoli, Tommaso; Margolus, Norman (1987). Cellular automata machines: a new environment for modeling. Cambridge, MA: MIT Press. ISBN 9780262200608.
- ^ Gilbert, Nigel (1997). "A simulation of the structure of academic science". Sociological Research Online. 2 (2): 1–15. doi:10.5153/sro.85. S2CID 5077349. Archived from the original on 1998-05-24. Retrieved 2009-12-16.
- ^ Axelrod, Robert; Hamilton, William D. (March 27, 1981). "The Evolution of Cooperation". Science. 211 (4489): 1390–1396. Bibcode:1981Sci...211.1390A. doi:10.1126/science.7466396. PMID 7466396.
- ^ Freeman, Linton C. (2004). The Development of Social Network Analysis: A Study in the Sociology of Science. Vancouver, BC: Empirical Press.
- ^ Lazer, David; Pentland, Alex; Adamic, L; Aral, S; Barabasi, AL; Brewer, D; Christakis, N; Contractor, N; et al. (February 6, 2009). "Life in the network: the coming age of computational social science". Science. 323 (5915): 721–723. doi:10.1126/science.1167742. PMC 2745217. PMID 19197046.
- ^ Srivastava, Jaideep; Cooley, Robert; Deshpande, Mukund; Tan, Pang-Ning (2000). "Web usage mining: Discovery and applications of usage patterns from Web data". ACM SIGKDD Explorations Newsletter. 1 (2): 12–23. doi:10.1145/846183.846188. S2CID 967595.
- ^ Brin, Sergey; Page, Lawrence (April 1998). "The anatomy of a large-scale hypertextual Web search engine". Computer Networks and ISDN Systems. 30 (1–7): 107–117. CiteSeerX 10.1.1.115.5930. doi:10.1016/S0169-7552(98)00110-X. S2CID 7587743.
- ^ a b S Sudhahar; GA Veltri; N Cristianini (2015). "Automated analysis of the US presidential elections using Big Data and network analysis". Big Data & Society. 2 (1): 1–28. doi:10.1177/2053951715572916. hdl:2381/31767.
- ^ S Sudhahar; G De Fazio; R Franzosi; N Cristianini (2013). "Network analysis of narrative content in large corpora" (PDF). Natural Language Engineering. 21 (1): 1–32. doi:10.1017/S1351324913000247. hdl:1983/dfb87140-42e2-486a-91d5-55f9007042df. S2CID 3385681.
- ^ Franzosi, Roberto (2010). Quantitative Narrative Analysis. Emory University.
- ^ I. Flaounas; M. Turchi; O. Ali; N. Fyson; T. De Bie; N. Mosdell; J. Lewis; N. Cristianini (2010). "The Structure of EU Mediasphere" (PDF). PLOS ONE. 5 (12): e14243. Bibcode:2010PLoSO...514243F. doi:10.1371/journal.pone.0014243. PMC 2999531. PMID 21170383.
- ^ V Lampos; N Cristianini (2012). "Nowcasting Events from the Social Web with Statistical Learning" (PDF). ACM Transactions on Intelligent Systems and Technology. 3 (4): 72. doi:10.1145/2337542.2337557. S2CID 8297993.
- ^ I. Flaounas; O. Ali; M. Turchi; T Snowsill; F Nicart; T De Bie; N Cristianini (2011). NOAM: news outlets analysis and monitoring system (PDF). Proc. of the 2011 ACM SIGMOD international conference on Management of data. doi:10.1145/1989323.1989474.
- ^ N Cristianini (2011). "Automatic Discovery of Patterns in Media Content". Combinatorial Pattern Matching. Lecture Notes in Computer Science. Vol. 6661. pp. 2–13. CiteSeerX 10.1.1.653.9525. doi:10.1007/978-3-642-21458-5_2. ISBN 978-3-642-21457-8.
- ^ Lansdall-Welfare, Thomas; Sudhahar, Saatviga; Thompson, James; Lewis, Justin; Team, FindMyPast Newspaper; Cristianini, Nello (2017-01-09). "Content analysis of 150 years of British periodicals". Proceedings of the National Academy of Sciences. 114 (4): E457–E465. Bibcode:2017PNAS..114E.457L. doi:10.1073/pnas.1606380114. ISSN 0027-8424. PMC 5278459. PMID 28069962.
- ^ I. Flaounas; O. Ali; M. Turchi; T. Lansdall-Welfare; T. De Bie; N. Mosdell; J. Lewis; N. Cristianini (2012). "Research methods in the age of digital journalism". Digital Journalism. 1: 102–116. doi:10.1080/21670811.2012.714928. S2CID 61080552.
- ^ T Lansdall-Welfare; V Lampos; N Cristianini. Effects of the Recession on Public Mood in the UK (PDF). Proceedings of the 21st International Conference on World Wide Web. Mining Social Network Dynamics (MSND) session on Social Media Applications. New York, NY, USA. pp. 1221–1226. doi:10.1145/2187980.2188264.
- ^ Dzogang, Fabon; Lansdall-Welfare, Thomas; Team, FindMyPast Newspaper; Cristianini, Nello (2016-11-08). "Discovering Periodic Patterns in Historical News". PLOS ONE. 11 (11): e0165736. Bibcode:2016PLoSO..1165736D. doi:10.1371/journal.pone.0165736. ISSN 1932-6203. PMC 5100883. PMID 27824911.
- ^ Seasonal Fluctuations in Collective Mood Revealed by Wikipedia Searches and Twitter Posts F Dzogang, T Lansdall-Welfare, N Cristianini - 2016 IEEE International Conference on Data Mining, Workshop on Data Mining in Human Activity Analysis
- ^ a b c Conte, Rosaria, et al. "Manifesto of computational social science Archived 2022-01-22 at the Wayback Machine." The European Physical Journal Special Topics 214.1 (2012): 325-346.
- ^ Egu´ıluz, V. M.; Zimmermann, M. G.; Cela-Conde, C. J.; San Miguel, M. "American Journal of Sociology" (2005): 110, 977.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ Sichman, J. S.; Conte, R. "Computational & Mathematical Organization Theory" (2002): 8(2).
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ Ehrhardt, G.; Marsili, M.; Vega-Redondo, F. "Physical Review E" (2006): 74(3).
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ Billari, Francesco C. Agent-based computational modelling: applications in demography, social, economic and environmental sciences. Taylor & Francis, 2006.
- ^ Centola, D.; Gonz´alez-Avella, J. C.; Egu´ıluz, V. M.; San Miguel, M. "Journal of Conflict Resolution" (2007): 51.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ Weisberg, Michael. When less is more: Tradeoffs and idealization in model building[dead link ]. Diss. Stanford University, 2003.
- ^ Epstein, Joshua M. Generative social science: Studies in agent-based computational modeling. Princeton University Press, 2006.
- ^ Yuan, Yuan; Alabdulkareem, Ahmad; Pentland, Alex 'Sandy' (2018). "An interpretable approach for social network formation among heterogeneous agents". Nature Communications. 9 (1): 4704. Bibcode:2018NatCo...9.4704Y. doi:10.1038/s41467-018-07089-x. PMC 6224571. PMID 30410019.
External links
[edit]- On-line book "Simulation for the Social Scientist" by Nigel Gilbert and Klaus G. Troitzsch, 1999, second edition 2005
- Journal of Artificial Societies and Social Simulation
- Agent based models for social networks, interactive java applets
- Sociology and Complexity Science Website
Journals and academic publications
[edit]- Complexity Research Journal List, from UIUC, IL
- Related Research Groups, from UIUC, IL
Associations, conferences and workshops
[edit]- North American Association for Computational Social and Organization Sciences
- ESSA: European Social Simulation Association
Academic programs, departments and degrees
[edit]- University of Bristol "Mediapatterns" project Archived 2012-12-01 at the Wayback Machine
- Carnegie Mellon University Archived 2014-09-02 at the Wayback Machine, PhD program in Computation, Organizations and Society (COS)
- University of Chicago
- George Mason University
- PhD program in CSS (Computational Social Sciences)
- MA program in Master's of Interdisciplinary Studies, CSS emphasis
- Portland State, PhD program in Systems Science
- Portland State, MS program in Systems Science
- University College Dublin,
- PhD Program in Complex Systems and Computational Social Science
- MSc in Social Data Analytics
- BSc in Computational Social Science
- UCLA, Minor in Human Complex Systems
- UCLA, Major in Computational & Systems Biology (including behavioral sciences)
- Univ. of Michigan, Minor in Complex Systems
- Systems Sciences Programs List, Portland State. List of other worldwide related programs.
Centers and institutes
[edit]North America
[edit]- Center for Complex Networks and Systems Research, Indiana University, Bloomington, IN, USA.
- Center for Complex Systems Research, University of Illinois at Urbana-Champaign, IL, USA.
- Center for Social Complexity, George Mason University, Fairfax, VA, USA.
- Center for Social Dynamics and Complexity, Arizona State University, Tempe, AZ, USA.
- Center of the Study of Complex Systems, University of Michigan, Ann Arbor, MI, USA.
- Human Complex Systems, University of California Los Angeles, Los Angeles, CA, USA.
- Institute for Quantitative Social Science, Harvard University, Boston, MA, USA.
- Northwestern Institute on Complex Systems (NICO), Northwestern University, Evanston, IL USA.
- Santa Fe Institute, Santa Fe, NM, USA.
- Duke Network Analysis Center, Duke University, Durham, NC, USA
South America
[edit]- Modelagem de Sistemas Complexos, University of São Paulo - EACH, São Paulo, SP, Brazil
- Instituto Nacional de Ciência e Tecnologia de Sistemas Complexos, Centro Brasileiro de Pesquisas Físicas, Rio de Janeiro, RJ, Brazil
Asia
[edit]- Bandung Fe Institute, Centre for Complexity in Surya University, Bandung, Indonesia.
Europe
[edit]- Centre for Policy Modelling, Manchester, UK.
- Centre for Research in Social Simulation, University of Surrey, UK.
- UCD Dynamics Lab- Centre for Computational Social Science, Geary Institute for Public Policy, University College Dublin, Ireland.
- Groningen Center for Social Complexity Studies (GCSCS), Groningen, NL.
- Chair of Sociology, in particular of Modeling and Simulation (SOMS), Zürich, Switzerland.
- Research Group on Experimental and Computational Sociology (GECS), Brescia, Italy