Jump to content

Learning analytics: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
capitalization per MOS:CAPS; moved reading list from lead to new section; no space before ref tags; no external links in article body per WP:EL
Rescuing 2 sources and tagging 0 as dead. #IABot (v1.6.1)
Line 14: Line 14:
# to help teachers and support staff plan supporting interventions with individuals and groups;
# to help teachers and support staff plan supporting interventions with individuals and groups;
# for functional groups such as course team seeking to improve current courses or develop new curriculum offerings; and
# for functional groups such as course team seeking to improve current courses or develop new curriculum offerings; and
# for institutional administrators taking decisions on matters such as marketing and recruitment or efficiency and effectiveness measures.<ref name="Institutional Readiness">Powell, Stephen, and Sheila MacNeill. Institutional Readiness for Analytics A Briefing Paper. CETIS Analytics Series. JISC CETIS, December 2012. http://publications.cetis.ac.uk/wp-content/uploads/2012/12/Institutional-Readiness-for-Analytics-Vol1-No8.pdf.</ref>
# for institutional administrators taking decisions on matters such as marketing and recruitment or efficiency and effectiveness measures.<ref name="Institutional Readiness">Powell, Stephen, and Sheila MacNeill. Institutional Readiness for Analytics A Briefing Paper. CETIS Analytics Series. JISC CETIS, December 2012. {{cite web |url=http://publications.cetis.ac.uk/wp-content/uploads/2012/12/Institutional-Readiness-for-Analytics-Vol1-No8.pdf |title=Archived copy |accessdate=2016-04-29 |deadurl=yes |archiveurl=https://web.archive.org/web/20130502234323/http://publications.cetis.ac.uk/wp-content/uploads/2012/12/Institutional-Readiness-for-Analytics-Vol1-No8.pdf |archivedate=2013-05-02 |df= }}.</ref>
In that briefing paper, Powell and MacNeill go on to point out that some motivations and implementations of analytics may come into conflict with others, for example highlighting potential conflict between analytics for individual learners and organisational stakeholders.<ref name="Institutional Readiness" />
In that briefing paper, Powell and MacNeill go on to point out that some motivations and implementations of analytics may come into conflict with others, for example highlighting potential conflict between analytics for individual learners and organisational stakeholders.<ref name="Institutional Readiness" />


Line 82: Line 82:
* Trust in data clients
* Trust in data clients


As Kay, Kom and Oppenheim point out, the range of data is wide, potentially derived from:<ref name="Legal Risk Ethical">Kay, David, Naomi Kom, and Charles Oppenheim. Legal, Risk and Ethical Aspects of Analytics in Higher Education. Analytics Series. Accessed January 3, 2013. [http://publications.cetis.ac.uk/wp-content/uploads/2012/11/Legal-Risk-and-Ethical-Aspects-of-Analytics-in-Higher-Education-Vol1-No6.pdf]</ref>
As Kay, Kom and Oppenheim point out, the range of data is wide, potentially derived from:<ref name="Legal Risk Ethical">Kay, David, Naomi Kom, and Charles Oppenheim. Legal, Risk and Ethical Aspects of Analytics in Higher Education. Analytics Series. Accessed January 3, 2013. {{cite web |url=http://publications.cetis.ac.uk/wp-content/uploads/2012/11/Legal-Risk-and-Ethical-Aspects-of-Analytics-in-Higher-Education-Vol1-No6.pdf |title=Archived copy |accessdate=2013-08-10 |deadurl=yes |archiveurl=https://web.archive.org/web/20130502234313/http://publications.cetis.ac.uk/wp-content/uploads/2012/11/Legal-Risk-and-Ethical-Aspects-of-Analytics-in-Higher-Education-Vol1-No6.pdf |archivedate=2013-05-02 |df= }}</ref>
* Recorded activity: student records, attendance, assignments, researcher information (CRIS)
* Recorded activity: student records, attendance, assignments, researcher information (CRIS)
* Systems interactions: VLE, library / repository search, card transactions
* Systems interactions: VLE, library / repository search, card transactions

Revision as of 10:25, 19 December 2017

Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.[1] A related field is educational data mining.

Definition

The definition and aims of learning analytics are contested. One earlier definition discussed by the community suggested that "Learning analytics is the use of intelligent data, learner-produced data, and analysis models to discover information and social connections for predicting and advising people's learning."[2] But this definition has been criticised by George Siemens[3][non-primary source needed] and Mike Sharkey.[4][non-primary source needed]

A more holistic view than a mere definition is provided by the framework of learning analytics by Greller and Drachsler (2012).[5] It uses a general morphological analysis (GMA) to divide the domain into six "critical dimensions".

A systematic overview on learning analytics and its key concepts is provided by Chatti et al. (2012)[6] and Chatti et al. (2014)[7] through a reference model for learning analytics based on four dimensions, namely data, environments, context (what?), stakeholders (who?), objectives (why?), and methods (how?).

It has been pointed out[by whom?] that there is a broad awareness of analytics across educational institutions for various stakeholders, but that the way learning analytics is defined and implemented may vary, including:

  1. for individual learners to reflect on their achievements and patterns of behaviour in relation to others;
  2. as predictors of students requiring extra support and attention;
  3. to help teachers and support staff plan supporting interventions with individuals and groups;
  4. for functional groups such as course team seeking to improve current courses or develop new curriculum offerings; and
  5. for institutional administrators taking decisions on matters such as marketing and recruitment or efficiency and effectiveness measures.[8]

In that briefing paper, Powell and MacNeill go on to point out that some motivations and implementations of analytics may come into conflict with others, for example highlighting potential conflict between analytics for individual learners and organisational stakeholders.[8]

Gašević, Dawson, and Siemens argue that computational aspects of learning analytics need to be linked with the existing educational research if the field of learning analytics is to deliver to its promise to understand and optimize learning.[9]

Differentiating learning analytics and educational data mining

Differentiating the fields of educational data mining (EDM) and learning analytics (LA) has been a concern of several researchers. George Siemens takes the position that educational data mining encompasses both learning analytics and academic analytics,[10] the former of which is aimed at governments, funding agencies, and administrators instead of learners and faculty. Baepler and Murdoch define academic analytics as an area that "...combines select institutional data, statistical analysis, and predictive modeling to create intelligence upon which learners, instructors, or administrators can change academic behavior".[11] They go on to attempt to disambiguate educational data mining from academic analytics based on whether the process is hypothesis driven or not, though Brooks[12] questions whether this distinction exists in the literature. Brooks[12] instead proposes that a better distinction between the EDM and LA communities is in the roots of where each community originated, with authorship at the EDM community being dominated by researchers coming from intelligent tutoring paradigms, and learning anaytics researchers being more focused on enterprise learning systems (e.g. learning content management systems).

Regardless of the differences between the LA and EDM communities, the two areas have significant overlap both in the objectives of investigators as well as in the methods and techniques that are used in the investigation. In the MS program offering in learning analytics at Teachers College, Columbia University, students are taught both EDM and LA methods.[13]

History

The context of learning analytics

In "The State of Learning Analytics in 2012: A Review and Future Challenges" Rebecca Ferguson[14] tracks the progress of analytics for learning as a development through:[a]

  1. The increasing interest in big data for business intelligence
  2. The rise of online education focussed around virtual learning environments (VLEs), content management systems (CMSs), and management information systems (MIS) for education, which saw an increase in digital data regarding student background (often held in the MIS) and learning log data (from VLEs). This development afforded the opportunity to apply business intelligence techniques to educational data
  3. Questions regarding the optimisation of systems to support learning particularly given the question regarding how we can know whether a student is engaged/understanding if we can't see them?
  4. Increasing focus on evidencing progress and professional standards for accountability systems
  5. This focus led to a teacher stakehold in the analytics—given that they are associated with accountability systems
  6. Thus an increasing emphasis was placed on the pedagogic affordances of learning analytics
  7. This pressure is increased by the economic desire to improve engagement in online education for the deliverance of high-quality affordable education

History of techniques and methods of learning analytics

In a discussion of the history of analytics, Cooper[15] highlights a number of communities from which learning analytics draws techniques, including:

  1. Statistics, which are a well established means to address hypothesis testing.
  2. Business intelligence, which has similarities with learning analytics, although it has historically been targeted at making the production of reports more efficient through enabling data access and summarising performance indicators.
  3. Web analytics, tools such as Google analytics report on web page visits and references to websites, brands and other keyterms across the internet. The more "fine grain" of these techniques can be adopted in learning analytics for the exploration of student trajectories through learning resources (courses, materials, etc.).
  4. Operational research, which aims at highlighting design optimisation for maximising objectives through the use of mathematical models and statistical methods. Such techniques are implicated in learning analytics which seek to create models of real world behaviour for practical application.
  5. Artificial intelligence and Data mining, machine learning techniques built on data mining and AI methods are capable of detecting patterns in data. In learning analytics such techniques can be used for intelligent tutoring systems, classification of students in more dynamic ways than simple demographic factors, and resources such as "suggested course" systems modelled on collaborative filtering techniques.
  6. Social network analysis (SNA), which analyses relationships between people by exploring implicit (e.g. interactions on forums) and explicit (e.g. "friends" or "followers") ties online and offline. SNA developed from the work of sociologists like Wellman and Watts, and mathematicians like Barabasi and Strogatz. The work of these individuals has provided us with a good sense of the patterns that networks exhibit (small world, power laws), the attributes of connections (in early 70's, Granovetter explored connections from a perspective of tie strength and impact on new information), and the social dimensions of networks (for example, geography still matters in a digital networked world). It is particularly used to explore clusters of networks, influence networks, engagement and disengagement, and has been deployed for these purposes in learning analytic contexts.
  7. Information visualization, which is an important step in many analytics for sensemaking around the data provided, and is used across most techniques (including those above).[15]

History of learning analytics in higher education

The first graduate program focused specifically on learning analytics was created by Ryan S. Baker and launched in the Fall 2015 semester at Teachers College, Columbia University. The program description states that

data about learning and learners are being generated today on an unprecedented scale. The fields of learning analytics (LA) and educational data mining (EDM) have emerged with the aim of transforming this data into new insights that can benefit students, teachers, and administrators. As one of world's leading teaching and research institutions in education, psychology, and health, we are proud to offer an innovative graduate curriculum dedicated to improving education through technology and data analysis."[16]

Analytic methods

Methods for learning analytics include:

  • Content analysis, particularly of resources which students create (such as essays).
  • Discourse analytics, which aims to capture meaningful data on student interactions which (unlike social network analytics) aims to explore the properties of the language used, as opposed to just the network of interactions, or forum-post counts, etc.
  • Social learning analytics, which is aimed at exploring the role of social interaction in learning, the importance of learning networks, discourse used to sensemake, etc.[17]
  • Disposition analytics, which seeks to capture data regarding student's dispositions to their own learning, and the relationship of these to their learning.[18][19] For example, "curious" learners may be more inclined to ask questions, and this data can be captured and analysed for learning analytics.

Analytic outcomes

Analytics have been used for:

  • Prediction purposes, for example to identify "at risk" students in terms of drop out or course failure
  • Personalization & adaptation, to provide students with tailored learning pathways, or assessment materials
  • Intervention purposes, providing educators with information to intervene to support students
  • Information visualization, typically in the form of so-called learning dashboards which provide overview learning data through data visualisation tools

Software

Much of the software that is currently used for learning analytics duplicates functionality of web analytics software, but applies it to learner interactions with content. Social network analysis tools are commonly used to map social connections and discussions. Some examples of learning analytics software tools include:

  • BEESTAR INSIGHT: a real-time system that automatically collects student engagement and attendance, and provides analytics tools and dashboards for students, teachers and management[20][non-primary source needed]
  • LOCO-Analyst: a context-aware learning tool for analytics of learning processes taking place in a web-based learning environment[21][22]
  • SAM: a Student Activity Monitor intended for personal learning environments[23][non-primary source needed]
  • SNAPP: a learning analytics tool that visualizes the network of interactions resulting from discussion forum posts and replies[24][non-primary source needed]
  • Solutionpath StREAM: A leading UK based real-time system that leverage predictive models to determine all facets of student engagement using structured and unstructured sources for all institutional roles[25][non-primary source needed]
  • Student Success System: a predictive learning analytics tool that predicts student performance and plots learners into risk quadrants based upon engagement and performance predictions, and provides indicators to develop understanding as to why a learner is not on track through visualizations such as the network of interactions resulting from social engagement (e.g. discussion posts and replies), performance on assessments, engagement with content, and other indicators[26][non-primary source needed]

Ethics and privacy

The ethics of data collection, analytics, reporting and accountability has been raised as a potential concern for learning analytics,[5][27][28] with concerns raised regarding:

  • Data ownership[29]
  • Communications around the scope and role of learning analytics
  • The necessary role of human feedback and error-correction in learning analytics systems
  • Data sharing between systems, organisations, and stakeholders
  • Trust in data clients

As Kay, Kom and Oppenheim point out, the range of data is wide, potentially derived from:[30]

  • Recorded activity: student records, attendance, assignments, researcher information (CRIS)
  • Systems interactions: VLE, library / repository search, card transactions
  • Feedback mechanisms: surveys, customer care
  • External systems that offer reliable identification such as sector and shared services and social networks

Thus the legal and ethical situation is challenging and different from country to country, raising implications for:[30]

  • Variety of data: principles for collection, retention and exploitation
  • Education mission: underlying issues of learning management, including social and performance engineering
  • Motivation for development of analytics: mutuality, a combination of corporate, individual and general good
  • Customer expectation: effective business practice, social data expectations, cultural considerations of a global customer base.
  • Obligation to act: duty of care arising from knowledge and the consequent challenges of student and employee performance management

In some prominent cases like the inBloom disaster,[31] even full functional systems have been shut down due to lack of trust in the data collection by governments, stakeholders and civil rights groups. Since then, the learning analytics community has extensively studied legal conditions in a series of experts workshops on "Ethics & Privacy 4 Learning Analytics" that constitute the use of trusted learning analytics.[32][non-primary source needed] Drachsler & Greller released a 8-point checklist named DELICATE that is based on the intensive studies in this area to demystify the ethics and privacy discussions around learning analytics.[33]

  1. D-etermination: Decide on the purpose of learning analytics for your institution.
  2. E-xplain: Define the scope of data collection and usage.
  3. L-egitimate: Explain how you operate within the legal frameworks, refer to the essential legislation.
  4. I-nvolve: Talk to stakeholders and give assurances about the data distribution and use.
  5. C-onsent: Seek consent through clear consent questions.
  6. A-nonymise: De-identify individuals as much as possible
  7. T-echnical aspects: Monitor who has access to data, especially in areas with high staff turn-over.
  8. E-xternal partners: Make sure externals provide highest data security standards

It shows ways to design and provide privacy conform learning analytics that can benefit all stakeholders. The full DELICATE checklist is publicly available.[34]

Open learning analytics

Chatti, Muslim and Schroeder[35] note that the aim of open learning analytics (OLA) is to improve learning effectiveness in lifelong learning environments. The authors refer to OLA as an ongoing analytics process that encompasses diversity at all four dimensions of the learning analytics reference model.[6]

See also

Further reading

For general audience introductions, see:

  • The Educause learning initiative briefing (2011)[36]
  • The Educause review on learning analytics (2011)[37]
  • The UNESCO learning analytics policy brief (2012)[38]

Notes

  1. ^ This section is adapted from the EdFutures.net wiki (CC-By-SA)

References

  1. ^ "Call for Papers of the 1st International Conference on Learning Analytics & Knowledge (LAK 2011)". Retrieved 12 February 2014.
  2. ^ Siemens, George. "What Are Learning Analytics?" Elearnspace, August 25, 2010. http://www.elearnspace.org/blog/2010/08/25/what-are-learning-analytics/
  3. ^ "I somewhat disagree with this definition—it serves well as an introductory concept if we use analytics as a support structure for existing education models. I think learning analytics—at an advanced and integrated implementation—can do away with pre-fab curriculum models." George Siemens in the Learning Analytics Google Group discussion, August 2010
  4. ^ "In the descriptions of learning analytics we talk about using data to "predict success". I've struggled with that as I pore over our databases. I've come to realize there are different views/levels of success." Mike Sharkey, Director of Academic Analytics, University of Phoenix, in the Learning Analytics Google Group discussion, August 2010
  5. ^ a b Greller, Wolfgang; Drachsler, Hendrik (2012). "Translating Learning into Numbers: Toward a Generic Framework for Learning Analytics" (pdf). Educational Technology and Society. 15 (3): 42–57.
  6. ^ a b Mohamed Amine Chatti, Anna Lea Dyckhoff, Ulrik Schroeder and Hendrik Thüs (2012). A reference model for learning analytics. International Journal of Technology Enhanced Learning (IJTEL), 4(5/6), pp. 318-331.
  7. ^ Chatti, M. A., Lukarov, V., Thüs, H., Muslim, A., Yousef, A. M. F., Wahid, U., Greven, C., Chakrabarti, A., Schroeder, U. (2014). Learning Analytics: Challenges and Future Research Directions. eleed, Iss. 10. http://eleed.campussource.de/archive/10/4035
  8. ^ a b Powell, Stephen, and Sheila MacNeill. Institutional Readiness for Analytics A Briefing Paper. CETIS Analytics Series. JISC CETIS, December 2012. "Archived copy" (PDF). Archived from the original (PDF) on 2013-05-02. Retrieved 2016-04-29. {{cite web}}: Unknown parameter |deadurl= ignored (|url-status= suggested) (help)CS1 maint: archived copy as title (link).
  9. ^ Gašević, D.; Dawson, S.; Siemens, G. (2015). "Let's not forget: Learning analytics are about learning" (PDF). TechTrends. 59 (1): 64–71. doi:10.1007/s11528-014-0822-x.
  10. ^ G. Siemens, D. Gasevic, C. Haythornthwaite, S. Dawson, S. B. Shum, R. Ferguson, E. Duval, K. Verbert, and R. S. J. D. Baker. Open Learning Analytics: an integrated & modularized platform. 2011.
  11. ^ Baepler, P.; Murdoch, C. J. (2010). "Academic Analytics and Data Mining in Higher Education". International Journal for the Scholarship of Teaching and Learning. 4 (2).
  12. ^ a b C. Brooks. A Data-Assisted Approach to Supporting Instructional Interventions in Technology Enhanced Learning Environments. PhD Dissertation. University of Saskatchewan, Saskatoon, Canada 2012.
  13. ^ "Learning Analytics | Teachers College Columbia University". www.tc.columbia.edu. Retrieved 2015-10-13.
  14. ^ Ferguson, Rebecca. The State of Learning Analytics in 2012: A Review and Future Challenges. Technical Report. Knowledge Media Institute: The Open University, UK, 2012. http://kmi.open.ac.uk/publications/pdf/kmi-12-01.pdf
  15. ^ a b Cooper, Adam. A Brief History of Analytics A Briefing Paper. CETIS Analytics Series. JISC CETIS, November 2012. http://publications.cetis.ac.uk/wp-content/uploads/2012/12/Analytics-Brief-History-Vol-1-No9.pdf.
  16. ^ "Learning Analytics". www.tc.columbia.edu. Retrieved 2015-11-03.
  17. ^ Buckingham Shum, S. and Ferguson, R., Social Learning Analytics. Educational Technology & Society (Special Issue on Learning & Knowledge Analytics, Eds. G. Siemens & D. Gašević), 15, 3, (2012), 3-26. http://www.ifets.info Open Access Eprint: http://oro.open.ac.uk/34092
  18. ^ Brown, M., Learning Analytics: Moving from Concept to Practice. EDUCAUSE Learning Initiative Briefing, 2012. http://www.educause.edu/library/resources/learning-analytics-moving-concept-practice
  19. ^ Buckingham Shum, S. and Deakin Crick, R., Learning Dispositions and Transferable Competencies: Pedagogy, Modelling and Learning Analytics. In: Proc. 2nd International Conference on Learning Analytics & Knowledge (Vancouver, 29 Apr-2 May 2012). ACM: New York. pp.92-101. doi:10.1145/2330601.2330629 Eprint: http://oro.open.ac.uk/32823
  20. ^ http://www.beestar.eu/insight/
  21. ^ Ali, L.; Hatala, M.; Gaševic, D.; Jovanovic, J. (2012). "A qualitative evaluation of evolution of a learning analytics tool" (PDF). Computers & Education. 58 (1): 470–489. doi:10.1016/j.compedu.2011.08.030.
  22. ^ Ali, L.; Asadi, M.; Gaševic, D.; Jovanovic, J.; Hatala, M. (2013). "Factors influencing beliefs for adoption of a learning analytics tool: An empirical study" (PDF). Computers & Education. 62: 130–148. doi:10.1016/j.compedu.2012.10.023.
  23. ^ http://www.role-showcase.eu/role-tool/student-activity-monitor
  24. ^ http://research.uow.edu.au/learningnetworks/seeing/snapp/
  25. ^ http://www.solutionpath.co.uk
  26. ^ http://www.brightspace.com/solutions/higher-education/advanced-analytics/
  27. ^ Slade, Sharon and Prinsloo, Paul "Learning analytics: ethical issues and dilemmas" in American Behavioral Scientist (2013), 57(10), pp. 1509-1528. http://oro.open.ac.uk/36594
  28. ^ Siemens, G. "Learning Analytics: Envisioning a Research Discipline and a Domain of Practice." In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, 4–8, 2012. http://dl.acm.org/citation.cfm?id=2330605.
  29. ^ Kristy Kitto, Towards a Manifesto for Data Ownership http://www.laceproject.eu/blog/towards-a-manifesto-for-data-ownership/
  30. ^ https://www.bloomberg.com/bw/articles/2014-05-01/inbloom-shuts-down-amid-privacy-fears-over-student-data-tracking
  31. ^ http://www.laceproject.eu/ethics-privacy-learning-analytics/
  32. ^ Drachsler, H. & Greller, W. (2016). Privacy and Analytics – it's a DELICATE issue. A Checklist to establish trusted Learning Analytics. 6th Learning Analytics and Knowledge Conference 2016, April 25–29, 2016, Edinburgh, UK.
  33. ^ http://de.slideshare.net/Drachsler/delicate-checklist-to-establish-trusted-learning-analytics
  34. ^ Mohamed Amine Chatti, Arham Muslim, and Ulrik Schroeder (2017). Toward an Open Learning Analytics Ecosystem. In Big Data and Learning Analytics in Higher Education (pp. 195-219). Springer International Publishing.
  35. ^ Eli (2011). "Seven Things You Should Know About First Generation Learning Analytics". EDUCAUSE Learning Initiative Briefing.
  36. ^ Long, P.; Siemens, G., (2011). "Penetrating the fog: analytics in learning and education". Educause Review Online. 46 (5): 31–40.{{cite journal}}: CS1 maint: extra punctuation (link) CS1 maint: multiple names: authors list (link)
  37. ^ Buckingham Shum, Simon (2012). Learning Analytics Policy Brief (PDF). UNESCO.