Methodological Issues in Learning Analytics: Critical Insights and Reflections

Professor Mark Brown, National Institute for Digital Learning, Dublin City University
.
This brief opinion paper raises a number of conceptual and methodological issues associated with attempts to evaluate institutional initiatives in the area of learning analytics. It frames the discussion around three recent works that invite a more critical reading of learning analytics research and the potential of interventions and data-driven decisions for successful, sustainable and scalable impact on an institution-wide basis.
.
Firstly, the emerging field of Learning Analytics would benefit from more critical engagement with some of the points raised by Paul Kirschner (2016) in his keynote at the 6th International Conference on Learning Analytics and Knowledge (LAK16). More specifically, Kirschner warns that naïve understandings of learning and narrow conceptions of learning analytics may potentially do a lot of harm. More recently Kirschner and Neelen (2017) argue that many so-called learning analytics initiatives: (i) view education as a simple process that is easily modelled; (ii) base decisions and interventions on data rich but weak theory; (iii) inform decisions and interventions based on wrong or even invalid variables; (iv) make interpretations and arrive at conclusions that confuse correlations with causality; and (v) result in unintended and unwanted effects that pigeonhole and stereotype learners which may be counterproductive to enhancing student engagement and learner success. Arguably, to date there has not been a serious or comprehensive response to these justifiable concerns.
.
Secondly, in many cases the value of learning analytics research would be enhanced when critically interpreting the findings of institutional pilot studies and interventions by greater consideration and acknowledgement of potential Hawthorn Effects—that is, the impact from the novelty of the new interventions. A related point is the positive expectations of both participants and researchers need to be given greater attention in reporting institutional studies. In a similar vein, the burgeoning of learning analytics literature can learn a number of lessons from a recent study by Grimes et. al., (2017) which alerts us to the problem of impact bias. This line of methodological critique raises questions about the trustworthiness of learning analytics research due to a tendency towards reporting positive findings and overly optimistic evaluations from key stakeholders. The lesson is that failures rarely get reported in the literature when in fact they may offer valuable insights for institutional leaders.
.
Thirdly, many learning analytics interventions would benefit from being more informed by the wider literature on what is already known about what works in supporting student success. In this respect, major findings from the recent What Works Report 2 (Thomas, et. al., 2017) help to illustrate that the sum of the whole may be greater than the individual parts. This point serves to highlight the danger of narrow reductionist interventions that focus on single courses and overlook the importance of helping students to develop a greater sense of belonging, especially during early stages of the study lifecycle. Indeed, such interventions may be counter-productive as they focus our attention and guide our search for love in the wrong place.
.
Moreover, as Zepke and Leach (2010) argue, a number of important “soft factors” often beyond the realm of institutions play an important role in influencing the pre-conditions for student engagement and success. Accordingly, more inter-disciplinary work and a greater level of border crossing of the literature is required if the emerging field of learning analytics is to heed the lesson from Einstein that: ‘Not everything that counts can be counted, and not everything that can be counted counts’.
.
Finally, the significance and practical implications of the above points are demonstrated in the case of early efforts at Dublin City University (DCU) to explore the potential of learning analytics (Corrigan, et. al., 2015). When unpicked and with the benefit of hindsight this case example illustrates both the opportunities and potential risks of (mis-)using learning analytics to advance the wider goal of harnessing institutional data to transform the student experience and develop a greater sense of belonging.
 .
References
Corrigan, O., Smeaton, A., Glynn, M., & Symth, S. (2015). Using educational analytics to improve test performance (pp. 42-55). In G. Conole, T. Klobucar, C. Rensing, J. Konert, & E. Lavoue (Eds.). Design for teaching and learning in a networked world. Proceedings of 10th European Conference on Technology-enhanced Learning (EC-TEL 2015),Toledo, Spain, 15-18th September. 
Grimes, A., Medway, D., Foos, A., & Goatman, A. (2017). Impact bias in student evaluations of higher education. Studies in Higher Education, 42, No. 6, 945–962
Kirschner, P. (2016). Learning analytics: Utopia or dystopia. Keynote presentation at 6th International Conference on Learning Analytics and Knowledge (LAK16). Edinburgh, 28th April. Available from http://lak16.solaresearch.org/?page_id=14
Kirschner, P., & Neelen, M. (2017). Predicting tech trends in education is hard, especially about the future. 3-Star Learning Experiences [blog]. Available from https://3starlearningexperiences.wordpress.com/2017/05/02/predicting-tech-trends-in-education-is-hard-especially-about-the-future
Thomas, L., Hill, M., Mahony, J., & Yorke, M. (2017). Supporting student success: Strategies for institutional change – What works? Student retention & success programme. Higher Education Academy. Available from https://www.heacademy.ac.uk/system/files/downloads/full_report_final_draft.pdf
Zepke, N., & Leach, L. (2010). Beyond hard outcomes: 'soft' outcomes and engagement as student success. Teaching in Higher Education, 15 (6), 661-673. 
 .
More information about Mark Brown can be found at the National Institute for Digital Learning website here.