Version 2 2022-09-12, 05:49Version 2 2022-09-12, 05:49
Version 1 2021-01-17, 12:42Version 1 2021-01-17, 12:42
conference contribution
posted on 2022-09-12, 05:49authored byColin BeerColin Beer, David Jones, Celeste Lawson
Despite sector-wide interest in learning analytics, there are currently few institution-wide deployments at scale (Dawson et al., 2018; Ferguson et al., 2014). The deficit of whole-of-institution implementations continues to
deny the sector a comprehensive understanding of the complexity of issues that mediate systemic uptake of learning analytics across an enterprise (Dawson, Mirriahi, & Gasevic, 2015). The same deficit applies to the theories and methodological approaches required for learning analytics implementation in real-world environments. Knowing what works, or otherwise, and why, provides potentially valuable generalisations or abstractions that can inform future learning analytics implementations. A team at a regional Australian university has been researching and experimenting with learning analytics for over 10 years and has developed an institutionwide learning analytics system. For deidentification purposes, the system that was developed will be called System X throughout this study. System X was developed by the team during 2014, has been used in 63% of the university’s offerings, and has facilitated communications with almost 90% of the university’s higher education students. While System X is a rare example of an institution-wide learning analytics implementation, its life beyond implementation has been beset with organisation-related challenges. Reflecting upon the design, development and operation phases of a learning analytics implementation like System X can provide valuable insights, which can contribute to a theory of implementation (Marabelli & Galliers, 2017; Sanders & George, 2017). This is especially important for learning analytics where successful, institution-wide implementations are currently rare.