Purpose: Online content developers use the term “stickiness” to refer to the ability of their online service or game to attract and hold the attention of users and create a compelling and magnetic reason for them to return repeatedly (examples include virtual pets and social media). In business circles, the same term connotes the level of consumer loyalty to a particular brand. This paper aims to extend the concept of “stickiness” not only to describe repeat return and commitment to the learning “product”, but also as a measure of the extent to which students are engaged in online learning opportunities.

Design/methodology/approach: This paper explores the efficacy of several approaches to the monitoring and measuring of online learning environments, and proposes a framework for assessing the extent to which these environments are compelling, engaging and “sticky”.

Findings: In particular, the exploration so far has highlighted the difference between how lecturers have monitored the engagement of students in a face-to-face setting versus the online teaching environment.

Practical implications: In the higher education environment where increasingly students are being asked to access learning in the online space, it is vital for teachers to be in a position to monitor and guide students in their engagement with online materials.

Originality/value: The mere presence of learning materials online is not sufficient evidence of engagement. This paper offers options for testing specific attention to online materials allowing greater assurance around engagement with relevant and effective online learning activities.


online learning, student engagement, monitoring, analytics, stickiness, time to read

Link to Publisher Version (URL)


Find in your library

Included in

Education Commons