Gamification for Online Engagement in Higher Education:
A Randomized Controlled Trial
David Leach, Brandon Laur, David Broome, Tina Bebbington, Jillianne Code, University of Victoria
Abstract
A randomized controlled trial was used to test if gamification tools can increase engagement and improve learning outcomes in a blended (online and in-class) second-year university course. Students in control and experimental groups accessed separate course management systems (CMS). On the gamified site, students earned badges and points for online activity and showed increases (versus control) in the personalization of online avatars; a doubling of visits to the CMS; and a reduction by 1.3 days in the time before deadline to complete weekly blog assignments. Female students used the gamified site more than males. In a post-class survey, 82% of students believed gamification was an effective motivation tool. However, there was no evidence of improved learning outcomes on graded assignments. This trial provides evidence that gamification can offer incentives for online activity and socializing but, on its own, may have little impact on quantifiable learning outcomes.
Introduction
Both “gamification” (game mechanics applied to non-game settings) and “blended learning” (the mix of online and in-class learning environments) have emerged as major trends in educational technologies (Baker, 2012). Blended learning, via online course management systems (CMS) such as Blackboard and Moodle, holds “the potential, human and technological, of accommodating students with distinct learning needs” (Dias, 2014). Likewise, gamification has witnessed growth in the commercial and non-educational spheres to engage and retain customers and clients on enterprise websites (Zichermann, 2011). In Google Scholar, “gamification” generated 6,830 results (as of May 26, 2014). Advocates have demonstrated anecdotally the power of game-based technologies and gamified pedagogies to motivate a broad range of participants, from students in institutional classrooms (Sheldon, 2011; Kapp, 2012), to crowd-sourced scientific research (Eiben, 2012), to “alternate reality games” aimed at non-academic general audiences (McGonigal, 2011).
Gamification has generated debate over its definition, its benefits, and its pitfalls, “broadly opposing marketing professionals vs. designers and scholars of serious games” (Rughinis, 2013). The basic definition of gamification entails “the use of game deign elements in non-game contexts” (Groh, 2012), although a refined definition focuses on educational contexts: “simple gameplay to support productive interaction for expected types of learners and instructors” (Rughinis, 2013).
There are many examples and case studies of educational games, play-based classrooms and gamified online educational environments. However, there is little experimental evidence of the efficacy of a gamified educational space when compared to a control group under similar conditions using randomized trial. As Groh cautions: “for a good academic summary the hype has to cool down before and proper scientific studies about the benefits as well as the side-effects of gamification are needed” (2012). Our experiment attempts to bridge that gap in the scientific literature by testing whether basic gamification tools can increase online activity in a blended classroom and, if so, whether improved learning outcomes will be reflected in the extra communication and time-on-task.
Previous Research
A literature review of experimental studies done on educational and “serious games” found that only 10 per cent (12 out of 121) used a randomized controlled trial approach (Connolly, 2012). The review’s authors called for “more well-designed studies of games in developing higher order thinking and soft skills” (Connolly, 2012). One control-variable experiment compared groups of students using a gamified course management system (Domínguez, 2013). The authors discovered that students in the gamified online environment scored better than the control group on practical assignments and total scores, but they performed more poorly on written assignments and participated less in class activities. Technical hurdles of the gamified online interface coloured this study’s findings, so the authors divided results between students in a control group, students in the experimental group who used the gamification tools, and students in the experimental group who eschewed the gamification tools (Domínguez, 2013).
Another series of experiments added game-based learning simulations to three large-intake university courses (first-year business, third-year economics, third-year management) and compared grades in gamified sections with non-gamified control sections of the same courses. The authors found increases in final grades in courses that used games; the positive effect occurred equally for students of different genders and ethnicities, but was reversed for students 41 years and older (Blunt, 2007). The experiment offers more evidence of the potential benefits of gamified curriculum; however, the results depend on motivated instructors who can integrate higher-order simulation-based game designs into their curriculum. Other research emphasizes the importance of “meaningful framing”—embedding activities in a narrative that supports goals—to improve the results of point-based gamification systems in general (Mekler, 2013).
The literature review highlighted an opportunity to create an online component for a university course that would integrate the most basic level of game design elements in a transparent and mostly automatized format, thereby removing the influence and bias of the instructor from the experiment as much as possible. In this way, we hoped to test whether rudimentary gamification—so-called “game interface design patterns,” such as badges and leaderboards (Groh, 2012)—can enable students to duplicate the positive learning outcomes demonstrated in courses that used deeper levels of game design, or at least perform better than a non-gamified control group.
System Design
For the experiment, our research team constructed a course site using open-source WordPress software, with the BuddyPress plug-in, which added social-media functionality. The CMS site included an Assignments page that outlined requirements and deadlines for the various assignments (weekly blog posts, mid-term exam, in-class group pitch, final essay); a Syllabus page, with weekly readings from a textbook and hyperlinked stories and videos; a Forum page, where students posted 10 weekly blog assignments; and a General Discussion forum, where students could start threads, post links and engage in online discussions.
The research team duplicated the control CMS under a separate domain URL and created a gamified experimental CMS by installing two plug-ins, BadgeOS and myCRED. A research assistant designed and uploaded 20 virtual badges via BadgeOS. Student users of the experimental website could earn the badges by completing online activities. The BadgeOS plug-in awarded most badges automatically for online activities, including the following:
- Signing into the website for the first time (badge title = Welcome)
- Changing the default avatar (Facelift)
- Sending Friendship requests (Networking)
- Creating a Forum topic (Contributor)
- Replying to a Forum topic (Commenter), 10 replies (Communicator Basic), 20 replies (Silver), 30 replies (Gold)
- Joining a Group (Members Only)
- Logging into the website 50 times (Information Overlord)
- Posting an innovation pitch idea to the group project forum (Thinker) and getting three other students to support the idea (Innovator)
- Submitting a draft of the major essay (Rough Draft)
The research assistant awarded certain badges for meeting course benchmarks:
- Getting perfect scores on reading quizzes over one month (Jeopardy Jedi)
- Attending the essay workshop class (Workshop)
- Completing the group project (Presentation)
- Completing the major essay (Major Essay)
- Completing all the course work (Complete)
- Complete the course with exceptional participation (Wizard)
Once achieved, the students received an email message, and the badge appeared beside students’ user profiles on the CMS (see Figure 1). Students in the experimental group could see each other’s badge collection as they progressed, but they had to log into the CMS, using student ID and password. Outsiders and students in the control group could not see the badge collections. Students were told that badges would have no direct impact on their marks or final grades for the course.
Figure 1: Facelift badge via BadgeOS (left), leaderboard via myCRED (right)
Using the myCRED plug-in, the research assistant assigned “Points to Award” for online activities. The myCRED plug-in automatically tabulated and associated points with a student’s username. On the CMS, myCRED generated a Top Ten leaderboard listing the students, by usernames, who had accumulated the most points (see Figure 1). The number of points per task remained unchanged throughout the term. Students in the gamified group did not know which tasks earned points or how many points they could earn for each. Again, they were told that points had no bearing on final course grades. The myCRED software issued points as follows:
- Becoming a member of the site (one-time) 10
- Logging into the site (maximum three times daily) 1
- Creating a new post 5
- Replying to a post (maximum three per post) 1
- Clicking on a hyperlinked reading (once per URL) 3
- Clicking on a hyperlinked video 1
- Creating a new Forum topic 1
- Replying to a Forum topic (maximum 3 per day) 2
- Creating a new BuddyPress Group 10
- Joining, posting to or commenting in a Group 1
- Leaving a group -5
- Updating Profile (maximum twice daily) 1
- New avatar 1
- New friendship 1
- Earning a badge 50
We also added a list of “quests” to the gamified site, which outlined the stages to complete the three major assignments. The quests were descriptive add-ons, however, rather than integrated into the curriculum, in contrast with other courses that instructors have designed and overseen as narrative-inflected, quest-based “games” (Sheldon, 2011).
Data-Gathering Tools
On both the gamified experimental website and the control site, the research team installed the SlimStat plug-in to track users’ online activities, including the date, time and number of log-ins, and the amount of time spent per visit. Students in both groups completed pre-experiment and post-experiment surveys to gauge general attitudes toward gaming, interest in the course content, experience with online course management systems, levels of procrastination, and other factors.
Experimental Design
After several revisions, the research team received Human Research Ethics Board approval for a randomized controlled trial in a second-year interdisciplinary elective course of 50 students that studied technology and society. During class, the course instructor (also the principal investigator) explained the purpose of the gamification project. A research assistant distributed and collected consent forms to maintain anonymity from instructor. The research assistant eliminated students from the trial who declined consent, failed to sign the form, or registered in the class after the experiment began. (These students could access the control website, but their data was not collected.) The research assistant randomly distributed students who had signed consent forms into control and experimental groups of 21 and 20 students respectively. The research assistant emailed website addresses for the control and experimental CMS to the appropriate students, with instructions to log in via student IDs and passwords.
No further explanation was given to students about the gamified site. Throughout the term, the instructor avoided mentioning the experiment. Students were simply told to “visit the course website” to access readings and complete blog assignments. If students had questions about the gamified site, they were told to contact the research assistant via email and not ask the instructor. The gamified website was not designed to embody best practices of gameful design or to turn the course into a game, as outlined by proponents of game-based learning (Sheldon, 2011). The system of badges and points was meant to act as a simplified metric and feedback loop of course activity for students. It was set up to operate as easily and transparently as possible, with a user-friendly interface and few interventions from the research assistant or instructor. The research team wanted to set up a rudimentary gamification system, common to commercial enterprises, in a controlled environment, and measure how users interacted with the course website. By isolating two randomized groups of students online, the research team could compare interactions with a similar cohort using an identical website, minus the gamification plug-ins, to analyze as objectively as possible the tools’ effects.
Results
The course concluded after 85 days, when students handed in final assignments. Initial analysis has focused on identifying and quantifying differences between the online activity of students in the experimental and control groups. Future analysis will probe deeper, using the qualitative survey data, into explanations for the correlations identified.
1. Avatar Adoption
The most conspicuous difference between the two groups was in the students’ adoption of new avatars within the CMS. On both WordPress sites, new users who logged into the site began with a default avatar: an anonymous gray-and-white silhouette. As with other social networking sites, students could click on their profile and upload a photo or image to personalize the avatar that appeared beside their Forum posts and replies. In the experiment, students were neither instructed nor required to alter their online avatar. In the gamified site, however, students could earn a “Facelift” badge (as well as 50 points) by replacing the default avatar with a new image.
In the gamified group, 16 out of 20 students (80%) replaced the default avatar with a personalized image, either a photo or found art. In the control group, none of 21 students who had signed consent forms replaced the default image. The research assistant and the instructor had personalized their own avatars, so students in both groups were aware of the option. Given the evidence of the importance of avatars in building trust in online gaming communities (Smith, 2010), the difference between the two groups of students in the personalization of their online presence (80% vs. 0%) is a reminder of the value (as many commercial social networks recognize) of simple feedback incentives, like badges or completion bars, to encourage users to humanize their digital presence.
- Assignment Deadlines
The main required use of the online CMS was for students to write short (150-250 words) blog posts on specific questions or readings and upload each post before a weekly deadline. Students received one mark for finishing each blog on time (up to 10% for the course), but no marks for failing to post or posting after the deadline. There was little difference in the number of missed blogging assignments between the two groups; the experimental group missed 1.4 out of 10 assignments on average versus 1.3 for the control group. However, based on SlimStat tracking, the experimental group posted their assignments on average 1.3 days earlier than the control group on the weekly deadline cycle.
3. Repeat Visits
Another correlation between the gamified CMS and online activity was in the total number of visits to the course website. Measured by the SlimStat plug-in, the class average for logging into the site was 79.1 over the term, a little less than once per day. However, students in the experimental group logged into the gamified site more than twice as often as students in the control group, as mean (112.6 visits vs. 47.1) and median averages (95 vs. 45). The median corrected for one student in the gamified group, who logged in 244 times and whose online communications revealed a conscious effort to “game” the system for maximum points. The doubling of online activity held up when analyzed by the gender of students in both groups (see Table 1).
|
Combined |
Experimental |
Control |
P value |
Total: (N) mean / median |
(41) 79.1 / 63.0 |
(20) 112.6 / 95.0 |
(21) 47.1 / 45.0 |
<0.0001 |
Male: (N) mean / median |
(21) 63.0 / 50.0 |
(7) 96.1 / 91.0 |
(14) 46.4 / 45.0 |
0.003 |
Female: (N) mean / median |
(20) 96.0 / 70.5 |
(13) 121.5 / 99.0 |
(7) 48.6 / 48.0 |
0.005 |
Table 1: Total CMS visits / term
The SlimStat plug-in also tracked the length of time of each visit to the CMS. The experimental group logged in twice as often as the control group, but these students spent neither more nor less time per visit on the site (see Table 2). This finding suggests students were not simply logging in to accumulate points or badges, and even though they did not spend extra time on the CMS during each visit, students doubled the time they spent in the online space due to their increased number of visits.
|
0-60 sec. |
1-5 min. |
5-10 min. |
10-20 min. |
20+ min. |
Experimental (N: 20) % |
48.2
|
18.6 |
8.5 |
12.3 |
12.5 |
Control (N: 21) % |
45.8 |
19.3 |
8.2 |
12.0 |
14.8 |
Table 2: Mean time per visit (% range)
4. Discussion Forum
Students had to use the CMS to check the syllabus for deadlines; to access online readings; to review assignment requirements; and to post weekly blog entries. Other activity was optional. Students were allowed (although not explicitly encouraged) to post to a forum non-mandatory “General Discussion” forum that had been set up on both course sites alongside the mandatory blogging forums. In the experimental group, students started 47 separate topic threads, which generated 220 discrete replies; topics ranged from links to course-related articles or videos, discussions of videos shown or topics mentioned in class, and questions about assignments or requests for research help. (On a “News & FAQ” forum, two students started a thread to discuss how the site awarded points and badges.) By contrast, on the non-gamified control site, only one student started a topic on the Discussion Forum; it got no replies. The social-networking potential of the control site remained almost entirely dormant.
5. Gender
While video games have been traditionally associated with young men, the data from this experiment demonstrated that women can also be motivated by gamification techniques. In the experimental group’s leaderboard of points, the top five users (and eight of the top 10) were female students. The female student with the most points (1,234) earned nearly twice as much as the top male student (660). Female students earned a higher mean (705) and median (627) than male students (507, 572).
6. Learning Outcomes
By adding new pedagogical tools, instructors hope to see a positive impact on that most quantifiable measure of learning outcomes: grades. (And so do students.) While our experiment demonstrated a correlation between gamification tools and an increased use of the CMS, the extra online activity did not translate into a rise in grades for the gamified group, either as a whole or when examined by gender. The marginal differences in the average grades for the midterm, major essay and final course were outside of statistical significance (see Tables 3 to 5). On the Midterm, where the P value was lowest, the experimental group actually scored a lower than control: 77.2 vs. 81.1 (see Table 3).
|
Experimental |
Control |
P value |
Class: mean % |
77.2 |
81.1 |
0.21 |
Male: mean % |
75.3 |
79.7 |
0.27 |
Female: mean % |
78.5 |
83.9 |
0.31 |
Table 3: Midterm marks
|
Experimental |
Control |
P value |
Class: mean % |
78.1 |
76.1 |
0.38 |
Male: % |
74.5 |
74.0 |
0.89 |
Female: % |
81.1 |
80.1 |
0.81 |
Table 4: Final assignment marks
|
Experimental |
Control |
P value |
Class: % |
81.5 |
79.4 |
0.37 |
Male: % |
78.2 |
77.8 |
0.91 |
Female: % |
83.6 |
82.4 |
0.71 |
Table 5: Final marks
7. Surveys
We have only begun our analysis of the pre-experiment and post-experiment survey data. Eighteen of the 20 students in the experimental group completed all or part of the exit survey. The following results stand out: 10 students (56%) felt the badges had a “mildly positive” effect that “encouraged [the student] to spend more time on the course site,” six (33%) indicated the badges “made no difference,” while one student felt they were a “mildly negative distraction” and one felt a “positive” motivation to “collect the most badges possible.” As for the leaderboard and point system, one student said it had a “negative” demotivating effect, two felt it was a “mildly negative” distraction, seven (39%) said it “made no difference,” while five (28%) cited a “mildly positive” and two a “positive” effect. When asked, “Do you feel that gamification tools can be effective for motivating students?” the gamified group replied as follows: No (0%), Probably not (6%), No opinion (12%), Probably (53%), and Definitely (29%). In the control group, 35% of students said they were “very interested” in what was going on in the gamified site (which they could not access), 40% were “mildly curious,” 10% “rarely thought of it” and 10% were “relieved not to be in that group.”
Discussion
Numerical grades are an imperfect measurement of learning, especially in a course based on evaluating students’ ability to think and write critically about abstract concepts related to technology and society. The diverse student body of this interdisciplinary elective course (which included undergraduate students in every year of study, from almost every faculty on campus) perhaps confounded the sample at the outset with a wide range of student knowledge and abilities. At the same time, the fact that the basic gamification tools of badges and points were associated with such statistically significant increases in the personalization of avatars, inter-student social networking and visits to the course site in general suggests that badges and leaderboards cannot be dismissed as motivation tools in an online educational setting. The depth of this “engagement” remains to be explored given the absence of any increase in quantifiable learning outcomes. To bend an aphorism: You can game a class to your course website, but you can’t (necessarily) make them think.
Some of the extensive data from this experiment, including in-class quiz and attendance results and qualitative surveys, remains to be analyzed in the light of our central findings about the effect of gamification on online engagement (a doubling of activity) and assignment grades (no impact). In this trial, the gamified experimental group used a website that intermingled both badges and a points-and-leaderboard system. A future iteration may separate the class into two groups, with one site gamified via badges and the other via points, to compare the impact of these two tools. A version of the course may be constructed that connects online resources and discussions more directly to the factual knowledge tested on the midterm and the critical and literary skills evaluated on the take-home assignments. This version might foreground assignment-based “quests” to create “meaningful framing” that links the raw gamification metrics with the course curriculum. Questions remain: Is the increased online engagement, enabled via gamification in this experiment, merely a superficial “checking in” by students to collect badges and points and to socialize online? Or is it an untapped resource in the attention economy of their academic lives that could be focused, within a blended course, on more purposeful activities directly connected to final learning outcomes?
References
Baker, P., Bujak, K. R., & DeMillo, R. (2012). The evolving university: Disruptive change and institutional innovation. Procedia Computer Science, 14, 330-335.
Blunt, R. (2007, January). Does game-based learning work? Results from three recent studies. In The Interservice/Industry Training, Simulation & Education Conference (I/ITSEC) (Vol. 2007, No. 1). National Training Systems Association.
Connolly, T. M., Boyle, E. A., MacArthur, E., Hainey, T., & Boyle, J. M. (2012). A systematic literature review of empirical evidence on computer games and serious games. Computers & Education, 59(2), 661-686.
Dias, S.B., & Diniz, J.A. (2014). Towards an Enhanced Learning Management System for Blended Learning in Higher Education Incorporating Distinct Learning profiles. Educational Technology & Society, 17 (x), 307-319.
Domínguez, A., & Saenz-de-Navarrete, J., Luis de-Marcos, Luis Fernández-Sanz, Carmen Pagés, José-Javier Martínez-Herráiz (2013). Gamifying learning experiences: Practical implications and outcomes, Computers & Education, 63, 380-392.
Eiben, C. B., Siegel, J. B., Bale, J. B., Cooper, S., Khatib, F., Shen, B. W., … & Baker, D. (2012). Increased Diels-Alderase activity through backbone remodeling guided by Foldit players. Nature biotechnology, 30(2), 190-192.
Groh, F. (2012). Gamification: State of the art definition and utilization. Institute of Media Informatics Ulm University, 39-47.
Kapp, K.M. (2012). The gamification of learning and instruction: game-based methods and strategies for training and education. Wiley.
McGonigal, J. (2011) Reality is Broken: why games make us better and how they can change the world. Penguin.
Mekler, E. D., Brühlmann, F., Opwis, K., & Tuch, A. N. (2013, April). Disassembling gamification: the effects of points and meaning on user motivation and performance. In CHI’13 Extended Abstracts on Human Factors in Computing Systems (pp. 1137-1142). ACM.
Rughinis, R. (2013). Gamification for Productive Interaction: Reading and working with the gamification debate in education. 8th Iberian Conference on Information Systems and Technology.
Sheldon, L. (2011). The Multiplayer Classroom: Designing coursework as a game. Course Technology Press.
Smith, J.H. (2010). Trusting the Avatar. Games and Culture. 5, 298-313.
Zichermann, G., & Cunningham, C. (2011). Gamification by Design: Implementing game mechanics in web and mobile apps. O’Reilly Media, Inc.
Recent Comments