Get Complete Project Material File(s) Now! »
Chapter 3 Theoretical framework and research questions
Research perspectives
In this chapter I present the theoretical framework that connects the two complementary strands of research that form the core of this thesis. Prior to detailing the framework in Section 3.2, I begin this chapter by briefly describing these two research strands. The first involves evaluating the PeerWise and CodeWrite tools from an educational perspective. In other words, understanding whether the activities that students engage in when using the tools have a measurable impact on their learning. This evaluation takes place in realistic course settings, where the tools are used unsupervised and over extended periods of time. The second strand focuses on evaluating the tools from a user interface perspective. Specifically, this part of the research explores whether the inclusion of game elements within the user interface of the tools has an impact on how students engage with them. These research strands are complementary in the sense that identifying actions taken by students that have a positive impact on learning provides the justification for having a user interface design that effectively encourages those actions. A detailed description of the design of the two tools, including the implemented game elements, follows in Chapter 4.
An educational perspective
PeerWise and CodeWrite both support a pedagogy of question generation and self-testing. Students use the tools to create practice questions, along with associated solutions, with the intention that these will be used by their peers for study purposes. This approach shifts the responsibility for creating learning resources, which traditionally lies with the teacher or expert, to the students. Within both tools authoring and answering questions are the two primary activities with which students engage. The key difference between the tools is simply the format of the questions they support.
From an educational perspective, the key question of interest is whether student engagement with the tools is positively related to learning outcomes. There are several reasons to suggest that such a relationship may exist. When creating questions, students must identify relevant course concepts and generate solutions or explain ideas in ways that may help to reinforce their own understanding. Reviewing questions created by a cohort of peers can help students make self-assessments about their own level of knowledge relative to others. Robust effects from the educational psychology literature indicate that both generating questions and answering questions can improve student performance when subsequently examined on related items.
Despite the theoretical benefits that may be offered by student-generated question activities, it is neces-sary to evaluate PeerWise and CodeWrite under realistic conditions as they are typically used in practice. For example, some earlier work on the efficacy of question authoring tasks involved experiments where con-ditions were carefully controlled and activity occurred over short time frames. In contrast, when students interact with PeerWise and CodeWrite they are typically unsupervised and are using them over extended periods of time as part of authentic courses. The fact that content is published without expert oversight may impact the quality of the questions, and hence the effectiveness of the tools. For example, if a large number of low quality or incorrect questions were published to a repository then its value as a learning resource may be compromised. To uncover whether such concerns are valid and to understand whether, in practice,students do learn by engaging with PeerWise and CodeWrite, they must be evaluated in realistic settings. However, evaluation can be complicated by the fact that a student’s performance in a course is determined by many factors. Even if a particular learning activity is known to be effective, it may be just one of many activities across which students are distributing their time and effort. The size of any effect on overall course or examination performance that can be attributed to one particular activity may therefore be limited.
A user interface perspective
If the activities supported by the tools are shown to be useful for learning, then exploring ways of increasing student engagement with those activities may be worthwhile. For example, if answering a greater number of student-generated practice questions on PeerWise prior to an examination is associated with better ex-amination performance, then it may be useful to motivate additional question answering activity. The use of gamification is one approach for achieving this. For example, students could earn points or be rewarded with badges for answering a set number of questions, or for answering questions regularly.
If the implemented game elements do motivate additional activity, and if that activity has been shown to be beneficial to learning, then a gamified user interface could be a more effective design than one without the game elements. Of course, not all students may be motivated by gamification. For those that are, an interesting question is whether any additional activity caused by the presence of the game elements actually leads to improved learning outcomes. Also of interest when evaluating gamification is to verify that no harmful effects are observed. It would not be desirable if certain game elements or implementations of those elements caused a reduction in student behaviours that are known to be useful.
Gamification, student-generated questions and learning
A simplified version of the theoretical framework guiding this thesis was originally described in Section 1.2 and is shown again in Figure 3.1. This simplified framework highlights the two primary relationships that are investigated, with the arrows indicating hypothesised positive causal links. Put simply, the framework indicates that student engagement with the PeerWise and CodeWrite tools (consisting primarily of question authoring and answering activity) has a positive effect on their learning (as measured by test and exam-ination performance). The relationship between engagement and learning is the focus of Chapters 6 and
Establishing that this relationship is positive then justifies the implementation of gamification as a way of increasing student engagement. The framework illustrates this hypothesized link by indicating that the presence of gamification in the tools (consisting of points, badges and leaderboard elements) has a positive effect on student engagement with the question authoring and answering activities. The relationship between gamification and engagement is the focus of Chapters 8 and 9.
The complete theoretical framework is shown in Figure 3.2. The difference between this complete frame-work and the simplified version shown in Figure 3.1 is that the two activities that represent engagement (i.e. authoring and answering) are studied separately. The relationship between authoring questions and subse-quent learning is investigated independent of the relationship between question answering and learning.
The solid arrows in the framework shown in Figure 3.2 indicate a hypothesised positive relationship between two items, with the arrows indicating the direction of causality. For example, according to this model increased engagement with the activity of authoring questions is hypothesised to lead to better learning outcomes. There are, of course, various ways in which engagement and learning outcomes can be measured. The specific methods used in this thesis are outlined in Section 3.2.1 which lists the research
1 Introduction
1.1 Overview and motivation
1.2 Organisation
2 Background and related work
2.1 Background
2.2 The rise of gamification
2.3 Pitfalls and risks
2.4 Gamification in education
2.5 A pedagogy of question generation and self-testing
2.6 Question format
2.7 Related tools
3 Theoretical framework and research questions
3.1 Research perspectives
3.2 Gamification, student-generated questions and learning
3.3 Overview of experiments
4 Tool design
4.1 Introduction
4.2 PeerWise
4.3 Game elements in PeerWise
4.4 CodeWrite
4.5 Game elements in CodeWrite
5 A retrospective of PeerWise and CodeWrite research
5.1 Introduction
5.2 PeerWise retrospective
5.3 CodeWrite retrospective
6 Authoring and learning
6.1 Overview
6.2 PeerWise authoring and learning
6.3 CodeWrite authoring and learning
6.4 PeerWise authoring with assigned topics
6.5 Conclusions
7 Answering and learning
7.1 Overview
7.2 PeerWise answering and learning
7.3 CodeWrite answering and learning
7.4 Quality of PeerWise questions
7.5 Quality of CodeWrite questions
7.6 Conclusions
8 Gamification and engagement
8.1 Overview
8.2 PeerWise badges and engagement
8.3 CodeWrite badges and engagement
8.4 Conclusions
9 Gamification, engagement and learning
9.1 Overview
9.2 Gamification, engagement and learning in PeerWise
9.3 Engagement and course performance
9.4 Gamification and engagement
9.5 Gamification and course performance
9.6 Discussion
9.7 Conclusions
10 Summary and conclusions
10.1 An educational perspective
10.2 A user interface perspective
10.3 Contributions
10.4 Conclusion
GET THE COMPLETE PROJECT
On Using Gamification to Effectively Influence Student Activity in Online Learning Environments