Next Article in Journal
Evidence-Based Design of a Teacher Professional Development Program for Differentiated Instruction: A Whole-Task Approach
Next Article in Special Issue
Blue Is for Boys and Pink Is for Girls: How to Break Gender Stereotypes with a Videogame
Previous Article in Journal
Perceived Stress and Perceived Lack of Control of Student Teachers in Field Practice Placements in Schools during the COVID-19 Pandemic: Validity of the PSS-10 Adapted to the Field Practice Context
Previous Article in Special Issue
Investigating the Impact of Gamification on Student Motivation, Engagement, and Performance
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Does Gamification Make a Difference in Programming Education? Evaluating FGPE-Supported Learning Outcomes

Department of Information Technology in Management, University of Szczecin, 71-004 Szczecin, Poland
*
Author to whom correspondence should be addressed.
Educ. Sci. 2023, 13(10), 984; https://doi.org/10.3390/educsci13100984
Submission received: 21 July 2023 / Revised: 14 September 2023 / Accepted: 20 September 2023 / Published: 26 September 2023
(This article belongs to the Special Issue Game-Based Learning and Gamification for Education—Series 2)

Abstract

:
While gamification has been paired with computer programming education on numerous occasions, most of the scientific reports covering the evaluation of its learning outcomes pertain to single-case specialized gamification applications with none or little chance of reuse in other institutions and courses; thus, they offer only limited replicability and comparability of results. In this work, we aim to address this gap by investigating the learning outcomes of a programming course based on the FGPE (Framework for Gamified Programming Education) platform, an open-source, fully configurable gamification platform developed specifically to support teaching and learning computer programming that can be used by any institution to support any programming-related course. This is, to the best of our knowledge, the very first study evaluating the learning outcomes of FGPE-supported programming education. Moreover, we address the question of whether students learning with gamified platforms limited to programming practice can benefit from additionally using non-gamified MOOCs by comparing the results attained in three groups differing in the choice of learning support tools (FGPE and MOOC vs. FGPE only vs. MOOC only).

1. Introduction

According to a widely agreed-upon definition of gamification, it consists of “using game-based mechanics, aesthetics and game thinking to engage people, motivate action, promote learning, and solve problems” [1]. The virtues of gamification fueled its development in various areas of education, with the largest share of interest coming from computer science [2]. Within this area, gamification was found to be particularly relevant for computer programming education [3], which can be attributed to several reasons: the (real or perceived) difficulty of this subject (see [4]), the almost inherent use of computers in teaching programming (making it easy to add an additional layer of gamification software), and the importance and wide use of automatic assessment in the process (see [5]), which provides a convenient point for embedding the gamification rules and the feedback they generate.
As a manifestation of this interest, a number of interactive platforms for learning programming have embraced gamification to a lesser or greater extent. This is best exemplified with the industry-leading platforms, such as Codecademy [6], Code School [7], and CheckiO [8]. Note that we exclude here educational games teaching programming, such as Leek Wars [9] or Code Combat [10], as these are instances of game-based learning (GBL) rather than gamification. Although both approaches share the same inspiration (games), according to Lander’s theory of gamified learning, gamification has the purpose of affecting learning-related behaviors or attitudes that, in turn, influence learning through a moderating process of strengthening the relationship between instructional design quality and outcomes and/or by a mediating process of influencing learning directly, whereas the purpose of GBL is to affect learning without such a behavioral mediator or moderator [11].
Apart from the gamified programming learning platforms for the general audience, there is also a plethora of proprietary platforms developed by individual institutions to support specific programming courses they offer to their students (see, e.g., [12,13,14,15,16,17,18]).
Quite a unique addition to this scene is the FGPE (Framework for Gamified Programming Education) platform, developed in a strive to satisfy both students’ and teachers’ needs, including:
  • learning various programming languages, including single courses covering various languages and/or letting students use a programming language of their choice to solve any exercise,
  • learning in various languages of instruction, including providing the multi-lingual students with the ability to switch between different languages of specification of individual exercises,
  • letting the teacher decide on the scope of learned contents, not only by selecting one of the provided ready-to-use courses but also by composing their own courses, reusing exercises from the provided courses, or creating new exercises of their own invention,
  • letting the teacher decide on the selection of applied gamification techniques, not only by selecting one of the provided ready-to-use gamified courses but also by composing own courses reusing gamification layer specifications from the provided courses or developing own courses featuring gamification rules of one’s own concept,
  • learning in various contexts, including one when the FGPE-served course is but a part of a larger Massive Open Online Course (MOOC) with an automatic bidirectional student identity, activity, and achievement data transfer via an interface compliant with the Learning Tools Interoperability (LTI) standard.
The FGPE ecosystem consists of several components, primarily including [19]:
  • FGPE AuthorKit, a tool to prepare and manage both programming exercises and gamification rules,
  • a GitHub-hosted open repository of gamified programming exercises,
  • FGPE Gamification Service, the back-end that processes gamification rules and keeps the game state,
  • Mooshak, the sandbox that executes programs submitted by students and automatically assesses them,
  • FGPE PLE is an interactive programming learning environment, implemented as a Progressive Web App so that it can be conveniently used on both desktop computers and mobile devices, that allows students to access the gamified programming exercises, solve them, and get them graded, and teachers to arrange exercise sets, grant access to them to their students, and follow the students’ learning progress.
As the positive effects of gamification are not proven [20], it is important to evaluate newly introduced gamified learning platforms and courses. Although the applications of gamification can be evaluated in various aspects, the recently proposed GATUGU framework covers 23 dimensions belonging to six perspectives [21]; in the case of educational gamification, the most interesting is its effect on students’ learning outcomes. While less direct in character than the gamification effect on, e.g., user experience or motivation, it is the improvement of learning outcomes that provides the most convincing reason to apply gamification to education.
This research gap has been scarcely addressed so far. Using Google Scholar and snowball sampling, we were able to identify merely 17 such publications (note that our search avoided reports on programming-themed educational games, which are seemingly more popular). After excluding 2 papers that referred to offline gamification of programming education (i.e., without using computers), 8 papers that referred to quiz-like gamification of teaching (i.e., asking students programming-related questions rather than providing them with a gamified interactive learning environment to verify their coding abilities), and 1 paper that reported statistically insignificant results, we were left with six works, which are listed below in chronological order:
  • Ibanez et al. reported a significant increase in the comprehension of C-programming languages by 22 Spanish students who used the Q-Learning-G gamified platform [12];
  • Moreno and Pineda reported a significant increase in the academic performance in all three main thematic parts (Conditionals, Iteration, and Arrays) of the course on introductory programming in Java of 24 Colombian students who used the CPP gamified platform vs. 21 who did not use it [15];
  • Marin et al. reported a significant improvement in marks (in 3 out of 4 analyzed comparisons) received by 267 Chilean students whose learning of programming in the C language was supported with UDPiler vs. two groups of 143 and 407 students, respectively, who did not use it at all [16];
  • de Pontes et al. reported a significant increase in the number of assignments completed in a 90-min experiment for 30 Brazilian students who learned with a dedicated gamified platform vs. 30 students who did not use it [17];
  • Garcia and Revano reported a significant increase in the skills performance score of 50 Philippine students who learned programming with the help of the CheckiO gamified platform vs. 50 students who did not use it [22];
  • Tasadduq et al. reported a significant increase in the assignment score of 21 Pakistani students who learned the fundamentals of computer programming with the help of the CYourWay gamified platform vs. 25 students who did not use it (though no significant difference has been observed for the final exam score) [18].
Note that 5 of the 6 reports listed above pertain to gamification applications proprietary to individual educational institutions and developed for specific courses, thus having little to no chance of reuse in other institutions and courses.
In this paper, we would like to contribute to this vein of research by investigating the learning outcomes of an open-sourced introductory Python programming course based on FGPE, an open-source, fully configurable gamification platform that can be used by any institution to support any programming-related course. This is, to the best of our knowledge, the very first study evaluating the learning outcomes of FGPE-supported programming education.
In contrast to all-inclusive gamified programming learning platforms, which encompass theoretical learning, programming practice, and collaboration (see [23] for a blueprint of such a platform), FGPE PLE aims only at supporting the central element (programming practice). For this reason, in typical educational scenarios, it is combined with other platforms, such as MOOCs, that address the remaining areas. Although the concept of gamifying MOOCs has garnered a lot of interest in recent years [24], it is not a prerequisite to match a MOOC with FGPE PLE as it can be used with MOOCs both non-gamified (in this case, the gamification is limited to the FGPE PLE and solving the programming exercises) and gamified (in this case, the achievements within the FGPE PLE form but a part of the MOOC gamification with relevant data transferred between the two platforms via LTI).
For this reason, we would also like to investigate whether there is a point in mixing MOOCs and gamified programming learning platforms such as FGPE PLE in particular, whether using FGPE PLE is beneficial for students learning with a MOOC and whether using a MOOC is beneficial for students learning with FGPE PLE.
Therefore, three research hypotheses have been stated:
Hypothesis 1.
Using the FGPE PLE as the only learning support tool is sufficient to achieve the expected minimum learning outcomes.
Hypothesis 2.
Using the FGPE PLE in combination with a MOOC results in better learning outcomes compared with using only the MOOC as a learning support tool.
Hypothesis 3.
Using the FGPE PLE in combination with a MOOC results in better learning outcomes compared with using only the FGPE PLE as a learning support tool.
The following section presents the course under investigation, its participants, and the applied research methods. Then, the obtained results are presented and analyzed. The final section discusses the main findings and concludes the study.

2. Materials and Methods

The presented study, following the example of prior work on this topic (see, e.g., [12,15,16]) takes a quasi-experimental approach: the results measured for the experimental groups are compared with the results measured for the respective control groups, though the assignment of participants to the groups was by an administrative division based on the alphabetic ordering of students’ names. The participation was voluntary; the study participants could opt out at their will throughout the whole period of the experiment. The participants were informed about the goals of the study and asked not to use other learning support tools until it was finished. There were no additional awards promised or granted to the study participants.
The study involved Business Informatics students at the University of Szczecin attending the Computer Programming course in the academic year 2022/2023, summer semester. The experiment started at the beginning of March 2023 and lasted until sufficient material relevant for the pretest or posttest was covered by all investigated groups, which, due to differences in their laboratory schedules, happened in late April or early May 2023.
Three groups took part in the study. Two groups (FGPE and MOOC+FGPE) counted 10 students each, whereas the third group (MOOC) counted initially 8 students, of which 4 resigned before the posttest, leaving only 4 students whose responses could be included in the analysis. All students attended the same lectures in traditional form; however, each group had their laboratory classes realized with the same instructor yet with different learning support tools, as depicted in Table 1. All participants were beginners in Python programming with no previous experience other than some basic knowledge of programming concepts and terms they learned during the prerequisite course on algorithms and data structures.
The groups supported by FGPE PLE learned from the “Introduction to Python 3 programming” exercise set, which is available under an open license and can be downloaded from GitHub (https://github.com/jcpaiva-fgpe/b110a4ad-3596-409e-b17c-90fe013804ed, accessed on 19 September 2023). It consists of 94 bilingual (Polish and English) gamified programming exercises grouped in 12 consecutive lessons, listed in the second column of Table 2.
The “Introduction to Python 3 programming” exercise set features five classic gamification elements: points (received for solving an exercise), quests (represented as subsets of exercises forming lessons), badges (earned for notable accomplishments), progress bars (showing the progress in completion of lessons), and leaderboards (showing how a student fares in the course compared with its other participants).
The groups supported by the MOOC learned from the “Introduction to programming in Python 3” course, consisting of 14 modules (plus Introduction and Conclusion with no programming-relevant contents) whose contents go beyond the scope covered by the FGPE-supported exercise set. The module titles are listed in the third column of Table 2.
The MOOC course is published under an open license and hosted at the Navoica (https://navoica.pl/courses/course-v1:Uniwersytet_Szczecinski+PP1+2022_2/course/, accessed on 19 September 2023) platform, powered by the Open edX (https://openedx.org/) system. For more information on the development of the course, see [25]. The MOOC course does not include any gamification elements.
The choice of FGPE PLE as the programming learning platform, the “Introduction to Python 3 programming” exercise set, and the “Introduction to programming in Python 3” course was a natural consequence of the involvement of the University of Szczecin and its personnel in the design and development of this platform, exercise set, and course.
Following the recommendation of the GATUGU framework [21], which indicates a selection of measures and measurement tools for the evaluation of gamification applications in a strive to provide for comparability of the reported results, the learning outcomes were measured as the difference between the arithmetic average of the percentage of correct answers on the pretest and the arithmetic average of the percentage of correct answers on the posttest. The questions in both the pretest and posttest were asked in the language of instruction (Polish) and are available from the authors on request. The questions covered topics from the first ten lessons of the FGPE course, mainly focusing on the use of variables, basic data structures, processing data in loops, and using functions.
For the analysis of data and verification of hypotheses, considering the small number of study participants, we decided to use visual analysis. Visual analysis is one of the oldest forms of data analysis, yet despite the current prevalence of statistical methods, it has not fallen to obsolescence—in fact, the opposite is true as the availability of new data visualization technology made it much easier to generate even complex charts, whereas its key advantage of allowing to quickly yield conclusions and hypotheses remains valid [26] (pp. 15–16).

3. Results

3.1. Main Results

The mean pretest and posttest scores for each test group (with minimum and maximum scores also indicated) are presented in Figure 1. The chart clearly shows that the initial level of programming knowledge was similar among all three groups; the mean score for each group from the pretest was between 34% and 38%.
Looking at mean posttest scores and considering 50% as the threshold indicating the achievement of the expected minimum learning outcomes, only the MOOC+FGPE group managed to pass it with an average result of 54%, whereas the other two groups failed slightly: the FGPE group scored an average of 48%, and the MOOC group—44%. Such results indicate a rejection of hypothesis H1: using the FGPE PLE as the only learning support tool was not sufficient to achieve the expected minimum learning outcomes.
Comparing the results of the MOOC+FGPE group to the MOOC group’s, we can see a gain of 36% in the case of the former and a gain of 10% in the case of the latter. Such results indicate a positive verification of hypothesis H2: using the FGPE PLE in combination with a MOOC resulted in clearly improved learning outcomes compared with using only the MOOC as a learning support tool.
The same results (a gain of 36% vs. a gain of 10%) can be observed when the results of the MOOC+FGPE group are compared with the FGPE group’s. This indicates a positive verification of hypothesis H3: using the FGPE PLE in combination with a MOOC resulted in clearly improved learning outcomes compared with using only the FGPE PLE as a learning support tool.

3.2. Gender-Specific Results

The next two charts (Figure 2 and Figure 3) show the average score for female and male participants, respectively. In the first case, visibly the biggest progress was made by female participants in the MOOC+FGPE group—scoring almost 58% on a posttest with a 28% difference from the pretest (average score of 30%). In contrast to female participants, males from this group scored higher on a pretest (38%) but made significantly smaller progress and achieved on average 53% on the posttest. The second highest score was 48% achieved by the FGPE group by both male and female participants, although females made on average a bit higher progress with a pretest score of 33% (39% for male participants). The lowest grade was obtained by the MOOC group, with male participants scoring on average 47% and females 35% on a posttest. Both males and females in this group made the least progress, but male respondents only improved by around 1%, whereas females’ scores improved by 15%. Summing up these results, the female participants attained larger progress in learning outcomes than their male counterparts in all groups under investigation.

3.3. Individual Results

Figure 4 shows the change in test scores in percentage points for each participant (different groups are marked with a different color). The biggest rise in a posttest outcome has been achieved by three participants from the MOOC+FGPE group, gaining 30 percentage points (pp) compared with their pretest scores. The second biggest progress in learning objectives—25 pp—has been made by another three participants—two from the MOOC+FGPE group and one from the FGPE group. An increase in 20 pp has been attained by two participants—one from the FGPE group and the other from the MOOC group. The observation that the best learning students were from the group using the combination of FGPE PLE and MOOC implies its superiority in supporting learning over the other approaches.
Looking at the bottom of the chart, we can see five students making no learning progress at all (in fact, one of them has even obtained a score lower by 10 pp in the posttest than in the pretest). These students hail from all groups, which indicates that no learning support tool, gamified or not, can guarantee the engagement of students. Note that the students were aware that their learning engagement during the experiment or the posttest score did not affect their course assessment, and the final course assessment was scheduled for the end of the semester, meaning that these results could be interpreted as simply revealing students who had other learning priorities or who were leaning toward procrastination.

3.4. Post-Experiment Results

After the end of the experiment period, the students were no longer asked to avoid other learning support tools, but they were still asked to use the learning support tools that they had used till then. It is thus quite probable that many of them did not look for other learning support tools, and therefore their final course outcome could be affected by that. Considering that, Figure 5 presents the final course outcomes for students assigned to the respective groups. Interestingly, despite the course ending more than 10 weeks after the end of the experiment period, these are somewhat similar to the posttest results—although more positive, with at least 50% of students in each group attaining the passing score (note that the final test consisted of solving programming exercises instead of answering closed-ended questions as in the posttest). All MOOC+FGPE-group students passed the final test (the majority of them in their first attempt), compared with 8 FGPE-group students (80%) (half of those in their first attempt) and 2 MOOC-group students (50%) (half of those in their first attempt as well). These results imply that the initial choice of learning support tools may have a large impact on the learning outcomes at the end of the programming course.

4. Discussion and Conclusions

The literature on the application of gamification in programming education links it to a number of benefits [16]. In the presented study, we focused on the most tangible of these benefits, which is the improvement in learning outcomes.
The object of our study was FGPE PLE—which is important, as, compared with most other gamified platforms supporting learning of programming, it provides a unique combination of advantages, being: open-source and free to use; fully customizable (in terms of taught programming language, language of instruction, scope and selection of covered material, and applied gamification techniques); and coming with a rich set of open-licensed gamified programming exercise sets.
The basic logic of FGPE PLE is to teach programming by practice, providing students with exercises with an increasing level of complexity and requiring increasing programming language knowledge to solve them. It combines the virtues of classic online interactive programming learning environments (the ability to solve exercises in a web browser without the need to install and deal with code editors, compilers etc., the ability to get instant automatic assessment of the submitted solution; the ability to store progress at the server and be able to continue at any time and place) with the virtues of gamified learning (being awarded and appreciated for the learning progress, regularity, dealing with especially challenging problems; being able to compare oneself to other students in terms of the current progress and past accomplishments via leaderboards and player profiles displaying badges and other achievements). In this aspect, it is similar to challenge-oriented gamified learning support tools such as UDPiler [16], while different from platforms combining challenges with theory lessons such as Javala [13]. The idea behind the design of FGPE PLE was that it should be a playground for practicing programming skills only, whereas the students will use other sources of the knowledge they are supposed to gain in order to be able to face the exercises of increasing difficulty, either provided by teachers or found by themselves in generally available workbooks or exploiting the knowledge of the Web with the help of search engines.
The presented results (see Section 3.1) show that this is not enough: only the students who were instructed to make use of a comprehensive, well-structured, and easily accessible source of knowledge in the form of a programming MOOC managed to attain the expected minimum learning outcomes during the study period. The same results also indicate that using the MOOC itself is not enough, thus extending the pool of reports indicating the positive effect of gamification on programming learning outcomes [12,15,16,17,22,27]. It is worthy to notice that even before this study has been concluded, the importance of combining a gamified programming learning environment with a MOOC has been foreseen by the FGPE developers, who recently extended it with support for the IMS LTI standard version 1.3, allowing for seamless integration of the gamified exercises provided via FGPE PLE with any LTI-compliant MOOC platform such as Moodle or Open edX [19].
Looking at the other obtained results:
  • we were not able to identify differences in the effects of using gamification between male and female students (see Section 3.2), which is in line with the previous research on this aspect [28] (note must be taken, though, for the very low number of female participants in the study),
  • we have observed (see Section 3.3) that the combined use of FGPE PLE and MOOC created an environment that sparked the best-performing students, which supports the idea that gamified platforms are suitable for mastery learning of programming [17]; on the other hand, according to our results, gamification does not seem to help with the issue of students unwilling to solve facultative assignments—although this is seemingly in contrast with the reported successful application of gamification in the treatment of procrastination (see, e.g., [29] and the work cited therein), it is in line with the reports indicating that the provision of facultative assignments to students may fuel their procrastination (see, e.g., [30] (p. 59) and the work cited therein);
  • we have observed (see Section 3.4) that the positive effects of gamification lasted until the end of the course, with the group instructed to use FGPE PLE and MOOC having attained the best final test passing ratio (this was not however a proper longitudinal study, as the students could use any learning support tools they wanted in their preparation for the final test); moreover, we have observed no drop-outs in the group combining MOOC with the gamified platform compared with half of the MOOC-only-supported group dropping out (or even 75% if the students who left early were considered), which is an important observation in the context of using gamification to address the MOOC attrition problem (see, e.g., [31])—weakened, though, by the small size of the MOOC group.
Although the above-mentioned small size of the studied sample (24) compares with those reported by Ibanez et al. [12] (22), Moreno and Pineda [16] (45), or Tasadduq et al. [18] (46), it is the first main limitation of our study. Its second main limitation is being confined to just one instance of one programming course conducted at one university by one instructor. These indicate the obvious next step of our work, which will be to repeat the study described here in the next academic year on a larger group of students, possibly attending more than one course at different universities in different countries.
Despite the mentioned limitations, we still consider the results presented in this work to be a valuable contribution to the research on supporting programming education with gamified online platforms, as they provide the first learning-objectives-focused results regarding FGPE PLE and strongly stress the role of combining gamified platforms of their kind with MOOCs. It also provides a base for further research comparing the learning performance of students using the combination of FGPE PLE and a non-gamified MOOC with students using a gamified MOOC and/or an all-inclusive gamified programming learning platform.

Author Contributions

Conceptualization, J.S. (Jakub Swacha) and J.S. (Justyna Szydłowska); methodology, J.S. (Jakub Swacha) and J.S. (Justyna Szydłowska); software, J.S. (Jakub Swacha) and J.S. (Justyna Szydłowska); validation, J.S. (Justyna Szydłowska); formal analysis, J.S. (Jakub Swacha); investigation, J.S. (Jakub Swacha) and J.S. (Justyna Szydłowska); resources, J.S. (Justyna Szydłowska); data curation, J.S. (Justyna Szydłowska); writing—original draft preparation, J.S. (Jakub Swacha) and J.S. (Justyna Szydłowska); writing—review and editing, J.S. (Jakub Swacha) and J.S. (Justyna Szydłowska); visualization, J.S. (Justyna Szydłowska); supervision, J.S. (Jakub Swacha); project administration, J.S. (Jakub Swacha). All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Ethical review and approval were waived for this study due to its scope in accordance with the regulations in force at the Faculty of Economics, Finance and Management of University of Szczecin.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data are available from the authors on request.

Acknowledgments

The authors would like to thank all students who participated in the study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kapp, K.M. The Gamification of Learning and Instruction: Game-Based Methods and Strategies for Training and Education; Pfeiffer: San Francisco, CA, USA, 2012. [Google Scholar]
  2. Swacha, J. State of Research on Gamification in Education: A Bibliometric Survey. Educ. Sci. 2021, 11, 69. [Google Scholar] [CrossRef]
  3. Venter, M.I. Gamification in STEM programming courses: State of the art. In Proceedings of the 2020 IEEE Global Engineering Education Conference (EDUCON), Porto, Portugal, 27–30 April 2020; pp. 859–866. [Google Scholar]
  4. Bosse, Y.; Gerosa, M.A. Why is Programming so Difficult to Learn? Patterns of Difficulties Related to Programming Learning Mid-Stage. SIGSOFT Softw. Eng. Notes 2017, 41, 1–6. [Google Scholar] [CrossRef]
  5. Ala-Mutka, K.M. A Survey of Automated Assessment Approaches for Programming Assignments. Comput. Sci. Educ. 2005, 15, 83–102. [Google Scholar] [CrossRef]
  6. Codecademy. 2023. Available online: https://www.codecademy.com/ (accessed on 11 September 2023).
  7. Code School. 2023. Available online: https://www.pluralsight.com/codeschool (accessed on 11 September 2023).
  8. CheckiO. 2023. Available online: https://checkio.org/ (accessed on 11 September 2023).
  9. Leek Wars. 2023. Available online: https://leekwars.com/ (accessed on 11 September 2023).
  10. CodeCombat. 2023. Available online: https://codecombat.com (accessed on 11 September 2023).
  11. Landers, R.N. Developing a Theory of Gamified Learning: Linking Serious Games and Gamification of Learning. Simul. Gaming 2014, 45, 752–768. [Google Scholar] [CrossRef]
  12. Ibanez, M.B.; Di-Serio, A.; Delgado-Kloos, C. Gamification for Engaging Computer Science Students in Learning Activities: A Case Study. IEEE Trans. Learn. Technol. 2014, 7, 291–301. [Google Scholar] [CrossRef]
  13. Lehtonen, T.; Aho, T.; Isohanni, E.; Mikkonen, T. On the role of gamification and localization in an open online learning environment: Javala experiences. In Proceedings of the 15th Koli Calling Conference on Computing Education Research, Koli, Finland, 19–22 November 2015; pp. 50–59. [Google Scholar] [CrossRef]
  14. Çubukçu, Ç.; Wang, B.; Goodman, L.; Mangina, E. Gamification for assessment of object oriented programming. In Proceedings of the International Conference on Information Communication Technologies in Education, Rhodes, Greece, 6–8 July 2017; pp. 1–6. [Google Scholar]
  15. Moreno, J.; Pineda, A.F. Competitive programming and gamification as strategy to engage students in computer science courses. Rev. ESPAC 2018, 39, 11. [Google Scholar]
  16. Marín, B.; Frez, J.; Cruz-Lemus, J.; Genero, M. An Empirical Investigation on the Benefits of Gamification in Programming Courses. ACM Trans. Comput. Educ. 2019, 19, 1–22. [Google Scholar] [CrossRef]
  17. De Pontes, R.G.; Guerrero, D.D.S.; De Figueiredo, J.C.A. Analyzing Gamification Impact on a Mastery Learning Introductory Programming Course. In Proceedings of the 50th ACM Technical Symposium on Computer Science Education, Minneapolis, MN, USA, 27 February–2 March 2019; pp. 400–406. [Google Scholar] [CrossRef]
  18. Tasadduq, M.; Khan, M.S.; Nawab, R.M.A.; Jamal, M.H.; Chaudhry, M.T. Exploring the effects of gamification on students with rote learning background while learning computer programming. Comput. Appl. Eng. Educ. 2021, 29, 1871–1891. [Google Scholar] [CrossRef]
  19. Paiva, J.C.; Queirós, R.; Leal, J.P.; Swacha, J.; Miernik, F. Managing Gamified Programming Courses with the FGPE Platform. Information 2022, 13, 45. [Google Scholar] [CrossRef]
  20. Almeida, C.; Kalinowski, M.; Uchôa, A.; Feijó, B. Negative effects of gamification in education software: Systematic mapping and practitioner perceptions. Inf. Softw. Technol. 2023, 156, 107142. [Google Scholar] [CrossRef]
  21. Swacha, J.; Queirós, R.; Paiva, J.C. GATUGU: Six Perspectives of Evaluation of Gamified Systems. Information 2023, 14, 136. [Google Scholar] [CrossRef]
  22. Garcia, M.B.; Revano, T.F. Assessing the Role of Python Programming Gamified Course on Students’ Knowledge, Skills Performance, Attitude, and Self-Efficacy. In Proceedings of the 2021 IEEE 13th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM), Manila, Philippines, 28–30 November 2021; pp. 1–5. [Google Scholar] [CrossRef]
  23. Swacha, J.; Baszuro, P. Gamification-based e-learning Platform for Computer Programming Education. In Proceedings of the X World Conference on Computers in Education, Vol. I, Toruń, Poland, 2–5 July 2013; pp. 122–130. [Google Scholar]
  24. de Freitas, M.J.; da Silva, M.M. Systematic literature review about gamification in MOOCs. Open Learn. J. Open Distance e-Learn. 2023, 38, 73–95. [Google Scholar] [CrossRef]
  25. Swacha, J. Teaching Python programming with a MOOC: Course design and evaluation. In Proceedings of the Thirty-Seventh Information Systems Education Conference, Online, 9 October 2021; pp. 131–137. [Google Scholar]
  26. Parsonson, B.S.; Baer, D.M.; Kratochwill, T.R.; Levin, J.R. The visual analysis of data, and current research into the stimuli controlling it. In Single-Case Research Design and Analysis: New Directions for Psychology and Education; Routledge: London, UK, 2015; pp. 15–40. [Google Scholar]
  27. Imran, H. An Empirical Investigation of the Different Levels of Gamification in an Introductory Programming Course. J. Educ. Comput. Res. 2023, 61, 847–874. [Google Scholar] [CrossRef]
  28. Zahedi, L.; Batten, J.; Ross, M.; Potvin, G.; Damas, S.; Clarke, P.; Davis, D. Gamification in education: A mixed-methods study of gender on computer science students’ academic performance and identity development. J. Comput. High. Educ. 2021, 33, 441–474. [Google Scholar] [CrossRef]
  29. van Eerde, W.; Klingsieck, K.B. Overcoming procrastination? A meta-analysis of intervention studies. Educ. Res. Rev. 2018, 25, 73–85. [Google Scholar] [CrossRef]
  30. Hung, A.C.Y. A critique and defense of gamification. J. Interact. Online Learn. 2017, 15, 57–72. [Google Scholar]
  31. Rizzardini, R.H.; Chan, M.M.; Guetl, C. An Attrition Model for MOOCs: Evaluating the Learning Strategies of Gamification. In Formative Assessment, Learning Data Analytics and Gamification; Caballé, S., Clarisó, R., Eds.; Intelligent Data-Centric Systems; Academic Press: Boston, MA, USA, 2016; pp. 295–311. [Google Scholar] [CrossRef]
Figure 1. Test score in [%] for all participants.
Figure 1. Test score in [%] for all participants.
Education 13 00984 g001
Figure 2. Test score in [%] for female participants.
Figure 2. Test score in [%] for female participants.
Education 13 00984 g002
Figure 3. Test score in [%] for male participants.
Figure 3. Test score in [%] for male participants.
Education 13 00984 g003
Figure 4. Change in test score.
Figure 4. Change in test score.
Education 13 00984 g004
Figure 5. The final course outcomes for students assigned to the respective groups.
Figure 5. The final course outcomes for students assigned to the respective groups.
Education 13 00984 g005
Table 1. Study participant groups.
Table 1. Study participant groups.
GroupLabelLecturesSupport ToolsnFemaleMale
1MOOC+FGPETraditionalMOOC and FGPE PLE1028
2FGPETraditionalFGPE PLE1028
3MOOCTraditionalMOOC413
Table 2. FGPE lessons and MOOC modules.
Table 2. FGPE lessons and MOOC modules.
Lesson No.FGPE LessonMOOC Module
1Basics; VariablesFirst contact with the Python language
2StringsCharacter strings.
3VariablesPrograms.
4ConditionalsSequences.
5LoopsLoops.
6SetsSets and dictionaries.
7ListsFunctions.
8String processingObject-oriented programming.
9DictionariesPython standard modules—overview.
10FunctionsData processing.
11Object-oriented programmingAlgorithms in Python.
12Classic algorithmsStorage of data.
13-Use of PYPI modules.
14-Python in practical applications.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Swacha, J.; Szydłowska, J. Does Gamification Make a Difference in Programming Education? Evaluating FGPE-Supported Learning Outcomes. Educ. Sci. 2023, 13, 984. https://doi.org/10.3390/educsci13100984

AMA Style

Swacha J, Szydłowska J. Does Gamification Make a Difference in Programming Education? Evaluating FGPE-Supported Learning Outcomes. Education Sciences. 2023; 13(10):984. https://doi.org/10.3390/educsci13100984

Chicago/Turabian Style

Swacha, Jakub, and Justyna Szydłowska. 2023. "Does Gamification Make a Difference in Programming Education? Evaluating FGPE-Supported Learning Outcomes" Education Sciences 13, no. 10: 984. https://doi.org/10.3390/educsci13100984

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop