Next Article in Journal
Modeling and Numerical Simulation of a CH3NH3SnI3 Perovskite Solar Cell Using the SCAPS1-D Simulator
Previous Article in Journal
Near-Infrared Wavelength Selection and Optimizing Detector Location for Apple Quality Assessment Using Molecular Optical Simulation Environment (MOSE) Software
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

A Comparison of Online and Offline Digital Gameplay Activities in Promoting Computational Thinking in K-12 Education †

1
School of Big Data, Fuzhou University of International Studies and Trade, Fuzhou 350202, China
2
Department of Computer Science, National Yang-Ming Chiao Tung University, Hsinchu 30010, Taiwan
3
Institute of Network Engineering, National Yang-Ming Chiao Tung University, Hsinchu 30010, Taiwan
*
Author to whom correspondence should be addressed.
Presented at the IEEE 5th Eurasia Conference on Biomedical Engineering, Healthcare and Sustainability, Tainan, Taiwan, 2–4 June 2023.
Eng. Proc. 2023, 55(1), 31; https://doi.org/10.3390/engproc2023055031
Published: 29 November 2023

Abstract

:
Due to the COVID-19 pandemic, courses for all ages in many countries moved online. The ability to use information technology fluently for learning has become a vital issue. Whether students can develop computational thinking (CT) literacy in online or offline learning environments is investigated. Two web games, Rummikub and Robozzle, are applied to teach computational thinking through Google Meet, Google Classroom, and offline live teaching. The results indicate that offline training activities are more appropriate for middle school students.

1. Introduction

With the rapid changes brought on by the COVID-19 epidemic situation, both students’ learning methods and teachers’ teaching ways faced unprecedented challenges and changes [1]. In response to various policies put in place during the epidemic, hybrid, online, and physical teaching methods have been alternately implemented. These conditions test whether teachers are flexible enough to change what and how they teach and whether students can learn the content. This process of change and adjustment requires using information technology to solve problems, conduct tests, and enhance interactions. The competencies required in this process are also known as computational thinking and digital literacy, which are skills that must be acquired through training [2]. With the increasing reliance of computer technology in the age of AI and big data, students must develop CT skills. Despite the increasing literature on CT, there is still a considerable amount of discussion and development regarding the cultivation of CT in education [3].
Digital games can support the development of a variety of computational thinking skills. Game-based learning environments enable students to understand complex systems independently or collaboratively, reflect on the solutions they find critically, and use analytical thinking to identify logical problem-solving strategies [4]. Gee [5] indicated that the characteristics of games, such as providing immediate feedback, analyzing the course of self-activity, involving students in trial and error, motivating the implementation of reasoning strategies, and providing algorithmic structures, are essential for developing students’ problem-solving skills. For example, students who regularly play “information/logic games” demonstrate higher levels of algorithmic thinking, cooperation, and problem-solving than those who do not [4]. Pattern recognition is one of the four cornerstones of computational thinking. It involves finding similarities or patterns between minor, disaggregated problems that can help in solving complex problems more effectively. The goal of the game Rummikub is to place tiles in sets of either a group (i.e., include at least three tiles of the same number in a different color) or a run (i.e., include at least three tiles with consecutive numbers in the same color) [6]. The combinations in each round differ depending on how the tiles were placed in the previous round. Players must figure out the release pattern and constantly think about how to place tiles to conform to the set rules and figure out how to play (Figure 1).
Algorithmic thinking is another derivative of computer science and coding. Its core concept is automating the problem-solving process by creating a series of systematic, logical steps. For example, Google applies the PageRank algorithm, which assigns a webpage’s importance based on the number of sites linking to it, to bring important search results to the top of the page [7]. Another example is long division, which follows the standard division algorithm by dividing multi-digit integers to calculate the quotient. Robozzle is a puzzle game that requires the player to program a robot through a maze to collect all the stars on the board (Figure 2). Students have to think about how to use the existing instructions and a limited number of moves to design a path so that the robot can collect all the stars during the gameplay [8]. The process mentioned above needs algorithmic thinking to design a good solution.
Due to the epidemic’s impact, the mode of teaching has changed from offline to online. Without the regular classroom environment and teachers’ direct regulation, the effect on students’ learning has been a significant concern [9]. This study aimed to explore and compare different teaching modes (online and offline) to train students’ computational thinking ability by asking them to solve problems logically and go step by step when they encounter problems.

2. Literature Review

2.1. Online and Offline Learning

In 2020, teaching globally was switched to emergency online teaching [9]. Students need basic digital literacy to take advantage of online learning opportunities. Digital literacy, including using digital technologies to search, process, manipulate and create data, online communication, and collaboration skills are also perceived as a form of computational thinking [10]. Before the COVID-19 pandemic in 2020, many researchers employed offline activities (e.g., unplugged activities and board games) to investigate computational thinking concepts and found that they could serve as a platform with which to promote students’ CT learning [11]. Others also used online resources to cultivate CT abilities, such as programming or game-based learning [12]. The method of course delivery (i.e., online or offline) affects student performance, satisfaction, and understanding [13]. However, the benefits of online instruction for students are expected to be highly heterogeneous, and there will be learning boundaries and occasionally even no learning outcomes during online learning [14]. There is a lack of investigations that compare the effectiveness of CT gameplay activities with online and offline learning approaches.

2.2. Digital Game-Based Learning and CT

Computational thinking enables people to understand and process complex problems and formulate possible solutions [15,16]. Both computers and humans can understand these solutions. Computational thinking has four cornerstones: decomposition, pattern recognition, abstraction, and algorithm thinking [16]. For example, imitating codes with similar functions allows students to discover small pieces of code, and adjusting a suitable algorithm can speed up problem-solving and reduce the computing power need for computers in CS course teaching. However, many Asia students are accustomed to accepting and following teachers’ orders for learning and lack the ability and opportunity to manage their own learning time independently [17]. It is difficult for teachers to adequately describe the search for rules and conceptualization in the classroom. Nowadays, teachers and researchers are applying digital games as a potential tool for teaching computational thinking in the classroom [18]. Students can be guided to find solutions, cultivate their CT abilities, and enhance their learning performance in a digital game-based learning (DGBL) environment [19]. For example, students in the UK use a free online learning resource, a gaming platform, that explains all aspects of CT skills [15]. In addition, adopting digital games in CT training activities can promote students’ creativity and stimulate their motivation [17]. Learners with different CT abilities will develop different strategies in the game to complete the game levels. Digital games facilitate students’ ability to apply logical thinking, memory, visualization, and problem-solving skills in real life [18]. Therefore, we used two games, Rummikub and Robozzle, to compare the effectiveness of online and offline activities in developing CT.

3. Methods

3.1. Study Design and Participants

Our primary study goal was to determine the influence of online and offline activities on CT abilities among a group of seventh-grade students playing two web-based games. Study participants volunteered in a two-day computational thinking training camp in July 2021. Students in the online camp joined the activities via the Google Meet and Google Classroom applications and the offline group attended a face-to-face camp at the school. The camp activities were divided into three parts. The first part included a Bebras pre-test and game teaching for two hours. In this stage, students were guided on how to play Robozzle and Rummikub. Then, students had one-day of gameplay time with an accompanying teacher and instructions. A game competition was held after self-gameplay. The goals were to solve the most Robozzle puzzles and get the highest Rummikub scores. The last part included the Bebras post-test to test students’ skills (Figure 3).
We used the Bebras Challenge [20] to test students’ computational thinking abilities in this study. The participant sample comprised 66 seventh-grade students from secondary high schools in Northern Taiwan. A total of 26 students participated in the online camp, while 40 participated in the offline camp. Half the people in each camp played Rummikub and half played Robozzle. Seven questions were selected from the seventh-grade question datasets, including easy, medium, and hard levels. The score range for the pre- and post-tests was 0–7.

3.2. Exploration and Competiton Strategy

Camp activities were held during holidays, so students participated in computational thinking training activities with their parents’ consent or through voluntary registration. Self-exploration and competition strategies were applied to enhance learning motivation and encourage engagement in the activities. During the self-exploration stage, students could use the taught game skills to explore gaming environments and find any game tips. Students in the online group could use Google Meet to discuss solutions with other participants, while students in the offline group could only complete the paperwork specified by the activity. In addition, the competitive mode was applied to prompt the middle school students to devote more time and energy to mastering one job [21]. Students played against the computer in single-player mode for two games. Their ranking in the game was recorded, and the highest-ranked student was rewarded.

4. Results and Discussion

Students’ CT ability was improved after the game-based learning camp activities (t = 2.96, p = 0.032). The post-test scores (M = 3.273, SD = 1.750) were significantly higher than the pre-test scores (M = 2.773, SD = 1.537) (Table 1 and Table 2). To determine whether the online teaching method differs from the offline teaching method, we looked at descriptive statistics for both the online and offline groups and found that the mean of both groups improved (Table 3). After analyzing students’ progress (differences in pre- and post-test scores) by using a paired-sample t-test (assuming unequal variance), we found no significant difference between the online and offline groups (t = 0.14, p = 0.886) (Table 4).
Next, we attempted to determine if any differences existed between the two games. We divided students into groups according to the type of game played and the method of teaching activities. Table 5. shows all students’ progress and regression in performance by subtracting the post- and pre-test scores.
A paired-sample t-test was applied to detect differences between the same game with different teaching methods (Table 6), and the results show a significant difference when Rummikub was played in the offline mode. The progress in offline activity performance (M = 1.15, S.D. = 2.330) was better than that in online activity performance (M = 0.69, S.D. = 1.702). However, we found no significant difference between the online and offline modes for students playing Robozzle. Comparing the Rummikub and Robozzle games, the former involves group interactions and emphasizes peer interaction and communication in the learning process. Therefore, we can infer that if learning is carried out in groups or classes, the offline mode will be more effective than the online mode.

5. Conclusions

Comparing students’ performance in the digital game camps with online and offline CT abilities, we found that students in both the online and offline groups played games on their computers, so the results were not significantly different. However, when analyzing Rummikub and Robozzle separately, the CT training outcomes for online and offline activities differed.
During the Rummikub multiplayer game, players must take turns taking cards and figuring out how to win the game. The opponents are nearby in the offline activity, so the students playing the game can further judge how to play their card by observing their opponent’s voice, expressions, and even body language. In the online group, the cameras were set above the neck of the students, and they could only look at the computer screen, so students in this group had far less information than in the offline group. Furthermore, in offline activities, students can hear the voices of competitors from other groups, while in online games, students can only hear the voices of the people from their groups, so offline activities have a stronger sense of immersion than online activities. In addition, the player does not need to interact with other players, and the winner of the competition is determined by students’ tanking in the Robozzle single-player game. Individual gameplay did not affect CT ability regardless of whether the students participated in online or offline activities.
The study results show that different teaching methods and gameplay methods will cause differences in students’ CT ability performance. Therefore, when planning for online-game-based learning or developing an online learning platform, it is necessary to consider the gameplay type so that it is possible to produce learning outcomes and promote computational thinking abilities.

Author Contributions

Conceptualization, Y.-Y.C. and S.-M.Y.; methodology, L.-X.C. and Y.-Y.C.; validation, L.-X.C., S.-W.S. and Y.-Y.C.; formal analysis, Y.-Y.C. and C.-H.L.; data curation, L.-X.C. and Y.-Y.C.; writing—original draft preparation, L.-X.C. and S.-W.S.; writing—review and editing, L.-X.C. and C.-H.L.; visualization, L.-X.C. and Y.-Y.C.; supervision, S.-M.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This work was partially funded by National Science and Technology Council Taiwan (grant number: 108-2511-H-009-009-MY3) and the High-level Talent Research Project at Fuzhou University of International Studies and Trade (grant no. FWKQJ201909).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are contained within the article.

Acknowledgments

The authors wish to thank the blind reviewers for their insightful and constructive comments.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hofer, S.I.; Nistor, N.; Scheibenzuber, C. Online teaching and learning in higher education: Lessons learned in crisis situations. Comput. Hum. Behav. 2021, 121, 106789. [Google Scholar] [CrossRef] [PubMed]
  2. Su, S.W.; Jung, S.Y.; Yu, X.; Yuan, S.M.; Sun, C.T. Modify, Decompose and Reassemble: Learner-Centered Constructive Teaching Strategy for Introductory Programming Course in College. In Proceedings of the 2022 IEEE 5th Eurasian Conference on Educational Innovation (ECEI), Taipei, Taiwan, 10–12 February 2022; pp. 197–200. [Google Scholar] [CrossRef]
  3. Saxena, A.; Lo, C.K.; Hew, K.F.; Wong, G.K.W. Designing unplugged and plugged activities to cultivate computational thinking: An exploratory study in early childhood education. Asia-Pac. Educ. Res. 2020, 29, 55–66. [Google Scholar] [CrossRef]
  4. Durak, H.Y.; Yilmaz, F.G.K.; Yilmaz, R. Examining the Relationship between Digital Game Preferences and Computational Thinking Skills. Contemp. Educ. Technol. 2017, 8, 359–369. [Google Scholar]
  5. Gee, J.P. Learning by design: Games as learning machines. Digit. Educ. Rev. 2004, 8, 15–23. [Google Scholar]
  6. van Rijn, J.N.; Takes, F.W.; Vis, J.K. The complexity of Rummikub problems. arXiv 2016, arXiv:1604.07553. [Google Scholar]
  7. O’Hara, I. Feedback Loops: Algorithmic Authority, Emergent Biases, and Implications for Information Literacy. Pa. Libr. Res. Pract. 2021, 9, 8–15. [Google Scholar] [CrossRef]
  8. Lindberg, R.S.; Laine, T.H.; Haaranen, L. Gamifying programming education in K-12: A review of programming curricula in seven countries and programming games. Br. J. Educ. Technol. 2019, 50, 1979–1995. [Google Scholar] [CrossRef]
  9. Li, D. The Shift to Online Classes during the COVID-19 Pandemic: Benefits, Challenges, and Required Improvements from the Students’ Perspective. Electr. J. E-Learn. 2022, 20, 1–18. [Google Scholar] [CrossRef]
  10. Carretero, S.; Vuorikari, R.; Punie, Y. DigComp 2.1: The Digital Competence Framework for Citizens with Eight Proficiency Levels and Examples of Use EUR; Scientific and Technical Research Series; Publications Office of the European Union: Seville, Spain, 2017; Volume 28558. [Google Scholar]
  11. Looi, C.-K.; How, M.-L.; Longkai, W.; Seow, P.; Liu, L. Analysis of linkages between an unplugged activity and the development of computational thinking. Comput. Sci. Educ. 2018, 28, 255–279. [Google Scholar] [CrossRef]
  12. Lye, S.Y.; Koh, J.H.L. Review on teaching and learning of computational thinking through programming: What is next for K-12? Comput. Hum. Behav. 2014, 41, 51–61. [Google Scholar] [CrossRef]
  13. Singh, S.; Rylander, D.H.; Mims, T.C. Efficiency of online vs. offline learning: A comparison of inputs and outcomes. Int. J. Bus. Humanit. Technol. 2012, 2, 93–98. [Google Scholar]
  14. Hofer, S.I.; Holzberger, D.; Reiss, K. Evaluating school inspection effectiveness: A systematic research synthesis on 30 years of international research. Stud. Educ. Eval. 2020, 65, 100864. [Google Scholar] [CrossRef]
  15. BBC. Introduction to Computational Thinking. Available online: https://www.bbc.co.uk/education/guides/zp92mp3/revision/1 (accessed on 1 December 2022).
  16. Wing, J.M. Computational thinking. Commun. ACM 2006, 49, 33–35. [Google Scholar] [CrossRef]
  17. Chen, C.; Stevenson, H.W. Motivation and mathematics achievement: A comparative study of Asian-American, Caucasian-American, and East Asian high school students. Child Dev. 1995, 66, 1215–1234. [Google Scholar] [CrossRef]
  18. Ch’ng, S.I.; Low, Y.C.; Lee, Y.L.; Chia, W.C.; Yeong, L.S. Video games: A potential vehicle for teaching computational thinking. In Computational Thinking Education; Springer: Singapore, 2019; pp. 247–260. [Google Scholar]
  19. Sun, C.-T.; Chen, L.-X.; Chu, H.-M. Associations among scaffold presentation, reward mechanisms and problem-solving behaviors in game play. Comput. Educ. 2018, 119, 95–111. [Google Scholar] [CrossRef]
  20. Hsu, T.-C.; Chang, S.-C.; Hung, Y.-T. How to learn and how to teach computational thinking: Suggestions based on a review of the literature. Comput. Educ. 2018, 126, 296–310. [Google Scholar] [CrossRef]
  21. Kim, B.; Park, H.; Baek, Y. Not just fun, but serious strategies: Using meta-cognitive strategies in game-based learning. Comput. Educ. 2009, 52, 800–810. [Google Scholar] [CrossRef]
Figure 1. Pattern recognition and group interaction in Rummikub.
Figure 1. Pattern recognition and group interaction in Rummikub.
Engproc 55 00031 g001
Figure 2. Planning algorithm in Robozzle.
Figure 2. Planning algorithm in Robozzle.
Engproc 55 00031 g002
Figure 3. Online and offline training camp process.
Figure 3. Online and offline training camp process.
Engproc 55 00031 g003
Table 1. Descriptive statistics of pre-test and post-test CT scores of all participants.
Table 1. Descriptive statistics of pre-test and post-test CT scores of all participants.
N = 66
MeanS.D.
Pre-test2.7731.537
Post-test3.2731.750
Table 2. Paired-sample t-test of pre-test and post-test scores of all participants (n = 66).
Table 2. Paired-sample t-test of pre-test and post-test scores of all participants (n = 66).
dftp-Value
Overall652.96 *0.032
* p < 0.05.
Table 3. Descriptive statistics of pre-test and post-test scores of online and offline groups.
Table 3. Descriptive statistics of pre-test and post-test scores of online and offline groups.
MinMaxMeanS.D.
Online (N = 26)
Pre-test073.0771.787
Post-test073.5381.985
Offline (N = 40)
Pre-test052.5751.338
Post-test063.11.582
Table 4. Paired-sample t-test of the pre-and post-test scores of online and offline groups (n = 66).
Table 4. Paired-sample t-test of the pre-and post-test scores of online and offline groups (n = 66).
dft p-Value
Overall620.140.886
Table 5. Descriptive statistics of progress in CT performance in playing Rummikub and Robozzle.
Table 5. Descriptive statistics of progress in CT performance in playing Rummikub and Robozzle.
NMinMaxMeanS.D.
Rummikub
Online13−540.691.702
Offline20−451.152.300
Robozzle
Online13−220.231.363
Offline20−33−0.011.586
Table 6. Paired-sample t-test of pre- and post-test scores for different game-based learning environments.
Table 6. Paired-sample t-test of pre- and post-test scores for different game-based learning environments.
Ndft p Value
Rummikub
Online1312−0.610.553
Offline2019−2.240.037 *
Robozzle
Online1312−1.470.168
Offline20190.280.781
* p < 0.05.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chen, L.-X.; Su, S.-W.; Chen, Y.-Y.; Liao, C.-H.; Yuan, S.-M. A Comparison of Online and Offline Digital Gameplay Activities in Promoting Computational Thinking in K-12 Education. Eng. Proc. 2023, 55, 31. https://doi.org/10.3390/engproc2023055031

AMA Style

Chen L-X, Su S-W, Chen Y-Y, Liao C-H, Yuan S-M. A Comparison of Online and Offline Digital Gameplay Activities in Promoting Computational Thinking in K-12 Education. Engineering Proceedings. 2023; 55(1):31. https://doi.org/10.3390/engproc2023055031

Chicago/Turabian Style

Chen, Li-Xian, Shih-Wen Su, Yen-Yun Chen, Chia-Hung Liao, and Shyan-Ming Yuan. 2023. "A Comparison of Online and Offline Digital Gameplay Activities in Promoting Computational Thinking in K-12 Education" Engineering Proceedings 55, no. 1: 31. https://doi.org/10.3390/engproc2023055031

Article Metrics

Back to TopTop