Next Article in Journal
Critical Thinking and Effective Personality in the Framework of Education for Sustainable Development
Previous Article in Journal
Flipped Classroom: A Good Way for Lower Secondary Physical Education Students to Learn Volleyball
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Usefulness of Digital Serious Games in Engineering for Diverse Undergraduate Students

by
Kimberly Cook-Chennault
1,2,3,*,
Idalis Villanueva Alarcón
4 and
Gabrielle Jacob
5
1
Mechanical and Aerospace Engineering Department, Rutgers, The State University of New Jersey, Piscataway, NJ 08854, USA
2
Biomedical Engineering Department, Rutgers, The State University of New Jersey, Piscataway, NJ 08854, USA
3
Department of Educational Psychology, Rutgers, The State University of New Jersey, Piscataway, NJ 08854, USA
4
Department of Engineering Education, University of Florida, Gainesville, FL 32611, USA
5
School of Public Health, Rutgers, The State University of New Jersey, Piscataway, NJ 08854, USA
*
Author to whom correspondence should be addressed.
Educ. Sci. 2022, 12(1), 27; https://doi.org/10.3390/educsci12010027
Submission received: 11 November 2021 / Revised: 23 December 2021 / Accepted: 24 December 2021 / Published: 4 January 2022
(This article belongs to the Topic Advances in Online and Distance Learning)

Abstract

:
The use of educational digital games as supplemental tools to course instruction materials has increased over the last several decades and especially since the COVID-19 pandemic. Though these types of instructional games have been employed in the majority of STEM disciplines, less is known about how diverse populations of students interpret and define the value of these games towards achieving academic and professional pursuits. A mixed-method sequential exploratory research design method that was framed on the Technology Acceptance Model, Game-Based Learning Theory and Expectancy Value Theory was used to examine how 201 students perceived the usefulness of an intuitive education game that was designed to teach engineering mechanics used in designing civil structures. We found that students had different expectations of educational digital games than games designed for entertainment used outside of classroom environments. Several students thought that the ability to design their own structures and observe structure failure in real-time was a valuable asset in understanding how truss structures responded to physical loading conditions. However, few students thought the educational game would be useful for exam (14/26) or job interview (19/26) preparation. Students associated more value with engineering games that illustrate course content and mathematical calculations used in STEM courses than those that do not include these elements.

1. Introduction and Project Motivation

The use of online digital technologies has increased during the COVID-19 pandemic as educators attempt to rapidly address the need for emergency remote instruction [1,2,3,4,5,6]. Along with this need to transition courses to online formats, comes the parallel need for digital tools that can supplement course instruction. Digital learning and grading tools have served as a way for teachers to cover the burgeoning amount of required topics related to science [7,8], engineering [9,10], technology [11,12], and math [13,14].
Serious games (a term coined by Carl Apt in the 1970′s [15]) are games that are designed for an explicit and carefully crafted educational purpose, which extends beyond amusement or entertainment [16]. Serious games can either be digital or tactile. Digital serious games are considered for this project. Digital serious games can be housed in a plethora of online applications and software programs that can be accessed by the individual via the internet or downloaded onto a smart mobile device or computer. These types of games are premised on the principles of the Constructivist Learning Theory [17,18], which postulates that learners actively cultivate and construct knowledge as they experience the world around them, reflect on their experiences, and subsequently incorporate new information into their pre-existing knowledge base [19]. The mechanisms by which people operate and interact with serious games is described by game-based learning theory.
Game-based learning describes how individuals connect with a game that has defined learning outcomes [20]. Game-based learning also explains how the balance between the necessity of covering subject matter and gameplay entertainment is created [21]. Of equal importance to game-based learning lies the assessment of student outcomes such as academic performance [22,23,24], motivation [25,26], and engagement [27], though these things may be personal, i.e., premised on an individual’s preferences or perceptions [28,29].
Much of the scholarly work on serious games in higher education environments has focused on making comparisons between groups of students who have been or not been exposed to a digital game intervention. This aggregation of groups can be problematic as the goal of the serious games is to help individuals meet milestones within a stated goal or timeframe. At the same time, without understanding how individuals attribute usefulness to a game, a true measure of student outcomes becomes convoluted.
The purpose of this exploratory study was to better understand the perceived usefulness of an engineering education serious game in higher education. This study was bounded by a specific game, Civil-Build (Pseudonym used to represent the online engineering game used in the study.), and was contextualized to an engineering course at a Northeast institution in the United States. We believe our findings have the potential for transferability in other contexts. One unique element of this study is the focus on exploring usefulness from a disciplinary context (e.g., engineering) and with a racially and gender diverse population of students (e.g., women, Black/African American (AA), LatinX, Asian). We anticipate that the usefulness of engineering education serious games used as part of a university-level course can result in two potential outcomes. First, students will experience enhanced course performance when they connect the game tool to the course content. Second, students’ will assign value (usefulness) to a game that connects to real-world applications and will conclude the game is useful in their preparation for professional and future academic settings.

2. Theoretical Framework and Models

2.1. Technology Acceptance Model (TAM): A Model to Explore Serious Game Usefulness

The Technology Acceptance Model [30,31] has been extensively used to examine individual learner differences and perceptions as mediated by technology (e.g., information technology, email, software, etc.). The original TAM, incepted in 1989, posits that five behaviors are important to understand how learners interact with technology: Perceived Usefulness, Perceived Ease of Use, Attitude Towards Using, Behavioral Intention, and Actual Use. These concepts were further explained by Davis [32] in 1993, where he described Perceived Usefulness as the degree to which a person believes that using a particular system will enhance their job performance. Perceived Ease of Use is the degree to which a person believes that using a particular system would be free of effort. These two distinct constructs center around people’s subjective appraisals of performance and effort, and function as behavioral determinants that connect an individual’s Attitude Towards Using a technological system. Venkatesh and Davis [33] argued that the degree to which one believes they will use a system in the future (Behavioral Intention) and Actual Use of a technological system are connected to the beliefs about performance that can disagree with objective reality. Davis surmised that user acceptance is undesirable in cases where systems fail to provide true performance gains. Gains in performance are uniquely connected to both perceived and actual usefulness of a given technology. Many scholars have adapted and extended Davis’s TAM to include effective teaching, intrinsic motivation [34], incentive and reward strategies [35], and trustworthiness [36] to better understand how other factors may influence the effectiveness of engineering games within a classroom environment.
The handful of the researchers who have used TAM to assess engineering education serious games, have detailed the meaning that participants attributed for the utility of these games as a supplemental resource to a course. This observation was evidenced recently [37] when researchers intended to extend the TAM to include online learning environments (OLE), such as virtual labs, simulators, videos, and interactive learning activities. These authors concluded that the original TAM did not address salient assessment issues, such as users’ perceived efficiency, playfulness, and satisfaction. These same authors clarified that perceived usefulness may be interpreted as students’ perceived advantage in using an educational system for self-study and hands-on exercises, which may be linked to students’ attitude toward using the OLE and students’ intention to use OLEs [37]. Similarly, other scholars [35] extended the TAM to include considerations around user satisfaction and attitude toward game elements (game design elements), while defining perceived usefulness as a student’s perception that playing the game, GamiCRS, would enhance their performance and help them achieve their academic goals.

2.2. Motivational Learning—Expectancy Value Theory

Many researchers have noted that usefulness and the perceived value of learning tools is linked to Motivation Theory (Expectancy Value) Theory, which [38] postulates that a student’s achievement-related choices, performance, and persistence are predicted and motivated by their expectations for success on those tasks. Furthermore, these theories posit that the subjective value attached to a task is related to a student’s perception of the element’s utility (usefulness). According to these theories, students’ expectancies and values are determined by other achievement-related beliefs such as achievement goals, self-schemata, and task specific beliefs [38,39]. Expectancy Value Theory (EVT) identifies four major components of subjective values: (1) attainment value of importance (importance of doing well on a task); (2) intrinsic value (enjoyment from the task); (3) utility value or usefulness of the task (how the task fits within an individual’s future plans); and (4) cost. EVT was extended by [40] to understand the motivational perspectives of African American adolescents in academic domains. It was concluded that African American students’ ability, self-concepts, and academic performance were not as highly linked to either their academic achievement or their self- esteem as those of European-American students. This model was also used to examine differences in self- and task perceptions of first, second- and fourth- grade children in the domains of math, reading, sports and instrumental music [41]. It was found that students differed in competence beliefs and values as a function of gender and age, where boys were more positive about sports and math, while girls were more positive about reading and music. Collectively, it is possible that the perceived usefulness of an educational game may be tied to how individuals dissociate performance indicators from the values they attribute to a game based upon their unique contexts and experiences.

2.3. Game-Based Learning

Game-based learning can be rooted in behaviorist and/or constructivist learning theories, where the importance of play in cognitive development and learning has been established since Piaget [42] in 1962. Piaget postulated that play could become more abstract, symbolic, and social as the player develops with time. As serious games should (by definition) have an educational purpose, game-based learning emphasizes the balance between game play and coverage of content matter [43,44].
Though not a requirement, serious games that employ elements such as incentive systems (points, stars, badges, etc.) to motivate individuals to engage in the task of learning content that they otherwise might find unattractive [45] are often linked to games that are digital. These types of games often have a structure that includes three key elements: a challenge, a response and feedback. To assess and quantify the impact of serious games, identification of genre is needed as games are not only used for learning in different fields, but also cater to a variety of gaming preferences, e.g., casual game, first-person shooter environment, role-playing, drill-and-practice, massively multiplayer online (MMO) games and intuitive games. Casual games can be played on most portable electronics like smart phones and tablets for easy access and are designed to help students learn quickly with little to no previous gaming skills, expertise, or regular time commitment to play [46,47]. Serious games classified as having first-person shooter environments provide a first-person perspective of gameplay whereas images are seen from the perspective of the player’s eyes [48]. Role-playing games allow players to act out the structured decision of characters or avatars in the game [49], while drill-and-practice game strategies repeatedly quiz players on concepts and practice problems allowing for immediate feedback like digital flashcards. Massively multiplayer online (MMO) games engage large numbers of players in a virtual world, which enhances social interaction and competition. Intuitive learning games emphasize experiential learning theory practices [50], where gaming environments allow players to consider a range of possibilities and ideas to solve problems, explore practical outcomes, foster abstraction, imagination, and prediction [51]. Due to the number of genres of serious games, conclusions from specific studies may not readily translate across all genres [45].
Game design elements allow for the realization of different genres, which can facilitate engagement on an affective, behavioral, cognitive and sociocultural level, where it has been noted that elements of challenge, curiosity and fantasy are intrinsically motivating for players [52]. According to [21], the transfer of knowledge content is optimized with practice and reinforcement of existing knowledge and skill, which can be facilitated through game scaffolding. Though scaffolding processes are relatively successful and commercially used in entertainment games, strategies for doing this in learning games are less understood, though these elements are connected to user perceptions of game usefulness.
How users perceive ease of use and usefulness of a technology is also related to their prior experiences with other forms of digital technology. User experiences with technology (UX) is part of the human–computer interaction framework that describes a “person’s perceptions and responses resulting from the use and/or anticipated use of a product, system, or service” [53]. Few authors have related aspects of the TAM and UX models together [54], with the exception of [54,55] who connected user experiences while engaging with a technology, system and/or device to the prediction of the player’s tendency towards playing the game in the future. Recently some researchers have linked users’ prior experiences with technology using the TAM [56,57]. Our findings help to bridge the gap in available literature to begin to explore emerging trends in student device usage and expected task value in educational engineering software.
There have been a number of studies in engineering education pertaining to game-based learning interventions in undergraduate and graduate classrooms in the majority of the engineering disciplines, e.g., mechanical engineering [58,59,60,61,62,63,64,65], civil engineering [66,67,68,69,70,71,72], computer science and software engineering [73,74,75,76], chemical engineering [75,76,77,78,79,80], power engineering [81], industrial engineering [82,83,84,85], environmental engineering [86,87], biomedical engineering [88] and aerospace engineering [89]. Of the aforementioned studies, less than a handful explore how gender influences the perceived value of the game intervention. Since engineering mechanics is a subject taught by mechanical or civil engineering departments, articles relating to these specific engineering disciplines are provided in Table 1. Few studies have examined the role that gender and engineering role identity may play on user acceptance, value, or perceived usefulness in the mechanical engineering, which is among the four lowest engineering disciplines to graduate women BS degrees (mechanical engineering women getting BS degrees = 15.7% in comparison to environmental engineering = 51.7% women and biomedical engineering = 48.1% women), despite being the largest population among all engineering disciplines according to the American Society of Engineering Education (mechanical engineering BS degrees = 24% followed by computer science engineering BS degrees = 14%) [90]. Hence, introduction of digital learning games could in theory present an opportunity to enhance the population of women in the profession according to some scholars [91]. Examples of game-based learning studies in mechanical and civil engineering are provided in Table 1.
The studies detailed in Table 1 illustrate digital game-based learning research that emphasizes findings from mechanical or civil engineering participants and studies that aimed to examine how participants related usefulness, engagement, or value to the digital learning tool. For example, ref. [58] studied the use of several different types of learning games in a second-year mechanical engineering course to understand students’ satisfaction, usefulness, and enjoyment. The games incorporated in this graphical engineering course included Championship of Limits and Fits, Tournament of Video Search of Fabrication Processes, Forum of Doubts, Proposals of Trivial Questions, Contest of 3D modeling. Each game was evaluated by asking the participants if they thought the game was “more useful and enjoyable” as a teaching element in the course than if the class were taught using conventional approaches. Participants responded to each question using a Likert Scale rating from 1 to 5, where 1 = Strongly disagree, 2 = Disagree, 3 = Neither Agree nor Disagree, 4 = Agree, 5 = Strongly Agree. It is difficult to ascertain how students defined usefulness of the games since the question also included enjoyment. Coupling both descriptive words (“useful” and “enjoyable”) in the questionnaire makes interpretation of the results difficult as these two sentiments could be mutually exclusive. These researchers concluded that the main advantage of using the games within a class environment was an increase in attendance and enhanced interest in the course topics, in addition to bringing joy to the classroom environment. Interestingly, in this study, 9 of the 11 women studied did not play video games but ranked the games 3.3 in comparison to their male counterparts 3.6. While the researchers concluded that the differences in opinions between the men and women were not significant, no statistical analysis was reported in the study, i.e., no standard deviations or p-values were included in the analysis.
Other researchers such as [62] conducted a study across all disciplines at a university in Germany to understand how students perceived gamification and mixed reality learning tools, and what types of study modules would be considered useful by students. They found that of all the disciplines studied, mechanical engineering students saw the most potential in gamification in academic settings and students from all disciplines expected that games teach in a practical, realistic, and fun way. In a graduate maintenance engineering course, ref. [63] examined how students engage, actively cooperate within a group, and perceive real-time feedback when playing an asset management serious engineering game. They concluded that the game allowed the participants to apply different strategies to problems, forced students to make and learn from mistakes, and encouraged students to learn the reasons behind their mistakes. In the engineering game, NIU-Torcs that was designed by [27] for inclusion in an dynamic systems and control course, student engagement was examined. In this study of 51 undergraduate students, the authors concluded that students experienced higher intellectual intensity, intrinsic motivation, and overall student engagement in comparison to traditional modes of approaches to homework and coursework. Similar games focused on mechanical engineering dynamics using race car scenarios have been studied with emphasis on game design, student engagement, and perceived effectiveness [23,27,65]. Researchers have also compared the effectiveness of different games aimed at enhancement of content from the same course, such as [92].
While the vast majority of the studies in game-based learning in engineering for undergraduate and graduate level courses excludes differences/similarities as a function of gender, several studies for middle and high school aged students studying STEM fields have explored the role gender plays in acceptance and effectiveness of game-based learning intervention [93,94,95] with varying conclusions regarding differentiation of perceptions and effectiveness of the game-based learning tool. However, several researchers have noted the importance of recognizing differences in gender in game design, appeal, and efficacy [96,97,98,99], but studies that reflect this important consideration in engineering disciplines such as mechanical engineering, which have lower concentrations of women—lag behind.
It is important to note that this study does not negate or diminish the relevance or importance of work and research from scholars and game designers that have not explored the effectiveness of their games as a function of student role identity in terms of gender. Instead, this research fills a gap in the literature towards understanding how one’s unique perspective and experiences influence how one associates usefulness and value of digital educational tools that are used to supplement traditional textbook and lecture course materials. This study fills the space in the literature that does not report on demographics in order to assure participant anonymity [65] or those whose studies were statistically inconclusive due to small percentages of women and under-represent minorities.

3. Research Design and Research Questions

The goal of this study is to explore usefulness of an engineering educational serious game from the perspectives of a diverse population of undergraduate students in a course at a Northeast institution in the United States. The research questions for this study are:
  • How is usefulness described by engineering student users of the game?
  • How do perceptions of game elements vary/stay the same as a function of prior gaming experience?
The study was conducted at a Research-1 institution in the Northeastern region of the United States in general statics and dynamics courses, Honors level engineering statics courses and engineering student organizations. Qualitative and quantitative data were collected via pre- and post-questionnaires, interviews, and focus groups. Data were analyzed in terms of ethnicity and gender to understand how this game connected with diverse populations of students. This work will help researchers understand what aspects of serious games can be leveraged to enable meaningful improvements in engineering education.

4. Methods

A Mixed-Method Sequential Exploratory Research Design Method [100] was proposed and approved by the primary Institutional Review Board of the first author and a cede of that IRB from the institution (at the time of the study) for the second author. The study took place at a Research-1 [101], research-intensive institution in the Northeastern region of the United States. The data described herein represent phases of a multi-year study. The preliminary results from this study that included responses from thirty-three participants were reported in conference papers [102,103,104]. In addition, correlation of post-survey responses with physiological eye-tracking measurements of eight participants were described in [105]. In this work, responses from 201 participants are described and discussed. This work differs from the previous work by including data from a larger data sample (previous work 33 participants and this work 201 participants) and data triangulation of coded textbox and focus group responses with pre- and post-questionnaire responses. Questions included in the pre- and post-questionnaire and focus group discussion are provided in Table 2. All of the 201 participants in the study were recruited from engineering classrooms and STEM and engineering student organizations. Students provided demographic information such as age range, gender, race/ethnicity, undergraduate major, and experience with online learning tools.
The research design is premised on the authors’ positionalities as intersectional women in engineering and engineering education who have also experienced or witnessed first-hand the role those educational materials in engineering can have in a woman’s overall sense of belonging and formation as an engineer.

4.1. Data Collection Protocol

An overview of the data collection process is provided in Figure 1. The students first completed a pre-game questionnaire, played the engineering game for 20 min, completed a post-game questionnaire (questions provided in Table 2), and then participated in a focus group discussion for approximately 30–40 min. The works of [106,107,108] were used to support the use of a 20-min exposure to the game play. In these studies, participants were exposed to the game-based learning tool for 20 min followed by a post-test. This timeline was also selected because researchers such as [109,110] indicated that students can have meaningful knowledge retention after 15–20 min of game play, where many students’ attention spans are usually less than 20 min [111].
The questionnaire included 7-point Likert-scaled questions pertaining to their experiences with the game, demographic student information, and previous experiences with playing video games. Students were also asked to provide text responses to provide additional information pertaining to their Likert-scale responses to the questions on the questionnaire. The learning theory or model associated with the questionnaire question is provided in the table, where questions were slightly modified to consider the context of the serious game selected for this study, Civil-Build. Also, prior learning experience questions were modified per recommendations from other scholars [29] to understand the nature and environment in which the student engages with video and entertainment games in their everyday lives. The Likert-scale ranges included: Strongly Agree (1), Agree (2), Somewhat Agree (3), Neither Agree nor Disagree (4), Somewhat Disagree (5), Disagree (6), and Strongly Disagree (7), where Strongly Agree and Strongly Disagree were ranked 1 and 7, respectively. The students played the game in a quiet computer laboratory with section partitions around each player to limit interaction of participants while playing the game. Students wore noise cancelling headsets attached to their computers that allowed them to hear the sounds of the game. The focus group was conducted in a conference room, which in a separate location to the computer room.
During the focus group, participants discussed their perceptions of the game as an engineering educational learning and motivational tool. Selected questionnaire questions were repeated during the focus group along with several additional questions provided in Figure 1 and Table 2.
The focus group questions enabled a more in-depth discussion of the topics described in the Technology Acceptance Model (TAM) [30,32,112], i.e., perceived usefulness and ease-of-use of the game. Several additional questions were included along with the questionnaire questions during the focus group discussion to facilitate the exploration of student’s opinions regarding their prior experiences with video games, enjoyment playing the Civil-Build game. Additional questions were asked to understand how participants defined “usefulness”. Focus groups consisted of 3 to 6 participants. The data were collected during three semesters, Spring 2018, Spring 2019, and Fall 2019. A quantitative analysis of pre- and post-questionnaire responses was conducted, in addition to written responses to open ended questions in the post-questionnaire. Twenty-six of the 201 participants in the study participated in the focus group discussions.

4.2. Student Population Demographics

Two hundred and one students participated in this study that introduced the online engineering educational game, Civil-Build (Pseudonym used to represent the online engineering game used in the study.). A pseudonym is used for the game to protect the students and instructors’ identities who participated in the study. Students were recruited using flyers and email advertisements in engineering classrooms and engineering student organizations. Students whose schedule availability fit the time frame of the study were selected to participate. Students who were in statics courses received extra credit for participation in the study, while those who were not engaged in a statics course at the time of the study received compensation for their participation.
The self-identified demographics of the students who participated in the study are provided in Table 3, where the percentage of men, women, non-binary, and other were 51%, 46%, 1%, and 1%, respectively. Eight of the 209 students declined to answer the question pertaining to gender. Students were also asked to indicate the race and/or ethnicity, where those who self-identified as African American/Black (7%), Caucasian/White (32%), LatinX (8%), Asian (46%), Mixed Race (6%), and “other” (1%). The populations of students who reported their gender and race are provided in Table 3, where the data from the 8 students who did not disclose demographic information are excluded.

4.3. Online Engineering Game Description

The online engineering educational game used for this study was selected as per recommendations from instructors who have used it to supplement course lecture materials on structural stability of truss structures, which is a topic that is covered in traditional undergraduate engineering mechanics statics courses. The game, Civil-Build was also selected for this study because it has been used as an educational tool in an existing engineering statics course at the university when course curriculum and schedule allowed. This digital tool is also considered to be the gold-standard of online games for engineering mechanics because it was designed by an engineering professional educator to help engineering students build intuition about how truss structures behave in real life and how they fail though physics-based simulations. Engineering instructors that opted to use this tool in the classroom believe that it supports student learning of engineering statics and have used it to supplement course materials such as the textbook and in-class lectures. This software was used by instructors who have taught the course for at least 5 years and who have taught both honors and general population classes. The software used in this study was encouraged and suggested by these instructors as they had used the software for three years in an honors engineering mechanics class.
The software focuses on truss structural stability, which is covered in all UG engineering mechanics courses and is not only used by instructors at this university located in the eastern region of the United States but, also by instructors at institutions in the middle and western coastal states and in several universities in Europe. (The specific names of universities where this tool has been used have been intentionally excluded to protect the identity of the software creator, university professors and students who presently use the software.) Unlike other civil engineering apps that illustrate structural failure using computerized animations from artists’ renderings, this app uses finite strain theory in computational models to produce scientifically accurate dynamic visual representations of how structures physically respond to applied mechanical loads in real time. This software has been praised in ASEE’s Prism magazine because it builds upon the visuospatial capabilities of students in real-time, can be downloaded onto a smartphone, and is accessible for use in classrooms (free). Software like this address the limitations of static drawings in textbooks as noted by other scholars [24,27,113]. The engineering software learning tool also enables the player to visualize material and geometric nonlinearities in addition to the dynamic movement of failed/compromised structures.
The goal of the game Civil-Build is to assist students in developing engineering intuition of truss structure behavior when subjected to loads. A representative rendering of the game interface is provided in Figure 2. The depiction of the game interface was modified to keep the tool and the designer anonymous. The software tool is based on finite strain theory that enables the user to visualize material and geometric nonlinearities and dynamic movement of failed/compromised structures. Users play the game by positioning bars and joints on the screen to construct a truss structure that can support an external mass and the structure’s weight. The structure the player builds must consist of joints and bars, where the bars are connected via the joints. Players are rewarded with nut(s) and points based on the player’s ability to create a structure of minimal weight and optimal structural stability (ability to support the given load while minimizing the support structure’s overall weight). The number of nuts and points rewarded to the players are based on the structure’s ability to support the given load while minimizing the overall weight of the support structure. Participants move the bars and joints on the screen of the game interface while manipulating the weight of the truss and adjusting the thickness of the bars. Participants visualize their structure’s success or failure in real-time, as the structures visibly collapse or maintain their position once the truss structure is completed. The collapse of the structure is punctuated with clanging sounds associated with the destruction of the structure. The bars subjected to loading from the weights change color (shades of blue and red) to illustrate compression and tension of the bars, respectively. The tool is designed to teach students intuition about the relationship between truss structural design, material and geometric nonlinearities, and dynamic failure. The player progresses from Challenge 1 to 24, where each level increases in difficulty. Players cannot move to the next challenge without successfully completing the previous challenge level and there are no game instructions furnished in the game interface. No supplemental gameplay resources or instructions are provided in the game interface. However, supplemental resources are available for download on the software website and via YouTube demonstrations. No supplemental supplies or instruction was provided as part of this study to maintain the game designers’ intent to teach engineering design intuition, i.e., apprehension or direct knowledge about a subject without instruction pertaining to the science or engineering governing the game.

4.4. Statistical Analysis for Quantitative Data

The descriptive statistics (mean, standard deviation, frequency, and percentage of the homogenous group) of all of the students who participated in the study were collected for the entire group of participants as a whole. In addition, these variables were recorded as a function of self-identified identity in order to be further correlated to their perceptions of the engineering learning game in terms of EVT, TAM and GBL. Questions pertaining to usefulness were added to understand the relationship between EVT, TAM, GBL and usefulness for engineering serious games that are designed for intuitive learning.
One-way and two-way multivariate analyses of variances (ANOVA and MANOVA) were performed to ascertain if there are statistical differences between students according to demographics, experience with playing games (on the phone or computer), perceptions of game being easy to use or useful, and frustration while playing the game. Two-way multivariate analyses (MANOVA) were conducted to understand if there were statistically significant interaction between subgroupings. The quantitative data from the questionnaire responses are triangulated with open-ended responses from students on the questionnaire pertaining to frustration and focus group questions.

4.5. First Cycle Coding for Qualitative Data—Elemental Coding Approach (Textbox and Focus Group Responses)

An open categorical first cycle elemental coding approach [114] was employed in this study to identify explicit words, phrases, opinions and experiences discussed in the selected text portions of the post-questionnaire and selected questions in the focus group. Two types of elemental coding approaches: structural and in vivo coding were used for analyzing the focus group transcriptions and Questions 12 and 13. For this work, six focus group discussions were transcribed and coded, where a total of 28 students participated in the focus group discussions in total. The students who participated in the focus group also participated in the first phase of the study, i.e., completed both a pre- and post-questionnaire, and were asked to participate in a focus group after they completed the post-questionnaire. Focus group size ranged from three to six participants due ease with recruitment, hosting and comfort for participants [115].

4.5.1. Elemental—Structural Coding for Qualitative Data (Textbox Responses)

The structural coding approach was applied for this exploratory investigation to ascertain major themes or sentiments of the participants. This process facilitated the categorization of the data to identify comparable commonalities, differences, and possible relationships [114]. For this element of the qualitative analysis, data from all 201 respondents were analyzed. It is important to note, however, that provision of text responses to questions 12 and 13 was optional for the respondents. Not all of the 201 participants provided text response to both questions and 28 students participated in the focus group discussions. The number of responses provided to each question will be provided in the results section.

4.5.2. Elemental—In Vivo Coding for Qualitative Data (Focus Group Responses)

An in vivo coding approach (form of elemental approach) is described as “literal coding” or “verbatim coding” was selected for this first cycle of qualitative analysis of the transcribed focus group discussions because it prioritizes and honors the participant’s voice [116]. The in vivo coding approach is also rooted in the Initial Coding employed in grounded theory [117] and is useful in educational ethnographies.

5. Results and Discussion

Two hundred and one students participated in the study. Also, only 174 of the 201 responses provided information pertaining to the challenge level that they achieved when playing the game. These 27 responses that did not provide challenge level were considered outliers in the study because this group of participants answered all of the pre- and post-game questions except the challenge level questions.
The descriptive frequency counts and percentages for the pre-game questions are provided in Table 4. The results indicate that the majority of the students who participated in this study play video games on their phone and were currently taking engineering statics or dynamics, 70.1% and 71.6%, respectively. Also, slightly less students played video games on their computer (62.7%) and the majority of the students had never played the game Civil-Build prior to participating in this study (86.6%).

5.1. Descriptive Statistics for the Entire Population of Students

The statistical means and standard deviations for the post-game questionnaire for the entire population of student respondents are presented in Table 5. The responses for the questions 2 through 11 were based on a 7-point Likert scale. There were 24 challenge levels that the participants could achieve, where a player could not proceed to a new challenge level without first successfully completing the prior challenge level. On average, the participants reached the 7th challenge level out of the 24 challenge levels while playing for 20 min. Respondents “agreed” (rating between 2.0 and 2.5) with Q8, i.e., that they enjoyed playing the game. Respondents also indicated that they “somewhat agreed” (rating between 2.6 and 3.5) with questions, 2, 3 and 11, i.e., that the game was easy to play, easy to understand, and got frustrated playing the game. It has been found that frustration can foster motivation in cognitive- and emotion-states while learning complex materials and one’s cognitive and emotion-states are related to the duration and frequency that participants experience frustration [118,119]. Game-based learning recognizes the balance between gamer interaction with the learning tool and adequate failure that foster learning. On the other hand, players neither agreed nor disagreed with questions 6, 9 and 10, which means that they were not convinced that playing the game increased their confidence in their engineering skills, motivated them to advance to higher challenge levels, or that the learning lessons or goals were defined in enough detail to successfully play the game. These results are compelling because they illustrate the difficulties in establishing a balance between experiential intuitive-based learning and other schema for creating an interactive environment that fosters learning in engineering games.
The last two questions in the post-game questionnaire were questions added to explore how game players defined usefulness in educational games. The results for the aggregate responses are provided in Table 6. From this table, it can be seen that the majority of the game players thought that the serious game should have more explanations about the challenge levels (78.1%) and nearly half of the students indicated that there should be opportunities for them to play against other players interactively. These findings suggest that students may have different expectations of engineering education serious games compared to those games designed solely for entertainment purposes. Also, students indicate that these serious games may have more use if they had images that are more realistic looking, suggesting a need for these types of games to connect to the real world. Participants were also asked to explain what about the app made it a good learning tool. The results from this open-ended question are provided in Table 7. Of the choices provided, players indicated that the game was a fun alternative to traditional learning (lectures). Although students indicated that elements from the game could be applied to their class and taught them about truss stability, additional studies should include inquiry about specifics that they learned and how these elements of the game were related to their course work. The results from Table 7 are further elaborated in their responses to the post-game questionnaire. Many students indicated that there was value to being able to see how trusses behaved physically in a dynamic game environment. For example, one student indicated that “it’s not a complete alternative, but it helps create a visual understanding of trusses.” Another student wrote, “With some more explanation in the game about the mathematical statics counterparts to the game experience, the game would provide knowledge, which could be applied to classes.” This sentiment is supported by another comment from a student who wrote, “The simulations could be played in slow motion,” which may indicate that there is value to being able to visualize the physical reaction of structures to forces in real time.

5.2. Analysis of Variance (ANOVA)

A one-way ANOVA was performed to examine how the responses to the post-game questions (dependent variable) varied as a function of gender (independent variable) as shown in Table 8. The confidence interval used was 95%. It was found that there were significant differences between genders for post-game questions 1, 10 and 11, with p equal to 0.00, 0.00, 0.001 for questions 1, 2, 10 and 11, respectively. Men had higher scores on the learning game compared to women students (8.47 ± 2.9 and 6.30 ± 2.72, respectively). The observations for non-binary and students who identify as “other” are statistically inconclusive due to the small number of students representing this population in this study. Men found the game to be easier to play (2.70 ± 1.26) than women (3.36 ± 1.48). Men also agreed more with the statement that the learning lessons or goals of each challenge were defined in enough detail to play the game, with scores equal to 3.30 ± 1.44 and 4.48 ± 1.67, for men and women, respectively.
A one-way ANOVA was also performed to examine how participants’ experiences with the serious game varied as a function of gaming experience on the phone, where 70.1% of the participants played games on their phone as shown in Table 9. Students indicated that the games they played on their phones were entertainment games like Candy Crush, which required minimal transfer of learning and high hand-eye coordination. How students experienced elements of the serious game varied depending on whether they played entertainment games on their phone, for post-game questions 5, 6, 8, 9, and 10, with p values equal to 0.001, 0.021, 0.00, 0.006, 0.011, respectively. In particular, students who indicated that they played games on their phones, thought that the educational game helped them understand engineering truss structures more than those who did not play entertainment video games on their phones, i.e., with means of 2.58 ± 1.13 and 3.20 ± 1.30, respectively. Similarly, students who played games on their phones indicated that playing the game increased their confidence in their engineering skills where means were 3.46 ± 1.40 and 3.98 ± 1.50 for those who did and did not play games on their phones, respectively.
In a similar way, a one-way ANOVA was performed to examine how participants experience the serious game as a function of prior gaming experience on their computer. The questions that provided statistically significant differences are provided in Table 10. Ironically, the responses that were significant for the phone app-gameplay were not the same as those for the computer gameplay, where students indicated that the type of games that they played on their phone (Candy crush, solitaire etc.) were entertainment and single-person games. On the other hand, students that played video games on their computer (Fortnite, Minecraft, Valorant, and Call of Duty) typically engaged with games that required more strategy and “gamer skills”. While some of the games that the participants listed for computer-based games are available as phone apps, the phone versions do not usually offer all of the features as the computer-based game, such as third-party mods, connection to third-party servers, etc.
In addition, many players of the computer-based games indicated that a different more advanced set of skills were needed to excel at the computer-based games as opposed to the phone-based games, e.g., good communication skills (for multiple player platforms), reflex skills (hand-eye-coordination), strategy, cooperation between teammates, etc. Hence, those who stated that they did play computer games on their computer indicated to a higher degree (than those that did not) that playing the game was easy 2.86 ± 1.30 in comparison to those who did not play the game, 3.41 ± 1.50 (p = 0.007). Those who played games on their computer reached higher and more complex game challenges that their peers that did not, i.e., 7.96 ± 3.41 and 6.43 ± 2.80, respectively. Also, those who played computer games had a more positive response to the games’ usefulness than those who did not in terms of the game helping them in their understanding of engineering truss structures (2.60 ± 1.20 and 3.04 ± 1.30, p = 0.013), confidence in engineering skills (3.30 ± 1.40 and 4.00 ± 1.40, p = 0.002), and motivation to advance to higher challenge levels (3.60 ± 1.60 and 4.00 ± 0.047). The experienced computer gamer also indicated that they felt that the game goals were defined in enough detail to excel in the game more than those students who did not play games on their computer (3.6 ± 1.6 and 4.4 ± 1.5, p = 0.001). Also, people who did not play games on their computer indicated that they were frustrated while playing to a higher degree than those who do not (3.7 ± 1.7 and 3.0 ± 1.5, p = 0.013).
The findings pertaining to prior use of games on smart phones or computers are similar to the findings of Abramson et al. [120] who tested specific factors of behavioral intention to use m-learning in a community college using an extended TAM and prior experiences with mobile phones. They concluded that there is a relationship between prior use of e-learning and behavioral intention in using m-learning. Similarly, our results are similar to those found by McFarland and Hamilton [57], who examined the influence of contextual specificity pertaining to technology acceptance. They concluded that the proclivity of one to use a computer system is strongly influenced by computer anxiety, prior experience, organizational support, and perceived usefulness. Our work also supports the conclusions of Venkatesh [56], who underscores the need for an increased focus on “individual difference variables” to foster acceptance and usage of information technologies. Venkatesh [56] highlights that the former focus of work is needed rather than “over-emphasizing system-related perceptions and design characteristics as has been done”.

5.3. Multivariate Analysis of Variance (MANOVA)

A multivariate analysis of variance was performed to understand the interrelationship between those who play games on their phone and gender in regard to the post-questionnaire. The results from this analysis are provided in Table 11. Though some researchers have concluded that those who have prior experience with playing video games accept and adapt to use of games in the classroom environment, our results indicate that amongst students who play phone apps, men achieve higher challenge levels than women, and even men who do not play video games on their phone achieve higher scores than women who do play games on their phones. In addition, men who do and do not play games on their phone found the engineering education serious game easy to play to a higher degree than women who either play or do not play games on their phone. Both women and men who played games on their phone agreed more with the statement that the engineering game helped them to understand engineering structures. This trend was the same for motivation and clarity of the learning lessons of the game for those who play games on their phone.
The sample sizes of students who identify as non-binary (N = 3) or other (N = 1) are small in comparison to the groups that identify as men or women. However, both of these groups achieved lower challenge levels in the game within the group of students who play video games on their phones. On the other hand, non-binary students that do not play video games on their phone achieved higher levels than women in the group. Also, non-binary students indicated that the game helped them learn about truss structures more than any of the other gender groups as shown in Table 11. These findings are interesting, but higher sampling sizes are needed to better understand the sentiments and perceptions of these students.
A multivariate analysis of variance was performed to understand the interrelationship between those who play games on their computer and gender. The results from this analysis are provided in Table 12. Similar to the results shown in Table 11, men who did and did not play video games achieved higher challenge levels on the serious engineering game than women and non-binary students who did not play video games on their computer. Women who did not play video games indicated that they disagreed more with the statement that the game was easy to play than those who did play video games. Ironically, however, women who did have experience playing video games, agreed with the statement that the game helped them to learn more about truss structures than men who indicated that they played video games. Hence, this population of students seemed to value their exposure to the game to a higher degree than their men counterparts, which may indicate differences in expectations of serious engineering learning games. Similarly, this population of women agreed more with the statement that playing the game increased their confidence in their engineering skills than women who did not play video games and men who played video games. Thus, this may indicate that women who enjoy playing video games for entertainment may value the learning experience from engaging with a serious engineering game than women who do not find engineering games entertaining in their spare time. Also, women participants who did and did not play video games disagreed more than their male counterparts with the statement that the goals of the engineering game were provided in enough detail to play the game. In a similar way, women indicated that the game was frustrating to them to a higher degree than their men counterparts. Understanding the role of frustration and student perceptions of value is important as it has been found that the negative path between learning anxiety and self-efficacy is stronger for females than males. So, in the case of a male and female student with equally high levels of learning anxiety, the women’s self-efficacy would be lower [121].

5.4. Elemental—Structural and In Vivo Coding of Text Box Responses

5.4.1. Question 12: I Would Improve the Game by: __________

Text box responses for Questions 12 and 13 (Table 2, Table 6 and Table 7) were recorded and analyzed using a structural coding approach, where responses were examined as a function of gender (men, women, and non-binary). Forty-five participants provided text responses to Question 12, which asked how they would improve the serious educational game.
Of the 45 participants that responded to Question 12, nineteen were women. The majority of women (53% (10/19)) indicated that they would recommend including an instructional tutorial to explain how the game worked. This theme was linked to the sentiment of 5 participants who stated that they would have liked it if the game designer provided hints (5/19) to explain why a structure failed and where it failed. The next two primary themes focused on provision of an explanation of why the structure failed (4/19) and how the game controls were operated (4/19). Two women indicated that they would have liked to understand how the game scoring worked and how they could increase the number of points for a better score. Quotes from textbox responses from women and men are provided in Table 13. In general, women’s comments primarily focused on provision of instruction and hints to learn how to use and play the game.
Slightly more men responded to Question 12 than women, e.g., 22 out of the 45 who responded in the text box. Though several men indicated that they would prefer to have an explanation about how the game worked, they did not directly state whether this was to be included within the game interface as the women did. This is evidenced by how the men articulated the request for game instruction. Examples of this are provided in Table 13. Only 2 of the 22 men asked for a “tutorial” (14%) and 3 asked for an explanation of the game and the game goals. While women primarily asked for a tutorial and better hints, men asked for a better user experience, (7/22 men and 0/19 women) i.e., “better UI (user interface)”, easier ways to place components on the screen, etc. Men also asked for more technical content, i.e., numerical values for calculation, etc., (3/22) and more realistic structures (3/22 men and 0/19 women). These findings suggest that men and women appreciate and expect different things from a learning game.
All of the students that self-identified as non-binary provided a response to Question 12. One of the three respondents asked for a tutorial within the game interface, while another indicated that the game experience (screen appearance) could be improved (see Table 13 and Table 14).

5.4.2. Question 13: This Game Is a Good Learning Tool Because: __________

Eighteen of the participants provided textbox responses to Question 13, which asked the game player to explain why they thought the serious engineering game was a good (not good) learning tool. Four of the eighteen respondents were women. All of the responses to question 13 were “positive”, where the respondent indicated a value associated with playing the game. In particular, women indicated that the tool was good because it extended their knowledge beyond equations and numbers and allowed them to view the physical responses of structures to loading conditions in real-time, i.e., allowing them to slow down or speed up the visualization. In other words, all four women indicated that there is value to being able to visualize the response of structures, which supported content that was covered their textbook and supplemented static drawings with dynamic representations of truss responses to various loading conditions and truss design. Examples of this feedback are provided in Table 14.
Six men and no non-binary students provided text feedback to Question 13. Only one of the men indicated that they liked the visualization (simulation) of the dynamic events pertaining to the truss structure (see Table 14). Two other participants indicated that the game was not a learning resource because it did not provide meaningful explanations to support the simulated failures/successes of the structures. In particular, these students indicated that software incorporated as supplemental learning tools should provide knowledge that can be applied to content provided in the class materials. On the other hand, two men indicated that they liked the fact that the game encouraged them to try multiple design options as “there was no punishment for failure,” and it was “less stressful”. Another man indicated that the game was reminiscent of games that he played when younger. None of the women indicated that they played games similar to this in their children in the textbox responses or linked it to a positive prior learning/entertainment experience. These responses support the post-questionnaire feedback where men who played video games on their phone did not agree as strongly as women who played video games with the statement that the game helped them to learn engineering concepts.

5.5. Focus Group Responses

Twenty-six participants engaged in the focus group discussion described in Table 2. The two focus group questions are: Would you use this game to prepare for a job interview? Would you use this game to prepare for an exam? The responses to these questions can be grouped into four primary themes that help to describe how students interpret and define usefulness in terms of engineering learning games.
Nineteen of the twenty-six participants indicated that they would not use the engineering software to prepare for an interview, while the remaining participants stated “maybe” or “not sure”. The seven who indicated that they were unsure or would consider using the tool for an interview if it was for a company that designed bridge or truss structures. Hence, games that are designed to address content specific areas of expertise are deemed to be useful if they allow the student to obtain concrete skills for application to an engineering career path. Tools that rely on the user to indirectly appreciate intuitive skills without eventually leading to direct explanation of results are deemed to be less beneficial as time for students is valuable.
Fourteen of the participants indicated that they would not use the game to prepare for an exam. The remaining students did not directly state that they would not use the game, but instead provided explanations of when it would be appropriate to use the game or how it could be used within a classroom setting. This feedback is summarized with the four themes provided.

5.5.1. Theme 1: Serious Engineering Games Should Strengthen Knowledge of Core Course Content

The majority of statics coursework consisted of using equations of equilibrium and algebra to solve for force and moments acting on structures in two or three dimensions. Hence, five students indicated that this tool was not helpful because it did not relate or support the methods (numerical calculations) used in the course to solve problems, for example, one student stated, in class “we are given predefined things”. Another student stated, “I would not use it to study just because, like, tests are geared towards problems and these problems, like, the challenges in the game aren’t really relative to the problems that we see on tests.” Also, because the games did not include explanations for failed structures or hints at how to improve structures, eight students indicated that tools that do not explain why are not as effective at teaching complex topics. Those that stated that they would not use the tool for a job interview indicated that this was because it did not help them understand why their structures failed, and they would need to understand the science and engineering behind failed structures to perform their jobs. One woman from this group indicated that she would not play the game for an interview, even if the instructor recommended it as an extra resource, because she did not like video games. These sentiments pair with and support the post-questionnaire textbox responses.

5.5.2. Theme 2: Serious Engineering Games That Do Not Pair with Math and Science Fundamentals Are for Novice Learners

Many students indicated that while the game did not advance their knowledge of truss structures enough to complement their learning, people new to statics (without prior engineering experiences/courses) may value learning the intuitive aspects of truss design. Several suggested that the game may be better incorporated in the statics course at the beginning of the term, prior to when the students learn about truss structures (9/26 participants).

5.5.3. Theme 3: Homework and Supplemental Materials Should Reflect Reflect-Real World Problems

Students also indicated that serious homework assignments (4/26 participants), and by extension engineering games should reflect real world problems, where several noted that while this software allowed for dynamic analysis of designs, it was not detailed enough to reflect the actual types of software used by industry that would relate to hand calculations they perform in class (3/26) and allow for more detailed input of parameters and output of values (5/26). Hence, while the software that the students used actually did include calculations based on real engineering principles, students did not interpret it to be based firmly in theory since it did not allow them to input specific numerical values. In fact, some students indicated that they would rather use other bridge building software packages, which (unbeknownst to them) did not adhere to all theoretical laws of physics or finite theoretical strain theory but included better graphics and 3-dimensional images that looked more representative of actual bridge and truss structures. These other software packages provided numerical feedback, though it was not always rooted in theoretically sound calculations along with visual images of failed structures. Instead of weights used as loads, actual vehicles and material selections were available, though the material selections did not result in actual changes in structural outcome.
Of the 26 focus group members, only one man affirmatively indicated that he would use the tool because it helped him quickly assess aspects of a bridge or truss structure without having to do calculations during the interview.

5.5.4. Theme 4: Software Presented a New Way of Learning the Material

Four students stated that while the engineering software did not add to their knowledge of statics, playing the game revealed a gap in their coursework. This gap was the ability to design one’s own structure and examine the outcome of the structure when subjected to a mechanical load. These students stated that their present homework did not reflect real world problems as they thought they would. Also, their homework primarily focused on examining pre-designed structures, but rarely their own attempts and engineering on their own. Hence, for these students, there was value associated with the process of designing structures on their own and testing them.

6. Conclusions and Future Work

A mixed-method sequential exploratory research design method was used to examine how a diverse population of students perceive the usefulness of a casual intuitive education game that was designed to teach players about the engineering mechanics and stability needed to build truss structures. The study also focused on elucidating how students with varying experience with gameplay on their phone and computer interpret the value of the serious intuitive engineering game. Elements from the technology acceptance model, game-based learning theory and expectancy value theory were incorporated into the pre- and post-questionnaire for this study, in addition to several questions posed to garner student perceptions of game usefulness, which we assert is related to application to enhancement of knowledge towards mastery of course content and preparation for a professional engineering job interview.
Our results indicate that students having higher expectations of casual intuitive engineering learning games than games played for entertainment. In particular, students expect game interfaces and design schema to include explanations of failed or unsuccessful attempts that incorporate technical learning concepts. Game players who played games on their computer achieved higher game challenge levels than students who did not play games on their computers, and these students also indicated higher levels of pleasure while playing the engineering game than those who did not play games on their computer. There were also differences in how students experienced the game in terms of gender. Men who did and did not play games on their computer indicated that the game was easy to play to a higher degree than women. Women indicated higher frustration while playing the game than men, but those who played games on their phone stated that they learned more about truss structures than men. These findings were supported by focus group and textbox coding analysis where more women indicated a desire for game explanations than men.
Though the sample sizes of students who did not identify as men or women was small, some preliminary trends were observed. Non-binary students thought that the engineering game helped them understand truss structure more than men and women. These students also indicated that they liked playing the game, Civil-Build more than their men and women counterparts.
Few students thought the game would be useful for exam preparation because the game did not directly align with course homework and lecture material problems and educational philosophy. Hence, students expect direct ties between supplemental learning tools and course content, and desire educational tools to reflect aspects of real-world engineering problems. The majority of students did not believe the tool to be useful in preparation for a job interview because it did not include tangible numbers and calculations, which were regularly needed for homework assignments. Also, because the game interface did not render real-world visualization of engineering problems, most thought it not a good tool for job preparation. Several students thought that the ability to design their own structures and observe their failure in real-time was a valuable asset in understanding how truss structures failed.
There are several limitations of the findings from this research. For example, the number of students who identified as “other” and “non-binary” was miniscule. Hence, no statistically significant conclusions regarding these groups can be made. The findings of this study would be enhanced with additional participants in the study of diverse backgrounds. Also, the reliability of the method to objectively predict student perceived ease and usefulness would be strengthened with a more comprehensive assessment of other engineering mechanics engineering games. However, there are very few engineering mechanics games available that present physical structural responses that are rooted in authentic engineering principles and theory and as such this work serves as one of the early studies to explore these experiences.
The role of prior experience playing engineering games and the types of games played should be studied further to understand their relationship to student affinity or appreciation of engineering mechanics games. In addition, analysis of the relationship between students’ perceived ease of use and usefulness to students’ academic performance in the course is needed.

Author Contributions

Conceptualization, K.C.-C.; methodology, K.C.-C. and I.V.A.; software; formal analysis, K.C.-C. and G.J.; investigation, K.C.-C.; resources, K.C.-C. and I.V.A.; data curation, K.C.-C.; writing—original draft preparation, K.C.-C.; writing—review and editing, K.C.-C. and I.V.A.; visualization; supervision, K.C.-C. and I.V.A.; project administration, K.C.-C.; funding acquisition, K.C.-C. and I.V.A. All authors have read and agreed to the published version of the manuscript.

Funding

This material is based upon work supported by the National Science Foundation under grant no. EEC-1830812 and EEC-1830788/2113739.

Institutional Review Board Statement

The study was conducted in accordance with and approved by the Internal Review Board of Rutgers, the State University of New Jersey, New Brunswick (protocol code: Pro2018000499, date approved: 8/9/2018).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data is available on request, but demographical data will be redacted in some instances due to restrictions need to maintain participant privacy and anonymity.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Favis, E. With Coronavirus Closing Schools, Here’s How Video Games Are Helping Teachers. The Washington Post. 15 April 2020. Available online: https://www.washingtonpost.com/video-games/2020/04/15/teachers-video-games-coronavirus-education-remote-learning/ (accessed on 4 December 2021).
  2. Gamage, K.A.A.; Wijesuriya, D.I.; Ekanayake, S.Y.; Rennie, A.E.W.; Lambert, C.G.; Gunawardhana, N. Online Delivery of Teaching and Laboratory Practices: Continuity of University Programmes during COVID-19 Pandemic. Educ. Sci. 2020, 10, 291. [Google Scholar] [CrossRef]
  3. Al Mulhem, A.; Almaiah, M.A. A Conceptual Model to Investigate the Role of Mobile Game Applications in Education during the COVID-19 Pandemic. Electronics 2021, 10, 2106. [Google Scholar] [CrossRef]
  4. Fontana, M.T. Gamification of ChemDraw during the COVID-19 Pandemic: Investigating How a Serious, Educational-Game Tournament (Molecule Madness) Impacts Student Wellness and Organic Chemistry Skills while Distance Learning. J. Chem. Educ. 2020, 97, 3358–3368. [Google Scholar] [CrossRef]
  5. Krouska, A.; Troussas, C.; Sgouropoulou, C. Mobile game-based learning as a solution in COVID-19 era: Modeling the pedagogical affordance and student interactions. Educ. Inf. Technol. 2021, 1–13. [Google Scholar] [CrossRef]
  6. Yang, X.; Zhang, M.; Kong, L.; Wang, Q.; Hong, J.-C. The Effects of Scientific Self-efficacy and Cognitive Anxiety on Science Engagement with the “Question-Observation-Doing-Explanation” Model during School Disruption in COVID-19 Pandemic. J. Sci. Educ. Technol. 2020, 30, 380–393. [Google Scholar] [CrossRef] [PubMed]
  7. Bertram, L. Digital Learning Games for Mathematics and Computer Science Education: The Need for Preregistered RCTs, Standardized Methodology, and Advanced Technology. Front. Psychol. 2020, 11, 2127. [Google Scholar] [CrossRef]
  8. Hosseini, H.; Hartt, M.; Mostafapour, M. Learning Is Child’s Play: Game-Based Learning in Computer Science Education. ACM Trans. Comput. Educ. 2019, 19, 1–18. [Google Scholar] [CrossRef]
  9. Alkis, N.; Ozkan, S. Work in Progress—A Modified Technology Acceptance Model for E-Assessment: Intentions of Engineering Students to Use Web-Based Assessment Tools. In Proceedings of the 2010 IEEE Frontiers in Education Conference, Arlington, VA, USA, 27–30 October 2010. [Google Scholar]
  10. Adams, D.M.; Pilegard, C.; Mayer, R.E. Evaluating the Cognitive Consequences of Playing Portal for a Short Duration. J. Educ. Comput. Res. 2015, 54, 173–195. [Google Scholar] [CrossRef]
  11. Barreto, F.; Benitti, V.; Sommariva, L. Evaluation of a Game Used to Teach Usability to Undergraduate Students in Computer Science. J. Usability Stud. 2015, 11, 21–39. [Google Scholar]
  12. Sivak, S.; Sivak, M.; Isaacs, J.; Laird, J.; McDonald, A. Managing the Tradeoffs in the Digital Transformation of an Educational Board Game to a Computer-based Simulation. In Proceedings of the Sandbox Symposium 2007: Acm Siggraph Video Game Symposium, San Diego, CA, USA, 4–5 August 2007; pp. 97–102. [Google Scholar]
  13. Lee, Y.-H.; Dunbar, N.; Kornelson, K.; Wilson, S.; Ralston, R.; Savic, M.; Stewart, S.; Lennox, E.; Thompson, W.; Elizondo, J. Digital Game based Learning for Undergraduate Calculus Education: Immersion, Calculation, and Conceptual Understanding. Int. J. Gaming Comput.-Mediat. Simul. 2016, 8, 13–27. [Google Scholar] [CrossRef] [Green Version]
  14. Bian, T.; Zhao, K.; Meng, Q.; Jiao, H.; Tang, Y.; Luo, J. Preparation and properties of calcium phosphate cement/small intestinal submucosa composite scaffold mimicking bone components and Haversian microstructure. Mater. Lett. 2018, 212, 73–77. [Google Scholar] [CrossRef]
  15. Abt, C.C. Serious Games; University Press of America: Lanham, MD, USA, 1987. [Google Scholar]
  16. Djaouti, D.; Alvarez, J.; Jessel, J.-P.; Rampnoux, O. Origins of Serious Games. In Serious Games and Edutainment Applications; Ma, M., Oikonomou, A., Jain, L.C., Eds.; Springer: London, UK, 2011. [Google Scholar]
  17. Gampell, A.; Gaillard, J.C.; Parsons, M.; Le Dé, L. ‘Serious’ Disaster Video Games: An Innovative Approach to Teaching and Learning about Disasters and Disaster Risk Reduction. J. Geogr. 2020, 119, 159–170. [Google Scholar] [CrossRef]
  18. Gampell, A.V.; Gaillard, J.C.; Parsons, M.; Le Dé, L. Exploring the use of the Quake Safe House video game to foster disaster and disaster risk reduction awareness in museum visitors. Int. J. Disaster Risk Reduct. 2020, 49, 101670. [Google Scholar] [CrossRef]
  19. Hof, B. The turtle and the mouse: How constructivist learning theory shaped artificial intelligence and educational technology in the 1960s. Hist. Educ. 2020, 50, 93–111. [Google Scholar] [CrossRef]
  20. Shaffer, D.W.; Squire, K.R.; Halverson, R.; Gee, J.P. Video Games and the Future of Learning. Phi Delta Kappan 2005, 87, 105–111. [Google Scholar] [CrossRef]
  21. Plass, J.L.; Perlin, K.; Nordlinger, J. The games for learning institute: Research on design patterns for effective educational games. In Proceedings of the Game Developers Conference, San Francisco, CA, USA, 9 March 2010. [Google Scholar]
  22. Chang, Y.; Aziz, E.-S.; Zhang, Z.; Zhang, M.; Esche, S.K. Evaluation of a video game adaptation for mechanical engineering educational laboratories. In Proceedings of the 2016 IEEE Frontiers in Education Conference, Erie, PA, USA, 12–15 October 2016. [Google Scholar]
  23. Coller, B.; Scott, M. Effectiveness of using a video game to teach a course in mechanical engineering. Comput. Educ. 2009, 53, 900–912. [Google Scholar] [CrossRef]
  24. Coller, B.D. A Video Game for Teaching Dynamic Systems & Control to Mechanical Engineering Undergraduates. In Proceedings of the 2010 American Control Conference, Baltimore, MD, USA, 30 June–2 July 2010; pp. 390–395. [Google Scholar]
  25. Jiau, H.C.; Chen, J.C.; Ssu, K.-F. Enhancing Self-Motivation in Learning Programming Using Game-Based Simulation and Metrics. IEEE Trans. Educ. 2009, 52, 555–562. [Google Scholar] [CrossRef] [Green Version]
  26. Jong, B.-S.; Lai, C.-H.; Hsia, Y.-T.; Lin, T.-W.; Lu, C.-Y. Using Game-Based Cooperative Learning to Improve Learning Motivation: A Study of Online Game Use in an Operating Systems Course. IEEE Trans. Educ. 2012, 56, 183–190. [Google Scholar] [CrossRef]
  27. Coller, B.D.; Shernoff, D.J. Video Game-Based Education in Mechanical Engineering: A Look at Student Engagement. Int. J. Eng. Educ. 2009, 25, 308–317. [Google Scholar]
  28. Landers, R.N.; Callan, R.C. Casual Social Games as Serious Games: The Psychology of Gamification in Undergraduate Education and Employee Training. In Serious Games and Edutainment Applications; Ma, M., Oikonomou, A., Jain, L.C., Eds.; Springer: London, UK, 2011; pp. 399–423. [Google Scholar]
  29. Orvis, K.A.; Horn, D.B.; Belanich, J. The roles of task difficulty and prior videogame experience on performance and motivation in instructional videogames. Comput. Hum. Behav. 2008, 24, 2415–2433. [Google Scholar] [CrossRef]
  30. Davis, F.D. Perceived Usefulness, Perceived Ease of Use, And User Acceptance of Information Technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef] [Green Version]
  31. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. User acceptance of computer-technology—A comparison of 2 theoretical-models. Manag. Sci. 1989, 35, 982–1003. [Google Scholar] [CrossRef] [Green Version]
  32. Davis, F.D. User acceptance of information technology: System characteristics, user perceptions and behavioral impacts. Int. J. Man-Mach. Stud. 1993, 38, 475–487. [Google Scholar] [CrossRef] [Green Version]
  33. Venkatesh, V.; Davis, F.D. A Theoretical Extension of the Technology Acceptance Model: Four Longitudinal Field Studies. Manag. Sci. 2000, 46, 186–204. [Google Scholar] [CrossRef] [Green Version]
  34. Aydin, G. Effect of Demographics on Use Intention of Gamified Systems. Int. J. Technol. Hum. Interact. 2018, 14, 1–21. [Google Scholar] [CrossRef]
  35. Mi, Q.; Keung, J.; Mei, X.; Xiao, Y.; Chan, W.K. A Gamification Technique for Motivating Students to Learn Code Readability in Software Engineering. In Proceedings of the 2018 International Symposium on Educational Technology, Osaka, Japan, 31 July–2 August 2018; pp. 250–254. [Google Scholar]
  36. Rönnby, S.; Lundberg, O.; Fagher, K.; Jacobsson, J.; Tillander, B.; Gauffin, H.; Hansson, P.-O.; Dahlström, O.; Timpka, T.; Bolling, C.; et al. mHealth Self-Report Monitoring in Competitive Middle- and Long-Distance Runners: Qualitative Study of Long-Term Use Intentions Using the Technology Acceptance Model. JMIR mHealth uHealth 2018, 6, e10270. [Google Scholar] [CrossRef]
  37. Estriegana, R.; Medina-Merodio, J.-A.; Barchino, R. Student acceptance of virtual laboratory and practical work: An extension of the technology acceptance model. Comput. Educ. 2019, 135, 1–14. [Google Scholar] [CrossRef]
  38. Eccles, J. Achievement and Achievement Motices; W. H. Freeman: San Francisco, CA, USA, 1983. [Google Scholar]
  39. Wigfield, A. Expectancy-value theory of achievement motivation: A developmental perspective. Educ. Psychol. Rev. 1994, 6, 49–78. [Google Scholar] [CrossRef]
  40. Winston, C. The utility of expectancy/value and disidentification models for understanding ethnic group differences in academic performance and self-esteem. Z. Fur Padag. Psychol. 1997, 11, 177–186. [Google Scholar]
  41. Eccles, J.; Wigfield, A.; Harold, R.D.; Blumenfeld, P. Age and Gender Differences in Children’s Self and Task Perceptions During Elementary-School. Child Dev. 1993, 64, 830–847. [Google Scholar] [CrossRef]
  42. Piaget, J. Play, Dreams and Imitation in Childhood; Norton: New York, NY, USA, 1962. [Google Scholar]
  43. Shaffer, D.W. Epistemic frames for epistemic games. Comput. Educ. 2006, 46, 223–234. [Google Scholar] [CrossRef]
  44. Plass, J.L.; Moreno, R.; Brünken, R. Cognitive Load Theory; Cambridge University Press: Cambridge, UK, 2010. [Google Scholar]
  45. Plass, J.L.; Homer, B.D.; Kinzer, C.K. Foundations of Game-Based Learning. Educ. Psychol. 2015, 50, 258–283. [Google Scholar] [CrossRef]
  46. Handriyantini, E.; Subari, D. Development of a Casual Game for Mobile Learning With the Kiili Experiential Gaming Model. In Proceedings of the 11th European Conference on Games Based Learning, Graz, Austria, 5–6 October 2017; pp. 213–218. [Google Scholar]
  47. Cheng, K.W. Casual Gaming; Vrije Universiteit: Amsterdam, The Netherlands, 2011. [Google Scholar]
  48. Elias, H. First Person Shooter—The Subjective Cyberspace; LabCom: Covilha, Portugal, 2009. [Google Scholar]
  49. Wardrip-Fruin, N.; Harrigan, P. Second Person: Role-Playing and Story in Games and Playable Media; MIT Press: Cambridge, MA, USA, 2007. [Google Scholar]
  50. Kolb, D.A. Experiential Learning: Experience as the Source of Learning and Development 1984; Prentice-Hal: Englewood Cliffs, NJ, USA, 1984. [Google Scholar]
  51. Dunwell, I.; Petridis, P.; Hendrix, M.; Arnab, S.; Al-Smadi, M.; Guetl, C. Guiding Intuitive Learning in Serious Games: An Achievement-Based Approach to Externalized Feedback and Assessment. In Proceedings of the 2012 Sixth International Conference on Complex, Intelligent, and Software Intensive Systems, Palermo, Italy, 4–6 July 2012; pp. 911–916. [Google Scholar] [CrossRef]
  52. Dondlinger, M. Educational Video Game Design: A Review of the Literature. J. Appl. Educ. Technol. 2007, 4, 21–31. [Google Scholar]
  53. ISO 9241-210:2019(en). Ergonomics of Human-System Interaction—Part 210: Human-Centred Design for Interactive Systems; International Organization for Standardization: Geneva, Switzerland, 2010; Available online: https://www.iso.org/standard/77520.html (accessed on 1 December 2021).
  54. Hornbæk, K.; Hertzum, M. Technology Acceptance and User Experience. ACM Trans. Comput.-Hum. Interact. 2017, 24, 1–30. [Google Scholar] [CrossRef]
  55. Aranyi, G.; van Schaik, P. Modeling user experience with news websites. J. Assoc. Inf. Sci. Technol. 2015, 66, 2471–2493. [Google Scholar] [CrossRef] [Green Version]
  56. Venkatesh, V. Determinants of Perceived Ease of Use: Integrating Control, Intrinsic Motivation, and Emotion into the Technology Acceptance Model. Inf. Syst. Res. 2000, 11, 342–365. [Google Scholar] [CrossRef] [Green Version]
  57. McFarland, D.J.; Hamilton, D. Adding contextual specificity to the technology acceptance model. Comput. Hum. Behav. 2006, 22, 427–447. [Google Scholar] [CrossRef]
  58. Gomez-Jauregui, V.; Manchado, C.; Otero, C. Gamification in a Graphical Engineering course—Learning by playing. In Proceedings of the International Joint Conference on Mechanics, Design Engineering and Advanced Manufacturing (JCM), Catania, Italy, 14–16 September 2016. [Google Scholar]
  59. Ismail, N.; Ayub, A.F.M.; Yunus, A.S.M.; Ab Jalil, H. Utilising CIDOS LMS in Technical Higher Education: The Influence of Compatibility Roles on Consistency of Use. In Proceedings of the 2nd International Research Conference on Business and Economics (IRCBE), Semarang, Indonesia, 3–4 August 2016. [Google Scholar]
  60. Janssen, D.; Schilberg, D.; Richert, A.; Jeschke, S. Pump it up!—An Online Game in the Lecture “Computer Science in Mechanical Engineering”. In Proceedings of the 8th European Conference on Games Based Learning (ECGBL), Berlin, Germany, 9–10 October 2014. [Google Scholar]
  61. Tummel, C.; Richert, A.; Schilberg, D.; Jeschke, S. Pump It up!—Conception of a Serious Game Applying in Computer Science. In Proceedings of the 2nd International Conference on Learning and Collaboration Technologies/17th International Conference on Human-Computer Interaction, Los Angeles, CA, USA, 2–7 August 2015. [Google Scholar]
  62. Lenz, L.; Stehling, V.; Richert, A.; Isenhardt, I.; Jeschke, S. Of Abstraction and Imagination: An Inventory-Taking on Gamification in Higher Education. In Proceedings of the 11th European Conference on Game-Based Learning (ECGBL), Graz, Austria, 5–6 October 2017. [Google Scholar]
  63. Martinetti, A.; Puig, J.E.P.; Alink, C.O.; Thalen, J.; Van Dongen, L.A. Gamification in teaching Maintenance Engineering: A Dutch experience in the rolling stock management learning. In Proceedings of the 3rd International Conference on Higher Education Advances (HEAd), Valencia, Spain, 21–23 June 2017. [Google Scholar]
  64. Menandro, F.C.M.; Arnab, S. Game-Based Mechanical Engineering Teaching and Learning—A Review. Smart Sustain. Manuf. Syst. 2020, 5, 45–59. [Google Scholar] [CrossRef]
  65. Shernoff, D.; Ryu, J.-C.; Ruzek, E.; Coller, B.; Prantil, V. The Transportability of a Game-Based Learning Approach to Undergraduate Mechanical Engineering Education: Effects on Student Conceptual Understanding, Engagement, and Experience. Sustainability 2020, 12, 6986. [Google Scholar] [CrossRef]
  66. Alanne, K. An overview of game-based learning in building services engineering education. Eur. J. Eng. Educ. 2015, 41, 204–219. [Google Scholar] [CrossRef]
  67. Dinis, F.M.; Guimaraes, A.S.; Carvalho, B.R.; Martins, J.P.P. Development of Virtual Reality Game-Based interfaces for Civil Engineering Education. In Proceedings of the 8th IEEE Global Engineering Education Conference (EDUCON), Athens, Greece, 25–28 April 2017. [Google Scholar]
  68. Ebner, M.; Holzinger, A. Successful implementation of user-centered game based learning in higher education: An example from civil engineering. Comput. Educ. 2007, 49, 873–890. [Google Scholar] [CrossRef]
  69. Herrera, R.F.; Sanz, M.A.; Montalbán-Domingo, L.; García-Segura, T.; Pellicer, E. Impact of Game-Based Learning on Understanding Lean Construction Principles. Sustainability 2019, 11, 5294. [Google Scholar] [CrossRef] [Green Version]
  70. Taillandier, F.; Micolier, A.; Sauce, G.; Chaplain, M. DOMEGO: A Board Game for Learning How to Manage a Construction Project. Int. J. Game-Based Learn. 2021, 11, 20–37. [Google Scholar] [CrossRef]
  71. Tsai, M.-H.; Wen, M.-C.; Chang, Y.-L.; Kang, S.-C. Game-based education for disaster prevention. AI Soc. 2014, 30, 463–475. [Google Scholar] [CrossRef]
  72. Zechner, J.; Ebner, M. Playing a Game in Civil Engineering The Internal Force Master for Structural Analysis. In Proceedings of the 14th International Conference on Interactive Collaborative Learning (ICL)/11th International Conference on Virtual-University (VU), Piestany, Slovakia, 21–23 September 2011. [Google Scholar]
  73. Carranza, D.B.; Negron, A.P.P.; Contreras, M. Teaching Approach for the Development of Virtual Reality Videogames. In Proceedings of the International Conference on Software Process Improvement (CIMPS), Leon, Mexico, 23–25 October 2019; pp. 276–288. [Google Scholar]
  74. Carranza, D.B.; Negrón, A.P.P.; Contreras, M. Videogame development training approach: A Virtual Reality and open-source perspective. JUCS-J. Univers. Comput. Sci. 2021, 27, 152–169. [Google Scholar] [CrossRef]
  75. Cengiz, M.; Birant, K.U.; Yildirim, P.; Birant, D. Development of an Interactive Game-Based Learning Environment to Teach Data Mining. Int. J. Eng. Educ. 2017, 33, 1598–1617. [Google Scholar]
  76. Sarkar, M.S.; William, J.H. Digital Democracy: Creating an Online Democracy Education Simulation in a Software Engineering Class. In Proceedings of the 48th ACM Annual Southeast Regional Conference (SE), New York, NY, USA, 15–17 April 2010; pp. 283–286. [Google Scholar]
  77. Kodappully, M.; Srinivasan, B.; Srinivasan, R. Cognitive Engineering for Process Safety: Effective Training for Process Operators Using Eye Gaze Patterns. In Proceedings of the 26th European Symposium on Computer Aided Process Engineering (ESCAPE), Portoroz, Slovenia, 12–15 June 2016; pp. 2043–2048. [Google Scholar]
  78. Llanos, J.; Fernández-Marchante, C.M.; García-Vargas, J.M.; Lacasa, E.; de la Osa, A.R.; Sanchez-Silva, M.L.; De Lucas-Consuegra, A.; Garcia, M.T.; Borreguero, A.M. Game-Based Learning and Just-in-Time Teaching to Address Misconceptions and Improve Safety and Learning in Laboratory Activities. J. Chem. Educ. 2021, 98, 3118–3130. [Google Scholar] [CrossRef]
  79. Tarres, J.A.; Oliver, H.; Delgado-Aguilar, M.; Perez, I.; Alcala, M. Apps as games to help students in their learning process. In Proceedings of the 9th International Conference on Education and New Learning Technologies (EDULEARN), Barcelona, Spain, 3–5 July 2017; pp. 203–209. [Google Scholar]
  80. Wun, K.P.; Harun, J. The Effects of Scenario-Epistemic Game Approach on Professional Skills and Knowledge among Chemical Engineering Students. In Proceedings of the 6th IEEE Conference on Engineering Education (ICEED), Kuala Lumpur, Malaysia, 9–10 December 2014; pp. 90–94. [Google Scholar]
  81. Cohen, M.A.; Niemeyer, G.O.; Callaway, D.S. Griddle: Video Gaming for Power System Education. IEEE Trans. Power Syst. 2016, 32, 3069–3077. [Google Scholar] [CrossRef]
  82. Bilge, P.; Severengiz, M. Analysis of industrial engineering qualification for the job market. In Proceedings of the 16th Global Conference on Sustainable Manufacturing (GCSM), Lexington, KY, USA, 2–4 October 2018; pp. 725–731. [Google Scholar]
  83. Despeisse, M.; IEEE. Games and simulations in industrial engineering education: A review of the cognitive and affective learning outcomes. In Proceedings of the Winter Simulation Conference (WSC), Gothenburg, Sweden, 9–12 December 2018; pp. 4046–4057. [Google Scholar]
  84. Galleguillos, L.; Santelices, I.; Bustos, R. Designing a board game for industrial engineering students a collaborative work experience of freshmen. In Proceedings of the 13th International Technology, Education and Development Conference (INTED), Valencia, Spain, 11–13 March 2019; pp. 138–144. [Google Scholar]
  85. Surinova, Y.; Jakabova, M.; Kosnacova, P.; Jurik, L.; Kasnikova, K. Game-based learning of workplace standardizations basics. In Proceedings of the 8th International Technology, Education and Development Conference (INTED), Valencia, Spain, 10–12 March 2014; pp. 3941–3948. [Google Scholar]
  86. Pierce, T.; Madani, K. Online gaming for sustainable common pool resource management and tragedy of the commons prevention. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics (SMC), Manchester, UK, 13–16 October 2013; pp. 1765–1770. [Google Scholar]
  87. Sobke, H.; Hauge, J.B.; Stefan, I.A.; Stefan, A. Using a Location-Based AR Game in Environmental Engineering. In Proceedings of the 1st IFIP TC 14 Joint International Conference on 18th IFIP International Conference on Entertainment Computing (IFIP-ICEC)/5th International Joint Conference on Serious Games (JCSG), Arequipa, Peru, 11–15 November 2015; pp. 466–469. [Google Scholar]
  88. Frøland, T.H.; Heldal, I.; Sjøholt, G.; Ersvær, E. Games on Mobiles via Web or Virtual Reality Technologies: How to Support Learning for Biomedical Laboratory Science Education. Information 2020, 11, 195. [Google Scholar] [CrossRef] [Green Version]
  89. Butler, W.M. An Experiment in Live Simulation-Based Learning in Aircraft Design and its Impact on Student Preparedness for Engineering Practice. In Proceedings of the ASEE Annual Conference, Atlanta, GA, USA, 23–26 June 2013. [Google Scholar]
  90. Roy, J. Engineering by the Numbers: ASEE Retention and Time-to-Graduation Benchmarks for Undergraduate Schools, Departments and Program; American Society for Engineering Education: Washington, DC, USA, 2019. [Google Scholar]
  91. Buffum, P.S.; Frankosky, M.; Boyer, K.E.; Wiebe, E.; Mott, B.; Lester, J. Leveraging Collaboration to Improve Gender Equity in a Game-based Learning Environment for Middle School Computer Science. In Proceedings of the Research on Equity and Sustained Participation in Engineering Computing and Technology, Charlotte, NC, USA, 13–14 August 2015. [Google Scholar]
  92. Coller, B.D. Work in progress—A video game for teaching dynamics. In Proceedings of the 2011 Frontiers in Education Conference (FIE), Rapid City, SD, USA, 12–15 October 2011. [Google Scholar]
  93. Buffum, P.S.; Frankosky, M.; Boyer, K.E.; Wiebe, E.; Mott, B.W.; Lester, J.C. Collaboration and Gender Equity in Game-Based Learning for Middle School Computer Science. Comput. Sci. Eng. 2016, 18, 18–28. [Google Scholar] [CrossRef]
  94. Cheng, Y.M.; Chen, P.F. Building an online game-based learning system for elementary school. In Proceedings of the 4th International Conference on Intelligent Information Hiding and Multimedia Signal Processing, Harbin, China, 15–17 August 2008. [Google Scholar]
  95. Lester, J.C.; Spires, H.; Nietfeld, J.L.; Minogue, J.; Mott, B.W.; Lobene, E.V. Designing game-based learning environments for elementary science education: A narrative-centered learning perspective. Inf. Sci. 2014, 264, 4–18. [Google Scholar] [CrossRef]
  96. Shabihi, N.; Taghiyareh, F.; Faridi, M.H. The Relationship Between Gender and Game Dynamics in e-Learning Environment: An Empirical Investigation. In Proceedings of the 12th European Conference on Games Based Learning (ECGBL), Lille, France, 4–5 October 2018; pp. 574–582. [Google Scholar]
  97. Pezzullo, L.G.; Wiggins, J.B.; Frankosky, M.H.; Min, W.; Boyer, K.E.; Mott, B.W.; Wiebe, E.N.; Lester, J.C. “Thanks Alisha, Keep in Touch”: Gender Effects and Engagement with Virtual Learning Companions. In Proceedings of the 18th International Conference on Artificial Intelligence in Education (AIED), Wuhan, China, 28 June–1 July 2017. [Google Scholar]
  98. Robertson, J. Making games in the classroom: Benefits and gender concerns. Comput. Educ. 2012, 59, 385–398. [Google Scholar] [CrossRef]
  99. Alserri, S.A.; Zin, N.A.M.; Wook, T. Gender-based Engagement Model for Designing Serious Games. In Proceedings of the 6th International Conference on Electrical Engineering and Informatics (ICEEI)—Sustainable Society Through Digital Innovation, Langkawi, Malaysia, 25–27 November 2017. [Google Scholar]
  100. Creswell, J.W.; Plano Clark, V.L. Designing and Conducting Mixed Methods Research; Sage Publications Inc.: Thousand Oaks, CA, USA, 2018; p. 492. [Google Scholar]
  101. The Carnegi Classification of Institutions of High Education. 2018 Update—Facts & Figures; Center for Postsecondary Research, Ed.; Indiana University, School of Education: Bloomington, IN, USA, 2019. [Google Scholar]
  102. Cook-Chennault, K.; Villanueva, I. Preliminary Findings: RIEF—Understanding Pedagogically Motivating Factors for Underrepresented and Nontraditional Students in Online Engineering Learning Modules. In Proceedings of the 2019 ASEE Annual Conference & Exposition, Tampa, FL, USA, 16–19 June 2019. [Google Scholar]
  103. Cook-Chennault, K.; Villanueva, I. An initial exploration of the perspectives and experiences of diverse learners’ acceptance of online educational engineering games as learning tools in the classroom. In Proceedings of the 2019 IEEE Frontiers in Education Conference, Cincinnati Marriott at RiverCenter, Cincinnati, OH, USA, 16–19 October 2019. [Google Scholar]
  104. Cook-Chennault, K.; Villanueva, I. Exploring perspectives and experiences of diverse learners’ acceptance of online educational engineering games as learning tools in the classroom. In Proceedings of the 2020 IEEE Frontiers in Education Conference (FIE), Uppsala, Sweden, 21–24 October 2020. [Google Scholar]
  105. Shojaee, A.; Kim, H.W.; Cook-Chennault, K.; Villanueva Alarcon, I. What you see is what you get?—Relating eye tracking metrics to students’ attention to game elements. In Proceedings of the IEEE Frontiers in Education (FIE)—Education for a Sustainable Future, Lincoln, NE, USA; 2021. [Google Scholar]
  106. Gauthier, A.; Jenkinson, J. Game Design for Transforming and Assessing Undergraduates’ Understanding of Molecular Emergence (Pilot). In Proceedings of the 9th European Conference on Games-Based Learning (ECGBL), Nord Trondelag Univ Coll, Steinkjer, Norway, 8–9 October 2015. [Google Scholar]
  107. Johnson, E.; Giroux, A.; Merritt, D.; Vitanova, G.; Sousa, S. Assessing the Impact of Game Modalities in Second Language Acquisition: ELLE the EndLess LEarner. JUCS-J. Univers. Comput. Sci. 2020, 26, 880–903. [Google Scholar] [CrossRef]
  108. Palaigeorgiou, G.; Politou, F.; Tsirika, F.; Kotabasis, G. FingerDetectives: Affordable Augmented Interactive Miniatures for Embodied Vocabulary Acquisition in Second Language Learning. In Proceedings of the 11th European Conference on Game-Based Learning (ECGBL), Graz, Austria, 5–6 October 2017. [Google Scholar]
  109. Potter, H.; Schots, M.; Duboc, L.; Werneck, V. InspectorX: A Game for Software Inspection Training and Learning. In Proceedings of the IEEE 27th Conference on Software Engineering Education and Training (CSEE&T), Klagenfurt, Austria, 23–25 April 2014. [Google Scholar]
  110. Owen, H.E.; Licorish, S.A. Game-based student response system: The effectiveness of kahoot! On junior and senior information science students’ learning. J. Inf. Technol. Educ. Res. 2020, 19, 511–553. [Google Scholar] [CrossRef]
  111. Lucas, M.J.V.; Daviu, E.A.; Martinez, D.D.; Garcia, J.C.R.; Chust, A.P.; Domenech, C.G.; Cerdan, A.P. Engaging students in out-of-class activities through game-based learning and gamification. In Proceedings of the 9th Annual International Conference of Education, Research and Innovation (iCERi), Seville, Spain, 14–16 November 2016. [Google Scholar]
  112. Davis, F.D.; Venkatesh, V. A critical assessment of potential measurement biases in the technology acceptance model: Three experiments. Int. J. Hum.-Comput. Stud. 1996, 45, 19–45. [Google Scholar] [CrossRef] [Green Version]
  113. Arnold, S.R.; Kruatong, T.; Dahsah, C.; Suwanjinda, D. The classroom-friendly ABO blood types kit: Blood agglutination simulation. J. Biol. Educ. 2012, 46, 45–51. [Google Scholar] [CrossRef]
  114. Saldana, J. The Coding Manual for Qualitative Researchers; Sage Publications Inc.: Thousand Oaks, CA, USA, 2015. [Google Scholar]
  115. Krueger, R.A.; Casey, M.A. Focus Groups: A Practical Guide for Applied Research, 5th ed.; SAGE: Thousand Oaks, CA, USA, 2015. [Google Scholar]
  116. Corbin, J.; Strauss, A. Basics of Qualitative Research—Techniques and Procedures for Developing Grounded Theory, 4th ed.; Sage Publications, Inc.: Thousand Oaks, CA, USA, 2014. [Google Scholar]
  117. Charmaz, K. Constructing Grounded Theory, 2nd ed.; Introducing Qualitative Methods Series; Seaman, J., Ed.; Sage: Thousand Oaks, CA, USA, 2014. [Google Scholar]
  118. Grafsgaard, J.F.; Wiggins, J.B.; Boyer, K.E.; Wiebe, E.N.; Lester, J.C. Automatically Recognizing Facial Indicators of Frustration: A Learning-centric Analysis. In Proceedings of the 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, Geneva, Switzerland, 2–5 September 2013. [Google Scholar]
  119. Craig, S.; Graesser, A.; Sullins, J.; Gholson, B. Affect and learning: An exploratory look into the role of affect in learning with AutoTutor. J. Educ. Media 2004, 29, 241–250. [Google Scholar] [CrossRef] [Green Version]
  120. Abramson, J.; Dawson, M.; Stevens, J. An Examination of the Prior Use of E-Learning Within an Extended Technology Acceptance Model and the Factors That Influence the Behavioral Intention of Users to Use M-Learning. SAGE Open 2015, 5. [Google Scholar] [CrossRef] [Green Version]
  121. Graesser, A.C. Emotions are the experiential glue of learning environments in the 21st century. Learn. Instr. 2019, 70, 101212. [Google Scholar] [CrossRef]
Figure 1. Information pertaining to their prior experiences with online learning tool and video game usage, followed by 20 min of engineering game play. Subsequent to playing the engineering game, students answer questions about their perception.
Figure 1. Information pertaining to their prior experiences with online learning tool and video game usage, followed by 20 min of engineering game play. Subsequent to playing the engineering game, students answer questions about their perception.
Education 12 00027 g001
Figure 2. Representation of the serious game interface, Civic-Build. Screen features were modified in order to anonymize the game.
Figure 2. Representation of the serious game interface, Civic-Build. Screen features were modified in order to anonymize the game.
Education 12 00027 g002
Table 1. Overview of digital game-based learning studies in mechanical and civil engineering disciplines that emphasize student engagement, perceived usefulness, and value.
Table 1. Overview of digital game-based learning studies in mechanical and civil engineering disciplines that emphasize student engagement, perceived usefulness, and value.
DisciplineTopic(s) DiscussedPopulation Studied, Men, Women, Race/EthnicityAuthors
Computer science, electrical engineering, liberal arts and social sciences, material science engineering, mathematics, mechanical engineering, physics, chemistry (review article)—undergraduate courseGamification approach, educational objectives, application to curriculum and cognitive learning outcomes----Menandro, F.
Arnab, S. [64]
Mechanical engineering (subject: dynamics, Game: Spumone)—undergraduate courseStudents’ conceptual understanding (Dynamics Conceptual Inventory), emotional engagement, and experience with video gameN = 243
Predominately male and ethnically diverse [65]
(Location: USA)
Shernoff, D., Ryu, J.-C., et al. [65]
Mechanical engineering (course subject: graphical engineering)—undergraduate courseGame enjoyment, usefulness, motivation
Multiple games: Championship of Limits and Fits, Tournament of Video Search of Fabrication Processes, Forum of Doubts, Proposals of Trivial Questions, Contest of 3D modeling
N = 51
11 Women (22%)
30 gamers (59%)
(Location: Spain)
Gomez-Jauregui, V., et al. [58]
All disciplines—undergraduate and graduate students: 30% architecture, 26% computer science, 22% engineering sciences, 10% social sciences, 10% economics.Sought to understand how students expect or not usefulness in gamification.N = 83
31 women (37%)
Ages 18–55 years
(Location: Germany)
Lenz, L., et al. [62]
Mechanical engineering—graduate courseExamine student engagement, active cooperation within a group, and give real-time feedback to studentsN = 36
(Location: Netherlands)
Martinetti, A. [63]
Mechanical Engineering (Game—Pump It Up)Focus on how the game was designed and how it is used by students who use the game to write computer code to program robots that manufacture pump adapter pipes.No participants—game developmentJanssen, D., et al. [61]
Mechanical Engineering—dynamic systems and control (Game—NIU-Torcs)Student engagement and perceived effectiveness of race car game. N = 51
N = 38
(Location = USA)
Coller, B., et al. [23,27]
Civil Engineering—graduate engineering course (Game—Internal Force Master)Examined if an online learning tool could make complex materials more approachable.N = 121
(Location = Austria)
Ebner, M. and Holzinger, A. [68]
Civil Engineering—virtual reality game for use in civil engineering education (various ages, academic backgrounds)Examined participant acceptance of a virtual reality game in construction engineering.N = 72
(Location = Portugal)
Dinis, F. M., et al. [67]
Table 2. Questions given to students prior to and after playing the serious game.
Table 2. Questions given to students prior to and after playing the serious game.
Pre-Game QuestionsLearning Model/Theory Reference
Q1.pre.   Do you play video games on your phone?Expectancy Value Theory [29,38]
Q2.pre.   Are you currently taking a Dynamics or Statics class?
Q3.pre.   Have you played the learning game before?
Q4.pre.   Do you play video games on your computer?
Post-Game QuestionsLearning Model/ Theory Reference
Q1.post.   The Challenge Level I ended the game at was (1–24 Challenge Levels):
Q2.post.   The game is easy to play. {ease of use}Technology Acceptance Model
[30,32,112]
Q3.post.   The ways to advance to higher levels in the game are easy to understand. {ease of use}
Q4.post.   I understood the engineering topics each level of the game was teaching me. {ease of use}
Q5.post.   This game helped me to understand engineering truss structures. {useful—help to do my job better}
Q6.post.   Playing the game increased my confidence in my engineering skills.Added to assess usefulness.
Q7.post.   The engineering concepts presented in the game were intuitive to me.
Q8. post.   Did you enjoy playing the game?Expectancy Value Theory [29,38]
Q9.post.   I got frustrated playing this game.Game-Based Learning [42,52] and Expectancy Value Theory [29,38]
Q10.post.   The hints motivated me to want to advance to higher challenge levels.
Q11.post.   The learning lessons or goals of each challenge are defined in enough detail to play the game.
Q12.post   I would improve this game by (complete the sentence). Select a choice.
  • This game is a good learning tool because (select all that apply). Adding a story line. (Game-Based Learning)
  • Adding avatars/characters. (Game-Based Learning)
  • Making the images look more like real life. (Game-Based Learning)
  • Adding opportunities to compete against other players while playing.
  • The game is fine the way it is.
  • Adding more explanation to the challenges. (EVT)
  • Changing the rewards from nuts to something else.
  • Give any other feedback.—Text
Q13.post.   This game is a good learning tool because (select appropriate responses).
  • The game is easy to figure out.
  • I can apply what I learn in my classes.
  • he game taught me about stability of truss structures.
  • It is a fun alternative to traditional learning.
Added to assess usefulness.
Focus Group Questions
F1.   Would you use this game to prepare for a job interview? Explain your answer.Added to assess usefulness
F2.   Would you use this game to prepare for an exam for a class?
Table 3. Demographics of the participants according to race and gender, where count (number of participants) and percentage of the total participants is presented *.
Table 3. Demographics of the participants according to race and gender, where count (number of participants) and percentage of the total participants is presented *.
Self-IdentificationAA/Black (%)Caucasian (%)LatinX (%)Asian (%)Mixed-Race(%)Other
Count
(%)
Total
(S)(%)
Male3 (2%)38 (19%)7 (4%)47 (23%)9 (5%)--104 (51%)
Female10 (5%)26 (13%)10 (5%)--2 (1%)1 (1%)93 (46%)
Non-binary--1 (0%)--------3 (2%)
Other----------1 (1%)1 (1%)
Total
Count (%)
(13) 7%(65) 32%(17) 8%(47) 46%(11) 6%(2) 1%(201) 100%
* Values have been rounded up to the nearest whole number. The numbers in the parentheses in the lower row are the numerical count and the percentages are of the total count in the column or row.
Table 4. Frequency count and percentages for the pre-game questions.
Table 4. Frequency count and percentages for the pre-game questions.
Pre-Game QuestionYes
Count
Yes (%)No
Count
No (%)
Q1.pre.   Do you play video games on your phone?14170.1%6029.9%
Q2.pre.   Are you currently taking a Dynamics or Statics class?14471.6%5728.4%
Q3.pre.   Have you played the learning game before?2713.4%17486.6%
Q4.pre.   Do you play video games on your computer?12662.7%7537.3%
Table 5. Descriptive statistics, i.e., mean, and standard deviations for the post-game questionnaire questions. Question 1 was answered by 174 participants and questions 2–11 were answered by 201 participants.
Table 5. Descriptive statistics, i.e., mean, and standard deviations for the post-game questionnaire questions. Question 1 was answered by 174 participants and questions 2–11 were answered by 201 participants.
Question (Number of Responses)Mean ± s
Q1.   The Challenge Level I ended the game at was: (N = 174)7.36 ± 3.01
Q2.   The game is easy to play.3.06 ± 1.43
Q3.   The ways to advance to higher levels in the game are easy to understand.2.98 ± 1.52
Q4.   I understood the engineering topics each level of the game was teaching me.3.08 ± 1.52
Q5.   This game helped me to understand engineering truss structures.2.77 ± 1.22
Q6.   Playing the game increased my confidence in my engineering skills.3.62 ± 1.47
Q7.   The engineering concepts presented in the game were intuitive to me.2.61 ± 1.13
Q8.   Did you enjoy playing the engineering serious game?2.49 ± 1.40
Q9.   The hints (in the game) motivated me to want to advance to higher challenge levels. 3.80 ± 1.64
Q10.   The learning lessons or goals of each challenge are defined in enough detail to play the game.3.90 ± 1.64
Q11.   I got frustrated playing this game.3.47 ± 1.65
Table 6. Results from Question 12: I would improve this game by __ (select the choice).
Table 6. Results from Question 12: I would improve this game by __ (select the choice).
ChoiceCount (%)
Adding a story line.46/201 (22.9%)
Adding avatars/characters.35/201 (17.4%)
Making the images look more like real life.40/201 (19.9%)
Adding opportunities to compete against other players while playing.91/201 (45.3%)
The game is fine the way that it is.15/201 (7.5%)
Adding more explanation to the challenges.157/201 (78.1%)
Changing the rewards from nuts to something else.37/201 (18.4%)
Table 7. Results from Question 13: This game is a good learning tool because: __ (select the choice).
Table 7. Results from Question 13: This game is a good learning tool because: __ (select the choice).
ChoiceCount (%)
The game is easy to figure out.88/201 (43.8%)
I can apply what I learned to my classes.109/201 (54.2%)
The game taught me about stability of truss structures.130/201 (64.7%)
It is a fun alternative to traditional learning.154/201 (76.6%)
Table 8. One-Way ANOVA statistically significant results of post-game perceptions in terms of gender.
Table 8. One-Way ANOVA statistically significant results of post-game perceptions in terms of gender.
Post-Game QuestionsGenderNMeanStd. Dev.FSig.
Q1: The Challenge Level I ended the game at was:Male878.472.909.120.000
Female836.302.72
Non-binary35.332.52
Other15.00--
Total1747.363.01
Q10: The learning lessons or goals of each challenge are defined in enough detail to play the game.Male1043.371.468.880.000
Female934.441.66
Non-binary35.330.58
Other15.00--
Total2013.901.64
Q11: I got frustrated playing this game.Male1043.841.634.040.008
Female933.061.58
Non-binary33.672.08
Other12.00--
Total2013.471.65
Table 9. One-way ANOVA statistically significant post-game questions in terms of playing video games on one’s phone.
Table 9. One-way ANOVA statistically significant post-game questions in terms of playing video games on one’s phone.
Post-Game QuestionsDo You Play Video Games on Your Phone?MeanStd. Dev.NFSig.
Q5. This game helped me to understand engineering truss structures.Yes2.581.1314111.440.001
No3.201.3160
Total2.771.22201
Q6. Playing the game increased my confidence in my engineering skills.Yes3.461.441415.420.021
No3.981.4960
Total3.621.47201
Q8. Did you enjoy playing the serious game?Yes2.261.2014112.940.000
No3.021.6860
Total2.491.40201
Q9. The hints motivated me to want me to advance to higher challenge levels.Yes3.591.651417.800.006
No4.281.5360
Total3.801.64201
Q10. The learning lessons or goals of each challenge are defined in enough detail to play the game.Yes3.711.631416.580.011
No4.351.6060
Total3.901.64201
Table 10. One-way ANOVA statistically significant results for post-game questions in terms of playing video games on one’s computer.
Table 10. One-way ANOVA statistically significant results for post-game questions in terms of playing video games on one’s computer.
Post-Game Questions.Do You Play Video Games on Your Computer?NMeanStd. Dev.FSig.
Q2. This game is easy to play.Yes1262.861.3377.3530.007
No753.411.517
Total2013.061.429
Q1. The Challenge Level I ended the game at was:Yes1067.963.01411.4600.001
No686.432.766
Total1747.363.007
Q4. I understood the engineering topics each level of the game was teaching me.Yes1262.901.4254.9730.027
No753.391.635
Total2013.081.521
Q5. This game helped me to understand engineering truss structures.Yes1262.601.1256.2200.013
No753.041.320
Total2012.771.217
Q6. Playing the game increased my confidence in my engineering skills.Yes1263.371.45710.3420.002
No754.041.409
Total2013.621.472
Q9. The hints motivated me to want me to advance to higher challenge levels.Yes1263.621.6493.9860.047
No754.091.595
Total2013.801.641
Q10. The learning lessons or goals of each challenge are defined in enough detail to play the game.Yes1263.601.65512.3130.001
No754.411.499
Total2013.901.643
Q11. I got frustrated playing this game.Yes1263.691.6716.3270.013
No753.091.552
Total2013.471.649
Table 11. MANOVA illustrating the post-game questions that have statistically significant differences in terms of means as a function of both video game usage on one’s phone and gender.
Table 11. MANOVA illustrating the post-game questions that have statistically significant differences in terms of means as a function of both video game usage on one’s phone and gender.
Post-Game QuestionDo You Play Video Games on Your Phone?GenderMeanStd. Dev.NFSig.
Q1: The Challenge Level I ended the game at was: (1–24 Challenge Levels): ________.YesMale8.542.750655.0220.000
Female6.552.81253
Non-binary4.001.4142
Other5.00--1
Total7.562.952121
NoMale8.273.38322
Female5.872.54330
Non-binary8.00--1
Total6.913.10953
TotalMale8.472.90587
Female6.302.72283
Non-binary5.332.5173
Other5.00--1
Total7.363.007174
Q2. This game is easy to play.YesMale2.681.312652.0430.063
Female3.321.47853
Non-binary3.500.7072
Other3.00--1
Total2.981.405121
NoMale2.771.15222
Female3.431.52430
Non-binary1.00--1
Total3.111.42353
TotalMale2.701.26887
Female3.361.48683
Non-binary2.671.5283
Other3.00--1
Total3.021.408174
Q5. This game helped me to understand engineering truss structures.YesMale2.601.235652.4320.028
Female2.551.01153
Non-binary2.000.0002
Other5.00--1
Total2.591.145121
NoMale3.361.29322
Female3.071.38830
Non-binary3.00--1
Total3.191.33153
TotalMale2.791.28687
Female2.731.18083
Non-binary2.330.5773
Other5.00--1
Total2.771.233174
Q9. The hints motivated me to want me to advance to higher challenge levels.YesMale3.541.768652.3920.030
Female3.451.53953
Non-binary4.000.0002
Other7.00--1
Total3.541.674121
NoMale4.411.70922
Female4.371.54230
Non-binary4.00--1
Total4.381.58453
TotalMale3.761.78587
Female3.781.59383
Non-binary4.000.0003
Other7.00--1
Total3.791.687174
Q10. The learning lessons or goals of each challenge are defined in enough detail to play the game.YesMale3.261.482655.3920.000
Female4.231.73953
Non-binary5.500.7072
Other5.00 1
Total3.741.667121
NoMale3.411.33322
Female4.931.46130
Non-binary5.00 1
Total4.301.57653
TotalMale3.301.44087
Female4.481.67083
Non-binary5.330.5773
Other5.00 1
Total3.911.656174
Table 12. MANOVA illustrating the post-game questions that have statistically significant differences in terms of means as a function of both video game usage on one’s computer and gender.
Table 12. MANOVA illustrating the post-game questions that have statistically significant differences in terms of means as a function of both video game usage on one’s computer and gender.
Post-Game QuestionGenderDo You Play Video Games on Your Computer?MeanStd. Dev.NFSig.
Q1. The Challenge Level I ended the game at was:MaleYes8.702.86666.5870.000
No7.763.0221
Total8.472.9087
FemaleYes6.922.9636
No5.832.4547
Total6.302.7283
Non-binaryYes5.332.523
Total5.332.523
OtherYes5.00--1
Total5.00--1
TotalYes7.963.01106
No6.432.7768
Total7.363.01174
Q2. This game is easy to play.MaleYes2.681.31663.6010.004
No2.761.1421
Total2.701.2787
FemaleYes2.891.2636
No3.721.5647
Total3.361.4983
Non-binaryYes2.671.533
Total2.671.533
OtherYes3.00--1
Total3.00--1
TotalYes2.751.29106
No3.431.5068
Total3.021.41174
Q5. This game helped me to understand engineering truss structures.MaleYes2.731.22662.1440.063
No3.001.4821
Total2.791.2987
FemaleYes2.360.9636
No3.021.2647
Total2.731.1883
Non-binaryYes2.330.583
Total2.330.583
OtherYes5.00--1
Total5.00--1
TotalYes2.611.15106
No3.011.3268
Total2.771.23174
Q6. Playing the game increased my confidence in my engineering skills.MaleYes3.481.50662.1180.066
No3.761.2621
Total3.551.4487
FemaleYes3.061.3136
No4.041.4947
Total3.611.4983
Non-binaryYes3.331.153
Total3.331.153
OtherYes4.00--1
Total4.00--1
TotalYes3.341.43106
No3.961.4268
Total3.581.45174
Q8. Did you enjoy playing the Civil-Build game?MaleYes2.361.20662.3760.041
No2.380.8621
Total2.371.1287
FemaleYes2.281.1136
No2.961.6147
Total2.661.4583
Non-binaryYes2.331.153
Total2.331.153
OtherYes5.00--1
Total5.00--1
TotalYes2.361.18106
No2.781.4468
Total2.521.30174
Q10. The learning lessons or goals of each challenge are defined in enough detail to play the game.MaleYes3.301.51666.9860.000
No3.291.2321
Total3.301.4487
FemaleYes4.001.9036
No4.851.3847
Total4.481.6783
Non-binaryYes5.330.583
Total5.330.583
OtherYes5.00--1
Total5.00--1
TotalYes3.611.68106
No4.371.5268
Total3.911.66174
Q11. I got frustrated playing this game.MaleYes3.941.67663.6960.003
No3.901.5521
Total3.931.6387
FemaleYes3.171.6136
No2.771.4847
Total2.941.5483
Non-binaryYes3.672.083
Total3.672.083
OtherYes2.00--1
Total2.00--1
TotalYes3.651.68106
No3.121.5868
Total3.441.66174
Table 13. Examples of quotes from women/men/non-binary/other in response to Question 12 in the textbox.
Table 13. Examples of quotes from women/men/non-binary/other in response to Question 12 in the textbox.
Example Quotes to Question 12 from Women
Having an introductory tutorial that gives general explanations as to how to play the game and what each icon represents. An introduction would allow the gamer to have a better understanding about the game’s goal and rules. Many games that I have seen have this sort of interactive tutorial. Additionally, an explanation of how your structure is scored would also be helpful to allow the gamer to know what actions are “worth more points” and would therefore maximize the gamer’s score (while I was playing, I never got a reward greater than one nut, and I was not sure how to improve that).
Providing more incentive to get 3 bolts. A more guided tutorial or explanation would’ve been good as well. They just toss you into the game.
giving hints to explain how to get higher scores
I think having a brief demo at the beginning would be helpful. As I said earlier, I did not grow up playing video games so I don’t always intuitively know how the controls work.
Example quotes to question from men
I feel like playing 3D games when I was younger helped me a lot in visualizing 3D objects in Engineering, so perhaps making the game 3D. Also, the placement of the structure on the grid was very awkward and made it frustrating to build symmetrical structures. I may be wrong about this, but structures that earned more than one bolt seemed very impractical to build in real life. While saving on material and money is important, not every type of beam (length, thickness, etc.) is always available.
Make the game more calculation based. But then that would make the game boring. But I guess that what it takes to make a game that is informative and useful.
Making it impossible to complete levels with nonsense structures. Also, adding maybe a tutorial that explains how some engineering concepts (moments, tensile/compressive strength, buckling) are relevant to the game, or making the challenges in a way such that they teach these concepts, one at a time.
Having a small tutorial section on how to use all the tools and improving the method of placing objects.
Fully explain about the game before so people know not to spend most of their time trying to get more than one nut.
Example quotes to question from non-binary
explain the objective of the game within the game itself
Option to show the forces in the beams numerically
Table 14. This tables presents examples of quotes from women participants in the study that are in response to Question 13.
Table 14. This tables presents examples of quotes from women participants in the study that are in response to Question 13.
Example Quotes to Question 13 from Women
I liked how the simulation could be slowed down to really give the gamer an opportunity to see where the weaknesses in their structure lie. This allowed me to correct the issues and construct a better truss.
“I liked the fact that some of the bars were highlighted blue or red, which I assume meant that they were on tension or compression, and that should be included in the instructions.”
Example quotes to question from men
“With some more explanation in the game about the mathematical statics counterparts to the game experience, the game would provide knowledge which can be applied to classes.
“I didn’t like it. At all. There is nothing about this game that was fun nor educational, any “engineering” aspects were not explained, yet a 5-year-old could probably figure it out by trial and error. (Textbox Q13—men, Pos. 4)
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Cook-Chennault, K.; Villanueva Alarcón, I.; Jacob, G. Usefulness of Digital Serious Games in Engineering for Diverse Undergraduate Students. Educ. Sci. 2022, 12, 27. https://doi.org/10.3390/educsci12010027

AMA Style

Cook-Chennault K, Villanueva Alarcón I, Jacob G. Usefulness of Digital Serious Games in Engineering for Diverse Undergraduate Students. Education Sciences. 2022; 12(1):27. https://doi.org/10.3390/educsci12010027

Chicago/Turabian Style

Cook-Chennault, Kimberly, Idalis Villanueva Alarcón, and Gabrielle Jacob. 2022. "Usefulness of Digital Serious Games in Engineering for Diverse Undergraduate Students" Education Sciences 12, no. 1: 27. https://doi.org/10.3390/educsci12010027

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop