Next Article in Journal
Realist Review and System Dynamics as a Multimethod Qualitative Synthesis Approach for Analyzing Waste Minimization in Aotearoa New Zealand
Previous Article in Journal
Modeling and IAHA Solution for Task Scheduling Problem of Processing Crowdsourcing in the Context of Social Manufacturing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Impact of Student Response Systems (SRS) on Student Achievements: A University-Scale Study with Deep Exploratory Data Analysis (EDA)

by
Ahmad Abdul-Wahhab Shahba
1,2,
Walid Soufan
1,3,*,
Omar Altwijri
1,4,
Elham Alsadoon
1,5 and
Saud Alkathiri
1,5
1
Center for Excellence in Learning and Teaching, King Saud University, Riyadh 11451, Saudi Arabia
2
Kayyali Chair for Pharmaceutical Industries, Department of Pharmaceutics, College of Pharmacy, King Saud University, P.O. Box 2457, Riyadh 11451, Saudi Arabia
3
Plant Production Department, College of Food and Agriculture Sciences, King Saud University, P.O. Box 2460, Riyadh 11451, Saudi Arabia
4
Department of Biomedical Technology, College of Applied Medical Sciences, King Saud University, P.O. Box 2460, Riyadh 11451, Saudi Arabia
5
Curriculum and Instruction Department, College of Education, King Saud University, P.O. Box 2460, Riyadh 11451, Saudi Arabia
*
Author to whom correspondence should be addressed.
Systems 2023, 11(8), 384; https://doi.org/10.3390/systems11080384
Submission received: 15 May 2023 / Revised: 20 June 2023 / Accepted: 11 July 2023 / Published: 27 July 2023
(This article belongs to the Section Systems Practice in Social Science)

Abstract

:
The integration of Student Response Systems (SRSs) into classroom teaching is a pioneering progression in social sciences research that has shown potential in boosting student engagement and elevating academic success. However, no extensive study has examined the impact of its use on academic achievement within a sizable number of students of diverse cofactors, such as scientific discipline and study levels. The current study aims to conduct a comprehensive score analysis investigating the effect of SRS use on academic performance. It involved a total of 6047 male and female undergraduate students from four scientific disciplines, seven colleges, four campuses, and 13 courses covering all study levels within King Saud University. The student’s scores along with their attributes were anonymously collected from the university system. A voluntary anonymous survey was distributed to collect students’ perceptions of SRS along with their personal attributes, such as learning style, and class interaction preferences. Upon data collection, the Python programming language was exclusively implemented for comprehensive data analysis including grouping, validation, random sampling, visualization, and statistical analysis. The overall score analysis study showed a non-significant effect of SRS use on student scores compared to the control (non-SRS) group, while the survey findings proved a significant enhancement of students’ scores (in courses that utilized SRS) compared to their overall GPA. In addition, the differential score and survey analysis within various study subcategories showed significant positive effects in certain subcategories, particularly science and community colleges, and four of their representative courses. SRS showed higher levels of overall student satisfaction (average—4.4/5.0), yet it was also significantly influenced by scientific discipline, preferred interaction methods, and study levels. Overall, SRS provides a highly engaging tool with excellent student acceptance and potential academic performance enhancement.

1. Introduction

The emergence of digital technologies, the progress of social sciences research, and their significant impact on education have shown great potential to improve students’ abilities to intellectually adapt, possess new knowledge and learn skills, and sustain their contribution in the digital age [1]. Many universities place a strong emphasis on enhancing student active learning in order to enhance academic achievement. According to several references, the incorporation of technology into the educational process has increased students’ educational effectiveness. The use of modern technologies in education, particularly higher education, has advanced significantly [2].
There are numerous approaches to incorporate active learning in the classroom, all of which share the common goal of actively engaging the student in the learning process within the class [3]. One widely used technology that promotes active participation is the implementation of student response systems (SRS), which allow for real-time interaction between the teacher and students, providing opportunities for immediate feedback and increased engagement. Accordingly, instructors can gauge student understanding and adjust their teaching strategies and styles. These systems are known by a variety of terms, such as classroom communication systems [4], audience response systems [5,6], clickers [7,8], student response systems [9], and voting systems [10,11].
The implementation of SRS has yielded numerous benefits that have enhanced the effectiveness of classroom instruction. SRS has been shown to have a positive impact on several learning outcomes, and particularly student engagement within the classroom. SRS has motivated students to attend lectures and actively participate in class discussions [12]. When using SRS, the students tend to be more attentive in the classroom because they are frequently required to answer questions during the lecture [13]. Additionally, it helps teachers improve their teaching strategies, which could lead to an overall improvement of university-level education by promoting active learning [12,14,15,16].
Regarding its effect on student achievement, there is divergent scientific evidence on the effect of SRS on student achievement in exams [17]. Some studies reported that the use of SRS in the classroom led to an improvement in the student’s academic performance and their scores in the examinations [15,18], as well as the improvement of the learning outcomes [14,19,20] and cognitive outcomes [21]. On the other hand, some studies declared that SRS did not have a significantly positive impact on student achievement [22,23]. Although the survey results showed that the majority of students enjoyed using the SRS, this did not necessarily translate into a significant improvement in their overall performance [24].
From another angle, exploratory data analysis (EDA) allows a better understanding of data, its distribution, purity, features, etc. The Python programming language introduces a robust tool to perform EDA, including data groups/subgroups, preprocessing, and manipulation, as well as providing attractive visualization, investigating potential correlations, and customized statistical analysis [25]. In addition, Python enables efficient detection and handling of duplicates, missing/incorrect values, and/or candidates that do not fulfill the study inclusion criteria within large study populations.
The student score is a key performance indicator (KPI) that reflects the achievement of the student in his learning path. Several studies have investigated the effect of SRS on KPI at the college/faculty level only; however, no extensive studies have conducted such research at the university level. In contrast, the current study presents an exceptional attempt to assess the effect of SRSs on student scores through a sizable group of students with a variety of academic backgrounds, university campuses, and study levels. In addition, the current study presents a novel attempt to utilize a high-level programming language, namely “Python 3”, for deep EDA of students’ scores and perceptions.
The main objective of this study was to investigate the effect of SRS use on academic performance, including pass, failure, prohibition rates, and final student scores in various courses, campuses, and scientific disciplines. The current study aims to answer important questions like: (1) To what extent did the use of student response systems (SRS) contribute to increasing students’ academic achievement and test scores? (2) What are the effects of other confounding factors such as gender and academic backgrounds on academic achievements? (3) Do students’ personal attributes such as preferred learning style, interaction method, and study levels affect their overall satisfaction with SRS use?

2. Methods

2.1. Procedure

The study was conducted in the Fall-2022 semester and involved seven colleges within King Saud University, which involved the various disciplines of health, science, community, and humanities colleges (Level 1–10, Bachelor’s degree, male and female campuses).
The “SRSs” educational project was initiated and orchestrated by the Center for Excellence in Learning and Teaching at King Saud University (KSU-CELT). The latter was also in charge of the project’s funding, implementation, data collection, procurement of SRS devices, and technical support. A call for voluntary participation in the SRS educational project was made public by CELT-KSU at the start of the Fall-2022 semester. The program was directed at all colleges and disciplines of King Saud University. Faculty members had to consent to the following conditions before using SRS in their live courses:
  • To utilize the SRS in the educational process of bachelor’s sections where at least one section must include ≥20 students;
  • To submit a final report, a faculty survey, and to encourage his/her students to participate in the students’ survey on utilizing SRS.
Newly hired faculty members could receive financial support for their efforts in using, managing, distributing the SRS devices, collecting data, and sending reports on the project (regardless of their own or their students’ opinions of using SRS).

2.2. Characteristics of the Participants

Ten faculty members from different colleges successfully utilized the SRS within 14 undergraduate courses (Table 1). The SRSs were utilized in certain sections within each course and the other sections that did not apply SRS were included as control. Accordingly, a total of 7046 undergraduate students were initially enrolled in the score analysis study (Figure 1). However, the following students/sections were excluded from the score analysis study as follow:
  • Five sections (202 students) showed technical errors in score calculation and/or missing/incomplete values;
  • Male campus #3 involved only six students in (1 section) and none of them utilized SRS;
  • 12 sections (342 students) were registered in the I-Cal course, some of them utilized SRS in the theoretical labs (not in the lectures), and hence, they could not be identified (from those who did not use SRS) in terms of their achievement in the final exams;
  • 449 students withdrew from their courses in the early weeks of the semester.
Figure 1. Schematic representation of the data collection and analysis processes of exam scores.
Figure 1. Schematic representation of the data collection and analysis processes of exam scores.
Systems 11 00384 g001
Table 1. List of courses and their abbreviations.
Table 1. List of courses and their abbreviations.
Course NameAbbreviation
English Language (1st level)EL-1
Pharmaceutical calculations and liquid dosage formsPC-LDF
Arabic Language-Writing SkillsAL-WS
Medical TerminologyMT
Professional Ethics in the Health SectorPE-HS
Health ManagementHM
Organization of Healthcare ServicesOHS
BiostatisticsBIO-STAT
Introduction to Plant ProductionIntro-PP
EnzymologyEnz
Molecular biologyM-BIO
PhysiologyPhys
Integral CalculusI-Cal
Linear Algebra in BusinessLA-B
Accordingly, the remaining 6047 students (corresponding to 185 sections, seven colleges, and 13 courses) were considered in all the subsequent score analyses.
The demographical data showed that the study covered a wide range of study levels of bachelor degrees ranging from levels 1 to 10. Interestingly, it showed an equal distribution based on gender (Figure 2A). In addition, the study was conducted in four university campuses where the majority (41%) belong to Male campus #1 (Figure 2B). In addition, the study involved good distribution among different disciplines, covering up to seven colleges (that involve the four scientific disciplines) within the university. Interestingly, the majority of participating students (55%) belong to the humanities and social sciences college, which belongs to the humanities discipline (Figure 2C,D). Most importantly, 10% of the enrolled students utilized SRS within their courses, while the remaining 90% represented the control group that lacked SRS use (Figure 2E).

2.3. Assessment of Student Achievements

After the semester ended, the final exam scores were anonymously collected from the deanship of admission and registration affairs at KSU. The collected data involved the total number of registered, dropped, studied, and prohibited students within each section. In addition, it presented the number of passed and failed students along with their detailed distribution over different score ranks from (D to A+).
If student attendance was less than 75% of the lectures and laboratory sessions assigned for each course, the student was barred from continuing that course and was denied entrance to the respective final examination. A student who is denied entrance to the examination due to absences is considered to have failed that course and is given the grade DN in the course. These data were reformatted by the Python programing language to focus on the three main features, namely, Pass_rate (%), Average section score, and Student_score. The pass rate was calculated based on the percentage of students who were graded D or above, while the failure rate was calculated based on students who were graded DN or F. The Average section and student scores were calculated based on the KSU grading system (A+: 5, A: 4.75, B+: 4.5, B: 4, C+: 3.5, C: 3, D+:2.5, D: 2, F: 1, DN: 1).

2.4. Questionnaires to Evaluate Student’s Perceptions of SRS

At the end of the semester, all students who utilized SRS were encouraged to participate in an anonymous voluntary survey to solicit their feedback about using SRS [26]. The survey was designed in the Arabic language and involved two main sections. The first section collected the general respondent’s information, including gender, college, study level, and whether they used SRS this semester. Only students who used SRS were redirected to the second section that collected students’ feedback on their preferred learning style, preferred interaction method with the lecturer, and their overall satisfaction with using SRS. In addition, they were required to list courses where they used SRS, their score in these courses, as well as their GPA. An English translation of the original Arabic survey is presented in Appendix A.
The surveys were designed using Google Forms® and the survey link was sent by email to students [27]. Two reminders were subsequently sent to the students to encourage their participation.
A total of 460 students were surveyed and the number of responses reached 85 representing a response rate of 18.5%. Among the respondents, 18 students did not use SRS and, hence, were excluded from the next survey sections.

2.5. Ethical Considerations

The research tool (student survey) was revised and approved by the Standing Committee for Scientific Research Ethics (Ref No. KSU-HE-22-772). Retrospective analyses of student scores and feedback were collected as part of normal assessment during the course. Informed consent was obtained from surveyed students electronically. The student survey included a pre-statement that the survey was anonymous.

2.6. Data Analysis

The current study data were mainly analyzed using the Python programing language (version 3.9.13) within Jupyter Notebook (Anaconda 3 version 23.1.0). In addition, the following packages, namely, numpy, pandas, matplotlib, seaborn, statannotations, itertools were utilized for data presentation, grouping, validation, data frame manipulation, and visualization. The normality of the data was assessed by Shapiro–Wilk and Kolmogorov–Smirnov tests (Scipy.stats python packages) [28].

2.6.1. Sampling

The preliminary testing of the data revealed a very large difference in sample sizes between the SRS and the control groups (Figure 2E). The thorough investigation revealed that this deviation was mainly due to two large-sized courses, namely, AL-WS and EL-1 (n = 3349 students/109 sections and 1227 students/30 sections, respectively). In these two courses, the number of students in the control groups was ≈158 and 42-fold compared to their counterpart SRS group, respectively. This caused misleading calculation of the overall average score of each study subgroup and increased the risk of type 1 error when analyzing the overall effects of study variables due to the unbalanced distribution of students according to potential confounding factors (unequal number of SRS/control in each course). To solve this limitation, random sampling from the collected raw data was conducted (Table 2). In each course, sampling was made only for the group (SRS or control) that had a larger size (n) to achieve an equal number of students in SRS/control groups within each course. The sampling was conducted by the sample function (pandas python package) and the mean score of all sampled data was similar to their counterparts in raw data (p > 0.05) (Table 2). Accordingly, the subsequent statistical analysis of pass rate(%) and student score was conducted based on the sampled data.

2.6.2. Statistical Analysis

For dichotomous dependent variables (such as pass yes/no), the data were statistically analyzed by the Chi-square test of independence (in a contingency table) for two independent samples and Bonferroni post Chi-square adjustment for >2 samples (Scipy.stats python package) [29,30].
For other dependent variables (discrete or continuous), the Mann-Whitney-Wilcoxon test [two-sided] (statannotations python package) for two independent samples, the Wilcoxon signed-rank test (Scipy.stats python package) for two paired samples, and kruskal wallis H followed by the Dunn post hoc test with Bonferroni correction (pingouin and scikit_posthocs python packages) for >2 samples [31]. In the case of large sample sizes (both n > 500), the Z-test was utilized instead of the Mann-Whitney-Wilcoxon test to avoid the risk of Type 1 error occurrence with the latter [32,33].
The correlation analysis was conducted by Spearman’s correlation test (Scipy.stats python packages) [34]. A p-value of ≤0.05 was denoted statistically significant in all the statistical analysis tests.

3. Results

3.1. General Study Findings

A total of 6047 students were enrolled in the current study within the Fall-2022 semester. The data distribution and general statistics of the study parameters are presented in Table 3. The study involved 185 sections in which the average number of studied students was 33 students per section. The average passing, prohibition, and failure rates were 95.1%, 0.4%, and 4.5%, respectively. Interestingly, the average student score was 4.1 (out of 5.0), with more than 50% of the students scoring ≥4.75 (A+ and A).
The histograms for the pass rate and the score showed right-skewed distributions, while the failure and prohibition rates showed left-skewed distributions. Both Shapiro–Wilk and Kolmogorov–Smirnov tests showed a significant p-value (p < 0.05) for all of these parameters. Accordingly, these parameters are not normally distributed and, hence, non-parametric tests were applied to test the statistical significance between different study groups [28].

3.2. Pass Rate (%)

The overall analysis of the pass rate revealed that, generally, SRS use showed no significant effect on the pass rate (p = 0.42) (Figure 3). Regarding the other co-factors, the overall analysis showed that gender had no significant effect on the pass rate (Figure 4A), while the campus, scientific discipline, and course type had significant effects on the pass rate (Figure 4C,E and Figure 5A, respectively). In particular, Female Campus #1 showed a significantly higher pass rate compared to the Male Campus #2 (Figure 4C). The science colleges showed significantly higher pass rates compared to both health and community colleges (Figure 4E). Due to the significant effects of these confounding factors, it was important to investigate the SRS effect as a smaller subset based on each subcategory of these confounding factors. The differential analysis of the SRS effect showed that the SRS use had no significant effect on the pass rate within different subcategories of gender, campus, and course type (Figure 4B,D and Figure 5B, respectively). As for the scientific discipline, the SRS group in science colleges showed a significantly higher pass rate compared to the control (non-SRS) group (Figure 4F).

3.3. Student Scores

Although the SRS group showed higher student scores compared to the control (non-SRS) group, the overall statistical analysis revealed that generally, SRS use showed no significant effect on student scores (p = 0.06) (Figure 6). Regarding the other co-factors, the overall analysis showed that gender, campus, scientific discipline, and course type had significant effects on the student scores (Figure 7A,C,E, and Figure 8A, respectively). In particular, the female students showed significantly higher scores compared to males (Figure 7A). Male Campus #2 showed significantly lower scores compared to all other campuses (Figure 7C). Interestingly, the representative health colleges showed the significantly lowest score, while the representative humanities college showed the significantly highest score (Figure 7E). The BIO-STAT course (one of the community colleges courses) showed the lowest score, while AL-WS (the representative course of humanities colleges) showed the highest score (Figure 8A). Due to the significant effects of these confounding factors, it was important to investigate the differential effect of SRS at a smaller subset that represents each subcategory of these confounding factors. Accordingly, the detailed analysis of the SRS effect showed that the SRS use had no significant effect on scores across different subcategories of gender (Figure 7B). Interestingly, SRS showed significantly higher scores compared to the control (non-SRS) group in Male Campus #1 and Female Campus #2, while it showed significantly lower scores in Female Campus #1 (Figure 7D). Interestingly, SRS showed significantly higher scores compared to the control group, within the science colleges, while it showed no significant effect in all other scientific disciplines (Figure 7F). Most importantly, the course type represented the smallest possible subset which can partially eliminate the heterogeneity caused by different teaching styles, college/scientific discipline, the structure of the courses, and methods of evaluation. This analysis revealed that a total of four courses (three from the science and one from the community disciplines) showed a significantly higher score for the SRS group. In contrast, a total of two courses (from the community discipline) showed significantly higher scores for the control (non-SRS) group. The remaining seven courses from multi-disciplines showed no significant effect of SRS use on students’ scores (Figure 8B).

3.4. The Correlation between Average Section Score and the Number of SRS Sessions/Questions

To investigate SRS use deeply, it was important to study the correlation between the average section score and the total SRS sessions, as well as the total questions that utilized SRS in this section. The study results showed a non-significant (p > 0.05) weak to moderate positive correlation between score and total SRS sessions or questions (Figure 9A,B) [35].

3.5. Students’ Survey Results

The data distribution and general statistics of the survey parameters are presented in Table 4 and Figure 10. The survey involved a wide range of study levels of bachelor degrees ranging from levels 1–9 with a peak of participants at level 5 (Table 4). The survey showed high levels of student satisfaction with SRS (average = 4.4/5, 83% were highly satisfied/satisfied with SRS).
Interestingly, the survey revealed that the majority of the students (40% and 37%), preferred the visual and read/write learning styles, respectively (Figure 10A). In addition, 46% of the students preferred direct interaction with the lecturer’s questions through verbal discussion and oral conversations. An Equal proportion (46%) also preferred indirect (digital-based) interaction with the lecturer’s questions through SRS or mobile applications. Whereas, a minor proportion (9%) preferred the indirect(paper-based) quizzes to interact with the lecturer’s questions (Figure 10B).
Interestingly, the students scored an average of 4.52 (out of 5) in the courses that utilized SRS, while their average GPA (in all the courses) was 4.26 (Table 4).
Students personal attributes such as preferred learning style, communication method, and scientific discipline could potentially affect their perception of educational tools like SRS. The analysis of overall SRS satisfaction (OSS) data revealed that the four learning styles showed no significant difference in student OSS (Figure 11A). In contrast, students who preferred direct interaction (through verbal discussions) showed significantly (p < 0.05) lower OSS compared to their counterparts who preferred indirect (digital-based) interaction (Figure 11B). Regarding the scientific disciplines, the students belonging to science colleges showed significantly lower OSS compared to their counterparts in the community and health colleges (Figure 11C). In addition, the Spearman correlation analysis revealed a statistically significant (p < 0.0001) moderate negative correlation between OSS and study level (Figure 11D) [35]. In other words, students at initial study levels appeared to perceive SRS higher than their counterparts in higher study levels.
Most importantly, the survey data revealed that students scored significantly higher (p < 0.01) in courses that utilized SRS compared to their overall scores in all courses (represented by GPA) (Figure 12). Due to the significant effects of other confounding factors, it was important to investigate the SRS effect as a smaller subset based on each subcategory of them. The differential analysis of scientific discipline revealed that the SRS score enhancement was significant only in the community colleges, while other disciplines showed no significant difference (Figure 13A). In addition, the students who preferred visual learning showed significantly higher scores in SRS-courses while other learning styles showed no significant differences (Figure 13B). Interestingly, students who preferred direct and indirect (digital-based) communication scored significantly higher in SRS courses compared to their GPA (Figure 13C).

4. Discussion

SRS, as many references mention, is useful in enhancing most tools of the educational process, such as stimulating attendance [36], participation, interaction, attention, and feedback [37,38,39]. The current study introduces an infrequent attempt to examine the effect of SRS on student achievement within a diverse university-scale student population. In the current study, the use of SRSs was investigated for its potential enhancement of student achievements, including pass rate, failure, prohibition rates, and score. In addition, it also evaluated the overall student satisfaction with SRS along with the potential accompanying student attributes that might affect their satisfaction level.
The current study involved several potential confounding factors (such as gender, campus, discipline, and course type) that showed significant effects on pass rate and/or student scores, regardless of the effect of SRS. In particular, the course type showed a very strong effect on student scores where >40% of courses showed significant differences from each other in student scores. This confounding factor may mask an actual association or falsely demonstrate an apparent association between the studied factor and the study outcome when no real association between them exists [40]. In addition, the courses were unevenly distributed on SRS and control groups where AL-WS course raw data showed that the sample size of the control group was >158-fold compared to its counterpart SRS group. Accordingly, the raw data were randomly sampled (based on each course) to ensure an even distribution of students on SRS and control groups in each course. Hence, the statistical analysis of pass rate and student scores was carried out using the randomly sampled data. In addition, the differential SRS effects were studied within each subcategory of these confounding factors.
In this study, the results showed that there is no significant overall effect of SRS use on the pass rate of students. This finding was also consistent across different subcategories of gender, campus, and course type. As for the effect of students’ scores when using SRS in the classroom: the overall analysis showed a relatively higher score for the SRS group; however, the effect was non-significant. The differential analysis of the SRS effect across different subcategories of other confounding factors was revealed to be uneven, either with no effect, lower or higher scores, according to gender of students, campus, discipline, and course type. In particular, the findings of the SRS effect based on discipline and individual courses are the most useful to interpret as it involves comparison within the same course in which similar study levels, discipline, course materials, and exam difficulty are expected. When looking at each discipline separately, the results of science colleges showed a significantly higher score for the SRS group compared to the control. Furthermore, the SRS group in three courses (within the science discipline) showed significantly higher student scores compared to their counterparts in control groups. Another course in the community discipline showed a significantly higher score for the SRS group. In contrast, a total of two courses (from the community discipline) showed significantly higher scores for the control group. The remaining seven courses from multi-disciplines showed no significant effect of SRS use on students’ scores. These findings reveal a positive effect of SRS use on students’ scores. However, this effect was uneven across all the course types. This finding might be owing to the inconsistent use of SRS by different faculty members during the semester. The study showed that some educators used SRS as little as one session only (5–7 questions) within the whole semester, while others used it more extensively, up to 11 sessions (38 questions). In addition, the participating educators utilized SRS for a variety of classroom activities, including attendance, in-class activities, pre- and post-class quizzes. Most of the educators used it to enhance student engagement and evaluate their understanding during the lectures, while others efficiently used them at the beginning of the lecture to assess the students’ prior knowledge, for evaluating students’ performance at the end of the lecture in addition to their regular use during the lecture.
Likewise, previous studies have reported mixed effects of SRS on course grades. Following an educational activity, SRS showed immediate improvements in student recall, but these improvements did not last [15]. A global meta-analysis study reported a positive (but moderated) influence of SRS on exam grades [18]. On the other hand, Liu et al. 2019 reported [41] that the use of the SRS did not show a significant improvement in students' learning achievement. In fact, the educational process, especially learning achievement, is linked and affected by several educational factors and strategies. For example, the development of peer groups and flipped learning activities could show significant impacts on learning achievement.
By examining the relationship between the average section score and the total SRS sessions, as well as the total questions that utilized SRS in this section, the results showed that the two correlations were not significant and weak to moderate. That is, the effect was normal, perhaps due to the very low number of SRS sessions in some courses and/or the decrease in interest and enthusiasm of students to use SRS in the last sessions in the semester, as suggested by Lantz (2010) [42]. Although the correlations between score and SRS sessions/questions were statistically insignificant, more intensive use of SRS within each lecture along with more innovative activities could lead to more sustainable student interaction and, hence, more achievement.
Although SRS demonstrated higher levels of overall student satisfaction (average—4.4/5.0), it might not be the first choice for all the students as a tool for interaction within the lecture, depending on different student preferences. In particular, highly interactive students (who tend to be directly engaged with their lecturer and peers through oral conversations, and group discussions) might feel that SRS restricts their ability to interact to the fullest. Meanwhile, students who preferred indirect communication (texting, chatting) and/or digitalization advocates perceived SRS better. These findings are strongly correlated with what Li (2020) investigated on the association between the effectiveness of SRS and students’ communication preference [43]. In this article, Li reported that the use of SRS showed the most significant positive influence on students who prefer to communicate via instant messaging. In addition, the current study findings suggest that junior students might perceive SRS better than their senior counterparts. This finding might be owing to the development of teamwork, interaction, and leadership skills in senior students, which could not be fully expressed by SRS use within the lecture. The students belonging to science colleges showed significantly lower OSS compared to their counterparts in the community and health colleges. The significant differences in OSS between different scientific disciplines of the current study correlate with the previously reported findings that declared different impacts on students depending on the type of specialization (college) [44].
Most importantly, the significantly higher students’ scores in SRS courses compared to their overall GPA reflects a strong positive effect of SRS use by the student on his own achievement in exams. However, this positive score enhancement was only significant in the community disciplines, which correlates well with the significantly higher OSS of community colleges compared to science counterparts. These findings again confirm the impact of scientific discipline and specialization type on SRS effectiveness. Although digital-based indirect communicators showed significantly higher OSS compared to their direct counterparts, both groups scored significantly higher in SRS courses compared to their overall GPA. However, further studies with a higher number of participants are warranted to achieve a generalized conclusion.
The present study Investigated student scores within a sizable number of students across diverse disciplines, courses, study levels, and other student attributes, such as learning styles and communication preferences. However, the study experienced some limitations which need to be discussed further. To efficiently evaluate the impact of any educational approach, careful steps should be undertaken to ensure that teachers use it almost identically, with the same effort and frequency, and for similar reasons. In the current study, the magnitude and the way of SRS use were not consistent within most of the investigated courses. However, it is worth mentioning that the current study showed a weak to moderate positive correlation between score and SRS use frequency. In addition, the study involved a diverse student population that was not distributed evenly among different potential co-factors. This fact limits the ability to isolate the overall real effect of SRS use on the learning process apart from such synergic/antagonistic co-factors. To solve this limitation, the collected student scores were analyzed at the smallest possible subset, namely course type, which can partially eliminate the heterogeneity caused by different teaching styles, college and scientific discipline, the structure of the courses and methods of evaluations. Similarly, the survey results (OSS, SRS course score, GPA) were analyzed as the smaller subsets as learning styles and communication preferences.
The evaluation of SRS impact based on final course grades provided a reliable unbiased tool for evaluating academic performance. However, this variable is influenced by several other cofactors, such as student memorization, studying efforts, and the way of disturbing marks across different evaluation tools. In some courses, 20–30% of the course grade evaluates student performance in lab and/or his practical skills, which might not be directly affected by SRS use in theoretical lectures. Future studies should combine assessing SRS impact on student achievements in pre-, post-lecture quizzes, short and final exams, where the lectures that involved SRS use are the main scope of these exams.
To improve the study design, future studies should start with systematic preparation of the whole experimental environment to be able to achieve a solid conclusion that the positive/negative/neutral impact is mainly caused by the proposed educational tool. In addition, the study design should be more controlled to achieve consistent use of SRS by teachers in terms of frequency and scope of use. Finally, it could be valuable to explore the SRS effect in some potential target groups as those who are at risk of academic failure, as well as good achievers (high GPA). Future studies should also include multiple class cohorts, and/or students from multiple universities. Along with the good distribution of the study population across different subsets, this could lead to a more precise assessment of the influence of SRS on educational outcomes, thereby enhancing the generalizability of the research conclusions.

5. Conclusions

The current study provides an exceptional attempt to examine the effect of SRS on student achievement within a diverse university-scale student population with various accompanying factors like study levels, gender, campus, scientific discipline, and course type. The overall score analysis showed a non-significant effect of SRS use on pass rate and student scores. However, the other confounding factors showed significant effects on pass rate and/or score. Hence, it was important to investigate the SRS effect as a smaller subset based on each subcategory of these factors. In particular, the SRS group in science colleges and three of their representative courses showed significantly higher student scores compared to the control (non-SRS) groups.
Similarly, the survey findings proved a significant enhancement of students’ scores (in courses that utilized SRS) compared to their overall GPA. However, this enhancement was significant only in certain subcategories such as community colleges, as well as visual learning advocates. In addition, SRS showed higher levels of overall student satisfaction (average—4.4/5.0), which was also significantly influenced by scientific discipline, preferred interaction methods, and study levels. In summary, the differential score and survey analysis within various study subcategories showed significant positive effects in certain subcategories, particularly science, community colleges and four of their representative courses. Overall, SRS provides a potential tool for enhanced student engagement, class interaction, and exam achievements.

Author Contributions

Conceptualization, A.A.-W.S., W.S., O.A., E.A. and S.A.; Methodology, A.A.-W.S., W.S. and E.A.; Software, A.A.-W.S.; Validation, A.A.-W.S., W.S. and S.A.; Formal analysis, A.A.-W.S.; Investigation, A.A.-W.S. and W.S., O.A., E.A. and S.A.; Resources, S.A.; Writing—original draft, A.A.-W.S. and W.S.; Writing—review & editing, A.A.-W.S., W.S. and S.A.; Visualization, A.A.-W.S.; Supervision, O.A., E.A. and S.A.; Project administration, O.A., E.A. and S.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Center for Excellence in Learning and Teaching, King Saud University, Riyadh, Saudi Arabia.

Institutional Review Board Statement

The research tool (student survey) was revised and approved by the Standing Committee for Scientific Research Ethics on 20 December 2022 (Ref No. KSU-HE-22-772).

Informed Consent Statement

Informed consent was obtained from all surveyed students electronically.

Data Availability Statement

The data presented in this study are available in the current article.

Acknowledgments

The authors extend their appreciation to Center for Excellence in Learning and Teaching, King Saud University, Riyadh, Saudi Arabia.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Questionnaires to Evaluate Student’s Perceptions of SRS

  • First section: General information
  • Dear Student: Thank you for agreeing to participate in this electronic survey entitled: Measuring the Impact of Students’ Use of Student Response Systems (SRS).
  • The purpose of this questionnaire is to assess the effect of using some active learning applications on students’ academic achievement. It takes about 3 min to complete this questionnaire.
  • We assure you that all your answers are used for the purposes of scientific research, which does not include the publication of any private data on the identity of the participant.
  • If you agree to participate in this online survey, please click below to get started.
1-
Participant gender: (Please select one)
  • Male
  • Female
2-
College:………………………………………………………………….
3-
The academic level: (Please select one)
  • Level 1
  • Level 2
  • Level 3
  • Level 4
  • Level 5
  • Level 6
  • Level 7
  • Level 8
  • Level 9
  • Level 10
4-
Did you use the personal response system during the first semester of the 2022 academic year?
(Please select one)
  • Yes
  • No: (Finish and Skip to send)
The second section: for students who used Student response systems (SRS).
5-
Mention the names of the courses in which you used (SRS) during the first semester 2022:
6-
What is your most preferred learning style? (Please select one)
  • Visual Learning: (Pictures, Drawings and Charts)
  • Auditory learning: (listening to recorded lectures repeatedly)
  • kinesthetic learning: (by touching the stereoscopic shapes or conducting laboratory experiments yourself, if possible)
  • Learning through reading and writing: (by writing lectures and writing notes on them and then reading them)
7-
What is your favorite means of interaction with the lecturer? (Inside the classroom)
(Please select one)
  • Discussions and oral conversations to answer the questions of the lecturer.
  • Personal Response Devices (SRS) or mobile apps in order to collect student responses.
  • Answer the paper-based quizzes before or after the lecture.
8-
In general, are you satisfied with the use of personal response systems (SAR) in lectures? (Please select one)
  • Highly unsatisfied
  • Unsatisfied
  • neutral
  • Satisfied
  • Highly satisfied
9-
What is your score in the courses in which (SRS) was used during the first semester 2022?
Please be careful in answering, knowing that the questionnaire does not reveal your identity. (For example: your score for each course is: A+, A, etc.).
10-
What is your cumulative GPA until the end of the first semester 2022?
Please be careful in answering, knowing that the questionnaire does not reveal your identity. (Ex: 3.56).
                                                                                                                                                                                                                     Send
Thank you for your fruitful cooperation

References

  1. Kuhlthau, C.C.; Maniotes, L.K.; Caspari, A.K. Guided Inquiry: Learning in the 21st Century; Abc-Clio: Santa Barbara, CA, USA, 2015. [Google Scholar]
  2. Stukalenko, N.M.; Zhakhina, B.B.; Kukubaeva, A.K.; Smagulova, N.K.; Kazhibaeva, G.K. Studying Innovation Technologies in Modern Education. Int. J. Environ. Sci. Educ. 2016, 11, 7297–7308. [Google Scholar]
  3. Crouch, C.H.; Mazur, E. Peer instruction: Ten years of experience and results. Am. J. Phys. 2001, 69, 970–977. [Google Scholar]
  4. White, P.; Syncox, D.; Alters, B. Clicking for grades? Really? Investigating the use of clickers for awarding grade-points in post-secondary education. Interact. Learn. Environ. 2011, 19, 551–561. [Google Scholar] [CrossRef]
  5. Connor, E.J.S.; Libraries, T. Using cases and clickers in library instruction: Designed for science undergraduates. Sci. Technol. Libr. 2011, 30, 244–253. [Google Scholar] [CrossRef]
  6. Cain, J.; Robinson, E. A primer on audience response systems: Current applications and future considerations. Am. J. Pharm. Educ. 2008, 72, 77. [Google Scholar] [CrossRef]
  7. Lantz, M.E.; Stawiski, A. Effectiveness of clickers: Effect of feedback and the timing of questions on learning. Comput. Hum. Behav. 2014, 31, 280–286. [Google Scholar] [CrossRef]
  8. Velasco, M.; Çavdar, G. Politics. Teaching large classes with clickers: Results from a teaching experiment in comparative politics. PS Political Sci. Politics 2013, 46, 823–829. [Google Scholar] [CrossRef] [Green Version]
  9. Klein, K.; Kientz, M. A model for successful use of student response systems. Nurs. Educ. Perspect. 2013, 34, 334–338. [Google Scholar]
  10. King, S.O. Investigating the most neglected student learning domain in higher education: A case study on the impact of technology on student behaviour and emotions in university mathematics learning. Probl. Educ. 21st Century 2016, 72, 31–52. [Google Scholar] [CrossRef]
  11. Mathiasen, H. Digital Voting Systems and Communication in Classroom Lectures—An Empirical Study Based around Physics Teaching at Bachelor Level at Two Danish Universities. J. Interact. Media Educ. 2015, 1. [Google Scholar] [CrossRef] [Green Version]
  12. Altwijri, O.; Alsadoon, E.; Shahba, A.A.; Soufan, W.; Alkathiri, S. The Effect of Using Student Response Systems (SRS)&rdquo; on Faculty Performance and Student Interaction in the Classroom. Sustainability 2022, 14, 14957. [Google Scholar] [CrossRef]
  13. Strasser, N. Who wants to pass math? Using clickers in calculus. J. Coll. Teach. Learn. (TLC) 2010, 7. [Google Scholar] [CrossRef]
  14. Chien, Y.-T.; Chang, Y.-H.; Chang, C.-Y. Do we click in the right way? A meta-analytic review of clicker-integrated instruction. Educ. Res. Rev. 2016, 17, 1–18. [Google Scholar] [CrossRef] [Green Version]
  15. Hussain, F.N.; Wilby, K.J. A systematic review of audience response systems in pharmacy education. Curr. Pharm. Teach. Learn. 2019, 11, 1196–1204. [Google Scholar] [CrossRef] [PubMed]
  16. Wood, R.; Shirazi, S.J.C. A systematic review of audience response systems for teaching and learning in higher education: The student experience. Comput. Educ. 2020, 153, 103896. [Google Scholar] [CrossRef]
  17. Katsioudi, G.; Kostareli, E. A Sandwich-model experiment with personal response systems on epigenetics: Insights into learning gain, student engagement and satisfaction. FEBS Open Bio 2021, 11, 1282–1298. [Google Scholar] [CrossRef]
  18. Castillo-Manzano, J.I.; Castro-Nuño, M.; López-Valpuesta, L.; Sanz-Díaz, M.T.; Yñiguez, R.J.C. Measuring the effect of ARS on academic performance: A global meta-analysis. Comput. Educ. 2016, 96, 109–121. [Google Scholar] [CrossRef]
  19. Atlantis, E.; Cheema, B.S. Effect of audience response system technology on learning outcomes in health students and professionals: An updated systematic review. JBI Evid. Implement. 2015, 13, 3–8. [Google Scholar] [CrossRef]
  20. Nelson, C.; Hartling, L.; Campbell, S.; Oswald, A.E. The effects of audience response systems on learning outcomes in health professions education. A BEME systematic review: BEME Guide No. 21. Med. Teach. 2012, 34, e386–e405. [Google Scholar] [CrossRef] [Green Version]
  21. Hunsu, N.J.; Adesope, O.; Bayly, D.J. A meta-analysis of the effects of audience response systems (clicker-based technologies) on cognition and affect. Comput. Educ. 2016, 94, 102–119. [Google Scholar] [CrossRef]
  22. Liu, D.J.; Walker, J.; Bauer, T.; Zhao, M. Facilitating Classroom Economics Experiments with an Emerging Technology: The Case of Clickers. SSRN 989482. 2007. Available online: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=989482 (accessed on 1 June 2023).
  23. Ortiz, B.L. The effects of student response systems on student achievement and engagement. Masters Thesis, Faculty of California State Polytechnic University, Pomona, CA, USA, 2014. [Google Scholar]
  24. Hayter, J.; Rochelle, C.F. Clickers: Performance and Attitudes in Principles of Microeconomics. SSRN 2226401. 2013. Available online: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2226401 (accessed on 14 May 2023).
  25. Daza, L. A Simple Way to Analyze Student Performance Data with Python. Available online: https://towardsdatascience.com/a-simple-way-to-analyze-student-performance-data-with-python-cc09c7508c4c (accessed on 7 May 2023).
  26. Shahba, A.A.; Alashban, Z.; Sales, I.; Sherif, A.Y.; Yusuf, O. Development and Evaluation of Interactive Flipped e-Learning (iFEEL) for Pharmacy Students during the COVID-19 Pandemic. Int. J. Environ. Res. Public Health 2022, 19, 3902. [Google Scholar] [CrossRef]
  27. Shahba, A.A.; Sales, I. Design Your Exam (DYE): A novel active learning technique to increase pharmacy student engagement in the learning process. Saudi Pharm. J. 2021, 29, 1323–1328. [Google Scholar] [CrossRef]
  28. Mishra, P.; Pandey, C.M.; Singh, U.; Gupta, A.; Sahu, C.; Keshri, A. Descriptive statistics and normality tests for statistical data. Ann. Card. Anaesth. 2019, 22, 67–72. [Google Scholar] [CrossRef]
  29. McHugh, M.L. The chi-square test of independence. Biochem. Med. 2013, 23, 143–149. [Google Scholar] [CrossRef] [Green Version]
  30. Anderson, M.R.; Baumhauer, J.F.; DiGiovanni, B.F.; Flemister, S.; Ketz, J.P.; Oh, I.; Houck, J.R. Determining Success or Failure After Foot and Ankle Surgery Using Patient Acceptable Symptom State (PASS) and Patient Reported Outcome Information System (PROMIS). Foot Ankle Int. 2018, 39, 894–902. [Google Scholar] [CrossRef]
  31. Nahm, F.S. Nonparametric statistical tests for the continuous data: The basic concept and the practical use. Korean J. Anesth. 2016, 69, 8–14. [Google Scholar] [CrossRef]
  32. Zimmerman, D.W. Type I Error Probabilities of the Wilcoxon-Mann-Whitney Test and Student T Test Altered by Heterogeneous Variances and Equal Sample Sizes. Percept. Mot. Ski. 1999, 88, 556–558. [Google Scholar] [CrossRef]
  33. Zimmerman, D.W. A Warning About the Large-Sample Wilcoxon-Mann-Whitney Test. Underst. Stat. 2003, 2, 267–280. [Google Scholar] [CrossRef]
  34. Schober, P.; Boer, C.; Schwarte, L.A. Correlation Coefficients: Appropriate Use and Interpretation. Anesth. Analg. 2018, 126, 1763–1768. [Google Scholar] [CrossRef]
  35. Akoglu, H. User’s guide to correlation coefficients. Turk. J. Emerg. Med. 2018, 18, 91–93. [Google Scholar] [CrossRef]
  36. Blasco-Arcas, L.; Buil, I.; Hernández-Ortega, B.; Sese, F. Using clickers in class. The role of interactivity, active collaborative learning and engagement in learning performance. Comput. Educ. 2013, 62, 102–110. [Google Scholar] [CrossRef]
  37. Kay, R.H.; LeSage, A.J.C. Examining the benefits and challenges of using audience response systems: A review of the literature. Comput. Educ. 2009, 53, 819–827. [Google Scholar] [CrossRef]
  38. Bullock, D.; LaBella, V.; Clingan, T.; Ding, Z.; Stewart, G.; Thibado, P. Enhancing the student-instructor interaction frequency. Phys. Teach. 2002, 40, 535–541. [Google Scholar] [CrossRef] [Green Version]
  39. Bunce, D.M.; Flens, E.A.; Neiles, K.Y. How long can students pay attention in class? A study of student attention decline using clickers. J. Chem. Educ. 2010, 87, 1438–1443. [Google Scholar] [CrossRef]
  40. Skelly, A.C.; Dettori, J.R.; Brodt, E.D. Assessing bias: The importance of considering confounding. Evid.-Based Spine-Care J. 2012, 3, 9–12. [Google Scholar] [CrossRef] [Green Version]
  41. Liu, C.; Sands-Meyer, S.; Audran, J. The effectiveness of the student response system (SRS) in English grammar learning in a flipped English as a foreign language (EFL) class. Interact. Learn. Environ. 2019, 27, 1178–1191. [Google Scholar] [CrossRef]
  42. Lantz, M.E. The use of ‘clickers’ in the classroom: Teaching innovation or merely an amusing novelty? Comput. Hum. Behav. 2010, 26, 556–561. [Google Scholar] [CrossRef]
  43. Li, R. Communication preference and the effectiveness of clickers in an Asian university economics course. Heliyon 2020, 6, e03847. [Google Scholar] [CrossRef]
  44. Barrio, C.M.; Muñoz-Organero, M.; Soriano, J.S. Can gamification improve the benefits of student response systems in learning? An experimental study. IEEE Trans. Emerg. Top. Comput. 2015, 4, 429–438. [Google Scholar] [CrossRef]
Figure 2. The distribution of students according to (A) Gender, (B) Campus, (C) Faculty, (D) Scientific discipline, and (E) SRS use.
Figure 2. The distribution of students according to (A) Gender, (B) Campus, (C) Faculty, (D) Scientific discipline, and (E) SRS use.
Systems 11 00384 g002aSystems 11 00384 g002b
Figure 3. The overall effect of SRS use on pass rate (%).
Figure 3. The overall effect of SRS use on pass rate (%).
Systems 11 00384 g003
Figure 4. The overall and differential effect of confounding factors, including (A,B) gender, (C,D) campus, and (E,F) scientific discipline on pass rate (%). In subfigures (A,B,D,F): a significant p-value (<0.05) was marked with an asterisk (*). In subfigures (C,E): only significant p values (<0.05) were annotated.
Figure 4. The overall and differential effect of confounding factors, including (A,B) gender, (C,D) campus, and (E,F) scientific discipline on pass rate (%). In subfigures (A,B,D,F): a significant p-value (<0.05) was marked with an asterisk (*). In subfigures (C,E): only significant p values (<0.05) were annotated.
Systems 11 00384 g004
Figure 5. The effect of course type and SRS use on pass rate. (A) denotes the overall and (B) the differential effects. In (A): only significant p values (<0.05) were annotated.
Figure 5. The effect of course type and SRS use on pass rate. (A) denotes the overall and (B) the differential effects. In (A): only significant p values (<0.05) were annotated.
Systems 11 00384 g005
Figure 6. The overall effect of SRS use on student scores.
Figure 6. The overall effect of SRS use on student scores.
Systems 11 00384 g006
Figure 7. The overall and differential effect of confounding factors, including (A,B) gender, (C,D) campus, and (E,F) scientific discipline on student scores. In subfigures (A,B,D,F): a significant p-value was marked with an asterisk (*). In subfigures (C,E): only significant p values were annotated.
Figure 7. The overall and differential effect of confounding factors, including (A,B) gender, (C,D) campus, and (E,F) scientific discipline on student scores. In subfigures (A,B,D,F): a significant p-value was marked with an asterisk (*). In subfigures (C,E): only significant p values were annotated.
Systems 11 00384 g007
Figure 8. The effect of course type and SRS use on student scores. (A) denotes the overall and (B) the differential effects. In subfigure (A): a total of 32 pairs of data showed significant differences; hence, only the lowest and highest scores were marked with the downward and upward red arrows. In Subfigure (B): a significant p-value was marked by an asterisk (*).
Figure 8. The effect of course type and SRS use on student scores. (A) denotes the overall and (B) the differential effects. In subfigure (A): a total of 32 pairs of data showed significant differences; hence, only the lowest and highest scores were marked with the downward and upward red arrows. In Subfigure (B): a significant p-value was marked by an asterisk (*).
Systems 11 00384 g008
Figure 9. The correlation between (A) total SRS sessions, (B) total SRS questions, and average section score. n = 12.
Figure 9. The correlation between (A) total SRS sessions, (B) total SRS questions, and average section score. n = 12.
Systems 11 00384 g009
Figure 10. The distribution of respondents based on (A) preferred learning style and (B) preferred interaction method (n = 72).
Figure 10. The distribution of respondents based on (A) preferred learning style and (B) preferred interaction method (n = 72).
Systems 11 00384 g010
Figure 11. The effect of (A) preferred learning style, (B) preferred interaction method, (C) scientific discipline, and (D) study level on overall SRS satisfaction (n = 67).
Figure 11. The effect of (A) preferred learning style, (B) preferred interaction method, (C) scientific discipline, and (D) study level on overall SRS satisfaction (n = 67).
Systems 11 00384 g011
Figure 12. The overall effect of SRS use on student score enhancement against his GPA.
Figure 12. The overall effect of SRS use on student score enhancement against his GPA.
Systems 11 00384 g012
Figure 13. The differential effects of SRS use on student score enhancement against his GPA based on confounding factors, including (A) Scientific discipline, (B) preferred learning style, and (C) preferred communication method. A significant p-value was marked by an asterisk (*).
Figure 13. The differential effects of SRS use on student score enhancement against his GPA based on confounding factors, including (A) Scientific discipline, (B) preferred learning style, and (C) preferred communication method. A significant p-value was marked by an asterisk (*).
Systems 11 00384 g013
Table 2. Study mapping of student distribution according to different subcategories (raw vs. sampled data).
Table 2. Study mapping of student distribution according to different subcategories (raw vs. sampled data).
SCIENTIFIC
DISCIPLINE
FACULTYCOURSE
CODE
SRS
USE
Raw
Data
Sampled Datap-Value *
nn
CommunityApplied_studies_and_
community_service
BIO-STATNO53250.48
YES25251.00 #
EL1NO1199280.19
YES28281.00 #
MTNO152210.73
YES21211.00 #
PE-HSNO147210.36
YES21211.00 #
HMNO55360.84
YES36361.00 #
OHSNO104240.56
YES24241.00 #
HealthMedicinePhysNO1461461.00 #
YES1881460.72
PharmacyPC-LDFNO54541.00 #
YES69540.76
HumanitiesHumanities_and_
Social_Sciences
AL-WSNO3328210.09
YES21211.00 #
ScienceBusiness_administrationLA-BNO96870.91
YES87871.00 #
Food_and_Agriculture_
Sciences
Intro-PPNO118230.61
YES23231.00 #
SciencesEnzNO10101.00 #
YES10101.00 #
M-BIONO10101.00 #
YES22101.00
* The p-value representing the statistical difference in the average score between sampled and raw data within each course. # denotes that the sampled data contain the exact values as the raw data.
Table 3. The descriptive statistics of the score analysis study based on sections (n = 185).
Table 3. The descriptive statistics of the score analysis study based on sections (n = 185).
Study ParametersTOTAL_REGISTERED
_STUDENTS
STUDIED_StudentsPassing Rate (%)Prohibition Rate (%)Failure Rate (%)Average Section Score (out of 5)A+ (%)A (%)B+ (%)B (%)C+ (%)C (%)D+ (%)D (%)
mean35.132.795.10.44.54.132.918.013.49.46.45.24.15.7
std17.115.97.21.56.80.728.213.011.19.18.37.05.99.8
min4.03.061.50.00.01.90.00.00.00.00.00.00.00.0
25%21.020.093.10.00.03.710.48.34.42.20.00.00.00.0
50%37.036.097.90.01.94.425.015.210.97.74.72.30.00.0
75%47.044.0100.00.06.54.746.226.220.013.810.07.77.67.1
max118.0108.0100.08.330.85.098.166.760.050.066.733.330.050.0
Table 4. The descriptive statistics of student survey results.
Table 4. The descriptive statistics of student survey results.
STUDY_LEVELOverall SRS
SATISFACTION (OSS)
SRS COURSE SCOREGPA
n67677272
mean4.764.364.524.26
std2.141.180.580.57
min1122.8
25%344.53.87
50%554.754.35
75%6.5554.78
max9555
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Shahba, A.A.-W.; Soufan, W.; Altwijri, O.; Alsadoon, E.; Alkathiri, S. The Impact of Student Response Systems (SRS) on Student Achievements: A University-Scale Study with Deep Exploratory Data Analysis (EDA). Systems 2023, 11, 384. https://doi.org/10.3390/systems11080384

AMA Style

Shahba AA-W, Soufan W, Altwijri O, Alsadoon E, Alkathiri S. The Impact of Student Response Systems (SRS) on Student Achievements: A University-Scale Study with Deep Exploratory Data Analysis (EDA). Systems. 2023; 11(8):384. https://doi.org/10.3390/systems11080384

Chicago/Turabian Style

Shahba, Ahmad Abdul-Wahhab, Walid Soufan, Omar Altwijri, Elham Alsadoon, and Saud Alkathiri. 2023. "The Impact of Student Response Systems (SRS) on Student Achievements: A University-Scale Study with Deep Exploratory Data Analysis (EDA)" Systems 11, no. 8: 384. https://doi.org/10.3390/systems11080384

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop