Next Article in Journal
Group Work during Inquiry-Based Learning in Biology Teacher Education: A Praxeological Perspective on the Task of (Collaborative) Protocol Generation
Next Article in Special Issue
Engaging Students in Scientific Practices in a Remote Setting
Previous Article in Journal
Examination of the Transdiagnostic Role of Impulsivity in the Development of ADHD and ODD Symptoms in Primary and Secondary School Students
Previous Article in Special Issue
A Robotic System for Remote Teaching of Technical Drawing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Do Direct and Indirect Recommendations Facilitate Students’ Self-Regulated Learning in Flipped Classroom Online Activities? Findings from Two Studies

Faculty of Education, The University of Hong Kong, Hong Kong SAR 999077, China
*
Author to whom correspondence should be addressed.
Educ. Sci. 2023, 13(4), 400; https://doi.org/10.3390/educsci13040400
Submission received: 28 February 2023 / Revised: 5 April 2023 / Accepted: 11 April 2023 / Published: 15 April 2023

Abstract

:
Self-regulated learning (SRL) is a prerequisite for successful learning. However, many students report having difficulties in completing individual online tasks outside the classroom in flipped learning contexts. Therefore, additional support for students should be provided to help them improve their SRL skills. Studies have examined the effects of prompts (e.g., questions) to facilitate SRL but have paid less attention to exploring how different types of recommendations for SRL activities may affect students’ SRL skills, course engagement and learning performance. We conducted two studies using direct and indirect recommendations for 77 undergraduate students in the faculty of education in two flipped classroom courses. The direct recommendation approach suggested specific follow-up SRL activities in various learning tasks, whereas the indirect recommendation approach provided students with general SRL hints but left them to identify what specific SRL activities they should use in the next step. To evaluate the impact of each recommendation approach, we measured the students’ self-reported SRL skills, online behaviors, course engagement and learning performance. The results suggested that direct recommendations were useful in improving students’ engagement in online SRL activities and in sustaining their motivation for SRL, while indirect recommendations played a major role in reminding students of the need to self-regulate their learning. Both types of recommendations could significantly affect the quality of students’ online learning. Finally, we discuss the theoretical and practical implications for future SRL recommendation research.

1. Introduction

Recent years have seen significant growth in online learning because of the COVID-19 pandemic. Online learning can offer students the flexibility to juggle their education, home life and careers without being confined to a fixed schedule; however, recent studies have also reported that students experience various challenges during online learning [1,2]. For example, learning materials are posted online for students to access in most cases. However, many learners are not proficient in dealing with the problems of when, where and how to approach these learning contents effectively. These learners may become demotivated and subsequently do not effectively self-regulate their online learning practice [3,4]. Therefore, training students to acquire self-regulated learning (SRL) skills is of paramount importance for improving the quality of their online learning practice.
In education, SRL refers to “self-generated thoughts, feelings and actions that are planned and cyclically adapted to the attainment of personal goals” [5] (p. 14). Students with SRL skills are more likely to perform better in their online learning than students who lack SRL skills. As a result, researchers have attempted to promote SRL by implementing its frameworks in various learning contexts, such as flipped classroom. The flipped classroom (FC), which comprises a self-paced pre-class online learning component, and a teacher-facilitated in-person component, has attracted researchers’ attention because many learners have reported having difficulties in completing their individual online tasks outside the classroom [6,7,8]. Given the growing body of research conducted in the flipped learning context, our proposed SRL approach using direct and indirect recommendations provides new insights into supporting students’ acquisition of SRL skills.
Studies supporting SRL in flipped learning contexts have mainly focused on embedding SRL prompts with predefined questions in instructional videos [9]. For example, students must respond to the SRL prompts (e.g., did you achieve your set goals?) and answer the prompted questions to continue the video lecture [9,10]. The major difference between SRL prompts and our recommendation approaches is that, in SRL prompts, students merely answer predefined questions, and no suggestions concerning follow-up SRL actions are given. These prompts may not be very helpful for students who are less proficient in SRL. The rationale for using recommendation approaches in this study is to suggest follow-up SRL strategies or activities to students based on their online learning performance.
Our research aims to explore the effect of providing students with different types of recommendations to facilitate their SRL skills. We introduced conceptual background and proposed three research questions in Section 2. In Section 3, we elucidated the research designs and methods of the two studies involved in this research. For each study, we analyzed and discussed the findings in terms of students’ self-reported SRL skills, online behaviors, course engagement and learning performance. In this research, we presented each study (Study 1 and Study 2) individually, with its own results and discussion sections. This structure allows for an instant analysis of the results and a contextual discussion of the findings. To overview the effects of the two types of recommendations, we provided a generation discussion across studies in Section 6. Finally, based on our findings, we highlighted some implications for future SRL recommendation designs.

2. Conceptual Background

2.1. Self-Regulated Learning

According to [11], there are three SRL phases (i.e., forethought, performance and reflection), with each phase representing a process that occurs before, during and after learning. The forethought phase is the preparation stage for SRL. The activities during this phase are designed to help students plan for their future learning process and to activate the relevant aspects of their prior knowledge [12]. The performance phase emphasizes learners’ ability to monitor their learning process and to persistently engage in learning activities [13]. After each learning effort, self-regulated learners are also expected to reflect on their performance and to take action to adapt their SRL process, which [11] called the reflection phase. Specific SRL activities are related to each phase, and learners can choose these SRL activities considering their current SRL phase. In recent years, several studies have suggested methods for promoting students’ SRL skills, which have shown positive effects of SRL training on improving students’ ability to take responsibility for their own learning and learning outcomes [9,14]

2.2. SRL in Flipped Learning

The flipped classroom (FC) approach has received increasing attention in recent years. FC refers to the idea of providing students with content knowledge outside the classroom while spending more time practicing what they have learned through in-class activities [8]. Studies have reported that the FC model has positive results in improving students’ engagement, motivation and learning outcomes [6,15]. However, [16] suggested that the additional workload imposed by the FC approach on individual learning can cause difficulties for students who are not proficient in regulating their learning outside the classroom. As a result, students who lack SRL skills may engage in insufficient preparation before their subsequent in-class session, which can reduce the efficiency of their in-class learning outcomes [17,18]. Thus, promoting students’ SRL skills has become a promising solution to addressing the challenges experienced during individual online learning activities outside the classroom.

2.3. Recommendation Approaches and Purpose of the Present Study

Recommendations were originally considered to help users obtain relevant personalized online information [19,20]. The tremendous growth in providing recommendations to users in commercial fields, such as online business, has aroused researchers’ interest in developing similar methods to support students’ online learning. One of the most promising features of using the recommendation approach to promote SRL is that the given recommendations can serve as learning guidance and simultaneously enable learners’ autonomy to self-control. [21] showed that effective SRL practices require learners to have some control over their own learning process. Therefore, the facilitation of SRL “is a balancing act between external support and internal regulation” [22].
The present study intended to support students in conducting SRL by providing them with different types of recommendations. To evaluate the impact of each type of recommendation, we measured students’ self-reported SRL skills, online behaviors, course engagement and learning performance. We proposed three research questions:
  • What is the effect of the recommendation approach on students’ self-reported SRL skills?
  • How can students’ SRL skills be manifested in their online SRL-related behaviors?
  • Does the recommendation approach affect students’ course engagement and learning performance?

3. Overall Study Design and Methods

This article reports the results of two studies (Studies 1 and 2) conducted in two undergraduate courses at a large public university. All the students who were enrolled in either of the two courses were invited to participate. Both courses were conducted fully online. All students enrolled in the two courses voluntarily signed the consent form approved by the Institutional Review Board of the authors’ university. Those who enrolled in the two courses were asked to conduct their online activities outside the classroom using the Moodle learning management system.
In this study, we implemented direct and indirect recommendations in two flipped classrooms to provide a more nuanced understanding of the differences between these approaches. The direct recommendation approach used specific SRL phases and activities, while the indirect recommendation approach (i.e., providing general SRL hints) reminded the students of ways to conduct SRL but left them to determine for themselves what specific SRL activities they should practice. The recommendations in both approaches were based on the students’ actual learning performance using the online learning management platform, Moodle. For example, only students who did not complete the assigned activities were provided with either direct or indirect recommendations. Meanwhile, those who were proficient in SRL were recommended to learn about other supplementary SRL materials.

3.1. Study Designs

3.1.1. Design Learning Activities in Terms of the Three Phases of SRL

This study provided students with recommendations using two main approaches. First, we designed the main objective of each online learning activity with the purpose of mapping the online activities into one of the three SRL phases as described by [11] In a recent review of SRL in online education, [23] found that one of the major limitations of previous research designs was that they only implemented recommendations in some SRL phases but failed to consider all three phases as a complete SRL process as mentioned by [11]. For example, students only received recommendations during their goal-setting and planning phase, while no further recommendations were given during the other two SRL phases. Therefore, these attempts did not sufficiently promote students’ SRL skills in the long term. To fill these knowledge gaps, the present study sought to promote SRL by adopting direct and indirect recommendations in all three SRL phases. For example, in the forethought phase, we provided weekly overviews to recommend to students what to work on in the upcoming week (for an example, see Figure 1). These overviews enabled the students to plan for their SRL process. We also included a discussion forum as a pre-class activity. In each week, the instructors posted several pre-class questions related to the new topics to be learned for the upcoming week, and the students were encouraged to share their opinions in the discussion forum. Both weekly overviews and pre-class activities could activate students’ prior knowledge and prepare them to learn the new contents, which were essential components of the students’ metacognitive self-regulation activities during the forethought phase [12,24].
During the performance phase, the students had to work on various tasks and monitor their progress to make sure that their work was completed as planned [5,8]. In this study, we suggested that students should break down their final project into weekly tasks and frequently check their performance record in a progress bar on Moodle, which displayed the completion status of their various activities on the course homepage.
After each weekly session, the students reflected on what they had learned and evaluated their learning performance [10,25]. The post-class discussion forum required the students to share their thoughts and solutions to resolve practical problems, in addition to in-depth reflections on the knowledge gained during the previous class. Students could also take a revision quiz to self-assess their learning performance every 3 weeks.

3.1.2. Design of the Performance Report

Second, we provided students with follow-up SRL recommendations via an instructor emailed individual performance report every 3 weeks, which consisted of three parts: activity completion status, digital badges earned and recommended SRL activities. The activity completion section displayed the uncompleted activities during the review period (see Figure 2). The badge section indicated the number of badges that each student had earned (see Section 3.1.3 for more details). Different recommendation approaches were presented in the recommendation section. Students who completed all tasks on time had higher SRL skills than their peers who did not complete all tasks [21]. We introduced other supplementary SRL activities for the students who had completed all tasks. For example, as these students understood the importance of planning during the forethought phase, we introduced a frequently used goal-setting framework (SMART, i.e., specific, measurable, achievable, relevant and time-bound) to help these students learn how to plan more effectively (for an example, see Figure 3).
However, students who did not complete all online activities during the review period in Studies 1 and 2 were provided with direct and indirect recommendations, respectively. In Study 1, we provided students who did not complete all tasks with direct recommendations. We recommended SRL phases and activities that these students had overlooked in the last 3 weeks. For instance, some of the students had skipped reading the weekly overview before they started to learn in the following week. Thus, we identified the overlooked SRL practices in the forethought phase and emphasized the importance of setting goals and planning ahead. Then we recommended these students perform the specific SRL activities in the forethought phase and complete the corresponding learning activities. In Study 2, we provided the students with indirect recommendations, that is, students were recommended to review all three SRL phases and activities. Thus, they were simultaneously given a certain amount of autonomy to self-regulate their course activities. No specific recommended SRL phases and activities were provided in Study 2.

3.1.3. Design of the Digital Badges

Zimmerman [11] found that SRL can be a boring and repetitive process for many students, which makes the FC context extremely challenging because students are provided with activities that they must complete without being fully aware of the benefits they can gain after successfully completing the tasks [8]. Zhao and Ye [26] suggested giving students extrinsic incentives to improve their motivation during online learning. Thus, it is necessary for researchers to devise ways of enhancing and maintaining students’ motivation to persistent in SRL [14,27].
With this in mind, we designed two types of digital badges to reward students who performed well in the course activities. Early Bird (EB) badges were awarded to students in both Studies 1 and 2 who completed the assigned learning tasks at least 1 day before the deadline. In this way, students were encouraged to conduct SRL by planning ahead and avoiding procrastination. Three Winning Streak (WS) badges were available for each course segment over the entire semester in Studies 1 and 2, and students could earn one if they completed all of the assigned learning activities within 3 consecutive weeks. This badge was designed to encourage students to persist in SRL in terms of completing all designed activities throughout their learning process. Previous research also identified students constantly engaging in learning activities as an indicative behavior during the performance phase [13]. Table 1 summarizes all learning activities, Moodle modules and badges related to each SRL phase.

3.2. Research Procedures in Studies 1 and 2

In the first and last session of each course, students in both Studies 1 and 2 were asked to complete the pre- and post-SRL questionnaires, which were considered as pre- and post-SRL tests in this study. In Session 1, we visited both courses and introduced the course design and the three-phase SRL framework to the students. First, in addition to attending each weekly class session, students were required to complete the online learning activities assigned by their instructors. Digital badges would be awarded to those who performed well during the process. Second, every 3 weeks throughout the semester, their instructors would email individual performance reports to the students with recommendations for their future learning. Students were reminded to read the reports and adjust their learning accordingly. Figure 4 shows the details of the research procedure throughout the courses.

3.3. Data Collection and Analysis

In this study, we did not make direct comparisons between the two studies because different instructors taught the courses. Instead, we explored the effects of different recommendation approaches on students’ self-reported and online SRL skills, course engagement and learning performance. We adopted pretest and posttest questionnaires to compare the changes in students’ SRL skills. Our study design mainly focused on emphasizing the three SRL phases; therefore, we used the metacognitive scale of the Self-regulated Online Learning Questionnaire (SOL-Q) [28]. The revised version of the SOL-Q defines metacognitive activities as happening before, during and after learning, which fit well with our study compared with most questionnaires that focus on measuring particular SRL strategies. The SOL-Q metacognitive scale consists of 20 items on a 7-point Likert scale ranging from 1 (strongly disagree) to 7 (strongly agree). A paired sample t-test was conducted to compare the means of the pretest and posttest scores. By analyzing the differences between the means, we could further explore the effect of our approach on students’ overall SRL skills and each SRL phase.
To explore the effect of the recommendation approach on students’ SRL skills, we identified SRL-related online behaviors using the Moodle log data. Many recent studies have used online log data to measure students’ SRL behaviors in online environments. For example, [13] translated students’ online trace data into macro- and micro-level SRL processes. They argued that the micro-level analysis must “articulate a set of event categorizations which fit into a model of SRL” [13] (p. 861). In our study, the weekly activities in both courses were designed based on the three-phase SRL model [11]. Hence, we identified the relevant SRL activities and students’ online behaviors as indicators of SRL in Moodle (see Table 2). We calculated frequencies to analyze the trends and demonstrated the effect of the two recommendation approaches on students’ online SRL performance.
Finally, we explored the impact of the recommendation approaches on students’ engagement and learning performance. In each study, we analyzed and divided students’ online behaviors into two clusters, which is consistent with the approach used by [17], who employed the clustering technique to detect learner groups. In our study, we considered students who completed all online activities on time by themselves and students who made up all missing assignments after receiving the recommendations as the strategic group. In contrast, those who did not adopt the recommendations were considered as the nonstrategic group. Thus, we could better interpret the effect of our approaches by comparing student engagement and learning performance between the strategic and nonstrategic groups. We defined learner engagement based on two types of students’ online behaviors [2,15]. The first type of behavior was the students’ weekly active days on the Moodle platform, which was calculated by dividing the total number of days when they accessed Moodle by the number of course weeks. The second type of behavior was defined as the students’ weekly use of Moodle resources, which was calculated by dividing the total number of accessed course resources throughout the semester by the number of course weeks.
Most studies have used students’ final course grades to measure their learning performance [2,17]. In this study, however, we focused on students’ online learning performance in Moodle by assessing the quality of their online forum activities. In Study 1, the course instructor graded the students’ posts in online discussion forums on a 40-points basis. In Study 2, two researchers graded students’ online forum postings according to the criteria for evaluating online activities proposed [2,29]. For each forum, students therefore could receive scores ranging from 0 (level 0) to 3 (level 3) (see Table 3). Each student’s final online activity score was counted as the sum of all scores received in tutorial forums, with a maximum of 27 points. The interrater reliability of gradings between the two researchers resulted in high agreement (92%), and disagreements were resolved after discussion.

4. Study 1

4.1. Participants

The participants in Study 1 were 29 undergraduate students who had enrolled in a Data Mining course, which aims to equip them with an understanding of data mining concepts and techniques. The course consisted of 12 two-hour face-to-face sessions. Students were expected to attend face-to-face weekly sessions in classrooms and participate in online activities using Moodle outside the classroom.

4.2. Results

4.2.1. Self-Reported SRL Tests

The SOL-Q was used for both pretest and posttests, and the total score from each test was 140. Three students did not complete either the pretest or posttest; therefore, the test results of 26 students were analyzed. Cronbach’s alpha showed a high level of internal consistency for each scale, ranging from 0.80 to 0.94 (see Table 4).
To explore whether the students’ SRL skills improved after taking the course, we compared the pre- and post-SRL scores of all students. The Shapiro–Wilk test showed that the scores of both tests were normally distributed (pretest, W(26) = 0.957, p = 0.337; posttest, W(26) = 0.943, p = 0.161). The assumption of normality was met; therefore, a paired sample t-test was conducted to compare the means of the students’ SRL scores. The descriptive statistics showed a significant difference between the pretest (n = 26, M = 93.84, SD = 14.42) and posttest (n = 26, M = 101.47, SD = 18.44) means (t(25) = −2.20, p = 0.037). To further explore the changes in student scores for each SRL phase, we compared the pretest and posttest scores for each of the three subscales. The results showed that the mean posttest scores improved in all three subscales compared with the pretest, and a significant difference was found in the forethought (t(25) = −2.09, p = 0.047) and reflection phases (t(25) = −2.73, p = 0.011).

4.2.2. SRL-Related Online Behavior

We separated the semester into three course segments, which referred to the dates when the students received their individual performance reports. The number of students who earned the EB badge increased from around 62% to 76% by the end of the semester. About half of the students were also able to earn the WS badge for each segment, and this percentage remained relatively stable throughout the semester (Figure 5). We further explored students’ weekly SRL-related online behaviors. The percentage of students who were able to complete the pre- and post-class discussion forums remained at a high level during the entire semester (Figure 6). However, the number of students who read the weekly overviews kept decreasing in the following weeks.

4.2.3. Engagement and Learning Performance

As mentioned in the previous section, the students’ engagement was measured by calculating the average number of active days and accessed resources per week. As the data did not meet the assumption of normality for the independent samples t-test, we conducted the Mann–Whitney U test to compare the difference between the two groups. The results showed a significant difference in students’ days with access per week (U = 38, p = 0.003) and resources with access per week (U = 53.5, p = 0.025). Learning performance was measured based on the scores that the students received in the online activity section. The results of the Mann–Whitney U test showed significant differences in student activity scores between the strategic (median = 33) and nonstrategic (median = 26.5) groups (U = 22.5, p < 0.001) (see Table 5).

4.3. Discussion

Considering RQ1, we analyzed students’ self-reported SRL questionnaires to explore whether their SRL skills improved after being exposed to our intervention. The t-test results showed that there was a significant difference between the students’ pre- and post-SRL scores. We further compared their scores on the three SRL subscales. All of the mean posttest scores improved compared with the pretest scores, while the forethought and reflection subscales showed significant increases.
We used RQ2 to explore students’ SRL online behaviors as shown in the Moodle log data. We matched the identified online behaviors with each SRL phase. By analyzing the frequencies and trends, we obtained the following conclusions. In the forethought phase, the number of students who earned EB badges as well as completed pre-class activities on time maintained at a higher level, which indicated that students behaved actively in planning ahead as self-regulated learners. Students’ SRL behaviors during the performance phase were identified as their persistence in completing all learning tasks on time for each segment (i.e., 3 consecutive weeks). Zimmerman and Martinezpons (1988) demonstrated that self-regulated students consistently complete their homework promptly. In our study, this was interpreted as the number of students who earned the WS badge. More than 50% earned a WS badge in each segment. This trend analysis result was consistent with their self-reported SRL test results, which indicated no significant difference between the pretest and posttest scores in the performance phase subscale. In the reflection phase, students actively completed the post-class activities over the 9 weeks of the course. Revision actions were considered as a manifestation of regulating learning activities in that students actively reinforce their previous knowledge (van Alten et al., 2020). In our study, students’ revision actions during each segment revealed that they frequently adopted reflection behaviors throughout the semester.
Finally, we found positive effects for the direct recommendation approach on student course engagement and performance. Specifically, students who followed our recommendations in the strategic group had a higher average number of active days and accessed resources per week compared to the nonstrategic group. Maldonado-Mahauad et al. [30] also concluded that students who tended to follow the learning path provided by the course structure exhibited a higher level of engagement than other learner clusters. In addition to measuring the quantity of behaviors, such as frequencies of actions and online access, we also considered the quality of student online learning. We compared the scores that the students received from online activities between the strategic and nonstrategic groups and found a significant difference, which indicated that our SRL recommendations benefited students in terms of improving their quality of online learning. Similarly, [2] demonstrated the effectiveness of improving course engagement and performance by adopting SRL prompts combined with recommendations in a massive open online course.

5. Study 2

5.1. Participants

The participants in Study 2 were 48 undergraduate students who had enrolled in an Information Society Issues and Policy course, which includes topics such as intellectual property and copyright, with the aim to teach students how to identify and critique various information policies. This course was based on 12 two-hour face-to-face sessions. In addition, the students were expected to undertake further independent studies outside the classroom to complete the online activities and assignments.

5.2. Results

5.2.1. Self-Reported SRL Tests

We used the same questionnaire introduced in Study 1. Twenty-two of the 48 students who participated in the course completed both the pretest and posttests. Their responses were used for further analysis. Cronbach’s alpha for each scale ranged from 0.84 to 0.94, which indicated their high reliability (see Table 6). To compare the students’ pre- and post-SRL scores, we conducted the Shapiro–Wilk test to check the normality of the scores. The results showed that the scores of both tests were normally distributed (pretest, W(22) = 0.982, p = 0.947; posttest, W(22) = 0.956, p = 0.416). The assumption of normality was met; therefore, we conducted a paired sample t-test to compare the means of the students’ pre- and post-SRL scores. The t-test results showed no significant difference between the pretest (n = 22, M = 91.64, SD = 17.41) and posttest (n = 22, M = 96.84, SD = 15.70) scores (t(21) = −1.52, p = 0.143). We further compared the mean scores for each SRL subscale between pretest and posttest. The results showed that the mean posttest scores improved in all three subscales, and a significant difference was found in the reflection subscale (t(21) = −2.19, p = 0.04).

5.2.2. SRL-Related Online Behaviors

As shown in Figure 6, the number of students who earned badges kept increasing during the course. The percentage of students who earned the WS badge increased by 15%. Starting from Segment 2, the percentage of students who earned the EB badge showed a dramatic increase (Figure 7). As in Study 1, we analyzed the students’ weekly SRL-related online behavior. Throughout the semester, more than half of the students were able to complete most of the pre- and post-class discussion forums. Although some of the students completed the forums afterward, the completion rates for both forums were high and stable. However, the number of students who read the weekly overviews dropped after Segment 1 (see Figure 8 for details).

5.2.3. Engagement and Learning Performance

The same methods were used to measure engagement in Study 2 as in Study 1. The results of the Mann–Whitney U test showed a significant difference in students’ days with access per week between the strategic (median = 2.43) and nonstrategic (median = 2.04) groups (U = 159.5, p = 0.035). However, no significant difference was found in the students’ resource access per week between the two groups. To evaluate the quality of student online learning, each student was given an online activity score based on their performance in weekly tutorial forums. The maximum score that a student could get was 27. The Mann–Whitney U test was conducted to compare the median scores that the students received in each group. The strategic group obtained significantly higher scores (median = 13) than the nonstrategic group (median = 10) (U = 102.5, p = 0.001) (Table 7).

5.3. Discussion

The effect of our approach on students’ self-reported SRL skills was explored in RQ1. The overall results showed no significant difference between pretest and posttest scores, while only the reflection phase subscale indicated significant improvements. We then identified various SRL-related online behaviors in Moodle. In the forethought phase, the proportion of students who earned the EB badge showed a notable increase from Segment 1 to Segment 2 and remained stable afterward. One reason for this finding could be that the students in this course showed relatively lower engagement at the beginning. During Segment 1, most of the students were still figuring out the course settings. Starting from Segment 2, students were more involved in their learning process, and in Segment 3, the proportion of students who earned the EB badge was stable. As in Study 1, the number of students who read the weekly overviews dropped after Segment 1, whereas the completion rate of pre-class activities was maintained at a relatively high level. Similar results have been found in previous studies that attempted to support students’ planning activities. [31] designed plan-making prompts for the forethought phase, and their results suggested that the effect of this intervention was only sustained for the first few weeks. Furthermore, the number of students who earned the WS badge kept increasing throughout the semester, which suggested the consistently positive effect of our approach during the performance phase. In the reflection phase, a high frequency of revision actions could be observed in all three course segments. These indicators confirmed the self-reported SRL results in RQ1, which showed a significant difference in the reflection subscale between pre- and post-SRL scores.
In terms of course engagement, we found that the students who followed our recommendations had a significantly higher average number of days with access for each week than those who did not follow our recommendations. In addition, students in the strategic group tended to access more recourses per week than those in the nonstrategic group. However, no significant difference was found. One reason for this finding could be that the students in the strategic group were used to regularly accessing the online platform to check their learning process, even when they had already completed all learning tasks. For example, with the progress bar displayed on the course homepage in Moodle, students did not have to access any resource pages to view their progress. Therefore, students in the strategic group might be exposed to their progress more frequently than those in the nonstrategic group. Maldonado-Mahauad et al. [30] also indicated that strategic students preferred to explore the course contents more often than the other clusters of students. However, as these students were highly strategic in their learning practice, they were likely to focus on particular course contents that they intended to work on instead of aimlessly accessing a variety of learning contents. Thus, this may explain our results that strategic students spent significant more days on the Moodle learning platform with slightly higher resource access. We also compared the activity scores that the students received from each group, and the results indicated that those who were exposed more often to our recommendations completed the online tasks with higher quality than those who were exposed less frequently.

6. General Discussion and Conclusions

6.1. Direct and Indirect Recommendations

We provided different types of recommendations in two studies. From an instructional design perspective, direct recommendations can be considered a form of direct instruction that aims to facilitate students’ acquisition of explicit SRL skills, whereas indirect recommendations provide students with general SRL hints and simultaneously give them a certain amount of autonomy to self-regulate their own learning activities. In Study 1, we provided students with direct recommendations on improving their SRL skills using their individual performance reports. We evaluated the students’ performance in various activities and recommended specific SRL activities for them to practice. In Study 2, we provided indirect recommendations. When students did not complete all tasks as required, they were recommended to revise the proposed SRL phases and assigned activities for each week, with no further specific SRL activities given.
Although most of the outcome variables showed improvements, Study 1 indicated more significant results in pretest and posttests, in addition to course engagement. Our direct recommendations might have had an immediate impact on motivating and guiding the students through various learning activities. Thus, students felt supported during their individual online learning process, increasing their engagement in joining SRL activities. Therefore, students were likely to have better self-report test results, as shown in Study 1. [32] also found that students who received prompts and specific feedback outperformed the control group in self-reported scales. In addition, [33] concluded that students who received adaptive metacognitive prompts together with feedback reported a higher frequency of using SRL strategies. The direct recommendations provided explicit guidance for students on what activities or strategies they should practice; therefore, the students who read and followed the recommendations showed higher course engagement.
The indirect recommendations given in Study 2 had a limited impact on student perceptions and engagement; however, students’ increasing tendency to engage in SRL activities during the three course segments indicated the great potential of indirect recommendations to support students’ SRL. Following the results of this research, we suggest that the use of direct recommendations may be useful in improving student engagement in online SRL activities and sustaining their motivation to conduct SRL, while the use of indirect recommendations may play a major role in increasing students’ awareness of the need to regulate their learning. Moreover, both types of recommendations can significantly affect the quality of students’ online learning.

6.2. Conclusions and Limitations

Using the findings of two studies, this study examined the effect of providing students with direct and indirect recommendations to facilitate their SRL. However, our research has some limitations. First, the sample sizes in both studies were relatively small. Future studies should replicate our methods and use larger samples to better generalize the learning designs. Second, we identified some of the SRL-related online behaviors in the Moodle learning management system; however, these indicators represented only a small set of student behaviors in Moodle. Therefore, students’ SRL behaviors during particular phases could be manifested in other forms of Moodle interactions that were excluded from the current study. We suggest that future research could identify more SRL behaviors via Moodle log data and include as many behaviors as they can identify for analysis.

6.3. Implications for Future Research

Based on the findings of our two studies, we propose the following implications for future SRL recommendation designs. First, the benefits of both recommendation approaches are based on the premise that students are willing to read or adopt the given recommendations. Therefore, instructors or researchers should explicitly demonstrate the benefits of taking these recommendations in the first stage of the designed course. This explanation would be useful for students who have high SRL skills or initial motivation to focus on the course in which they are enrolled; however, for those who lack such skills or beliefs, researchers could follow these two steps. First, using external incentives to improve students’ motivation to conduct SRL activities. When students become used to the techniques and perceive their effectiveness in their own learning performance, their intrinsic motivation may further stimulate them to persist in SRL activities [34] Second, direct recommendations are useful in increasing student engagement and maintaining their motivation to conduct SRL. However, indirect recommendations also show great potential for raising students’ awareness of the need to conduct SRL. Thus, we recommend that future research adopt different types of recommendations based on student abilities and course settings. For example, indirect recommendations could be given as hints for students with more proficient SRL practices, and direct recommendations could be provided to novice students to explicitly guide them in SRL practices.
The findings of both studies indicated that the number of students who read weekly overviews kept dropping, while those who completed pretest and posttest written activities maintained a higher level of learning throughout the semester than students who did not complete pretest and posttest activities. [35,36] both argued that the decline in student performance may be due to either their insufficient skills to complete the tasks or their slack attitude toward repeatedly adopting the skills that they already know. Based on these arguments, we offer two suggestions for future research. First, as [37] pointed out, it is not enough for students only to be told what learning strategies to use. More importantly, they should have the opportunity to practice SRL during their learning process because practical activities enable students to reflect on their knowledge gaps and skill weaknesses during their learning process. Moreover, instructors could become aware of students’ problems by assessing their posted work and promptly react to any decline in their students’ learning. Second, instructors and researchers should detect their students’ competence levels by identifying indicators using behavioral data. Once these indicators show that students have already mastered particular skills, teachers should adjust their recommendations accordingly. Thus, the decline in student performance could be prevented at an earlier stage.

Author Contributions

Conceptualization, J.D. and K.F.H.; methodology, J.D. and K.F.H.; software, J.D.; validation, L.L.; formal analysis, J.D.; investigation, J.D.; resources, J.D.; data curation, J.D. and L.L; writing—original draft, J.D.; writing—review and editing, K.F.H.; visualization, J.D.; supervision. K.F.H.; projection administration, K.F.H.; funding acquisition, K.F.H. All authors have read and agreed to the published version of the manuscript.

Funding

This work was partially supported by a fellowship award from the Research Grants Council of the Hong Kong SAR, China (Project No. RFS2223-7H02).

Institutional Review Board Statement

This study was approved by the Human Research Ethics Committee of the University of Hong Kong (HREC’S Reference no: EA200155, approved on 11/11/2020).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on reasonable request from the corresponding author. The data are not publicly available due to consent provided by participants on the use of confidential data.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Blau, I.; Shamir-Inbal, T.; Avdiel, O. How does the pedagogical design of a technology-enhanced collaborative academic course promote digital literacies, self-regulation, and perceived learning of students? Internet High. Educ. 2020, 45, 100722. [Google Scholar] [CrossRef]
  2. Wong, J.; Baars, M.; De Koning, B.B.; Paas, F. Examining the use of prompts to facilitate self-regulated learning in massive open online courses. Comput. Hum. Behav. 2021, 115, 106596. [Google Scholar] [CrossRef]
  3. Kalman, R.; Esparza, M.M.; Weston, C. Student views of the online learning process during the COVID-19 pandemic: A comparison of upper-level and entry-level undergraduate perspectives. J. Chem. Educ. 2020, 97, 3353–3357. [Google Scholar] [CrossRef]
  4. Pedrotti, M.; Nistor, N. How students fail to self-regulate their online learning experience. In Transforming Learning with Meaningful Technologies; EC-TEL 2019; Lecture Notes in Computer Science; Scheffel, M., Broisin, J., Pammer-Schindler, V., Ioannou, A., Schneider, J., Eds.; Springer: Berlin/Heidelberg, Germany, 2019; Volume 11722. [Google Scholar] [CrossRef] [Green Version]
  5. Zimmerman, B.J. Attaining self-regulation. A social cognitive perspective. In Handbook of Self-Regulation; Boekaerts, M., Pintrich, P.R., Zeidner, M., Eds.; Academic Press: Cambridge, MA, USA, 2000; pp. 13–39. [Google Scholar]
  6. Al Mulhim, E.N. Flipped learning, self-regulated learning and learning retention of students with internal/external locus of control. Int. J. Instr. 2021, 14, 827–846. [Google Scholar] [CrossRef]
  7. Shih, M.; Liang, J.C.; Tsai, C.C. Exploring the role of university students’ online self-regulated learning in the flipped classroom: A structural equation model. Interact. Learn. Environ. 2019, 27, 1192–1206. [Google Scholar] [CrossRef]
  8. Yoon, M.; Hill, J.; Kim, D. Designing supports for promoting self-regulated learning in the flipped classroom. J. Comput. High. Educ. 2021, 33, 398–418. [Google Scholar] [CrossRef]
  9. Van Alten, D.C.D.; Phielix, C.; Janssen, J.; Kester, L. Effects of self-regulated learning prompts in a flipped history classroom. Comput. Hum. Behav. 2020, 108, 106318. [Google Scholar] [CrossRef]
  10. Moos, D.C.; Bonde, C. Flipping the classroom: Embedding self-regulated learning prompts in videos. Technol. Knowl. Learn. 2016, 21, 225–242. [Google Scholar] [CrossRef]
  11. Zimmerman, B.J. Becoming a self-regulated learner: An overview. Theory Into Pract. 2002, 41, 64–70. [Google Scholar] [CrossRef]
  12. Shyr, W.J.; Chen, C.H. Designing a technology-enhanced flipped learning system to facilitate students’ self-regulation and performance. J. Comput. Assist. Learn. 2018, 34, 53–62. [Google Scholar] [CrossRef]
  13. Saint, J.; Gasevic, D.; Matcha, W.; Uzir, N.A.; Pardo, A. Combining analytic methods to unlock sequential and temporal patterns of self-regulated learning. In Proceedings of the Tenth International Conference on Learning Analytics & Knowledge, Main, Germany, 25–27 March 2020; pp. 402–411. [Google Scholar] [CrossRef] [Green Version]
  14. Çebi, A.; Güyer, T. Students’ interaction patterns in different online learning activities and their relationship with motivation, self-regulated learning strategy and learning performance. Educ. Inf. 2020, 25, 3975–3993. [Google Scholar] [CrossRef]
  15. Uzir, N.A.; Gasevic, D.; Matcha, W.; Jovanovic, J.; Pardo, A. Analytics of time management strategies in a flipped classroom. J. Comput. Assist. Learn 2020, 36, 70–88. [Google Scholar] [CrossRef]
  16. Humrickhouse, E. Flipped classroom pedagogy in an online learning environment: A self-regulated introduction to information literacy threshold concepts. J. Acad. Librariansh. 2021, 47, 102327. [Google Scholar] [CrossRef]
  17. Jovanović, J.; Gašević, D.; Dawson, S.; Pardo, A.; Mirriahi, N. Learning analytics to unveil learning strategies in a flipped classroom. Internet High. Educ. 2017, 33, 74–85. [Google Scholar] [CrossRef] [Green Version]
  18. Zheng, B.B.; Ward, A.; Stanulis, R. Self-regulated learning in a competency-based and flipped learning environment: Learning strategies across achievement levels and years. Med. Educ. Online 2020, 25, 1686949. [Google Scholar] [CrossRef]
  19. Benhamdi, S.; Babouri, A.; Chiky, R. Personalized recommender system for e-learning environment. Educ. Inf. Technol. 2017, 22, 1455–1477. [Google Scholar] [CrossRef]
  20. Bouihi, B.; Bahaj, M. Ontology and rule-based recommender system for e-learning applications. Int. J. Emerg. Technol. Learn 2019, 14, 4–13. [Google Scholar] [CrossRef] [Green Version]
  21. Zimmerman, B.J.; Martinezpons, M. Construct-validation of a strategy model of student self-regulated learning. J. Educ. Psychol. 1988, 80, 284–290. [Google Scholar] [CrossRef]
  22. Ifenthaler, D. Determining the effectiveness of prompts for self-regulated learning in problem-solving scenarios. Educ. Technol. Soc. 2012, 15, 38–52. [Google Scholar]
  23. Du, J.; Hew, K.F.T. Using recommender systems to promote self-regulated learning in online education settings: Current knowledge gaps and suggestions for future research. J. Res. Technol. Educ. 2021, 54, 557–580. [Google Scholar] [CrossRef]
  24. Jansen, R.S.; van Leeuwen, A.; Janssen, J.; Conijn, R.; Kester, L. Supporting learners’ self-regulated learning in Massive Open Online Courses. Comput. Educ. 2020, 146, 103771. [Google Scholar] [CrossRef]
  25. Rodrigues, R.L.; Ramos, J.L.C.; Silva, J.C.S.; Dourado, R.A.; Gomes, A.S. Forecasting students’ performance through self-regulated learning behavioral analysis. Int. J. Distance Educ. Technol. 2019, 17, 52–74. [Google Scholar] [CrossRef]
  26. Zhao, L.; Ye, C. Time and performance in online learning: Applying the theoretical perspective of metacognition. Decis. Sci.—J. Innov. Educ. 2020, 18, 435–455. [Google Scholar] [CrossRef]
  27. Tsai, Y.H.; Lin, C.H.; Hong, J.C.; Tai, K.H. The effects of metacognition on online learning interest and continuance to learn with MOOCs. Computers & Education 2018, 121, 18–29. [Google Scholar] [CrossRef]
  28. Jansen, R.S.; van Leeuwen, A.; Janssen, J.; Kester, L. Validation of the revised self-regulated online learning questionnaire. In European Conference on Technology Enhanced Learning; Springer: Berlin/Heidelberg, Germany, 2018; pp. 116–121. [Google Scholar]
  29. Huang, B.; Hew, K. Using gamification to design courses: Lessons learned in a three-year design-based study. Educ. Technol. Soc. 2021, 24, 44–63. [Google Scholar]
  30. Maldonado-Mahauad, J.; Pérez-Sanagustín, M.; Kizilcec, R.F.; Morales, N.; Munoz-Gama, J. Mining theory-based patterns from big data: Identifying self-regulated learning strategies in Massive Open Online Courses. Comput. Hum. Behav. 2018, 80, 179–196. [Google Scholar] [CrossRef]
  31. Kizilcec, R.F.; Reich, J.; Yeomans, M.; Dann, C.; Brunskill, E.; Lopez, G.; Turkay, S.; Williams, J.J.; Tingley, D. Scaling up behavioral science interventions in online education. Proc. Natl. Acad. Sci. USA 2020, 117, 14900–14905. [Google Scholar] [CrossRef]
  32. Van den Boom, G.; Paas, F.; van Merriënboer, J.J. Effects of elicited reflections combined with tutor or peer feedback on self-regulated learning and learning outcomes. Learn. Instr. 2007, 17, 532–548. [Google Scholar] [CrossRef]
  33. Lee, H.W.; Lim, K.Y.; Grabowski, B.L. Improving self-regulation, learning strategy use, and achievement with metacognitive feedback. Educ. Technol. Res. Dev. 2010, 58, 629–648. [Google Scholar] [CrossRef]
  34. Zimmerman, B.J.; Bonner, S.; Kovach, R. Developing Self-Regulated Learners: Beyond Achievement to Self-Efficacy; American Psychological Association: Washington, DC, USA, 1996. [Google Scholar]
  35. Carter, R.A.; Rice, M.; Yang, S.H.; Jackson, H.A. Self-regulated learning in online learning environments: Strategies for remote learning. Inf. Learn. Sci. 2019, 121, 321–329. [Google Scholar] [CrossRef]
  36. Molenaar, I.; Horvers, A.; Baker, R.S. What can moment-by-moment learning curves tell about students’ self-regulated learning? Learn. Instr. 2021, 72, 101206. [Google Scholar] [CrossRef]
  37. Weinstein, C.E.; Husman, J.; Dierking, D.R. Self-regulation interventions with a focus on learning strategies. In Handbook of Self-Regulation; Boekaerts, M., Pintrich, P.R., Zeidner, M., Eds.; Academic Press: Cambridge, MA, USA, 2000; pp. 727–747. [Google Scholar]
Figure 1. The weekly overview of the online activities.
Figure 1. The weekly overview of the online activities.
Education 13 00400 g001
Figure 2. Examples of activity completion status and digital badges earned presented in the performance report.
Figure 2. Examples of activity completion status and digital badges earned presented in the performance report.
Education 13 00400 g002
Figure 3. Example of the performance report for students who completed all tasks in Studies 1 and 2.
Figure 3. Example of the performance report for students who completed all tasks in Studies 1 and 2.
Education 13 00400 g003
Figure 4. Research procedure.
Figure 4. Research procedure.
Education 13 00400 g004
Figure 5. Trends of earning EB and WS badges.
Figure 5. Trends of earning EB and WS badges.
Education 13 00400 g005
Figure 6. Distribution of students who completed all weekly activities (i.e., completers).
Figure 6. Distribution of students who completed all weekly activities (i.e., completers).
Education 13 00400 g006
Figure 7. Trends of earning EB and WS badges.
Figure 7. Trends of earning EB and WS badges.
Education 13 00400 g007
Figure 8. Distribution of students who completed all weekly activities (i.e., completers).
Figure 8. Distribution of students who completed all weekly activities (i.e., completers).
Education 13 00400 g008
Table 1. Study design summary.
Table 1. Study design summary.
SRL PhaseLearning ActivityMoodle ModulePossible Reward
Forethought1. A weekly overview tells students what to learn and what to work on in the upcoming week, which enables them to plan their learning.1. Weekly overview: A resource page for students to viewEB badge: Posted in the pre- or post-class discussion forum at least 1 day before the deadline
2. A pre-class activity helps students activate their prior knowledge and prepare for the new learning contents.2. Pre-class discussion forum activity: An asynchronous interaction tool
Performance1. Group Wiki enables students to break down the final project into weekly tasks so that they can monitor their weekly progress.1. Group Wiki: Use the Wiki as a sharing and collaboration space for group projectsWS badge: Complete all weekly Moodle activities (listed in the weekly overview) for every 3 consecutive weeks to get one WS badge
2. The progress bar displays the completion status of all assignments that students must complete.2. Progress bar: Displays students’ progress
Reflection1. Post-class activities enable students to reflect on what they have learned in class and to apply that knowledge to solving real-world problems.1. Tutorial forum activity: Using questions as a tool to reflect on SRL practice
Table 2. Indicators of online SRL-related behaviors.
Table 2. Indicators of online SRL-related behaviors.
Macro-Level ProcessMicro-Level ProcessSRL ActivitiesIndicators in Moodle
ForethoughtPlanningDefine plans and activate prior knowledgeRead weekly overview [24]; complete pre-class discussion forum [12]; number of EB badges earned
PerformanceWorking on tasksTo consistently engage with learning tasks and progress monitoringNumber of WS badges earned [13]
ReflectionEvaluating and reflectingReinforce the knowledge learned in class and assess learning performanceComplete post-class discussion forum [25]
RevisingRevise previous learning activitiesComplete previously unfinished tasks [9]
Table 3. Grading criteria for online activities [29].
Table 3. Grading criteria for online activities [29].
Level 0Level 1-Lower Level
Off-the-topic, no submission or submission after the due dataMere repetition or simplistic arguments
  • Repeating question statements without adding new information or interpretation
  • Making confusing or ambiguous statements
  • Assertions without evidence or giving an example to provide a simple explanation
Level 2—Upper levelLevel 3—Uppermost level
Serious attempts to analyze an argument or list competing arguments with evidence
  • Serious argument, such as listing factors as evidence, exploring competing argument, citing anecdotal evidence, but with logical flaws
  • Serious argument with at least 1 creative perspective (which can shed light on the issue/or can bring inspiration to other students) but with logical flaws
  • Serious argument but without diverse examples to support the arguments.
Serious argument with a clear logical framework
  • Clearly analyze the cases using information policies concepts; using diverse examples and reasons to support the arguments.
  • Serious argument with two or more creative perspectives (which can shed light on the issue or bring inspiration to other students) in a logical manner.
Table 4. Self-reported SRL test results.
Table 4. Self-reported SRL test results.
SRL
Subscales
PretestPosttestPaired t-Test
MSDαMSDαtpd
Forethought33.315.950.8535.856.420.92−2.090.047 *0.41
Performance33.195.570.8134.697.060.94−1.060.2990.23
Reflection27.354.740.8030.935.780.91−2.730.011 *0.68
Overall93.8414.42 101.4718.44 −2.200.037 *0.46
* p < 0.05.
Table 5. Comparison of engagement and learning performance between the strategic and nonstrategic groups.
Table 5. Comparison of engagement and learning performance between the strategic and nonstrategic groups.
Outcome VariableStrategic GroupNonstrategic GroupMann–Whitney U Test
nMediannMedianUp
Engagement
Days with access per week162.68131.79380.003 **
Resources accessed per week164.25133.7153.50.025 *
Learning performance
Activity scores16331326.522.50.000 ***
* p < 0.05; ** p < 0.01; *** p < 0.001.
Table 6. Self-reported SRL test results.
Table 6. Self-reported SRL test results.
SRL
Subscales
PretestPosttestPaired t-Test
MSDαMSDαtpd
Forethought31.956.330.9032.805.780.91−0.660.5170.14
Performance32.446.050.8834.456.260.94−1.400.1750.33
Reflection27.245.830.9329.594.930.84−2.190.040.44
Overall91.6417.41 96.8415.70 −1.520.1430.31
Table 7. Comparison of engagement and learning performance between the strategic and nonstrategic groups.
Table 7. Comparison of engagement and learning performance between the strategic and nonstrategic groups.
Outcome
Variable
Strategic GroupNonstrategic GroupMann–Whitney U Test
nMediannMedianUp
Engagement
Days with access per week322.43162.04159.50.035
Resources accessed per week324.33163.97199.50.216
Learning performance
Activity scores32131610102.50.001
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Du, J.; Hew, K.F.; Li, L. Do Direct and Indirect Recommendations Facilitate Students’ Self-Regulated Learning in Flipped Classroom Online Activities? Findings from Two Studies. Educ. Sci. 2023, 13, 400. https://doi.org/10.3390/educsci13040400

AMA Style

Du J, Hew KF, Li L. Do Direct and Indirect Recommendations Facilitate Students’ Self-Regulated Learning in Flipped Classroom Online Activities? Findings from Two Studies. Education Sciences. 2023; 13(4):400. https://doi.org/10.3390/educsci13040400

Chicago/Turabian Style

Du, Jiahui, Khe Foon Hew, and Liuyufeng Li. 2023. "Do Direct and Indirect Recommendations Facilitate Students’ Self-Regulated Learning in Flipped Classroom Online Activities? Findings from Two Studies" Education Sciences 13, no. 4: 400. https://doi.org/10.3390/educsci13040400

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop