Next Article in Journal
How Stable Are Students’ Entrepreneurial Intentions in the COVID-19 Pandemic Context?
Next Article in Special Issue
Preservice Science Teachers’ Perspectives on and Practices Related to Self-Regulated Learning after a Brief Learning Opportunity
Previous Article in Journal
Inferring Land Conditions in the Tumen River Basin by Trend Analysis Based on Satellite Imagery and Geoinformation
Previous Article in Special Issue
Self-Regulation, Teaching Presence, and Social Presence: Predictors of Students’ Learning Engagement and Persistence in Blended Synchronous Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

What Can Off- and Online Measures Tell about Students’ Self-Regulation and Their Achievement While Learning Science Expository Hypertext

1
Faculty of Computer and Information Science, University of Ljubljana, Večna Pot 113, SI-1000 Ljubljana, Slovenia
2
Faculty of Arts, University of Ljubljana, Aškerčeva 2, SI-1000 Ljubljana, Slovenia
3
Faculty of Natural Sciences and Engineering, University of Ljubljana, Snežniška 5, SI-1000 Ljubljana, Slovenia
4
Centre for the Examination of Cognition and Learning, Educational Research Institute, Gerbičeva 62, SI-1000 Ljubljana, Slovenia
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(9), 5686; https://doi.org/10.3390/su14095686
Submission received: 23 March 2022 / Revised: 1 May 2022 / Accepted: 3 May 2022 / Published: 8 May 2022

Abstract

:
Self-regulated learning (SRL) plays an important role in successful learning with hypertexts. The use of appropriate SRL strategies helps students acquire new knowledge more efficiently. We investigated the use of SRL strategies in individual learning from expository science hypertext, the correlations between different measures of self-regulation, and the differences in SRL between more and less successful students. A sample of 443 ninth graders from 15 different schools participated in the study. A variety of off- and online measures were used to measure SRL. Data were collected from student traces, questionnaires, tests scores, and notes. Low correlations between the off- and online measures of SRL suggest that they measure different aspects of SRL use in learning digital science texts. Student achievement in science positively correlated with their use of SRL strategies. Students with higher knowledge gains reported the higher use of deep cognitive strategies, higher motivation for learning, and used a higher number of strategies in note-taking while learning. The results of this study may have practical implications for teachers to support student SRL and for developers of digital learning materials to incorporate SRL scaffolding into learning topics.

1. Introduction

The COVID-19 pandemic has profoundly changed our lives. The use of ICT in education has increased [1]. Student individual learning and the use of electronic sources and expository texts in the e-environment has increased significantly [2]. Effective use of digital learning materials requires well-developed self-regulatory learning skills. Digital texts are usually presented in a non-linear format [3,4]; they contain images, animations, and various information search options. Therefore, compared to learning with printed materials, e-learning requires students to use more self-regulation during learning [5,6].
Although reading comprehension is higher when reading on paper than on screen [7], and student performance and reading times are higher when learning linear text than when learning hypertext [8], developing students’ self-regulated learning (SRL) improves learning efficiency in both traditional [9,10] and e-learning environments [11,12]. Research on the relationship between student self-regulation and achievement found that high school students who had to shift their learning online during the COVID-19 pandemic, and had higher competencies in coping with self-regulated learning and higher motivation to learn, performed better [13]. It was also found that high school students who were supported with teacher protocols for self-regulated learning had better learning outcomes [14]. Understanding how students regulate their learning through expository hypertexts and examining the differences in self-regulation between good and poor learners may be useful for developing interventions to improve students’ skills and learning efficiency. Thus, the present study examined self-regulated learning with hypertext and the resulting gains in performance.
Several authors [15,16,17] also emphasize the importance of multi-method self-regulated learning measurement to avoid the shortcomings of self-reports and to take advantage of computer technology for the online measurement of SRL [18]. Nevertheless, the multi-method research is rare, so we used different off- and online measures in our study to study self-regulation and its relations to students’ achievements.

Research Questions

Given the increasing use of hypertext for individual learning [2] and the greater emphasis on the need for SRL, we investigated the relationship between self-reported measures and learning traces (notes, data traces from log files) on cognitive, metacognitive, and motivational self-regulation in learning with expository science hypertext. We used a multi-method approach with measures that could be used in a large-scale study with whole school classes. We were also interested in how these two types of measures relate to grade-point average (GPA) in science, students’ SRL as a function of their prior knowledge of the topic they are learning, and their knowledge growth during learning.
Specifically, the following research questions were asked.
  • What are the correlations between various off- and online measures of self-regulation in individual learning of science hypertext?
  • What are the correlations between different off- and online measures of self-regulation and GPA in science?
  • How do the SRL behaviors of low- and high-achieving students and students with low or higher previous knowledge differ in learning expository hypertext?

2. Theoretical Background

2.1. Self-Regulated Learning

Self-regulated learning (SRL) is an active process in which individuals set goals, monitor their learning process, and regulate it according to goals and contextual demands [19]. It involves a feedback loop that reduces the difference between the actual and target states. Most models of self-regulated learning [19,20,21,22,23] assume that the purposeful use of specific processes, strategies, or responses is directed toward improving academic performance. They also assume that SRL involves cognitive, metacognitive, and motivational/affective processes, knowledge about these processes, and strategies for carrying them out [24].
According to Zimmerman’s cyclical model [25], SRL is a dynamic process that includes activities before, during, and after learning. The relationships between the processes in the phases of SRL are causal and cyclical. Before learning, motivational aspects (such as interest, task value, learner self-efficacy) influence learners’ decisions to set learning goals and plan learning activities. During learning, learners control themselves, the tasks, and the environment: they monitor their learning, selectively direct their attention, and use different strategies to remember and solve learning tasks. After learning, they undergo self-evaluation and make attributions for success or failure that trigger various positive or negative emotions (e.g., satisfaction, pride, shame, fear) that influence self-regulation in further learning. The processes in each phase affect processes in the next phase. Cognitive, metacognitive, and motivational/affective processes are constantly interwoven.

2.2. Measurement of SRL

It is challenging to measure the (meta)cognitive and motivational/affective process and its components in SRL. No existing measure alone can capture the full complexity of this dynamic process and its contents. In general, SRL measures can be divided into offline and online measures [26,27]. Offline measures, e.g., self-report questionnaires and interviews, attempt to capture self-regulation before or after the completion of the learning process. Such measures are subject to social desirability bias and students’ inability to access higher-order cognitive processes during learning due to memory failures, distortions, or interpretive reconstruction [27,28,29]. Online measures, on the other hand, attempt to capture self-regulation in real time while learning is in progress. Some of the online measures are unobtrusive and do not interfere with student learning [26], e.g., traces collected by computer software during learning (time spent on a particular page or content, clicks on hyperlinks, scrolling, and return to previous page). From these traces, inferences can be made about how students monitor and update their understanding or how much effort they put into learning [26]. On the other hand, the think aloud method can reveal students’ cognitive, metacognitive, and motivational processes, but it can be difficult for students to use and is not appropriate for studies that involve entire classes. Students can also take notes as they learn. These are a rather intrusive measure, but reveal a great deal about students’ self-regulation, particularly the learning strategies they use. Offline measures can be general or task-specific, whereas online measures are always task-specific [16].
Studies of the convergent validity of different SRL measures show inconsistent results. Correlations between offline measures vary from low to high [30,31,32], with most results in the low to moderate range. Results are also mixed for online measures. Some authors report moderate to high correlations between different online measures, e.g., for strategies derived from writing journals [33] and from think aloud methods and systematic observations [34]. Research on correlations between off- and online measures also shows mixed results, ranging from low to moderate correlations between self-report questionnaires and notes, to no to moderate correlations between self-reports and the think aloud method [16], to high correlations between note-taking and think aloud protocols [35] and between self-reports and traces in the study material [36].
Several authors [15,16,17] emphasized the importance of measuring self-regulation using multi-methods in order to avoid the drawbacks of using single measures and to gain valid insight into students’ actual approach to learning. However, research using multiple methods is still scarce. Thus, in our study, we used a multi-method approach and task-specific on- and offline measures of SRL that can be used in research in a school context with whole classes. We used self-reports as offline measures and student notes and log files as online measures.

2.3. Research on Cognitive, Metacognitive, and Motivational/Affective Processes in SRL

SRL involves cognitive, metacognitive, and motivational/affective processes [19,20,23]. Cognitive strategies such as rehearsal, elaboration, and organization [37] are used in processing the content and encoding the information in long-term memory for later retrieval. Rehearsing strategies include repeating information without transforming it (e.g., verbatim repetition, copying, or underlining text). Elaboration strategies include integrating different parts of the material (e.g., linking earlier information to current information, locating key concepts, summarizing, paraphrasing content). Organization strategies are used to find relationships among important concepts in the material (e.g., connecting concepts, grouping, ordering, hierarchizing). Dent and Koenka [28] found low to moderate correlations between offline measures of cognitive strategies and learning performance (average r = 0.10) and moderate to large correlations for online measures (average r = 0.30); they found higher correlations for science and social studies than for math and languages. Correlations of performance with rehearsing strategies were lower than with elaboration or organization strategies [28,38].
In the present study, students’ self-reports of strategies used in learning were used as offline measures. We also analyzed students’ note-taking during learning as an online indicator of their cognitive strategies. During note-taking, students use surface strategies (e.g., copying parts of the content) or deep strategies (elaboration and organization) to reconstruct the learning material [39]. Kobayashi [40] found that note-taking resulted in better academic performance than listening or reading (d = 0.75) and mental reviewing (d = 0.77).
Self-regulation is not possible without reflection on one’s own cognitive processes. Metacognitive processes, as an essential component of SRL, include thinking about the task, one’s learning goals and abilities to accomplish the task, and the strategies used in learning. SRL models [19,20,23] emphasize four groups of metacognitive processes related to learning performance, namely, planning, self-monitoring, self-control, and self-evaluation. Offline measures of these processes have negligible to moderate correlations (r = 0.36) with performance, whereas they are much higher for online measures (r = 0.45–0.90) [32]. The highest correlations with performance were found for planning [28], monitoring [41,42,43], self-checking [28], evaluation, and reflection [42], and the lowest for self-control [43]. In the present study, students’ self-reports of their use of metacognitive strategies served as offline measures, and information from log files (e.g., data on returning to different pages as an indicator of monitoring and control, clicking hyperlinks to check terms in the glossary as an indicator of control) served as online measures.
Knowledge of tasks and strategies is important but not sufficient to successfully complete the learning process and gain deep understanding and higher achievement. Students have to actually use strategies which depends on their motivation. Lin, Zhang, and Zheng [44] found that highly motivated students were more likely to use a variety of learning strategies in online courses and that higher levels of strategy use positively correlated with higher learning achievement. Jackson [45] found the highest correlations between course grades in chemistry, mathematics, and physics and self-efficacy, task value, effort regulation, and time and study management. In Hattie and Donoghue’s summary of SRL interventions [42], the greatest impact on student achievement was found for self-efficacy (d = 0.90), followed by increasing task value (d = 0.46), and improving attitudes toward the content (d = 0.35). In the present study, we assessed students’ motivation by asking them about their interest in the topic at the beginning and immediately after learning. At the end of learning, we also asked them about their efforts during learning and the motivational strategies they used. We used the information about the time they spent on a learning task as an online indicator of student effort [26].
Studies also examined the impact of students’ prior knowledge on the use of learning strategies and performance. Ghiasvand [46] found that higher performing students used more cognitive and metacognitive strategies and that metacognitive strategies measured offline were a better predictor of their performance than cognitive strategies. Bråten and Samuelstuen [47] found no difference between students with low and high prior knowledge in reported strategies (memorizing, elaborating, organizing, monitoring) used while reading for tests. Taub and colleagues [48] found differences in the use of metacognitive strategies but not in the use of cognitive strategies. Paans and colleagues [49] found that students’ prior knowledge as well as their motivation was a positive predictor of their performance, but their online navigation strategies did not mediate this relationship. Given the contradictory results, the present study also sought to find out the differences in SRL between students with higher and lower prior knowledge and between students with higher and lower achievement gains.

3. Materials and Methods

3.1. Participants

A sample of 443 ninth graders (224 girls and 219 boys) from 25 classes in 15 schools in Slovenia participated in the study. Their mean age was 14.38 years (SD = 0.40). Students who did not complete the requested questionnaires and students with special educational needs were excluded from the study. The final sample consisted of 364 students (197 girls and 167 boys). Their mean age was 14.35 years (SD = 0.37).

3.2. Learning Unit

The learning unit Eyes and Color Perception was taken from the accredited Slovenian digital textbook Chemistry in Life used as an elective subject for ninth grade [50], and slightly redesigned. The topic included the structure of the human eye, the process of visual perception, and the role of rods and cones in the retina. The importance of vitamin A, β-carotene, and retinal were explained and their chemical structure formulas and the chemical processes involved were shown. Light and photons and the concept of mimicry in animals were explained as well.
The hypertext consisted of 1073 words and a further 226 words in the glossary, which was available for additional explanations of the main concepts presented (e.g., electromagnetic waves, photons). The unit was divided into six chapters that included text, 20 pictures and photos, 5 schemes of chemical structures, and 1 video clip (on the water solubility of β-carotene). Students worked through the unit step by step, clicking on one chapter at a time. When they were finished with all the chapters, they could review the content of the unit. Figure 1 shows part of the learning unit content (the sixth chapter) displayed within the e-learning environment. The titles of all six chapters are also shown.

3.3. E-Learning Environment

A dedicated e-learning environment, developed for the purposes of this research (Figure 1), allowed students’ activities to be tracked as they navigated through the hypertext. Interactions with the system were recorded, such as clicking on the glossary link to get an explanation of certain terms, or operating the controls to play the video clip. All recorded events were time-stamped, as were the beginning and end of the session. The e-learning environment also included self-report questionnaires and a posttest on the topic.

3.4. Measures

Data were gathered from three different data sources: log files, self-reports of motivation and strategy use, and analysis of students’ notes. We also measured their knowledge of the selected topic before and after learning.
Test of Color Perception: The knowledge test contained five open-ended questions (e.g., “Explain the importance of β-carotene to human health”.) and eight multiple-choice questions with five response options, the last of which was always “I don’t know”. Students’ answers to the open-ended questions were scored on a three-point scale: no answer or incorrect answer (0), partially correct answer (1), and correct answer (2). Before learning, students completed this test in paper and pencil form, while after learning they completed it in an e-learning environment. Answers to the multiple-choice questions were scored as correct (1 point) or incorrect (0 points). Two independent examiners scored all open-ended responses and resolved any discrepancies. On average, students scored 2.82 points (SD = 2.09) on the pretest and 8.43 points (SD = 3.45) on the posttest. The Cronbach’s α-coefficient of internal consistency for the posttest was 0.75.
Learning Strategies Checklist: We developed this list based on the self-regulation models of Pintrich [19] and Zimmerman [25]. Five experts in educational psychology compiled a list of 33 cognitive, metacognitive, and motivational strategies related to the phases before, during, and after learning and independently classified them into seven categories (disagreements about classification were resolved through discussion until consensus was reached): surface cognitive (1 item, “I quietly repeated the learning content while learning”.); deep cognitive (organization and elaboration) (9 items, e.g., “While learning, I thought about how I could apply the learning content in life”.); metacognitive (MC) planning (4 items, e.g., “I thought about what I wanted to know about this topic before I started learning”.); MC monitoring (5 items, e.g., “I monitored whether I paid attention to the topic during learning”.); MC regulation (8 items, e.g., “I looked up new and unfamiliar words in the glossary”.); MC evaluation (3 items, e.g., “After learning, I asked myself if I knew the content well”.); and motivational strategies (3 items, e.g., “I encouraged myself during learning to continue until the end”.).
Motivation to Learn: Before students started learning, they rated on a 7-point scale (1—not at all, 7—very much) how much they would like to learn about color perception. After they finished learning, they rated on the same scale how much they would like to learn about color perception in the future. They also indicated how engaged they were in the content compared to their usual learning behavior, using a 3-point scale (1—less, 2—same, 3—more engaged). When designing scales, we carefully considered the characteristics that could affect the data quality (e.g., scale’s length, clarity of questions, reference points).
Data Traces from Log Files: We measured (i) the total time spent viewing the content from the beginning to the end of the unit, (ii) the total time spent reviewing the content before the knowledge (post)test, (iii) the time spent watching the video, and (iv) the number of clicks on links in the text that connected concepts to their explanation in the glossary. Time was measured in seconds.
Cognitive Learning Strategies Revealed by Students’ Notes: The notes were classified into six categories by two coders according to a coding scheme based on Weinstein and Mayer’s cognitive strategy model [37]: (i) basic rehearsal strategies (writing down single concepts/words, “telegraphic” input); (ii) complex rehearsal strategies (word-by-word transcription of one or more sentences); (iii) basic elaboration strategies (forming a sentence linking two or more different concepts, associations, mnemonics, answering questions); (iv) complex elaboration strategies (paraphrasing, summarizing, using analogies, turning words into drawings); (v) basic organization strategies (grouping information based on common features but without hierarchy); and (vi) complex organization strategies (grouping information based on common features in a hierarchy or sequence of events). Each strategy switch (change from one strategy to another) was coded as a separate occurrence of a strategy. Two coders independently coded one third of the notes. The initial agreement between coders was 90.6% (Krippendorff’s α = 0.87). Although it was high enough, the coders discussed the discrepancies and made additional adjustments to their coding criteria. After each coder coded the notes according to the agreed-upon criteria, the level of agreement was 98.5% (Krippendorff’s α = 0.98).
Grade Point Average (GPA) for Science: The topic of color perception is related to chemistry, biology, and physics; thus, we averaged students’ self-reported final grades in these three subjects, which can range from 1 (unsatisfactory) to 5 (excellent) for science GPA.

3.5. Data Collection

After the ethics committee confirmed that the study met all ethical criteria and standards, informed consent was obtained from students and their parents. All participants with informed consent were given a unique ID and data were collected at the specified ID. The students’ teachers took care of the students’ IDs. After the end of the data collection in the schools, the information about the connection of the participating students and their names were destroyed. Thus, the names of the participants were never shared with the researchers and were not part of the data collected.
A protocol for conducting the study was carefully developed in detail to ensure equal conditions in all field sessions and comparable results. During each field session, the agreed-upon procedure was strictly followed and documented.
Data collection was conducted during regular classes in two sessions. In the first session (September and October 2019), demographic data and pretest scores were collected. In the second session two months later (in November and December 2019), students learned about color perception from the unit in the online learning environment, completed the posttest, and assessed their motivation and strategy use. Laptops with the same configuration, HD resolution, external mouse, and connection to the Internet were used. The unit was displayed in a maximized window of the Chrome browser. Students’ interaction with the system was recorded during learning and continuously stored in the database on the server. Students were also allowed to take notes while learning.
Students were given verbal instructions and then tried to learn as much as possible independently (there was no time limit, but the content of the unit was planned for about 40 min of work). During the session, they could obtain technical support as well as additional explanations of procedures, but there was no help on the topic they had to learn.
Collected data used in this study can be found in the Supplementary Material.

3.6. Data Analysis

Descriptive statistics of different measures and science performance are shown in Table A1. Histograms and the Shapiro–Wilk test indicated a non-normal distribution for all variables (p < 0.001). Spearman correlation coefficients were calculated between different off- and online measures and with science GPA. To avoid alpha accumulation due to multiple testing, the Šidák correction at 0.024% alpha error rate was used to determinate significant correlations.
Then, students’ knowledge gain was calculated as the difference between the posttest and pretest scores. The sample was divided into low and some pretest knowledge groups based on the median of the pretest scores (Mdn = 3). A further split was made into low and high gain groups based on the median gain (Mdn = 6). A combination of the two splits resulted in four groups of students: (i) Group A (low prior knowledge, low gain) with 89 students (40 girls and 49 boys, mean age 14.41 years, SD = 0.41); (ii) Group B (low prior knowledge, high gain) with 91 students (51 girls and 40 boys, mean age 14.40 years, SD = 0.37); (iii) Group C (some prior knowledge, low gain) with 92 students (48 girls and 44 boys, mean age 14.29 years, SD = 0.36); and (iv) Group D (some prior knowledge, high gain) with 92 students (58 girls and 34 boys, mean age 14.29 years, SD = 0.34).
The frequency distributions for most variables in the four groups were non-normal and the homogeneity of variances was not confirmed by Levene’s test for several of the variables. Therefore, we used multRM and RM functions from the MANOVA.RM package [51] for a multivariate semi parametric ANOVA with test statistics that are robust to non-normal error terms and unequal variances. P-values were assessed using the random permutation method on all data. In multiple post hoc univariate ANOVAs, the effects were tested with the Šidák correction at a 0.24% alpha error rate.

4. Results

Table 1 shows students’ learning traces and self-reports on the cognitive, metacognitive, and motivational strategies they used during self-regulated learning with expository science hypertext. Students’ self-reports of SRL strategy use indicated that the majority of students (94%) used deep cognitive strategies. More than half (55%) also reported using surface cognitive strategies, while almost all students reported using metacognitive strategies: 79% used planning, 89% monitoring, 93% regulating, and 72% evaluating their learning process. However, many students (70%) did not use any motivational strategy. Before learning, students’ motivation to learn the topic was relatively high: 35% of them were eager to learn about color perception and 43% were neutral. After learning, the motivation was lower: only 30% of the students expressed their interest in learning more about color perception in the future, 34% were neutral, and the number of not motivated increased to 36%. Student engagement in learning was low: only 10% said they were more engaged compared to usual learning, while 45% were equally engaged, and 45% were less engaged. On average, they took 16.9 min to complete the unit, and only 1.8 min to go through it again before moving on to the knowledge test. On average, they opened the glossary only 2.2 times, while the unit provided seven links to five different terms in the glossary.
Of the 364 students, 282 (77%) wrote notes while learning. In their notes, they used about four deep cognitive strategies on average, but there was a large variability present in the sample. Students most frequently used complex rehearsal strategies (i.e., they rewrote entire sentences from the hypertext). This could be due to the novelty of the topic, as indicated by the very low scores on the pretest. Deep learning strategies (complex elaboration, basic and complex organization used to link and reconstruct incoming information) were also frequently used in note-taking. The strategies of basic rehearsal and basic elaboration were rarely used.

4.1. Correlations between Different On- and Offline Measures of Self-Regulation

Means, standard deviations, and Spearman correlation coefficients for all variables examined are shown in Table 1. We defined variables 1–10 as offline measures and variables 11–20 as online measures of SRL.
Most correlations between self-reported offline strategies (variables 1–7 in Table 1) were significantly positive and low to moderate, but correlations of the number of metacognitive regulation with the number of deep strategies and metacognitive monitoring strategies used during learning were high. Thus, some students reported using more learning strategies of different types on average than others. Although these correlations suggest that the former are simply better learners, they may also indicate self-report bias. The correlations between motivation at the beginning and end of learning were high, while the correlations between ratings of engagement during learning and motivation at the end of learning were low (see correlations between variables 8–10 in Table 1). Moreover, the correlations between ratings of engagement during learning and motivation at the beginning were low and non-significant. This is somewhat surprising, as one would expect students who are more motivated to report higher levels of engagement. Low correlations could again indicate a bias in reporting their own motivation to learn, or the fact that their learning outcomes in the study were not assessed and graded as they are in school.
As for the correlations between the different online measures used, only one statistically significant but small positive correlation was found between the online data traces. The time spent completing the learning correlated with the number of times the glossary was opened. Correlations between online measures of cognitive strategies used in the notes varied from low to high. The highest positive correlation was found between the use of complex rehearsal and complex elaboration strategies. In general, students whose notes contained more elaboration strategies were also more likely to use organization strategies. Students who spent more time taking notes also took more time to complete learning. Time spent learning correlated with more extensive use of all types of note-taking strategies, except for the use of basic rehearsal and basic elaboration strategies.
Correlations between offline measures of self-regulation strategies and motivation (variables 1–10) and online traces (variables 11–14) were predominantly low (Table 1). In general, we found low positive correlations between time to completion and deep cognitive and metacognitive planning and regulation strategies used during learning, as well as with student motivation at the beginning and end of learning, suggesting that the more different strategies students used and the higher their motivation to learn, the longer it took them to complete the learning task. Motivational strategies measured offline showed low correlations with the time spent watching the video, while metacognitive monitoring and regulation strategies correlated with the frequency of looking up the glossary. Some correlations between strategies measured offline (variables 1–7) and online with notes (variables 15–20) were low positive. Self-reported deep strategies correlated with all strategies that emerged from the notes except basic rehearsal and basic elaboration, while metacognitive regulation correlated with basic and complex organization.
In summary, the correlations between the SRL offline measures were mostly moderate, but varied from low to high. The correlations between online measures also varied from low to high. The correlations between the offline and online measures were rare and predominantly low positive.

4.2. Correlations between Different On- and Offline Measures of Self-Regulation and Achievement

We found low positive statistically significant correlations between GPA in science and an offline and some online measures of SRL (Table 1). Compared to students with a lower GPA, students with a higher GPA reported using a slightly greater number of deep cognitive learning strategies, which was reflected in their actual use of note-taking, where they used more basic and complex organization strategies. They also used the glossary more frequently.

4.3. Differences in SRL Behavior between Higher- and Lower-Performing Students

Next, we wanted to examine the relationship between offline and online measures of SRL with students’ prior knowledge of the topic they were learning and their knowledge growth during learning. Offline and online measures of SRL were compared in four groups of students defined by their prior knowledge (low vs. some) and learning gains (low vs. high). Table 2 shows the comparison of the different measures in the four groups.
The students in Group D who had some prior knowledge of the topic and gained a lot from the learning also had the highest GPAs in science. Of all the groups, this group showed the highest use of several SRL strategies (surface and deep cognitive strategies, monitoring, and regulating), their initial motivation and engagement were the highest, they devoted the longest amount of time to learning, and their use of different note-taking strategies was (among) the highest. In contrast, Group A (low prior knowledge, low gain) had the lowest science GPA and scores on many off- and online SRL measures. This group also reported the most frequent use of motivational strategies. It is also worth noting that Group C (some prior knowledge, low gain) used monitoring and evaluation the least of all groups. For all measures examined, the results for Group B (low prior knowledge, high gain) were somewhere in the middle of the results for the other groups.
A multivariate semi-parametric analysis using a Wald-type statistic (WTS) showed a statistically significant effect of both prior knowledge, WTS(21) = 84.35, p < 0.001 (resampled p < 0.001), and gain, WTS(21) = 108.90, p < 0.001 (resampled p < 0.001), whereas the interaction between the two was not statistically significant, WTS(21) = 30.79, p = 0.077 (resampled p = 0.119). The results of post hoc semi-parametric ANOVAs are shown in the rightmost columns of Table 2. Both prior knowledge and knowledge gain were associated with a higher GPA in science. In contrast to prior knowledge that did not show much effect on the studied SRL measures, knowledge gain was related to several offline and online measures of SRL. Compared to lower knowledge gain, higher gain was associated with more frequent use of deep cognitive strategies, higher motivation at the beginning of learning, higher engagement in learning, more time spent with the unit, and more frequent use of the complex cognitive strategies (rehearsing, elaboration, and organization) in note-taking.

5. Discussion

In the present study, we investigated students’ SRL when learning with science texts. To measure cognitive, metacognitive, and motivational SRL processes, we used a combination of off- and online measures.

5.1. The Relationship between Offline and Online Measures of SRL

The first aim of our study was to examine the relationship between different offline and online measures of SRL. We found that correlations between measures of the same type were generally higher than correlations between different types of SRL measures. While the correlations between different offline measures as well as the correlations between online measures varied from low to high, the correlations between offline and online measures were mostly low. This means that self-reports do not accurately reflect student behavior during the learning process. Low correlations between measures of the same type may also indicate that we have measured constructs that are heterogeneous and covary only to a limited extent. In addition, some of the offline measures were discrete single-indicator variables, which may have resulted in lower correlations.
Correlations among Offline Measures of SRL. The finding that correlations between offline measures (i.e., self-report questionnaire on self-regulation strategies and motivation to learn) were mostly low to moderate is consistent with existing findings for offline cognitive [31] and metacognitive strategies [32]. The high correlation (r = 0.57) between metacognitive monitoring and metacognitive regulation, both of which are activated in the second phase of the SRL process [23,25], supports the notion that metacognitive regulation cannot occur without careful monitoring of the level of understanding of the learning material. Correlations between offline motivation variables and other (meta)cognitive dimensions except motivational engagement were predominantly low, similar to several other studies [30,31,52].
In our study, students reported that they used to some extent deep cognitive strategies that included elaboration and organization of information in the text, and that they consciously monitored their learning process through the use of metacognitive strategies such as planning, monitoring, regulating, and evaluating. However, they reported that they rarely used motivational strategies. Since they reported high motivation to learn about visual perception both at the beginning and at the end of learning (even though their ratings decreased slightly), it could be concluded that the topic was interesting to them. On the other hand, only about three-quarters of the students took notes, and on average, they did not spend much time reviewing the unit before they started the knowledge test and did not check the meaning of all the new terms in the glossary. They indicated that their overall engagement during learning was, on average, lower than during usual learning in a school context. It is possible that in the classroom, teachers take on the role of motivators, which is why motivational strategies are rarely used by the students themselves (and were also not used in the present study). Since the learning task in our study was not part of the regular curriculum and was not assessed or graded, students may have also perceived it as less valuable. It has been previously found that the value of the task is related to students’ effort regulation [30,53].
Correlations between Online Measures of SRL. The negligible to small positive correlations between the log data suggest that these measures relate to different aspects of students’ SRL. Time spent studying and watching videos could be an indicator of students’ learning effort, while time spent opening glossaries and reviewing material could be an indicator of students’ metacognitive control [26]. On the other hand, the strategies students use to take notes are indicators of different but related cognitive processes [37]. This is reflected in the low to high correlations found between most of the cognitive strategies in the notes. Moderate to high correlations between time spent learning and the use of surface and deep learning strategies in the notes are consistent with research that has found that correlations between online measures of SRL were higher than between offline measures [28], and point to the need to use more online methods in SRL research.
Students who were more motivated to learn took longer to complete the learning process. Their notes were more detailed and contained more strategies of different types. The time spent on learning can therefore also be a relatively good indicator of the motivation to learn. It can provide an online measure of students’ efforts [26].
Correlations between Off- and Online Measures of SRL. Correlations between off- and online measures were predominantly low in our study. Rogiers and colleagues [16] found low to moderate correlations between self-reports and taking notes, but in a study by Bråten and Samuelstuen [36] these correlations were high. The lower correlations found in our study could be due to differences in the presentation of the learning tasks. Both previous studies used expository paper texts, whereas our study used hypertext. Research has already shown that students learn differently when learning from hypertexts than from paper texts [7,8]. Correlations may also depend on the topic studied. We used a text about visual perception, whereas Rogiers et al. [16] used a text about the production of chewing gum and Bråten and Samuelstuen [36] used a text about aspects of socialization in different cultures. Our text was also longer than the other two texts.

5.2. The Relationship between SRL Measures and Students’ Achievement

The second aim of our study was to examine the relationships between SRL measures and student performance. First, we examined the relationship between SRL measures and GPA in science. Most off- and online SRL measures were positively related to student achievement, but correlations were low and mostly non-significant. Although previous studies [28,32,43,44,45] have shown that higher use of self-regulation in learning is generally associated with higher achievement, we found no statistically significant correlations except for self-reported deep cognitive strategies and organization strategies revealed in the notes. This may be because students’ GPAs in science were relatively high (median 4 out of 5) and the variability in scores was low, which may then be reflected in the non-significant or low correlations.
Several other studies reported low to moderate correlations between GPA and offline measures of SRL and performance [30,31,52]. There are studies that have found slightly higher correlations of performance with online SRL measures, e.g., the correlation 0.30 with both cognitive and metacognitive strategies [28] or the correlations 0.50 for cognitive and 0.62 for metacognitive strategies [32]. The lower correlations found in our study can be attributed to the specific task we used. For example, Veenman and colleagues [32] used a discovery learning task (modified Otter task) that required students to experiment with different variables and hypothesize about the results. Thus, they likely needed many more metacognitive skills to experiment and draw conclusions than in our study, which required students to compare, elaborate, and rehearse information in an expository science text in order to remember as much as possible. The results of the different studies may also differ slightly due to the different online measurement methods. Bråten and Samuelstuen [36] used the traces students left in their notes and assessed the quality and quantity of their organization and underlining/highlighting in the text. However, the correlations of PISA literacy results with organization (0.35 for quality and 0.29 for quantity) were quite similar to ours (0.30 for basic organization and 0.29 for complex organization). In their study, the correlations between PISA results and the quality and quantity of underlining/highlighting were low and non-significant (−0.02 and −0.03), similar to ours for basic and complex elaboration (0.14 and 0.15), although our category of elaboration strategies was broader than theirs.

5.3. The Differences in the SRL of Students with Different Prior Knowledge and Learning Gains

In our study, we also examined the differences in SRL measures in relation to two dimensions—students’ prior knowledge and their gain in knowledge after learning from science hypertext.
Our results showed no significant differences in offline and online measures in relation to students’ prior knowledge. This contradicts with other studies that showed a higher use of metacognitive monitoring strategies [47,48] and judgement of learning [48] among students with higher prior knowledge. A clearer picture of the effects of prior knowledge on the use of SRL strategies would likely be observed if the groups with low and some prior knowledge differed more on this variable (the average difference between the groups with low and with some prior knowledge in our study was only 3.23 points out of a total of 18 points).
The results related to achievement gain revealed that, in general, students with higher achievement gains on the posttest outperformed students with lower gains in SRL use. Differences were found in several offline and online measures. The effect size for time spent in learning, complex elaboration, and complex organization was above 0.039, which corresponds to a Cohen’s d of more than 0.40. This is an effect size that according to Hattie [9] can be considered as having significant practical implications for school learning. The effect of the knowledge gain was generally larger for the online measures than for the offline measures, supporting the claim that online measures have a higher predictive power for learning outcomes [28,32].
In our study, students who gained more from learning used more complex cognitive strategies when taking notes than students with lower gains. They wrote down more complex passages of text, but more importantly, they integrated different parts of the material—summarizing the new information and applying it to real-life situations (e.g., writing where knowledge about β-carotene can be used in everyday life); comparing different pieces of information (e.g., comparing the molecules of β-carotene, vitamin A, and retinal); and organizing the information (e.g., outlining the stages of visual perception)—and thus added the new information to the existing knowledge in meaningful ways. Use of these strategies led to deep learning [39] and better outcomes.
In contrast to previous research on metacognitive strategies and achievement [19,20,23], in our study students with higher knowledge gains did not report using significantly more metacognitive strategies than students with lower gains. As mentioned earlier, self-assessment of metacognitive strategies is not necessarily a valid indicator of self-regulation. Other studies have also found that the association with performance is higher for online measures than for offline measures of SRL [32].
Among the motivational dimensions of SRL, students’ motivation at the beginning of learning, their engagement, and especially time spent on learning distinguished between low and high knowledge gain students. These results once again demonstrated the strong interconnectedness of cognitive and motivational aspects of self-regulation in learning [19,20]. It is not enough to know the strategies; to gain something from learning, students also need to use them, and this depends on their motivation, especially its behavioral aspects such as active engagement and time devoted to learning.

5.4. Limitations and Suggestions for Future Research

Our study has several limitations. First, we used an explanatory hypertext from the field of applied science and it is difficult to generalize the results to other types of texts. The results are also likely limited to the population of ninth graders.
Second, in order to achieve high ecological validity but control for the learning conditions and computer environment, the learning process was conducted in schools. However, since the learning took place during regular school hours and students’ participation in the study was voluntary, the situation for them was different from individually studying for the test. They did not spend much time on the topic, so we believe that on average they did not take the learning very seriously. This could also be reflected in their use of SRL strategies. Future studies should consider using learning for a grade to increase student motivation.
Third, to measure the strategies, we used a self-report checklist that has been validated by several researchers in the field of self-regulation. However, it would be advisable to further improve the instrument (e.g., by adding more items to measure basic rehearsal). Student motivation and affective processes during learning could be captured by adding more offline items or by using online measures that record, for example, flow during learning, such as physiological measures (skin conductance and temperature, heart rate variability, respiratory rate) or students’ emotional facial expressions. Future studies should also consider online measures of metacognitive strategies, e.g., the think aloud method or experience sampling method. For an effective use of such methods, however, students should be intensively trained to be able to provide detailed descriptions of their thoughts and sensations during learning, and rich descriptions might be difficult to achieve in students with lower achievements and (meta)cognitive abilities, thus probably limiting the use of such methods to specific groups of students. A task developed by Ristić-Dedić [54], in which students need to explain to their peers what needs to be done to learn a selected topic successfully, may be promising; however, this method is an offline measure of metacognitive strategies. At the moment, neurophysiological methods which might perhaps be used as online measures once in the future, are still way underdeveloped to enable an insight into students’ metacognitive processes during learning.

6. Conclusions

In the digital age, individual learning with hypertext is becoming more common, and students’ academic performances increasingly depend on their ability to self-regulate. The present study contributes to the knowledge of self-regulation when students learn with expository hypertexts. The multi-method approach we used is still rare, especially the use of expository hypertexts compared to problem-based learning with technology, and this can be considered a significant contribution to SRL research. More significant correlations between online than offline measures with student achievement, as well as more significant differences and generally larger gains in knowledge for online than offline measures, confirm the advantages of including online measures for valid measurement of SRL, especially combining online data from log files and student notes. In future studies, an e-environment that allows online note-taking could make analysis more convenient for the researcher and the teacher, who could obtain information about students’ SRL and use it for interventions to improve it. Further validity studies on both types of measures are also needed in the future.
Our results suggest the importance of cognitive, metacognitive, and motivational processes for better performance. Students with higher achievement gains reported and used more deep cognitive strategies (complex rehearsal, elaboration, and organization), and their motivation was reflected in greater time spent learning, their higher initial motivation, and their engagement during learning. Deep cognitive strategies and motivation were more important than their initial knowledge of the topic. These findings have several practical implications for classroom practice. First, they suggest that teachers should help students learn metacognitive and deep cognitive strategies that promote higher levels of knowledge. An important part of promoting SRL is showing students how the effort they put into their individual learning, along with their engagement and time spent learning, translates into the acquisition of more organized knowledge and better grades. At the same time, teachers should support students’ motivation to learn by identifying their interests and selecting interesting and challenging hypertexts for individual learning. Hypertext authors could also use this information to develop hypertexts with embedded cognitive, metacognitive, and motivational scaffolds to support students’ SRL and create long-lasting learning effects.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/su14095686/s1, Material S1.zip: Data presented in the study.

Author Contributions

Conceptualization, A.P., L.K., A.H., B.B.P., K.D.S., A.G., M.M., S.P., T.P., M.P.L. and C.P.; methodology, A.P., L.K., A.H., B.B.P., K.D.S., A.G., S.P., T.P., M.P.L. and C.P.; software, C.B., Ž.L. and M.P.; validation, A.P., L.K. and K.D.S.; formal analysis, A.K., A.P., L.K., A.H. and B.B.P.; investigation, A.K., C.B., K.D.S., A.G., Ž.L., S.P., M.P., M.P.L. and C.P.; resources, K.D.S., A.G., M.M. and C.P.; data curation, A.K., A.P., L.K., K.D.S., M.P. and C.P.; writing—original draft preparation, A.K., A.H., B.B.P., C.B., K.D.S., A.G., Ž.L., M.P., T.P., M.P.L. and C.P.; writing—review and editing, A.K., A.P., L.K., M.M., S.P. and C.P.; visualization, A.K. and A.P.; supervision, M.M. and C.P.; project administration, A.K., B.B.P., K.D.S. and A.G.; funding acquisition, C.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research was conducted as part of the basic research project Effectiveness of Different Types of Scaffolds in Self-Regulated e-Learning [project number J5-9437], funded by the Slovenian Research Agency (Javna agencija za raziskovalno dejavnost RS) from the state budget. The APC was also funded by the Slovenian Research Agency (Javna agencija za raziskovalno dejavnost RS).

Institutional Review Board Statement

All subjects gave their informed consent for inclusion before they participated in the study. The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethics Committee of Faculty of Arts, University of Ljubljana (approval date: 16 October 2018).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available in the supplementary material.

Acknowledgments

We thank all participating schools, students, and teachers for their contribution to this study.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

Appendix A

Table A1. Descriptive statistics for variables included in research.
Table A1. Descriptive statistics for variables included in research.
SRL VariablesMSDMdns2MinMaxSkewnessKurtosis
Offline measures of strategies (checklists)
1. Cognitive surface strategies0.550.501.000.2501−0.201.04
2. Cognitive deep strategies3.732.164.004.67090.252.39
3. Metacognitive planning strategies1.371.021.001.03040.382.40
4. Metacognitive monitoring strategies2.281.422.002.02050.162.14
5. Metacognitive regulation strategies3.652.284.005.18080.121.93
6. Metacognitive evaluation strategies1.230.991.000.99030.282.00
7. Motivational strategies0.600.870.000.76031.243.40
Offline measures of motivation (scales)
8. Motivation before starting to learn4.141.284.001.6517−0.283.45
9. Motivation to continue with learning3.811.494.002.2317−0.082.67
10. Level of engagement in learning1.650.652.000.43130.522.30
Online measures (data traces)
11. Time spent for going through the unit for the first time1015.63426.01953.00181,487.141924170.543.01
12. Time spent for reviewing the unit110.69150.1943.5022,557.5009772.249.31
13. Time spent on video viewing33.0435.1322.001234.0104345.6152.59
14. Number of times for opening glossary2.242.601.006.760151.224.57
Online measures of strategies (notes)
15. Basic rehearsal0.150.410.000.17022.8610.82
16. Complex rehearsal1.171.211.001.46061.214.63
17. Basic elaboration0.140.400.000.16022.8911.03
18. Complex elaboration0.861.110.001.23061.314.45
19. Basic organization0.490.840.000.71041.835.82
20. Complex organization0.410.760.000.58042.006.64
Science performance
21. Grade-point average in science3.760.984.000.9625−0.311.85
N = 364.

References

  1. Pozo, J.-I.; Pérez Echeverría, M.-P.; Cabellos, B.; Sánchez, D.L. Teaching and Learning in Times of COVID-19: Uses of Digital Technologies during School Lockdowns. Front. Psychol. 2021, 12, 1511. [Google Scholar] [CrossRef]
  2. Li, C.; Lalani, F. The COVID-19 Pandemic Has Changed Education Forever. This Is How. World Economic Forum. 2020. Available online: https://www.weforum.org/agenda/2020/04/coronavirus-education-global-covid19-online-digital-learning/ (accessed on 25 August 2021).
  3. Azevedo, R.; Cromley, J.G. Does Training on Self-Regulated Learning Facilitate Students’ Learning with Hypermedia? J. Educ. Psychol. 2004, 96, 523–535. [Google Scholar] [CrossRef] [Green Version]
  4. Yen, M.H.; Chen, S.; Wang, C.Y.; Chen, H.L.; Hsu, Y.S.; Liu, T.C. A framework for self-regulated digital learning (SRDL). J. Comput. Assist. Learn. 2018, 34, 580–589. [Google Scholar] [CrossRef]
  5. Azevedo, R. Computer Environments as Metacognitive Tools for Enhancing Learning. Educ. Psychol. 2005, 40, 193–197. [Google Scholar] [CrossRef]
  6. Devolder, A.; van Braak, J.; Tondeur, J. Supporting self-regulated learning in computer-based learning environments: Systematic review of effects of scaffolding in the domain of science education. J. Comput. Assist. Learn. 2012, 28, 557–573. [Google Scholar] [CrossRef]
  7. Kong, Y.; Seo, Y.S.; Zhai, L. Comparison of reading performance on screen and on paper: A meta-analysis. Comput. Educ. 2018, 123, 138–149. [Google Scholar] [CrossRef]
  8. Blom, H.; Segers, E.; Knoors, H.; Hermans, D.; Verhoeven, L. Comprehension and navigation of networked hypertexts. J. Comput. Assist. Learn. 2018, 34, 306–314. [Google Scholar] [CrossRef] [Green Version]
  9. Hattie, J.A.C. Visible Learning: A Synthesis of over 800 Meta-Analyses Relating to Achievement; Routledge: London, UK, 2009. [Google Scholar]
  10. Perry, V.; Albeg, L.; Tung, C. Meta-Analysis of Single-Case Design Research on Self-Regulatory Interventions for Academic Performance. J. Behav. Educ. 2012, 21, 217–229. [Google Scholar] [CrossRef]
  11. Broadbent, J.; Poon, W.L. Self-regulated learning strategies & academic achievement in online higher education learning environments: A systematic review. Internet High. Educ. 2015, 27, 1–13. [Google Scholar] [CrossRef]
  12. Kim, J.-K.; Kim, D.-J. Meta-Analysis on Relations between E-Learning Research Trends and Effectiveness of Learning. Int. J. Smart Home 2013, 7, 35–48. [Google Scholar] [CrossRef]
  13. Berger, F.; Schreiner, C.; Hagleitner, W.; Jesacher-Rößler, L.; Roßnagl, S.; Kraler, C. Predicting Coping with Self-Regulated Distance Learning in Times of COVID-19: Evidence from a Longitudinal Study. Front. Psychol. 2021, 12, 3627. [Google Scholar] [CrossRef] [PubMed]
  14. Cai, R.; Wang, Q.; Xu, J.; Zhou, L. Effectiveness of Students’ Self-Regulated Learning during the COVID-19 Pandemic. Sci. Insigt. 2020, 34, 175–182. [Google Scholar] [CrossRef]
  15. Desoete, A. Multi-method assessment of metacognitive skills in elementary school children: How you test is what you get. Metacogn. Learn. 2008, 3, 189. [Google Scholar] [CrossRef]
  16. Rogiers, A.; Merchie, E.; Van Keer, H. What they say is what they do? Comparing task-specific self-reports, think-aloud protocols, and study traces for measuring secondary school students’ text-learning strategies. Eur. J. Psychol. Educ. 2020, 35, 315–332. [Google Scholar] [CrossRef]
  17. Veenman, M.V.J. Learning to Self-Monitor and Self-Regulate. In Handbook of Research on Learning and Instruction; Mayer, R.E., Alexander, P.A., Eds.; Routledge: New York, NY, USA, 2011; p. 520. [Google Scholar]
  18. Li, S.; Chen, G.; Xing, W.; Zheng, J.; Xie, C. Longitudinal clustering of students’ self-regulated learning behaviors in engineering design. Comput. Educ. 2020, 153, 103899. [Google Scholar] [CrossRef]
  19. Pintrich, P.R. The Role of Goal Orientation in Self-Regulated Learning. In Handbook of Self-Regulation; Boekaerts, M., Pintrich, P.R., Zeidner, M., Eds.; Academic Press: San Diego, CA, USA, 2000; pp. 451–502. [Google Scholar]
  20. Boekaerts, M. Self-regulated learning: A new concept embraced by researchers, policy makers, educators, teachers, and students. Learn. Instr. 1997, 7, 161–186. [Google Scholar] [CrossRef]
  21. Efklides, A. Interactions of Metacognition with Motivation and Affect in Self-Regulated Learning: The MASRL Model. Educ. Psychol. 2011, 46, 6–25. [Google Scholar] [CrossRef]
  22. Winne, P.H.; Hadwin, A.F. Studying as Self-Regulated Learning. In Metacognition in Educational Theory and Practice; The Educational Psychology Series; Lawrence Erlbaum Associates Publishers: Mahwah, NJ, USA, 1998; pp. 277–304. [Google Scholar]
  23. Zimmerman, B.J. Investigating Self-Regulation and Motivation: Historical Background, Methodological Developments, and Future Prospects. Am. Educ. Res. J. 2008, 45, 166–183. [Google Scholar] [CrossRef]
  24. Panadero, E. A Review of Self-regulated Learning: Six Models and Four Directions for Research. Front. Psychol. 2017, 8, 422. [Google Scholar] [CrossRef]
  25. Zimmerman, B.J. Attaining Self-Regulation: A Social Cognitive Perspective. In Handbook of Self-Regulation; Boekaerts, M., Pintrich, P.R., Zeidner, M., Eds.; Academic Press: San Diego, CA, USA, 2000; pp. 13–39. [Google Scholar]
  26. Schraw, G. Measuring Self-Regulation in Computer-Based Learning Environments. Educ. Psychol. 2010, 45, 258–266. [Google Scholar] [CrossRef]
  27. Veenman, M.V.J. Alternative assessment of strategy use with self-report instruments: A discussion. Metacogn. Learn. 2011, 6, 205–211. [Google Scholar] [CrossRef] [Green Version]
  28. Dent, A.L.; Koenka, A.C. The Relation between Self-Regulated Learning and Academic Achievement Across Childhood and Adolescence: A Meta-Analysis. Educ. Psychol. Rev. 2016, 28, 425–474. [Google Scholar] [CrossRef]
  29. Winne, P.H.; Jamieson-Noel, D. Exploring students’ calibration of self reports about study tactics and achievement. Contemp. Educ. Psychol. 2002, 27, 551–572. [Google Scholar] [CrossRef]
  30. Komarraju, M.; Nadler, D. Self-efficacy and academic achievement: Why do implicit beliefs, goals, and effort regulation matter? Learn. Individ. Differ. 2013, 25, 67–72. [Google Scholar] [CrossRef]
  31. Mega, C.; Ronconi, L.; De Beni, R. What makes a good student? How emotions, self-regulated learning, and motivation contribute to academic achievement. J. Educ. Psychol. 2014, 106, 121–131. [Google Scholar] [CrossRef] [Green Version]
  32. Veenman, M.V.J.; Bavelaar, L.; De Wolf, L.; Van Haaren, M.G.P. The on-line assessment of metacognitive skills in a computerized learning environment. Learn. Individ. Differ. 2014, 29, 123–130. [Google Scholar] [CrossRef]
  33. Glogger, I.; Schwonke, R.; Holzäpfel, L.; Nückles, M.; Renkl, A. Learning strategies assessed by journal writing: Prediction of learning outcomes by quantity, quality, and combinations of learning strategies. J. Educ. Psychol. 2012, 104, 452–468. [Google Scholar] [CrossRef]
  34. Veenman, M.; Cleef, D.V. Measuring metacognitive skills for mathematics: Students self-reports versus on-line assessment methods. ZDM 2019, 51, 691–701. [Google Scholar] [CrossRef]
  35. Merchie, E.; Van Keer, H. Using on-line and off-line measures to explore fifth and sixth graders’ text-learning strategies and schematizing skills. Learn. Individ. Differ. 2014, 32, 193–203. [Google Scholar] [CrossRef]
  36. Bråten, I.; Samuelstuen, M.S. Measuring strategic processing: Comparing task-specific self-reports to traces. Metacogn. Learn. 2007, 2, 1–20. [Google Scholar] [CrossRef]
  37. Weinstein, C.E.; Mayer, R.E. The Teaching of Learning Strategies. In Handbook of Research on Teaching; Wittrock, M., Ed.; Macmillan: New York, NY, USA, 1986; pp. 315–327. [Google Scholar]
  38. Akyol, G.; Sungur, S.; Tekkaya, C. The contribution of cognitive and metacognitive strategy use to students’ science achievement. Educ. Res. Eval. 2010, 16, 1–21. [Google Scholar] [CrossRef]
  39. Marton, F.; Saljo, R. On qualitative differences in learning: I. Outcome and process. Br. J. Educ. Psychol. 1976, 46, 4–11. [Google Scholar] [CrossRef]
  40. Kobayashi, K. Combined Effects of Note-Taking/-Reviewing on Learning and the Enhancement through Interventions: A meta-analytic review. Educ. Psychol. 2006, 26, 459–477. [Google Scholar] [CrossRef]
  41. Greene, J.A.; Azevedo, R. A macro-level analysis of SRL processes and their relations to the acquisition of a sophisticated mental model of a complex system. Contemp. Educ. Psychol. 2009, 34, 18–29. [Google Scholar] [CrossRef]
  42. Hattie, J.A.C.; Donoghue, G.M. Learning strategies: A synthesis and conceptual model. NPJ Sci. Learn. 2016, 1, 16013. [Google Scholar] [CrossRef]
  43. Roebers, C.M.; Krebs, S.S.; Roderer, T. Metacognitive monitoring and control in elementary school children: Their interrelations and their role for test performance. Learn. Individ. Differ. 2014, 29, 141–149. [Google Scholar] [CrossRef]
  44. Lin, C.-H.; Zhang, Y.; Zheng, B. The roles of learning strategies and motivation in online language learning: A structural equation modeling analysis. Comput. Educ. 2017, 113, 75–85. [Google Scholar] [CrossRef]
  45. Jackson, C.R. Validating and Adapting the Motivated Strategies for Learning Questionnaire (MSLQ) for STEM Courses at an HBCU. AERA Open 2018, 4, 2332858418809346. [Google Scholar] [CrossRef] [Green Version]
  46. Ghiasvand, M.Y. Relationship between learning strategies and academic achievement; based on information processing approach. Procedia-Soc. Behav. Sci. 2010, 5, 1033–1036. [Google Scholar] [CrossRef] [Green Version]
  47. Bråten, I.; Samuelstuen, M.S. Does the Influence of Reading Purpose on Reports of Strategic Text Processing Depend on Students’ Topic Knowledge? J. Educ. Psychol. 2004, 96, 324–336. [Google Scholar] [CrossRef]
  48. Taub, M.; Azevedo, R.; Bouchet, F.; Khosravifar, B. Can the use of cognitive and metacognitive self-regulated learning strategies be predicted by learners’ levels of prior knowledge in hypermedia-learning environments? Comput. Hum. Behav. 2014, 39, 356–367. [Google Scholar] [CrossRef]
  49. Paans, C.; Segers, E.; Molenaar, I.; Verhoeven, L. The quality of the assignment matters in hypermedia learning. J. Comput. Assist. Learn. 2018, 34, 853–862. [Google Scholar] [CrossRef]
  50. Jamšek, S.; Sajovic, I.G.; Andrej; Vrtačnik, M.; Wissiak Grm, K.S.; Boh Podgornik, B.; Glažar, S.A. Chemistry 9. In Textbook for Chemistry in the 9th Grade of Primary School; National Education Institute Slovenia: Ljubljana, Slovenia, 2014. [Google Scholar]
  51. Friedrich, S.; Konietschke, F.; Pauly, M. Resampling-Based Analysis of Multivariate Data and Repeated Measures Designs with the R Package MANOVA.RM. R J. 2019, 11, 380–400. [Google Scholar] [CrossRef]
  52. Pintrich, P.R.; de Groot, E.V. Motivational and self-regulated learning components of classroom academic performance. J. Educ. Psychol. 1990, 82, 33–40. [Google Scholar] [CrossRef]
  53. Schwinger, M.; Steinmayr, R.; Spinath, B. How do motivational regulation strategies affect achievement: Mediated by effort management and moderated by intelligence. Learn. Individ. Differ. 2009, 19, 621–627. [Google Scholar] [CrossRef]
  54. Ristić Dedić, Z. Metacognitive knowledge in relation to inquiry skills and knowledge acquisition within a computer-supported inquiry learning environment. Psihol. Teme 2014, 23, 115–141. [Google Scholar]
Figure 1. The user interface of the e-learning environment. A part of the learning unit content is shown together with the six chapter titles that open on click. The user has to log into the system and their progress in learning is saved. A link to glossary (named SLOVAR) is always available in top menu bar.
Figure 1. The user interface of the e-learning environment. A part of the learning unit content is shown together with the six chapter titles that open on click. The user has to log into the system and their progress in learning is saved. A link to glossary (named SLOVAR) is always available in top menu bar.
Sustainability 14 05686 g001
Table 1. Descriptive statistics and Spearman correlation coefficients for the examined variables.
Table 1. Descriptive statistics and Spearman correlation coefficients for the examined variables.
SRL VariablesMSDScale1234567891011121314151617181920
Offline measures of strategies (checklists)
1. Cognitive surface strategies0.550.500–1
2. Cognitive deep strategies3.732.160–90.23 **
3. Metacognitive planning strategies1.371.020–40.190.43 **
4. Metacognitive monitoring strategies2.281.420–50.23 **0.47 **0.38 **
5. Metacognitive regulation strategies3.652.280–80.22 **0.53 **0.39 **0.57 **
6. Metacognitive evaluation strategies1.230.990–30.130.43 **0.44 **0.42 **0.41 **
7. Motivational strategies0.600.870–3−0.030.140.190.21 **0.120.25 **
Offline measures of motivation (scales)
8. Motivation before starting to learn4.141.281–70.100.32 **0.19 **0.190.25 **0.24**0.04
9. Motivation to continue with learning3.811.491–70.140.31 **0.25 **0.28 **0.33 **0.30 **0.040.61 **
10. Level of engagement in learning1.650.651–30.130.090.130.140.080.20 **–0.070.170.22 **
Online measures (data traces)
11. Time spent for going through the unit for the first time1015.63426.010–0.180.28 **0.22 **0.130.24 **0.150.070.27 **0.28 **0.16
12. Time spent for reviewing the unit110.69150.190–0.110.110.010.020.110.04−0.060.030.03−0.02−0.05
13. Time spent on video viewing33.0435.130–0.010.030.160.080.110.090.21 **0.060.06−0.020.170.05
14. Number of times for opening glossary2.242.600–0.130.160.100.21 **0.41 **0.040.010.080.11−0.040.20 **0.100.01
Online measures of strategies (notes)
15. Basic rehearsal0.150.410–0.010.03−0.050.030.04−0.11−0.06−0.03−0.04−0.02−0.020.02−0.010.06
16. Complex rehearsal1.171.210–0.100.24 **0.190.040.110.130.020.140.180.130.53 **0.000.010.03−0.07
17. Basic elaboration0.140.400–−0.040.13−0.040.040.03−0.08−0.050.050.060.060.050.01−0.030.050.22 **0.07
18. Complex elaboration0.861.110–0.030.24 **0.180.060.060.080.010.140.150.160.47 **−0.090.020.10−0.030.64 **0.15
19. Basic organization0.490.840–0.100.28 **0.110.090.20 **0.060.000.130.160.030.34 **0.100.000.100.080.26 **0.20 **0.30 **
20. Complex organization0.410.760–0.050.24 **0.080.140.24 **0.140.010.120.160.090.40 **0.05−0.010.150.160.21 **0.170.29 **0.40 **
Science performance
21. Grade-point average in science3.760.981–50.080.24 **0.060.140.160.03−0.110.140.110.060.130.06−0.060.22 **0.130.060.140.150.30 **0.29 **
** p < α (α was set at 0.00024 due to multiple tests carried out). N = 364.
Table 2. Means (and standard deviations in parentheses) of different offline and online measures of self-regulated learning (SRL) and the results of semi-parametric ANOVAs for testing the effects of prior knowledge and knowledge gain on SRL measures.
Table 2. Means (and standard deviations in parentheses) of different offline and online measures of self-regulated learning (SRL) and the results of semi-parametric ANOVAs for testing the effects of prior knowledge and knowledge gain on SRL measures.
Group (Prior Knowledge/Gain)
M (SD)
Semi-Parametric ANOVA
WTS (ω2)
SRL VariablesA (Low/Low)
(n = 89)
B (Low/High)
(n = 91)
C (Some/Low)
(n = 92)
D (Some/High)
(n = 92)
Prior KnowledgeKnowledge Gain
Offline measures of strategies (checklists)
1. Cognitive surface strategies0.45 (0.50)0.57 (0.50)0.55 (0.50)0.62 (0.49) 2.16 (0.006)3.23 (0.009)
2. Cognitive deep strategies3.00 (1.95)3.90 (2.17)3.71 (2.07)4.27 (2.27) 5.89 (0.015)10.91 ** (0.029)
3. Metacognitive planning strategies1.30 (0.88)1.43 (1.16)1.33 (0.94)1.42 (1.07) 0.01 (<0.001)1.09 (0.003)
4. Metacognitive monitoring strategies2.11 (1.50)2.44 (1.38)2.02 (1.27)2.55 (1.49) 0.01 (<0.001)8.44 (0.023)
5. Metacognitive regulation strategies3.04 (2.18)3.88 (2.23)3.68 (2.30)3.99 (2.32) 2.52 (0.007)5.80 (0.016)
6. Metacognitive evaluation strategies1.40 (1.01)1.21 (0.99)0.93 (0.94)1.37 (0.98) 2.26 (0.006)1.35 (0.004)
7. Motivational strategies0.78 (1.01)0.63 (0.90)0.52 (0.80)0.49 (0.75) 4.56 (0.012)0.98 (0.003)
Offline measures of motivation (scales)
8. Motivation before starting to learn3.84 (1.45)4.25 (1.36)4.02 (1.14)4.42 (1.10) 1.72 (0.005)9.24 * (0.025)
9. Motivation to continue with learning3.48 (1.55)4.00 (1.51)3.74 (1.50)4.02 (1.37) 0.79 (0.002)6.58 (0.018)
10. Level of engagement in learning1.52 (0.66)1.68 (0.68)1.55 (0.62)1.83 (0.62) 1.81 (0.005)10.38 * (0.028)
Online measures (data traces)
11. Time spent for going through the unit for the first time867.24 (450.24)1101.02 (427.20)922.84 (389.26)1167.51 (367.48) 2.01 (0.005)30.94 ** (0.079)
12. Time spent for reviewing the unit97.46 (151.32)110.23 (144.49)105.23 (155.92)129.40 (149.42) 0.73 (0.002)1.37 (0.004)
13. Time spent on video viewing33.56 (32.91)32.40 (48.15)36.37 (31.96)29.86 (23.21) 0.00 (<0.001)1.08 (0.003)
14. Number of times for opening glossary1.91 (2.47)2.15 (2.44)2.10 (2.47)2.77 (2.94) 2.20 (0.006)2.86 (0.008)
Online measures of strategies (notes)
15. Basic rehearsal0.07 (0.25)0.16 (0.43)0.16 (0.43)0.20 (0.50) 2.16 (0.006)2.29 (0.006)
16. Complex rehearsal0.79 (0.98)1.35 (1.28)1.08 (1.13)1.46 (1.31) 2.53 (0.006)14.56 ** (0.038)
17. Basic elaboration0.04 (0.26)0.16 (0.43)0.18 (0.44)0.17 (0.43) 3.19 (0.008)1.71 (0.004)
18. Complex elaboration0.46 (0.81)0.95 (1.05)0.79 (1.01)1.23 (1.36) 7.49 (0.019)16.67 ** (0.043)
19. Basic organization0.25 (0.68)0.49 (0.81)0.48 (0.88)0.73 (0.92) 7.22 (0.019)8.26 (0.022)
20. Complex organization0.15 (0.44)0.52 (0.91)0.28 (0.56)0.70 (0.91) 4.20 (0.010)25.85 ** (0.066)
Science performance
21. Grade-point average in science3.09 (0.82)3.72 (0.92)3.76 (1.00)4.45 (0.65) 60.31 ** (0.126)52.55 ** (0.111)
Notes. In the semi-parametric ANOVAs, the degrees of freedom were 1 for both main effects. * p < α (α was set at 0.0024 due to multiple tests carried out). ** p < 0.001.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kavčič, A.; Podlesek, A.; Komidar, L.; Hladnik, A.; Boh Podgornik, B.; Bohak, C.; Depolli Steiner, K.; Gril, A.; Lesar, Ž.; Marolt, M.; et al. What Can Off- and Online Measures Tell about Students’ Self-Regulation and Their Achievement While Learning Science Expository Hypertext. Sustainability 2022, 14, 5686. https://doi.org/10.3390/su14095686

AMA Style

Kavčič A, Podlesek A, Komidar L, Hladnik A, Boh Podgornik B, Bohak C, Depolli Steiner K, Gril A, Lesar Ž, Marolt M, et al. What Can Off- and Online Measures Tell about Students’ Self-Regulation and Their Achievement While Learning Science Expository Hypertext. Sustainability. 2022; 14(9):5686. https://doi.org/10.3390/su14095686

Chicago/Turabian Style

Kavčič, Alenka, Anja Podlesek, Luka Komidar, Aleš Hladnik, Bojana Boh Podgornik, Ciril Bohak, Katja Depolli Steiner, Alenka Gril, Žiga Lesar, Matija Marolt, and et al. 2022. "What Can Off- and Online Measures Tell about Students’ Self-Regulation and Their Achievement While Learning Science Expository Hypertext" Sustainability 14, no. 9: 5686. https://doi.org/10.3390/su14095686

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop