Next Article in Journal
Management Assessment and Future Projections of Construction and Demolition Waste Generation in Hai Phong City, Vietnam
Next Article in Special Issue
Developing Students’ Attitudes toward Convergence and Creative Problem Solving through Multidisciplinary Education in Korea
Previous Article in Journal
The Characteristics of Soil Ca and Mg Leakage in a Karst Depression
Previous Article in Special Issue
Visualizing Source-Code Evolution for Understanding Class-Wide Programming Processes
 
 
Article
Peer-Review Record

Impacts on Student Learning and Skills and Implementation Challenges of Two Student-Centered Learning Methods Applied in Online Education

Sustainability 2022, 14(15), 9625; https://doi.org/10.3390/su14159625
by Lama Soubra 1,*, Mohammad A. Al-Ghouti 1, Mohammed Abu-Dieyeh 1, Sergio Crovella 1 and Haissam Abou-Saleh 1,2
Reviewer 1: Anonymous
Reviewer 2:
Reviewer 3: Anonymous
Sustainability 2022, 14(15), 9625; https://doi.org/10.3390/su14159625
Submission received: 4 June 2022 / Revised: 30 July 2022 / Accepted: 1 August 2022 / Published: 5 August 2022

Round 1

Reviewer 1 Report

There is an interesting premise in the heart of this article, which is likely to make an important contribution to literature and practice.  Most parts are nicely written. Below I present a number of suggested changes while trying to be helpful.

Please be careful with some claims. For example, the abstract mentions: ‘Online education became prevalent during the Covid-19 pandemic imposing thus the need to employ effective online teaching strategies’. You need to be careful here for a number of reasons, as follows:

1.       Where is online learning prevalent? In which continent/ country? The application of online learning during the pandemic has been diffused differently in various parts of the world. Not all countries were successful in their effort.

2.       Online education has been very successful (and facilitated educational inclusion) for several years. It’s not simply a phenomenon that emerged in Covid-19. The reason it became successful (once again) during the pandemic, is because certain technologies, learning platforms, interactive recourses, learning spaces and other components were already  available. Therefore, claiming that the use of online learning is attributed to Covid-19, is quite misleading.

3.       ‘…this the need to employ effective online teaching strategies’. There is always a need to evaluate pedagogy and learning strategies, for both face-to-face and online learning, with or without Covid-19. It’s Covid-19 that generated this need. This need has always been present.

As you can see, by using one single sentence from your abstract, it becomes clear that your paper should be deeply and carefully reviewed to be considered for publication. The same applies to other parts of the paper. For example, when you claim: ‘The transition from traditional face-to-face learning to online education encountered many challenges related to the flexibility of available learning platforms and digital tools; acquaintance with Information and Communication Technologies (ICT) for both educators and the learners’ (page 2), once again you have to understand that this is not the case in every university. For example, there are universities that achieved a smooth transition during Covid, exactly because they had an established experience in online learning (along with the face-to-face offerings) prior to Covid. And of course, there are universities that refused to get on the ‘online train’ prior to the pandemic, and had to struggle during the transition. But once again, the problem here does not concern online learning as such, but rather, universities’ older decisions to engage with online learning or not, prior to the pandemic. This distinction should be made clear. Towards this approach, I strongly advice you to read and utilize the following papers in your analysis:

Boca, G.D. Factors Influencing Students’ Behavior and Attitude towards Online Education during COVID-19. Sustainability 2021, 13, 7469. https://doi.org/10.3390/su13137469

  Doukanari, E., Ktoridou, D., Efthymiou, L. and Epaminonda, E. (2021) The Quest for Sustainable Teaching Praxis: Opportunities and Challenges of Multidisciplinary and Multicultural Teamwork. Sustainability 2021, 13, 7210. https://doi.org/10.3390/su13137210  

Boca, G.D. Factors Influencing Students’ Behavior and Attitude towards Online Education during COVID-19. Sustainability 2021, 13, 7469. https://doi.org/10.3390/su13137469  

Efthymiou L. & Zarifis A. (2021). ‘Modeling Students’ Voice for Enhanced Quality in Online Management Education’, The International Journal of Management Education, vol.19, iss.2, pp.1-16. Available from: https://doi.org/10.1016/j.ijme.2021.100464

 

Concerning Methodology and Methods, I believe the analysis is clear (although I would suggest another round of copy edit, to improve lapses in English expression, and make the research process clearer to the reader. For example, a sentence I really like is the following: ‘In the invitation, students were explicitly told that participation in the surveys is voluntary. It will not affect the instructor/student relationships or students' grades. The collected data will be treated confidentially and will be used to serve the purpose of the study only’ (page 12). Here, you could explain a bit more about issues relating to anonymity, consent, confidentiality, and students’ ability to withdraw from the research, without consequences.

Good presentation of findings and good use of graphic material.

Finally, another round of copy edit will further strengthen the analysis.

 

Finally, I do consider your paper suitable for publication in ‘Sustainability’. However, I believe that the suggested changes will help you present a more detailed analysis. I look forward to reading the enriched (and final) version.

Author Response

Response to Reviewer One

 

Comments

There is an interesting premise in the heart of this article, which is likely to make an important contribution to literature and practice.  Most parts are nicely written. Below I present a number of suggested changes while trying to be helpful.

Please be careful with some claims. For example, the abstract mentions: ‘Online education became prevalent during the Covid-19 pandemic imposing thus the need to employ effective online teaching strategies’. You need to be careful here for a number of reasons, as follows:

Point 1: Where is online learning prevalent? In which continent/ country? The application of online learning during the pandemic has been diffused differently in various parts of the world. Not all countries were successful in their effort.

Response 1: Thank you for raising this point. This point was clarified by adding the countries where online learning became prevalent i.e. GCC countries. (abstract) (lines 15-16)

Point 2: Online education has been very successful (and facilitated educational inclusion) for several years. It’s not simply a phenomenon that emerged in Covid-19. The reason it became successful (once again) during the pandemic, is because certain technologies, learning platforms, interactive recourses, learning spaces and other components were already  available. Therefore, claiming that the use of online learning is attributed to Covid-19, is quite misleading.

Response 2: a small paragraph on online education was introduced in the introduction section, where it was highlighted that online education existed before the pandemic and was successful. Besides, the role of COVID in accelerating the transition towards online education was clarified (lines 68-86).

 

Point 3:       ‘…this the need to employ effective online teaching strategies’. There is always a need to evaluate pedagogy and learning strategies, for both face-to-face and online learning, with or without Covid-19. It’s Covid-19 that generated this need. This need has always been present. 

      Response 3: This sentence was deleted.

Point 4: As you can see, by using one single sentence from your abstract, it becomes clear that your paper should be deeply and carefully reviewed to be considered for publication. The same applies to other parts of the paper. For example, when you claim: ‘The transition from traditional face-to-face learning to online education encountered many challenges related to the flexibility of available learning platforms and digital tools; acquaintance with Information and Communication Technologies (ICT) for both educators and the learners’ (page 2), once again you have to understand that this is not the case in every university. For example, there are universities that achieved a smooth transition during Covid, exactly because they had an established experience in online learning (along with the face-to-face offerings) prior to Covid. And of course, there are universities that refused to get on the ‘online train’ prior to the pandemic, and had to struggle during the transition. But once again, the problem here does not concern online learning as such, but rather, universities’ older decisions to engage with online learning or not, prior to the pandemic. This distinction should be made clear. Towards this approach, I strongly advice you to read and utilize the following papers in your analysis:

 

 

Response 4:  The distinction between the various experiences of universities in the transition towards online education during the pandemic was done. Finally, some of the proposed references were used and included Lines (68-86).

  • Boca, G.D. Factors Influencing Students’ Behavior and Attitude towards Online Education during COVID-19. Sustainability 2021, 13, 7469. https://doi.org/10.3390/su13137469  
  • Frecker, K.; Bieniarz, E. Why Online Education Is Here to Stay. Available online: https://www.lighthouselabs.ca/en/blog/why-education-is-moving-online-for-good(accessed on July 2022).
  • Moore, R.L.; Fodrey, B.P.; Piña, A.A.; Lowell, V.L.; Harris, B.R. Distance Education and Technology Infrastructure: Strategies and Opportunities. In Leading and Managing e-Learning; Springer: Berlin/Heidelberg, Germany, 2018; pp. 87–100. [Google Scholar]

Point 5: Concerning Methodology and Methods, I believe the analysis is clear (although I would suggest another round of copy edit, to improve lapses in English expression, and make the research process clearer to the reader. For example, a sentence I really like is the following: ‘In the invitation, students were explicitly told that participation in the surveys is voluntary. It will not affect the instructor/student relationships or students' grades. The collected data will be treated confidentially and will be used to serve the purpose of the study only’ (page 12). Here, you could explain a bit more about issues relating to anonymity, consent, confidentiality, and students’ ability to withdraw from the research, without consequences.

Response 5: Thank you for raising this point. We did a second round of copy edit and we added/modified some tables to make the process more clear to the reader. We explained more about how data were collected from the students and treated. Lines 339- 346, and lines 380-382.

Good presentation of findings and good use of graphic material.

Point 6: Finally, another round of copy edit will further strengthen the analysis.

Response 6: another round of copy edit was done.

 

Finally, I do consider your paper suitable for publication in ‘Sustainability’. However, I believe that the suggested changes will help you present a more detailed analysis. I look forward to reading the enriched (and final) version.

 

Reviewer 2 Report

The authors have presented interesting work on the impacts of problem-based learning (PBL) and Just-in-Time Teaching (JiTT) and use a variety of assessment tools to strengthen their comparisons. This work is suitable for the journal, however, a more thorough literature review needs to be done regarding PBL and JiTT. For example, the introduction primarily contains references to online teaching, however, the manuscript would benefit from dedicated literature review sections on these two strategies. The authors should emphasize recent work using PBL and JiTT in an online setting and use that to frame their work. Some recommended references are:

1. Ramachandran, R., Bernier, N.A., Mavilian, C.M., Izad, T., Thomas, L. and Spokoyny, A.M., 2021. Imparting Scientific Literacy through an Online Materials Chemistry General Education Course. Journal of Chemical Education, 98(5), pp.1594-1601.

2. Dominguez, M., DiCapua, D., Leydon, G., Loomis, C., Longbrake, E.E., Schaefer, S.M., Becker, K.P., Detyniecki, K., Gottschalk, C., Salardini, A. and Encandela, J.A., 2018. A neurology clerkship curriculum using video-based lectures and just-in-time teaching (JiTT). MedEdPORTAL, 14, p.10691.

3. Erickson, S., Neilson, C., O’Halloran, R., Bruce, C. and McLaughlin, E., 2021. ‘I was quite surprised it worked so well’: Student and facilitator perspectives of synchronous online Problem Based Learning. Innovations in Education and Teaching International, 58(3), pp.316-327.

4. Gavrin, A., Watt, J.X., Marrs, K. and Blake Jr, R.E., 2004. Just-in-time teaching (JITT): Using the web to enhance classroom learning. Computers in education journal, 14(2), pp.51-60.

5. Carter, P., 2009, May. An experiment with online instruction and active learning in an introductory computing course for engineers: JiTT meets CS1. In Proceedings of the 14th western Canadian conference on computing education (pp. 103-108).

Author Response

Response to Reviewer Two

Comments

The authors have presented interesting work on the impacts of problem-based learning (PBL) and Just-in-Time Teaching (JiTT) and use a variety of assessment tools to strengthen their comparisons. This work is suitable for the journal, however, a more thorough literature review needs to be done regarding PBL and JiTT.

Point 1:  For example, the introduction primarily contains references to online teaching, however, the manuscript would benefit from dedicated literature review sections on these two strategies. The authors should emphasize recent work using PBL and JiTT in an online setting and use that to frame their work. Some recommended references are:

  1. Ramachandran, R., Bernier, N.A., Mavilian, C.M., Izad, T., Thomas, L. and Spokoyny, A.M., 2021. Imparting Scientific Literacy through an Online Materials Chemistry General Education Course. Journal of Chemical Education, 98(5), pp.1594-1601.
  2. Dominguez, M., DiCapua, D., Leydon, G., Loomis, C., Longbrake, E.E., Schaefer, S.M., Becker, K.P., Detyniecki, K., Gottschalk, C., Salardini, A. and Encandela, J.A., 2018. A neurology clerkship curriculum using video-based lectures and just-in-time teaching (JiTT). MedEdPORTAL, 14, p.10691.
  3. Erickson, S., Neilson, C., O’Halloran, R., Bruce, C. and McLaughlin, E., 2021. ‘I was quite surprised it worked so well’: Student and facilitator perspectives of synchronous online Problem Based Learning. Innovations in Education and Teaching International, 58(3), pp.316-327.
  4. Gavrin, A., Watt, J.X., Marrs, K. and Blake Jr, R.E., 2004. Just-in-time teaching (JITT): Using the web to enhance classroom learning. Computers in education journal, 14(2), pp.51-60.
  5. Carter, P., 2009, May. An experiment with online instruction and active learning in an introductory computing course for engineers: JiTT meets CS1. In Proceedings of the 14th western Canadian conference on computing education (pp. 103-108).

Response 1: Two sections dedicated to PBL and JiTT were added after the introduction section and some suggested references were used to frame the current work. (Lines 124-177)

Reviewer 3 Report

The paper discusses the impacts of adopting two different approaches of student-centered instructional strategies in online settings in comparison with these approaches in conventional learning settings (face-to-face classroom). For that, authors gathered both qualitative and quantitative information from 134 students who took courses using these different approaches and statistically compared their performances.

The work explores an important aspect of online-learning and it is specially essential if considering the current trends and shifts related to educational approaches caused by the COVID 19 pandemic situation.

The paper however presents a number of issues that need improvement. First, there are innumerous words and sentences without separation (several words along the paper are not divided). This can be easily fixed but it seriously compromises the reading and review of the paper. The paper also mentions two Annexes (A and B) related to the surveys that are not included at the end of the paper. I understand the authors had the intention to include the annexes (as they mentioned in the text). Some other formatting issues with the paper were also observed (large parts of pages in blank, tables initiating in the middle of the page, different paragraphs with different line spacing). Table 2 cuts the last row in two parts giving the impression that BIO 452 and Molecular Analytical Techniques are different courses.

After reading the paper I still have a number of questions that I believe need more clarification inside the paper.

1) In my opinion it is a bit difficult to follow how many students participated in the experiments for each specific approach and which approaches were applied in each discipline. All 134 students participated in both PBL and JiTT classrooms? How many students participated in the LBL classroom? JiTT was introduced in the same courses were PBL was applied? I believe a general table explaining this would help to better understand the panoramic picture of the experiment.

2) Were the different courses taught by the same professors. Were the same materials available for all the students from these courses? How these issues may impact the results of the experiments? The same quizzes and final exams were applied for these different classrooms or different professors elaborated different exams?

3) At the end of section 1, the authors propose a number of research questions that are not directly tackled at the end of the paper. Although I understand the discussion involves the research questions, it is important to answer directly each one of them, preferably referring to the quantitative and qualitative results of the experiments while answering them.

4) The comparison between continuous variables was made by using student’s T-test. Do these variables follow a normal distribution ? Which are these continuous variables ? The scores? It is better to state which test is being presented in the tables. The analysis of variance mentioned in line 329 is the ANOVA test?

5) The instruments used measure different dimensions (Learning the subject matter, intrinsic interest in learning, etc). How the instruments were quantitatively validated? Do the authors tested the internal correlation between the different questions that measure the same dimension beforehand?

6) In section 3.6, the authors conclude that PBL and JiTT had significant positive impacts on the long-term, but not in the short-term. Which experiment allowed the authors to conclude that?

Clarify the meaning of QU in line 232 ( I believe it is the name of the University)

Author Response

Response to Reviewer Three

 

Comments

The paper discusses the impacts of adopting two different approaches of student-centered instructional strategies in online settings in comparison with these approaches in conventional learning settings (face-to-face classroom). For that, authors gathered both qualitative and quantitative information from 134 students who took courses using these different approaches and statistically compared their performances. The work explores an important aspect of online-learning and it is specially essential if considering the current trends and shifts related to educational approaches caused by the COVID 19 pandemic situation.

 

Point 1: The paper however presents a number of issues that need improvement. First, there are innumerous words and sentences without separation (several words along the paper are not divided). This can be easily fixed but it seriously compromises the reading and review of the paper. The paper also mentions two Annexes (A and B) related to the surveys that are not included at the end of the paper. I understand the authors had the intention to include the annexes (as they mentioned in the text). Some other formatting issues with the paper were also observed (large parts of pages in blank, tables initiating in the middle of the page, different paragraphs with different line spacing). Table 2 cuts the last row in two parts giving the impression that BIO 452 and Molecular Analytical Techniques are different courses.

Response 1: The formatting issues are not related to the original submitted paper. They have emerged from the formatting conducted by the editorial office. Besides the annex A and B were written by mistakes and are meant to be Table 6 and Table 7.

 

After reading the paper I still have a number of questions that I believe need more clarification inside the paper.

Point 2: In my opinion it is a bit difficult to follow how many students participated in the experiments for each specific approach and which approaches were applied in each discipline. All 134 students participated in both PBL and JiTT classrooms? How many students participated in the LBL classroom? JiTT was introduced in the same courses were PBL was applied? I believe a general table explaining this would help to better understand the panoramic picture of the experiment.

Response 2: thank you for this suggestion. A table  (table 2) was created to help answering raised questions and present a summary of the experiment. The table describes for each course what are the implemented instructional strategies, the number of students participating, and the modules where the instructional strategies were implemented.

 

Point 3: Were the different courses taught by the same professors. Were the same materials available for all the students from these courses? How these issues may impact the results of the experiments? The same quizzes and final exams were applied for these different classrooms or different professors elaborated different exams?

 

Response 3: All included courses were run in one section and each course was taught by the same instructor who implemented the two/three instructional strategies. Therefore, all students enrolled in the same course had the same quizzes and exams. In addition, the researchers reviewed exam questions to ensure that they tackle recalling facts, as well as critical thinking and problem-solving abilities. Besides, a committee of experts in the field reviewed the questions to ensure they had the same difficulty level for each section.

 

Point 4: At the end of section 1, the authors propose a number of research questions that are not directly tackled at the end of the paper. Although I understand the discussion involves the research questions, it is important to answer directly each one of them, preferably referring to the quantitative and qualitative results of the experiments while answering them.

Response 4: Thank you for raising this point. The research questions were tackled directly at the end of the paper, in the first paragraph of the conclusion section. (Lines 661-670)

 

Point 5: The comparison between continuous variables was made by using student’s T-test. Do these variables follow a normal distribution ? Which are these continuous variables ? The scores? It is better to state which test is being presented in the tables. The analysis of variance mentioned in line 329 is the ANOVA test?

Response 5: the data statistical analysis part was rewritten to  make it clearer to the reader. The scores from the quizzes and exams as well as those calculated from the surveys were normally distributed. The continuous variables are the test and the survey scores.  The tests used are added to the tables. The analysis of variance was replaced by the ANOVA test. (Lines385-390)

Point 6: The instruments used measure different dimensions (Learning the subject matter, intrinsic interest in learning, etc). How the instruments were quantitatively validated? Do the authors tested the internal correlation between the different questions that measure the same dimension beforehand?

Response 6: Thank you for raising this point. The internal consistency of the whole survey as well as different sections was determined using Cronbach’s alpha coefficient. The values are presented in two tables (tables 6 and 7) that were added in the material and method section. (Lines 334-338)

Point 7: In section 3.6, the authors conclude that PBL and JiTT had significant positive impacts on the long-term, but not in the short-term. Which experiment allowed the authors to conclude that?

Response 7: we meant by this that the PBL and JiTT were not superior to LBL on short-term learning. It is the comparison between the test scores (quizzes and final exams) that allowed us to draw this conclusion. Indeed, since there was not a significant difference between scores of quizzes when the course materials were taught using PBL or JiTT and those observed when the module was taught using LBL, it can be concluded that PBL and JiTT are as effective as LBL in terms of short-term knowledge acquisition. The statement was rewritten to make it more clear. (Lines 535-539)

 

Point 8: Clarify the meaning of QU in line 232 (I believe it is the name of the University)

Response 8: Yes, indeed it is the name of the university. This was clarified in the text. Line (316)

Round 2

Reviewer 3 Report

The authors attended all the improvements required by the reviewer.

I still think that the answers to the research questions should be clearly addressed with the same letters (A, B, C, D, ...) as they were identified in the Introduction section.

 

Author Response

Response to Reviewer Three

 

Comments

The authors attended all the improvements required by the reviewer.

 

Point 1: I still think that the answers to the research questions should be clearly addressed with the same letters (A, B, C, D, ...) as they were identified in the Introduction section.

Response 1: we added the same letter to the answers of the research questions as identified in the introduction section.

Back to TopTop