Next Article in Journal
The Development of an ‘Attitudes to Science and Religion’ Instrument for Secondary School Students: How Are the Attitudes of Students to Science and Religion Associated with Student Religion and Other Characteristics?
Next Article in Special Issue
Viewpoints on the Development of Critical Thinking Skills in the Process of Foreign Language Teaching in Higher Education and the Labor Market
Previous Article in Journal
Identity-Conscious Scholar Formation: Shaping More Inclusive Academic Communities
 
 
Article
Peer-Review Record

Development and Validation of a Critical Thinking Assessment-Scale Short Form

Educ. Sci. 2022, 12(12), 938; https://doi.org/10.3390/educsci12120938
by Rita Payan-Carreira 1,*, Ana Sacau-Fontenla 2, Hugo Rebelo 3, Luis Sebastião 3 and Dimitris Pnevmatikos 4
Reviewer 1: Anonymous
Reviewer 2:
Reviewer 3: Anonymous
Educ. Sci. 2022, 12(12), 938; https://doi.org/10.3390/educsci12120938
Submission received: 18 November 2022 / Revised: 13 December 2022 / Accepted: 15 December 2022 / Published: 19 December 2022

Round 1

Reviewer 1 Report

With great interest and pleasure I read your manuscript. It is well written and nicely positioned in the current literature on critical thinking.

Nevertheless I have some suggestions

-        In the manuscript you describe different conceptualisations of critical thinking and you mention critical thinking skills and also dispositions. However, throughout the text you use critical thinking and critical thinking skills sometimes as synonyms and in other places you clearly distinguished skills from dispositions. Also the short-form CTSAS is a test on critical thinking skills but you use it sometimes as if it equals critical thinking. I think the text would gain clarity if you position yourself more clearly.

-        An intriguing sentence starts on line 86 (There is little support that …). I would liked to have seen a discussion on the ideas in the sentence in the discussion, as it touches a key aspect of the validity of the test.

-        You refer to Cole and Gonyea (2010) when stating that self-report data might not be consensual. However, what Cole and Gonyea did, was comparing actual test scores (on e.g. SAT), which is (in theory at least) known by the student and what students say they had. What you do in CTSAS is something else: you ask students to report on their behaviour. The question remains on how accurate students can assess their behaviour (E.g. how do we know if students correctly assess if they ‘provides reason of rejecting another’s claim’ (50)). An elaboration on that aspect of validity would be good to add in the discussion. The strong internal consistencies of the scale do not guarantee that students’ self-assessments are similar to the assessment of a teacher of the behaviour or skill of students.  

-        The short-form of CTSAS is tested with different students groups who all took the test once, if I understood it correctly. In the last sentence you state that is a tool can be used to survey changes in CrT skills after instructional actions. Do you have data of students taking the test twice? How do you know that you can assess changes with this test?  

-        Some sentences are very long, which make them hard to understand. E.g on line 69 a very long sentence starts. It says that two different approaches are used. Because of the long sentence it is unclear what the two methods precisely are.

Author Response

The authors greatly appreciated the time and effort put on the review of the proposed manuscript. We really appreciate your comments and suggestions and attempt to answer your concerns in full:

Comment 1. regarding the use of critical thinking and critical thinking skills either as synonyms or as a way to distinguish between skills from dispositions.

In fact, sometimes we use critical thinking in general terms, but we understand that it may raise some concerns regarding the consistency of this approach. So, we revised the MS and change the terms accordingly. We hope that it became clear now.

Comment 2. regarding the sentence starting on line 86. It derives from the comments from researchers that defend that performance assessment would better reflect the use or transposition of skills (including CrT as a soft skill) into real work situations. During the MS production, it became detached from the sequential line of thought. It has been changed and brought closer to the correct place in the text (lines 86 to 90, in the new version of the MS)

Comment 3. With respect to your comment on the Cole and Gunyea study, the sentence was elaborated to present the possible bias that can occur in self-report questionnaires, and a comment was introduced about the possibility to mitigate the bias impact when they are acknowledged during the design of a self-report instrument. An additional comment was also introduced in the discussion.

In comment 4. you question whether we have (or not) data of students taking the test multiple times. We implemented some learning activities with those students, which are still ongoing, and we plan to ask the students to answer again the questionnaire. That would allow us to test the ability of the scale to monitor (or not) variations in CrT skills associated with the pedagogical interventions. As we still don´t have results from that study, we changed the sentence.

In your last comment, you point out the existence of some long and not-so-clear sentences in the MS. We try to track them down and correct them.

Some additional changes were introduced, as we identified a reference that was duplicated in the list, and other minor issues in the reference list as well, which were corrected. All the modifications except those in the reference list were signaled using track-changes adds in Word (Office).

We hope that the MS is now in a good fit and that it can be accepted for publication. Again, we are grateful for your help in strengthening the new version of the manuscript

Reviewer 2 Report

The manuscript is clear, relevant for the field and presented in a well-structured manner. The results provide an advancement of the current knowledge of critical thinking research. The methods, tools, software, described with sufficient details to allow another researcher to reproduce the results. Part of the discussion and conclusions consistent with the evidence and arguments presented. The conclusions are interesting for the readership of the journal.

Author Response

The authors greatly appreciated the time and effort put into the review of the proposed manuscript. We really appreciate your comments.

Reviewer 3 Report

The authors have done a great job. The validation of the instrument is presented in a detailed and coherent way in the paper. The validated instrument will be useful for further research on critical thinking.

Author Response

The authors greatly appreciated the time and effort put into the review of the proposed manuscript. We really appreciate your comments.

Back to TopTop