Special Issue "Advances in Metacognition, Learning, and Reactivity"
A special issue of Journal of Intelligence (ISSN 2079-3200).
Deadline for manuscript submissions: closed (30 April 2023) | Viewed by 10095
Interests: test-enhanced learning; spacing and interleaving; metamemory monitoring and control; development of meta-awareness; translating principles from cognitive sciences into educational practice
Interests: factors influencing children’s learning ability and mental health; metacognition; family socioeconomic status; learning strategies
Confucius said, “To know what you know and what you do not know, that is true knowledge.” Accurate monitoring of what we know and what we do not know plays a fundamental role in effective learning, because individuals typically regulate their study activities according to their metacognitive monitoring (Bjork, Dunlosky, & Kornell, 2013; Finn, 2008). Previous studies demonstrated that human metacognitive monitoring is far from perfect, and susceptible to a variety of illusions and biases (Undorf, Navarro-Báez, & Zimdahl, 2022). For instance, although practice testing and interleaved learning have been established as powerful study strategies, learners tend to metacognitively underappreciate the merits of these strategies, leading to their under-employment during self-regulated learning (Kornell & Bjork, 2008; Rivers, 2021). Hence, it is of critical importance to understand what mechanisms underlie metacognitive monitoring, what factors constrain monitoring accuracy, and how to improve monitoring accuracy.
In previous metacognition studies, researchers typically instructed participants to make a metacognitive judgment (e.g., judgment of learning, confidence rating) while or after participants studied each item or answered each question (Rhodes & Tauber, 2011). These studies implicitly assume that these item-by-item judgments are passive measures of metacognition and have no impact on the underlying cognitive process being monitored. However, an emerging body of recent studies demonstrate that these metacognitive judgments can (at least in some situations) reactively temper the very things being judged—a phenomenon known as the reactivity effect (Double, Birney, & Walker, 2018; Shi et al., 2022; Soderstrom, Clark, Halamish, & Bjork, 2015; Zhao et al., 2022). These reactive findings suggest that trial-by-trial judgments may not be an unbiased measure of metacognition, and highlight the urgent need to uncover the cognitive underpinnings of reactivity, which is helpful for developing effective methods to eliminate (or reduce) reactivity in future metacognition research.
This Special Issue has two main aims. The first is to further explore the mechanisms underlying metacognitive monitoring, factors constraining monitoring accuracy, and interventions to promote monitoring accuracy (e.g., mitigating metacognitive bias and promoting the self-employment of effective learning strategies). The second is to examine why memory is reactive to metacognitive judgments. Studies that explore the practical use of reactivity in learning settings are also welcomed.
The Editors of this Special Issue from Journal of Intelligence invite contributions that present experimental findings, neuroscientific results, computational models, innovative theoretical perspectives, and systematic reviews (e.g., meta-analyses) that contribute to advancing our understanding of the aforementioned research questions.
In particular, the Editors invite contributions regarding the following topics:
- Mechanisms underlying metacognitive monitoring;
- Factors affecting metacognitive judgments (e.g., judgments of learning and confidence ratings);
- Interventions to calibrate monitoring accuracy;
- Links among metacognition, study habits, and learning outcomes;
- Interventions to promote self-use of effective study strategies (e.g., testing, interleaving) during self-regulated learning;
- Mechanisms underlying the reactivity effect of metacognitive judgments;
- Methods to eliminate (or reduce) reactivity in metacognition research;
- Practical use of reactivity in learning settings.
- Bjork, R. A., Dunlosky, J., & Kornell, N. (2013). Self-regulated learning: Beliefs, techniques, and illusions. Annual Review of Psychology, 64, 417–444. doi:10.1146/annurev-psych-113011-143823.
- Double, K. S., Birney, D. P., & Walker, S. (2018). A meta-analysis and systematic review of reactivity to judgements of learning. Memory, 26(6), 741–750. doi:10.1080/09658211.2017.1404111.
- Finn, B. (2008). Framing effects on metacognitive monitoring and control. Memory & Cognition, 36(4), 813–821. doi:10.3758/mc.36.4.813.
- Kornell, N., & Bjork, R. A. (2008). Learning concepts and categories: Is spacing the “enemy of induction”? Psychological Science, 19, 585–592. doi:https://doi.org/10.1111/j.1467-9280.2008.02127.x.
- Rhodes, M. G., & Tauber, S. K. (2011). The influence of delaying judgments of learning on metacognitive accuracy: A meta-analytic review. Psychological Bulletin, 13, 131–148. doi:http://dx.doi.org/10.1037/a0021705.
- Rivers, M. L. (2021). Metacognition about practice testing: A review of learners’ beliefs, monitoring, and control of test-enhanced learning. Educational Psychology Review, 33, 823–862. doi:10.1007/s10648-020-09578-2.
- Shi, A., Xu, C., Zhao, W., Shanks, D. R., Hu, X., Luo, L., & Yang, C. (2022). Judgments of learning reactively facilitate visual memory by enhancing learning engagement. Psychon Bull Rev. doi:10.3758/s13423-022-02174-1.
- Soderstrom, N. C., Clark, C. T., Halamish, V., & Bjork, E. L. (2015). Judgments of learning as memory modifiers. Journal of Experimental Psychology: Learning, Memory, and Cognition, 41, 553–558. doi:10.1037/a0038388.
- Undorf, M., Navarro-Báez, S., & Zimdahl, M. F. (2022). 19 Metacognitive illusions. Cognitive Illusions: Intriguing Phenomena in Thinking, Judgment, and Memory, 307.
- Zhao, W. L., Li, B., Shanks, D. R., Zhao, W., Zheng, J., Hu, X., Yang, C. (2022). When judging what you know changes what you really know: Soliciting metamemory judgments reactively enhances children’s learning. Child Development, 93, 405–417. doi:https://doi.org/10.1111/cdev.13689.
Dr. Chunliang Yang
Prof. Dr. Liang Luo
Manuscript Submission Information
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a double-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Journal of Intelligence is an international peer-reviewed open access monthly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.
- metacognitive monitoring
- judgments of learning
- confidence rating
- learning efficiency
- self-use of effective study strategies