Next Article in Journal
Asymmetric VR Game Subgenres: Implications for Analysis and Design
Previous Article in Journal
A Comparison of One- and Two-Handed Gesture User Interfaces in Virtual Reality—A Task-Based Approach
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Assessing the Efficacy of an Accessible Computing Curriculum for Students with Autism Spectrum Disorders

1
School of Computer Science, Information, and Engineering Technology Youngstown State University, Youngstown, OH 44555, USA
2
Potential Development Program, Youngstown, OH 2405, USA
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2024, 8(2), 11; https://doi.org/10.3390/mti8020011
Submission received: 22 January 2024 / Revised: 2 February 2024 / Accepted: 5 February 2024 / Published: 9 February 2024

Abstract

:
There is a limited amount of research dedicated to designing and developing computing curricula specifically tailored for students with autism spectrum disorder (ASD), and thus far, no study has examined the effectiveness of an accessible computing curriculum designed specifically for students with ASD. The goal of this study is to evaluate the effectiveness of an accessible curriculum in improving the learning of computational thinking concepts (CTCs) such as sequences, loops, parallelism, conditionals, operators, and data, as well as the development of proficiency in computational thinking practices (CTPs) including experimenting and iterating, testing and debugging, reusing and remixing, and abstracting and modularizing. The study involved two groups, each comprising twenty-four students. One group received instruction using the accessible curriculum, while the other was taught with the original curriculum. Evaluation of students’ CTCs included the analysis of pretest and posttest scores for both groups, and their CTPs were assessed through artifact-based interview scores. The results indicated improvement in both groups concerning the learning of CTCs, with no significant difference between the two curricula. However, the accessible computing curriculum demonstrated significant enhancements in students’ proficiency in debugging and testing, iterating and experimenting, modularizing and abstracting, as well as remixing and reusing. The findings suggest the effectiveness of accessible computing curricula for students with ASD.

1. Introduction

Computational thinking (CT) is a crucial skill for the 21st century. CT involves systematically applying abstraction, decomposition, algorithmic design, generalization, and evaluation to address problems, gain a deeper understanding of situations, and articulate values more effectively, leading to the creation of automated solutions implemented either by digital or human computing devices [1]. Understanding computer systems will be pivotal for discoveries and innovations across all fields and pursuits [2]. Hence, all students should learn CT alongside fundamental skills like reading, writing, and arithmetic [3]. Research indicates that CT fosters improved student interest, knowledge, and skills in computing [4,5,6,7,8]. Additionally, it improves quantitative and critical thinking abilities [9] and supports abstract thinking, generalization, and the construction of convincing arguments [10]. Furthermore, it facilitates the near and far transfer of problem-solving skills, enhancing spatial reasoning skills [11], comprehension of algorithmic flow [12], and even predicting future academic success [13,14]. Consequently, previous studies have integrated CT into many different subjects across various grade levels for typical students [6,7,8,9,10,11,12,13,14]. One particular CT curriculum was designed and developed for students with ASD, ensuring accessibility [15]. However, there is a lack of research into the effectiveness of an accessible CT curriculum specifically designed for students with ASD. This study aims to evaluate the efficiency of a CT curriculum designed for seventh-grade students with ASD and implemented at an inner-city school for students with ASD. The goal is to measure the accessible CT curriculum’s efficacy in enhancing students’ learning of CTCs (loops, sequences, conditionals, parallelism, data, and operators) and in improving their development of fluency in CTPs (debugging and testing, iterating and experimenting, modularizing and abstracting, and remixing and reusing). The study compares two student groups: one taught using the accessible CT curriculum and the other one following the original one.
Numerous studies have demonstrated the successful integration of CT curriculums across various subjects and grade levels. These studies have highlighted several key findings. For example, the performance in CT among college freshmen has been shown to forecast future academic achievements [13] and a significant correlation between students’ computational abilities and their overall academic performance [14]. Additionally, CT has the potential to improve both quantitative and critical thinking skills [9] and improves the ability to abstract, generalize, and articulate persuasive arguments [10]. Moreover, a CT-centered curriculum has been observed to positively impact students’ perceptions, interests, and knowledge of computing [6,8]. Also, investigations have revealed that coding contributes to the far and near transfer of problem-solving skills and enhances spatial reasoning abilities among fifth- and sixth-grade students [11]. Furthermore, the integration of CT into middle and high school classes has been found to improve knowledge related to algorithmic flow [12] and enhance CT skills among middle school students [4].
Consequently, many studies have been conducted to design and implement computer science curricula for typical students. Nevertheless, there is a lack of research focused on designing and developing an inclusive computer science curriculum tailored to the needs of students with special needs. [16] conducted one of these studies where the participants were high school students formally diagnosed with learning disabilities or attention deficit disorder (commonly known as ADHD). This study identified challenges in teaching CS (computer science) to students with ADHD, proposing and testing adjustments to develop an accessible CS course. Successful adjustments related to barriers in language, written expression, reading, attention, and mathematics were made and tested [16]. Additionally, another rare study carried out by [15] aimed to adapt and accommodate an original CT curriculum to make it accessible to students with ASD (autism spectrum disorder).

2. Materials and Methods

2.1. The Original and Accessible Computing Curriculum

The original curriculum utilized in this study is the creative computing curriculum designed by [17]. This curriculum [17,18] is composed of seven units, each containing six sessions, totaling forty-two sessions in all (see Table 1).
It provides instructional activities, accompanying resources centered on Scratch, learning objectives, and downloadable materials. The curriculum is designed around three primary dimensions of CT: computational thinking practices (CTPs), computational thinking concepts (CTCs), and computational thinking perspectives. CTCs encompass loops, sequences, conditionals, parallelism, data, and operators [17,18]. Meanwhile, CTPs involve debugging and testing, iterating and experimenting, modularizing and abstracting, and remixing and reusing [17,18].
However, the CT perspectives dimension is excluded from this study due to the fact that this dimension’s instructional activities aim to enhance students’ understanding of themselves, relationships with others, and their comprehension of the technological world around them. Because of the challenging nature of these activities for students with ASD and considering the neurological impairments of the participants, this dimension will not be measured in the study.
The accessible computing curriculum designed by [15], off of the creative computing curriculum, was utilized in this study. This accessible computing curriculum comprises six units, each containing six sessions, amounting to a total of thirty-six sessions (see Table 1). There are eight instructional components included in each session: pre-teaching activities, session schedule, session learning objectives, visual handouts, instructional activities, instructional videos, work evaluation rubrics, and reflection prompts. However, only the first four units were implemented in this study.

2.1.1. Session Schedule

Every one of the 36 sessions presents the same sequence of instructions, although the extent and scope vary depending on the session’s topics. Each session is initiated with a session schedule designed to alleviate any potential anxiety experienced by the students. This schedule serves to inform students about the anticipated activities and the allotted time for each task within the session. The session schedule is presented on a dedicated page, allowing teachers to print and display it on classroom walls or distribute copies to be placed on students’ desks. The duration of tasks and breaks is to be customized according to individual attention spans and behavioral requirements.

2.1.2. Pre-Teaching Activities

Following the session schedule, the next instructional element is the pre-teaching activities. These activities are specifically aimed at students facing more severe cognitive challenges, those experiencing difficulty grasping presented information, and individuals struggling to keep pace with instructions due to social, psychological, behavioral, or other reasons. Pre-teaching activities offer additional support and preparation for these students. They encompass three key instructional components: session terms, topics, and session expectations. To assist students in understanding unfamiliar terms related to the sessions, descriptions and visual representations (symbols) are provided. The descriptions of these terms and all instructional content presented to students are adjusted to match the students’ reading grade levels. Moreover, by outlining session expectations, students are presented with what is anticipated of them during the session. This helps alleviate anxiety and adequately prepares them for the upcoming session activities.

2.1.3. Session Learning Objectives

In contrast to the original curriculum, which provides a single set of objectives, the accessible curriculum introduces two sets of objectives. The first set, referred to as “session objectives”, serves to communicate the instructional target of the session to the classroom teacher. The second set, labeled as “learning objectives”, is designed to inform students about the outcomes they will attain by the session’s conclusion. These learning objectives are presented at a reading level appropriate for the students’ grade and comprehension abilities and are designed to be specific, measurable, attainable, and observable.

2.1.4. Instructional Activities

The original curriculum’s instructional activities were simplified and segmented into several smaller sections for easier management. These modifications aimed to accommodate students with a wide range of characteristics, encompassing visual, verbal, and kinesthetic learners, those comfortable working independently or in various group sizes, and accommodating visual or verbal response styles, or students who prefer to respond to peers, class, or a notebook. An instructional activity introduced involves modeling activities designed for students to complete the session activities alongside classroom teachers. Another one is for classroom teachers to pair students of distinct characteristics with undergraduate students, facilitating one-on-one collaboration on session activities. Still another one is to permit students who prefer independent study to work on their own.

2.1.5. Visual Handouts

Twenty-four visual handouts have been designed and developed specifically for guiding students through step-by-step instructions to accomplish session projects and/or tasks. These visual guides have been formatted as individual PDF documents, allowing teachers to conveniently print and display them on classroom walls or place them on student desks.

2.1.6. Instructional Videos

Approximately 60 instructional videos were recorded specifically for students who prefer visual learning methods for the presentation of information. All these videos have been uploaded onto a YouTube channel [19] as an integral part of the accessible curriculum. Guidance and instructions detailing the optimal utilization of these instructional videos have been provided within the instructional documents for the sessions.

2.1.7. Reflection Prompts

Reflection prompts have been designed and developed for teachers to kickstart discussions or stimulate written or visual responses from students aligned with the session’s learning objectives. Within each session, one set of reflection prompts presented verbally and accompanied by visual symbols is designed to accommodate students with ASD. Moreover, following each reflection prompt, ample space has been allocated for students to provide their written and/or visual responses.

2.1.8. Work Evaluation Rubrics

To assess students’ accomplishments in meeting the session learning objectives, a dedicated rubric was designed and developed. Thirty-six rubrics in total were made available in PDF formats. These rubrics were designed to assess student achievement across three levels of prompts for each item. The evaluation criteria comprised nine ratings to assess students’ performance: No Attempt Independently, No Attempt with a Visual or Virtual Cue, No Attempt with Physical Assistance, Insufficient Attempt Independently, Insufficient Attempt with a Visual or Virtual Cue, Insufficient Attempt with Physical Assistance, Complete Independently, Complete with a Visual or Virtual Cue, and Complete with Physical Assistance. These rubrics were employed by classroom teachers to assess students’ comprehension of CTCs at the end of each session. In total, 168 rubrics were gathered, with the number of students evaluated per session varying between 6 and 12.

3. Participants

In this research, twenty-three seventh-grade students, with an average age of 13, diagnosed with ASD, drawn from two inner-city schools (PDMS and RCA), participated. All students involved had the ability to use computer applications on Apple iPads. All seventh-grade students in both PDMS and RCA who volunteered to participate were included in the study.

4. Research Methods and Procedures

The independent variable in this study is the grouping, with PDMS serving as the experimental group and RCA as the control group. The experimental group comprised twelve seventh-grade students with ASD from PDMS, exposed to an accessible CT curriculum. In contrast, the control group consisted of eleven seventh-grade students with ASD from RCA, exposed to the original CT curriculum without modifications. The dependent variables encompassed pretest and posttest assessments designed to gauge students’ learning of CTCs, along with interview scores used to assess students’ proficiency development in CTPs.
In the course of this study, a total of twenty-four out of the thirty-six instructional sessions from the accessible CT curriculum were carried out. Each session spanned two 45 min classes held on two distinct days within a week. These sessions continued for a span of 24 weeks, totaling 48 teaching sessions, starting in September 2021 and concluding at the end of May 2022. In addition to these 48 teaching sessions, an extra ten class sessions were dedicated to assessment activities. Specifically, four class sessions were reserved at the beginning and end of the academic year for the administration of the pretest and posttest assessments. Additionally, two class sessions were allocated at the end of the academic year for conducting artifact-based interviews. As a result, the entire intervention comprised 58 class sessions conducted over approximately 30 weeks.

4.1. Data Analysis

The pretest and posttest were utilized to assess students’ learning of CTCs, and the artifact-based interviews were conducted to analyze the development of fluency in CTPs.

4.1.1. Pretest and Posttest

During both the pretest and posttest phases of the study, identical tests were administered to evaluate students’ proficiency in formulating and solving problems utilizing computational thinking concepts (CTCs). The same tests were administered at the pretest and posttest. These tests comprised 28 multiple-choice items, each presenting one correct answer among four choices (three incorrect and one correct). The purpose of these assessments was to assess students’ abilities in formulating and solving problems using their knowledge of CTCs. The pretest was administered at the outset of the study during the initial teaching session, and the posttest was given after the last teaching session. A two-group repeated measures study was carried out on the pretest and posttest data collected from both groups to examine the effect of the accessible CT curriculum on students’ learning of CTCs. The dependent variable was the scores on the posttest adjusted for covariates by their scores on the pretest, and the independent variable was the two groups. The pretest and posttest scores represented the following CTCs: loops, sequences, conditionals, parallelism, data, and operators. An ANCOVA test helped detect the difference in groups along the dependent variable. We analyzed the differences through descriptive statistics and ANCOVA.

4.1.2. Artifact-Based Interviews

Toward the end of the project in May, fifteen students (seven students at PDMS and eight students at RCA) worked for about two weeks on computing projects to present at a showcase day. The computing projects included games, digital stories, and interactive music projects. The week after the showcase day, students were interviewed in an attempt to understand how they utilized CTPs in their projects. The rubric developed by [17] was used after adjusting it for students with ASD to assess students’ development of fluency in CTPs. CTPs were the ratings on the artifact-based interviews. In addition to eliminating five questions in the original rubric, resulting in ten total questions, the language and scope of these questions were simplified to accommodate students with ASD. The adjusted rubric consists of a few questions for each aspect of CTPs, and three columns to rate responses as low, medium, or high. Each of these ratings has specific criteria defined to determine which rating would be the best for a response. Each interview took about 10 to 20 min based on the student and the student’s computing project. An analysis of multivariance (MANOVA) was run to determine whether there is any statistically significant difference between the means of the two groups (PDMS and RCA) on their scores of artifact-based interviews.

5. Results

The objective of the study was to evaluate the effectiveness of the accessible computing curriculum in improving the learning of CTCs and the development of fluency in CTPs among seventh-grade students at PDMS and RCA. The independent variables in the study were the two groups of seventh-grade students, with one group from PDMS and the other from RCA. These groups were exposed to different curriculums—the PDMS group to the accessible CT curriculum and the RCA group to the original CT curriculum. Both groups underwent the intervention for the same duration. The study analyzed the pretest and posttest scores, along with artifact-based interview scores, for both groups.

5.1. Pretest-Posttest Scores

Table 2 shows descriptive statistics for pretest and posttest scores for the PDMS and RCA. Although 12 students took the pretest at PDMS, only 8 of them took the posttest. The same 11 students took both the pretest and posttest at RCA. As seen in Table 2, the mean scores at both the pretest and posttest were higher for students at PDMS than students at RCA.
An independent sample t-test was conducted on the pretest scores (see Table 3), which showed no statistically significant difference between the two groups, PDMS and RCA. The Shapiro-Wilk test results (see Table 4) demonstrated that both the PDMS and RCA groups exhibited a normal distribution in their pretest scores. This suggests that the pretest scores for both groups were normally distributed, meeting the assumption of normality for conducting parametric tests. Additionally, the results of Levene’s test (see Table 5) revealed that the assumption of homogeneity of variance across the PDMS and RCA groups was upheld. This implies that there were no significant differences in the variances of pretest scores between the two groups, validating the assumption of equal variances for conducting the independent samples t-test.
An ANCOVA test was conducted to find out if the experiment, the accessible CT curriculum, made any difference in students’ learning of CTCs. As seen in Table 6, the groups were not statistically significantly different at the posttest on their knowledge of CTCs after adjusting for pretest scores.
To find out if the research project across the groups, regardless of the type of CT curriculum implemented (accessible or original), made any statistically significant effect on students’ learning of CTCs, a paired sample t-test was conducted. The descriptive statistics in Table 7 show that the mean scores across groups were higher at the posttest than at the pretest. The Shapiro-Wilk normality test shows that the scores on the pretest and posttest were normally distributed. Q–Q plots also show the normality of scores on both the pretest and posttest. Box plots did not show outliers. Looking at the paired samples t-test result, as seen in Table 8, one-sided p-value showed a statistically significant result (M = −1.579, SD = 0.814; t(18) = −1.940, p = 0.034). The level of significance (alpha) was set at 0.05.

5.2. Interviews

To examine the effects of the accessible CT curriculum on students’ development of fluency in CTPs as measured by their interview scores in debugging and testing, iterating and experimenting, modularizing and abstracting, and remixing and reusing, a MANOVA test was run, which was able to analyze the two groups, the independent variable (PDMS and RCA), over each dependent variable (debugging and testing, iterating and experimenting, modularizing and abstracting, and remixing and reusing, all of which share CTPs as a common conceptual meaning) individually and all together as a combined construct. The significance level (alpha) for these analyses was set at 0.05, indicating that results would be considered statistically significant if the p-value was less than 0.05. Wilk’s Lambda was employed to ascertain the overall multivariate significance of the dependent variables across the groups. Furthermore, to investigate whether the accessible curriculum had any impact on individual CTP scores, a 2 × 2 ANOVA was conducted. In this analysis, the two groups were the independent variables, while each CTP construct served as the dependent variable. This allowed the researchers to determine whether there were differences between the two groups in each CTP score separately.
Descriptive statistics (see Table 9) of the interview scores for both groups (PDMS and RCA) showed that the mean scores were higher for students in PDMS in all CTPs individually, i.e., in debugging and testing, iterating and experimenting, modularizing and abstracting, and remixing and reusing, and collectively as a composite CTP score. Levene’s test of homogeneity of variance showed that (see Table 10) the two groups were normally distributed on students’ CTP composite scores and all other individual CTP scores except on abstracting and modularizing.
The ANOVA statistics (test of between-groups effect), as presented in Table 11, demonstrated the groups were statistically significantly different on all CTPs individually; debugging and testing (F(1, 13) = 0.004, p < 0.05; partial eta squared = 0.493; power = 0.907), iterating and experimenting (F(1, 13) = 0.010, p < 0.05; partial eta squared = 0.408; power = 0.791), modularizing and abstracting (F(1, 13) = 0.008, p < 0.05; partial eta squared = 0.432; power = 0.828), and remixing and reusing (F(1, 13) = 0.003, p < 0.05; partial eta squared = 0.510; power = 0.925). However, it is important to note that because the homogeneity of variance was violated specifically in the dimension of the abstracting and modularizing CTP, robust tests of equality of means were conducted, as depicted in Table 12. The robust tests of equality of means presented in Table 12 reaffirmed that the groups (PDMS and RCA) were statistically significantly different even in the dimension of abstracting and modularizing CTP.
Multivariate test results, as seen in Table 13, did not show a statistically significant effect (F(4, 10 = 0.088, p < 0.05; Wilk’s Lambda = 0.476, partial eta squared = 0.524; observed power = 0.542) on the composite CTP scores in differentiating the two groups (PDMS and RCA).

6. Discussions

6.1. Pretest–Posttest Scores

Based on the analysis conducted using an independent sample t-test on the pretest scores (see Table 3), it is reasonable to conclude that, at the beginning of the study, the groups were not statistically significantly different in their knowledge of CTCs.
At the end of the study, as revealed by the ANCOVA test results (see Table 6), neither the accessible nor the original CT curriculum was statistically significantly different on the posttest, taking the difference of the pretest into account, concluding that neither the accessible nor the original CT curriculum was more effective than the other one. This result indicates that the accessible CT curriculum was not found to be more effective than the original CT curriculum, as hypothesized in this study. There may be a few reasons for this result. One is that there were four students at PDMS who took the pretest but not the posttest. The second reason may be that not only is every student with autism different, but also each group was quite different in their autism diagnostics, their place on the autism spectrum, their executive functional skills, and their learning preferences. For example, as seen in Table 14, students at RCA generally had higher mean BRIEF 2 [20] scores for their executive functioning skills. The third and more important reason may be that two teachers at RCA involved in this study were free to adjust the original CT curriculum based on students at RCA and how the instructions in the original CT curriculum could be taught best to these students. These two teachers were not asked to refrain from adjusting how instructions in the original CT curriculum were presented to their students. Therefore, they adjusted how they implemented the original CT curriculum to ensure that students learn the targeted CTCs by specific sessions in the original CT curriculum.
The paired sample t-test results conducted to explore whether the research study had any statistically significant impact across the groups indicated a statistically significant effect when considering only one direction: whether posttest scores were statistically higher than pretest scores, as presented in Table 8. This means the intervention, that is, the implementation of a CT curriculum (accessible or original), made a statistically significant effect on students’ learning of CTCs. This indicates that implementing a CT curriculum makes a measurable and beneficial impact on enhancing students’ skills and knowledge of CTCs.

6.2. Interviews

The results presented in Table 11 indicate that students at PDMS, as compared to students in RCA, showed statistically significant improvements in their fluency of CTPs in all four dimensions: debugging and testing, iterating and experimenting, modularizing and abstracting, and remixing and reusing. This suggests that the accessible CT curriculum, compared to the original CT curriculum, was more effective in improving students’ proficiency in each dimension of computational thinking practice. In other words, an accessible CT curriculum would better benefit students with ASD in improving their CTPs in each dimension. On the other hand, there was no statistically significant difference in effectiveness between the accessible CT curriculum and the original CT curriculum in enhancing students’ fluency in CTPs when all CTPs were considered collectively, as presented in Table 13. That is, even though the groups exhibited differences in individual CTPs, these differences did not manifest when all CTPs were considered as a single construct, which may be explained by the small sample size.

7. Conclusions

The study aimed to evaluate an accessible CT curriculum for ASD students, focusing on improving their understanding of CTCs and fluency in CTPs. It used pretests, posttests, and interviews to assess their progress. The experimental group received the accessible curriculum, while the control group used the original one.
Students in both PDMS and RCA groups were equal in their knowledge of CTCs at the pretest. When comparing the two groups based on their posttest scores, no statistical significance was found. This indicates that neither the accessible CT curriculum nor the original one differed significantly in terms of their impact on CTC learning. However, when comparing pretest scores to posttest scores across both groups (PDMS and RCA), a statistically significant improvement was observed. This suggests that all participating students with ASD in this study enhanced their learning of CTCs from the pretest to the posttest. This result aligns with the prior literature, which reported successful and effective implementation of computing curriculums for mainstream students [6,8,9,10,11,13,14], students with ADHD [16], and students with ASD [15]. That is, computing curriculums, whether designed for mainstream students or adjusted for students with ASD, are effective in increasing the learning of CTCs in students with ASD.
Individual ANOVA tests on each CTP, based on interviews conducted with fifteen students (seven in PDMS and eight in RCA) to assess their fluency development in CTPs, revealed a significant difference between the two groups on all individual CTP scores, indicating that the accessible CT curriculum was significantly more effective than the original one in improving fluency in debugging, testing, iterating, experimenting, modularizing, abstracting, and remixing. Multiple analyses of variance showed that the accessible curriculum was notably more effective overall in developing students’ CTP fluency compared to the original curriculum when all CTPs are considered together. This result is congruent with the findings by [15,16], who reported successful implementation of computing curriculums for students with learning differences. The current findings in this study show a statistically significant result toward the effectiveness of the implementation of an accessible curriculum in improving the development of fluency in CTPs in students with ASD.
A future study could examine the effectiveness of an accessible computing curriculum across different grade levels. Additionally, exploring the impact of involving more students may lead to different outcomes in the learning of CTCs, application of CTCs, and development of fluency in CTPs. Lastly, further research could investigate the correlation between curriculum sessions and students’ artifact scores to assess its influence on CTP fluency.

Author Contributions

Conceptualization, A.A. and M.L.B.; methodology, A.A.; formal analysis, A.A. and R.I.; investigation, A.A., M.L.B., G.V.B. and K.P.; resources, A.A. and M.L.B.; data curation, A.A., M.L.B. and G.V.B.; writing—original draft preparation, A.A. and R.I.; writing—review and editing, A.A., G.V.B. and K.P.; visualization, A.A. and R.I.; supervision, A.A., M.L.B., G.V.B. and K.P.; project administration, A.A., M.L.B., G.V.B. and K.P.; funding acquisition, A.A. and M.L.B. All authors have read and agreed to the published version of the manuscript.

Funding

This material is based upon work supported by the National Science Foundation under Grant No. 2031427. Any opinions, findings and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the view of the National Science Foundation.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Institutional Review Board (or Ethics Committee) of Youngstown State University (protocol # 005-21 and 10/28/2020).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The datasets presented in this article are not readily available because of privacy and to protect the involved minor participants with special characteristics. Requests to access the datasets should be directed to aarslanyilmaz@ysu.edu.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Woollard, J.; Selby, C. Refining an Understanding of Computational Thinking. 2014. Available online: http://eps.cc.ysu.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=edsair&AN=edsair.od.......348..1f41eef6be5e67cbda0c347d77ce33ae&site=eds-live (accessed on 2 March 2020).
  2. Lockwood, J.; Mooney, A. Computational Thinking in Education: Where Does It Fit? A Systematic Literary Review. Int. J. Comput. Sci. Educ. Sch. 2018, 2, 41–60. [Google Scholar] [CrossRef]
  3. Wing, J. Computational Thinking. Commun. ACM 2006, 49, 33–35. [Google Scholar] [CrossRef]
  4. Brackmann, C.P.; Román-González, M.; Robles, G.; Moreno-León, J.; Casali, A.; Barone, D. Development of Computational Thinking Skills through Unplugged Activities in Primary School. In Proceedings of the 12th Workshop on Primary and Secondary Computing Education, WiPSCE ’17, Nijmegenm, The Netherlands, 8–10 November 2017; ACM: New York, NY, USA, 2017; pp. 65–72. [Google Scholar] [CrossRef]
  5. Folk, R.; Lee, G.; Michalenko, A.; Peel, A.; Pontelli, E. GK-12 DISSECT: Incorporating Computational Thinking with K-12 Science without Computer Access. In Proceedings of the Frontiers in Education Conference, FIE, El Paso, TX, USA, 21–24 October 2015; Volume 2014, pp. 1–8. [Google Scholar] [CrossRef]
  6. Goldberg, D.S.; Grunwald, D.; Lewis, C.; Feld, J.A.; Hug, S. Engaging Computer Science in Traditional Education: The ECSITE Project. In Proceedings of the ACM Conference on Innovation and Technology in Computer Science Education, ITiCSE’12, Haifa, Israel, 3–5 July 2012; pp. 351–356. [Google Scholar] [CrossRef]
  7. Hambrusch, S.; Hoffmann, C.; Korb, J.T.; Haugan, M.; Hosking, A.L. A Multidisciplinary Approach towards Computational Thinking for Science Majors. SIGCSE Bull. Inroads 2009, 41, 183–187. [Google Scholar] [CrossRef]
  8. Burgett, T.; Folk, R.; Fulton, J.; Peel, A.; Pontelli, E.; Szczepanski, V. DISSECT: Analysis of Pedagogical Techniques to Integrate Computational Thinking into K-12 Curricula. In Proceedings of the Frontiers in Education Conference, FIE, El Paso, TX, USA, 21–24 October 2015; Volume 2014, pp. 1–9. [Google Scholar] [CrossRef]
  9. Qin, H. Teaching Computational Thinking through Bioinformatics to Biology Students. SIGCSE Bull. Inroads 2009, 41, 188–191. [Google Scholar] [CrossRef]
  10. Jenkins, J.T.; Jerkins, J.A.; Stenger, C.L. A Plan for Immediate Immersion of Computational Thinking into the High School Math Classroom through a Partnership with the Alabama Math, Science, and Technology Initiative. In Proceedings of the Annual Southeast Conference, Tuscaloosa, AL, USA, 29–31 March 2012; pp. 148–152. [Google Scholar] [CrossRef]
  11. Miller, R.B.; Kelly, G.N.; Kelly, J.T. Effects of Logo Computer Programming Experience on Problem Solving and Spatial Relations Ability. Contemp. Educ. Psychol. 1988, 13, 348–357. [Google Scholar] [CrossRef]
  12. Grover, S.; Cooper, S.; Pea, R. Assessing Computational Learning in K-12. In Proceedings of the 2014 Innovation and Technology in Computer Science Education Conference, Uppsala, Sweden, 21–25 June 2014; pp. 57–62. [Google Scholar] [CrossRef]
  13. Haddad, R.; Kalaani, Y. Can Computational Thinking Predict Academic Performance? In Proceedings of the 5th IEEE Integrated STEM Education Conference, Princeton, NJ, USA, 7 March 2015; pp. 225–229. [Google Scholar] [CrossRef]
  14. De Oliveira, O.L.; Del Val Cura, L.M.; Do Carmo Nicoletti, M. Quantitative Correlation between Ability to Compute and Student Performance in a Primary School. In Proceedings of the 45th ACM Technical Symposium on Computer Science Education, SIGCSE 2014, Atlanta, GA, USA, 5–8 March 2014; pp. 505–510. [Google Scholar] [CrossRef]
  15. Arslanyilmaz, A.; Briley, M.; Loto, M.B.; Fernberg, C.; Beadle, G.; Coldren, J. An Accessible Computing Curriculum for Students with Autism Spectrum Disorders. In Proceedings of Society for Information Technology & Teacher Education International Conference; Langran, E., Archambault, L., Eds.; Association for the Advancement of Computing in Education: Waynesville, NC, USA, 2021; pp. 17–23. [Google Scholar]
  16. Wille, S.; Century, J.; Pike, M. Exploratory Research to Expand Opportunities in Computer Science for Students with Learning Differences. Comput. Sci. Eng. 2017, 19, 40–50. [Google Scholar] [CrossRef]
  17. Brennan, K.; Balch, C.; Chung, M. Creative Computing: Scratch Curriculum Guide. 2014. Available online: https://www.scratched.gse.harvard.edu/guide (accessed on 31 December 2017).
  18. Brennan, K.; Resnick, M. New Frameworks for Studying and Assessing the Development of Computational Thinking. In Proceedings of the 2012 Annual Meeting of the American Educational Research, Vancouver, BC, Canada, 13–17 April 2013. [Google Scholar]
  19. ISAC YouTube Channel; Youngstown, OH. 2021. Available online: https://www.youtube.com/channel/UCE2RvGLMnVWDZun7YH6bE8g (accessed on 1 September 2023).
  20. Gioia, G.A.; Isquith, P.K.; Guy, S.C.; Kenworthy, L. Behavior Rating Inventory of Executive Function. Available online: https://www.parinc.com/products/pkey/24 (accessed on 22 August 2021).
Table 1. Unit topics covered in both accessible and original CT curriculums.
Table 1. Unit topics covered in both accessible and original CT curriculums.
UnitTopics
1Introducing Scratch, Scratch Accounts, Design Journal, Scratch Surprise, Scratch Studio, Critique Group
2Programmed to Dance, Step by Step, Ten Blocks, My Studio, Debug It, About Me
3Performing Scripts, Build-A-Band, Orange Square—Purple Circle, It’s Alive, Debug It!, Music Video
4Characters, Conversations, Scenes, Debug It!, Create Construction, Pass It On
5Dream Game List, Starter Games, Score, Extensions, Interactions, Debug It!
6Know Want Learn, Round Two, Advanced Concepts, Hardware & Extensions, Activity Design, My Debug It!
7Project Pitch, Project Planning Design Sprint, Project Feedback, Project Check-In, Showcase
Table 2. Descriptive statistics on pretest and posttest.
Table 2. Descriptive statistics on pretest and posttest.
PretestPosttest
SchoolMeanStd. Dev.NMeanStd. Dev.N
PDMS8.582.3921210.252.6598
RCA6.822.040118.822.67611
Total7.74 9.422.69419
Table 3. Independent samples t-test on pretest.
Table 3. Independent samples t-test on pretest.
tdfSignificanceMean DifferenceStd. Err. Dif.95% Confidence Interval of the Difference
One-Sided pTwo-Sided pLowerUpper
Equal variances assumed1.89210.036 *0.0721.7650.931−0.1723.702
* p < 0.05, one-tailed.
Table 4. Shapiro-Wilk test results.
Table 4. Shapiro-Wilk test results.
SchoolShapiro-Wilk
StatisticdfSig.
PretestPDMS0.942120.521
RCA0.947110.601
Table 5. Levene’s homogeneity of variance test across groups.
Table 5. Levene’s homogeneity of variance test across groups.
Levene Statisticdf1df2Sig.
Pretest0.2761210.605
Table 6. Test of between-subjects effect (ANCOVA) after adjusting for pretest scores.
Table 6. Test of between-subjects effect (ANCOVA) after adjusting for pretest scores.
SourceType III Sum of SquaresdfMean SquareFSig.Partial Eta Squared
Pretest0.72610.7260.0970.7600.006
PDMS vs. RCA9.61519.6151.2780.2750.074
R Squared = 0.078 (Adjusted R Squared = −0.037).
Table 7. Descriptive statistics for paired samples t-test.
Table 7. Descriptive statistics for paired samples t-test.
SourceMeanSt. Err.Std. Dev.NShapiro-Wilk
StatisticsdfSig.
Pretest7.840.5742.5190.957190.507
Posttest9.420.6182.694190.966190.694
Table 8. Paired samples t-test.
Table 8. Paired samples t-test.
Paired DifferencestdfSignificance (One-Sided)
MeanStd. Dev.Std. Err. Mean95% Confidence Interval of the Difference
LowerUpper
Pretest—Posttest−1.5793.5480.814−3.2890.131−1.94180.034 *
* p < 0.05, one-tailed.
Table 9. Descriptive statistics on the artifact-based interview scores.
Table 9. Descriptive statistics on the artifact-based interview scores.
CTPsNMeanStd. Dev.Std. Err.95% Confidence Interval for MeanMinimumMaximum
Lower BoundUpper Bound
Experimenting and IteratingPDMS72.6190.7560.2861.9193.3191.003.00
RCA81.4160.7920.2790.7552.0781.003.00
Testing and DebuggingPDMS72.5000.8660.3271.6993.3011.003.00
RCA81.2500.4630.1640.8631.6371.002.00
Reusing and RemixingPDMS72.5710.7870.2971.8443.2301.003.00
RCA81.3130.5310.1880.8701.7561.0002.50
Abstracting and ModularizingPDMS72.3330.8820.3331.5183.1491.003.00
RCA81.2080.4700.1660.8161.6011.002.33
Notes: PDMS = Potential Development Middle School; RCA = Rich Center for Autism; CTPs: Computational Thinking Practices.
Table 10. Levene’s test of homogeneity of variance.
Table 10. Levene’s test of homogeneity of variance.
Levene Statisticdf1df2Sig.
Experimenting and Iterating0.1211130.734
Testing and Debugging4.2861130.059
Reusing and Remixing1.2971130.275
Abstracting and Modularizing7.1881130.019 *
* p < 0.05.
Table 11. Test of between-subject effects.
Table 11. Test of between-subject effects.
Dependent VariableType III Sum of SquaresdfMean SquareFSig.Partial Eta SquaredObserved Power
Experimenting and Iterating5.39715.3978.9750.010 *0.4080.791
Testing and Debugging5.83315.83312.6390.004 *0.4930.907
Reusing and Remixing5.91715.91713.5350.003 *0.5100.925
Abstracting and Modularizing4.72514.7259.8940.008 *0.4320.828
* p < 0.05.
Table 12. Robust test of equality of means.
Table 12. Robust test of equality of means.
Statisticdf1df2Sig.
Brown-Forsythe12.824110.3210.005 *
Abstracting and ModularizingWelch9.12918.8750.015 *
Brown-Forsythe9.12918.8750.015 *
* p < 0.05.
Table 13. Between-subjects effect (MANOVA) on all CTPs combined.
Table 13. Between-subjects effect (MANOVA) on all CTPs combined.
EffectValueFHypothesis dfError dfSig.Partial Eta SquaredNoncent. ParameterObserved Power
Wilks’ Lambda0.4762.764.0010.00.0880.52411.020.542
Table 14. BRIEF 2 scores.
Table 14. BRIEF 2 scores.
BRIERICRIComposite
PDMS4.5143.3864.2716.97727
RCA4.8523.0684.1957.39935
Notes. PDMS: Potential Development Middle School; RCA: Rich Center for Autism; BRI: behavior regulation index; ERI: emotional regulation index; CRI: cognitive regulation index.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Arslanyilmaz, A.; Briley, M.L.; Boerio, G.V.; Petridis, K.; Ilyas, R. Assessing the Efficacy of an Accessible Computing Curriculum for Students with Autism Spectrum Disorders. Multimodal Technol. Interact. 2024, 8, 11. https://doi.org/10.3390/mti8020011

AMA Style

Arslanyilmaz A, Briley ML, Boerio GV, Petridis K, Ilyas R. Assessing the Efficacy of an Accessible Computing Curriculum for Students with Autism Spectrum Disorders. Multimodal Technologies and Interaction. 2024; 8(2):11. https://doi.org/10.3390/mti8020011

Chicago/Turabian Style

Arslanyilmaz, Abdu, Margaret L. Briley, Gregory V. Boerio, Katie Petridis, and Ramlah Ilyas. 2024. "Assessing the Efficacy of an Accessible Computing Curriculum for Students with Autism Spectrum Disorders" Multimodal Technologies and Interaction 8, no. 2: 11. https://doi.org/10.3390/mti8020011

Article Metrics

Back to TopTop