New Insights in Learning Analytics

A special issue of Analytics (ISSN 2813-2203).

Deadline for manuscript submissions: closed (31 December 2023) | Viewed by 6437

Special Issue Editors


E-Mail Website
Guest Editor
School of Computing and Information System, Faculty of Science and Technology, Athabasca University, Edmonton, AB, Canada
Interests: big data learning analytics; artificial intelligence; self-regulated and co-regulated learning; human-computer interaction; causal modelling

E-Mail Website
Guest Editor
College of Information, The University of North Texas, Denton, TX, USA
Interests: learning analytics; learning technologies; mobile, ubiquitous, and location aware learning systems; cognitive profiling; interactive technologies

Special Issue Information

Dear Colleagues,

Akin to collecting intelligence on individuals for national security purposes, learning analytics collects intelligence on humans and provides insightful frameworks for the proper application of the same. Observing learning engagements in study episodes, LA agents/sensors tend to collect ethically clean and time-sensitive data over these episodes. LA then exacts actionable insights from these securely communicated datasets to infer information and create models, yielding actionable insights. Specific actions can then be carried out considering these insights, with the impact of those actions measuring the quality of learning. Generating intelligence that helps humans succeed in learning is the singular goal of learning analytics.

Learning Science explores the nature of learning as an interweaving of theories, models, algorithms, traits, datasets, practices, and insights originating from areas such as computing, mathematics, health, education, sociology, humanities, and statistics. Learning Analytics(LA) is an emerging area of study in learning science. LA offers a meta-level view on the quality of human learning with respect to influencing variables of interest. Technology enhancement, cognition and meta-cognition, pedagogy and andragogy, end-user competences, learning environments as well as governance and policymaking are some of the key influencing variables.

In general, LA derives intelligence on the quality of human learning from educational data. Meta-level views of LA include the measurements and relations about the utility, optimality, generalizability, intervention, explainability, and customizability, among other ‘unknown common truths’ pertaining to the quality of human learning. For instance, LA intelligence concerns learners’ capacity, challenges, progress, accomplishments, and the effectiveness of learning contexts, approaching the elusive causality on learning quality as a function of contextual factors such as instructional effectiveness and adaptability. As another example, LA intelligence technology enhancement concerns computational entities, learning machines, and technological gadgets, among others, as well as their impact on the quality of learner interaction.

This Special Issue invites research contributions on learning analytics. Whether research focuses on tracing skills development or metacognition in human learning, whether it focuses on face-to-face, online or blended environments, this Issue seeks to share new insights in the field of LA.

Topics of interest include:

A. Learning analytic theory and science

  1. Analysis of unstructured and semi-structured data;
  2. Security, privacy, and veracity of analytics;
  3. Ethics of learning analytics;
  4. Scalability of machine learning and data mining for analytics;
  5. Computing infrastructure for analytics—cloud, grid, autonomic, stream, mobile, high-performance computing;
  6. Search in learning traces;
  7. Artificial intelligence in learning analytics;
  8. Uncertainty handling in analytics;
  9. IoT and learning analytics.

B. Applications of learning analytics

  1. Detecting students’ approach to learning;
  2. Analytics in academic administration;
  3. Analytics in complex training;
  4. Gaming analytics and sports analytics;
  5. Evidence-driven instruction in interdisciplinary and single-discipline areas;
  6. Big data and educational technology;
  7. Analytics in academic strategic planning;
  8. Cultural analytics;
  9. Large-scale social networks;
  10. Data literacy;
  11. Technological literacy and analytics;
  12. Human literacy and analytics.

C. Techniques and technology in learning analytics

  1. Evidence-driven mixed-initiative learning;
  2. Data-intensive learning and instructional design;
  3. Emerging standards in learning analytics;
  4. Sentiment analysis;
  5. Large-scale productivity analysis;
  6. Analytic infrastructure for academic institutions;
  7. Scalable knowledge management;
  8. Research methods for learning analytics;
  9. Immersive learning.

Prof. Dr. Vivekanandan Suresh Kumar
Prof. Dr. Kinshuk
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Analytics is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1000 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

16 pages, 984 KiB  
Article
Learner Engagement and Demographic Influences in Brazilian Massive Open Online Courses: Aprenda Mais Platform Case Study
by Júlia Marques Carvalho da Silva, Gabriela Hahn Pedroso, Augusto Basso Veber and Úrsula Gomes Rosa Maruyama
Analytics 2024, 3(2), 178-193; https://doi.org/10.3390/analytics3020010 - 03 Apr 2024
Viewed by 1039
Abstract
This paper explores the dynamics of student engagement and demographic influences in Massive Open Online Courses (MOOCs). The study analyzes multiple facets of Brazilian MOOC participation, including re-enrollment patterns, course completion rates, and the impact of demographic characteristics on learning outcomes. Using survey [...] Read more.
This paper explores the dynamics of student engagement and demographic influences in Massive Open Online Courses (MOOCs). The study analyzes multiple facets of Brazilian MOOC participation, including re-enrollment patterns, course completion rates, and the impact of demographic characteristics on learning outcomes. Using survey data and statistical analyses from the public Aprenda Mais Platform, this study reveals that MOOC learners exhibit a strong tendency toward continuous learning, with a majority re-enrolling in subsequent courses within a short timeframe. The average completion rate across courses is around 42.14%, with learners maintaining consistent academic performance. Demographic factors, notably, race/color and disability, are found to influence enrollment and completion rates, underscoring the importance of inclusive educational practices. Geographical location impacts students’ decision to enroll in and complete courses, highlighting the necessity for region-specific educational strategies. The research concludes that a diverse array of factors, including content interest, personal motivation, and demographic attributes, shape student engagement in MOOCs. These insights are vital for educators and course designers in creating effective, inclusive, and engaging online learning experiences. Full article
(This article belongs to the Special Issue New Insights in Learning Analytics)
Show Figures

Figure 1

17 pages, 1756 KiB  
Article
Code Plagiarism Checking Function and Its Application for Code Writing Problem in Java Programming Learning Assistant System
by Ei Ei Htet, Khaing Hsu Wai, Soe Thandar Aung, Nobuo Funabiki, Xiqin Lu, Htoo Htoo Sandi Kyaw and Wen-Chung Kao
Analytics 2024, 3(1), 46-62; https://doi.org/10.3390/analytics3010004 - 17 Jan 2024
Viewed by 773
Abstract
A web-based Java programming learning assistant system (JPLAS) has been developed for novice students to study Java programming by themselves while enhancing code reading and code writing skills. One type of the implemented exercise problem is code writing problem (CWP), which asks [...] Read more.
A web-based Java programming learning assistant system (JPLAS) has been developed for novice students to study Java programming by themselves while enhancing code reading and code writing skills. One type of the implemented exercise problem is code writing problem (CWP), which asks students to create a source code that can pass the given test code. The correctness of this answer code is validated by running them on JUnit. In previous works, a Python-based answer code validation program was implemented to assist teachers. It automatically verifies the source codes from all the students for one test code, and reports the number of passed test cases by each code in the CSV file. While this program plays a crucial role in checking the correctness of code behaviors, it cannot detect code plagiarism that can often happen in programming courses. In this paper, we implement a code plagiarism checking function in the answer code validation program, and present its application results to a Java programming course at Okayama University, Japan. This function first removes the whitespace characters and the comments using the regular expressions. Next, it calculates the Levenshtein distance and similarity score for each pair of source codes from different students in the class. If the score is larger than a given threshold, they are regarded as plagiarism. Finally, it outputs the scores as a CSV file with the student IDs. For evaluations, we applied the proposed function to a total of 877 source codes for 45 CWP assignments submitted from 9 to 39 students and analyzed the results. It was found that (1) CWP assignments asking for shorter source codes generate higher scores than those for longer codes due to the use of test codes, (2) proper thresholds are different by assignments, and (3) some students often copied source codes from certain students. Full article
(This article belongs to the Special Issue New Insights in Learning Analytics)
Show Figures

Figure 1

22 pages, 358 KiB  
Article
Learning Analytics in the Era of Large Language Models
by Elisabetta Mazzullo, Okan Bulut, Tarid Wongvorachan and Bin Tan
Analytics 2023, 2(4), 877-898; https://doi.org/10.3390/analytics2040046 - 16 Nov 2023
Viewed by 2807
Abstract
Learning analytics (LA) has the potential to significantly improve teaching and learning, but there are still many areas for improvement in LA research and practice. The literature highlights limitations in every stage of the LA life cycle, including scarce pedagogical grounding and poor [...] Read more.
Learning analytics (LA) has the potential to significantly improve teaching and learning, but there are still many areas for improvement in LA research and practice. The literature highlights limitations in every stage of the LA life cycle, including scarce pedagogical grounding and poor design choices in the development of LA, challenges in the implementation of LA with respect to the interpretability of insights, prediction, and actionability of feedback, and lack of generalizability and strong practices in LA evaluation. In this position paper, we advocate for empowering teachers in developing LA solutions. We argue that this would enhance the theoretical basis of LA tools and make them more understandable and practical. We present some instances where process data can be utilized to comprehend learning processes and generate more interpretable LA insights. Additionally, we investigate the potential implementation of large language models (LLMs) in LA to produce comprehensible insights, provide timely and actionable feedback, enhance personalization, and support teachers’ tasks more extensively. Full article
(This article belongs to the Special Issue New Insights in Learning Analytics)
Show Figures

Figure 1

17 pages, 543 KiB  
Article
Can Oral Grades Predict Final Examination Scores? Case Study in a Higher Education Military Academy
by Antonios Andreatos and Apostolos Leros
Analytics 2023, 2(4), 836-852; https://doi.org/10.3390/analytics2040044 - 02 Nov 2023
Viewed by 967
Abstract
This paper investigates the correlation between oral grades and final written examination grades in a higher education military academy. A quantitative, correlational methodology utilizing linear regression analysis is employed. The data consist of undergraduate telecommunications and electronics engineering students’ grades in two courses [...] Read more.
This paper investigates the correlation between oral grades and final written examination grades in a higher education military academy. A quantitative, correlational methodology utilizing linear regression analysis is employed. The data consist of undergraduate telecommunications and electronics engineering students’ grades in two courses offered during the fourth year of studies, and spans six academic years. Course One covers period 2017–2022, while Course Two, period 1 spans 2014–2018 and period 2 spans 2019–2022. In Course One oral grades are obtained by means of a midterm exam. In Course Two period 1, 30% of the oral grade comes from homework assignments and lab exercises, while the remaining 70% comes from a midterm exam. In Course Two period 2, oral grades are the result of various alternative assessment activities. In all cases, the final grade results from a traditional written examination given at the end of the semester. Correlation and predictive models between oral and final grades were examined. The results of the analysis demonstrated that, (a) under certain conditions, oral grades based more or less on midterm exams can be good predictors of final examination scores; (b) oral grades obtained through alternative assessment activities cannot predict final examination scores. Full article
(This article belongs to the Special Issue New Insights in Learning Analytics)
Show Figures

Figure 1

Back to TopTop