Next Article in Journal
Digital Transformation of Marketing Strategies during a Pandemic: Evidence from an Emerging Economy during COVID-19
Next Article in Special Issue
Development of Teacher Digital Competence in the Area of E-Safety through Educational Video Games
Previous Article in Journal
The Effect of ESG Performance on Tax Avoidance—Evidence from Korea
Previous Article in Special Issue
Development and Validation of a Digital Learning Competence Scale: A Comprehensive Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Design, Validation and Implementation of a Questionnaire to Assess Teenagers’ Digital Competence in the Area of Communication in Digital Environments

by
Ana Iglesias-Rodríguez
1,*,
Azucena Hernández-Martín
1,
Yolanda Martín-González
2 and
Patricia Herráez-Corredera
1
1
Department of Didactics, Organization and Research Methods, Faculty of Education, University of Salamanca, 37008 Salamanca, Spain
2
Department of Library and Information Science, Faculty of Translation and Documentation, University of Salamanca, 37008 Salamanca, Spain
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(12), 6733; https://doi.org/10.3390/su13126733
Submission received: 18 May 2021 / Revised: 8 June 2021 / Accepted: 10 June 2021 / Published: 14 June 2021
(This article belongs to the Special Issue Digital Teaching Competences for Sustainable Development)

Abstract

:
This article describes the process of design, validation, and implementation (N = 609) of a questionnaire drawn up ad hoc to assess the digital competence of compulsory education students (ages 11 to 13) in the area of communication. The test measures students’ knowledge, skills, and attitudes in the six competences that make up the area of communication, as established in the Framework for the Development and Knowledge of Digital Competence in Europe (DigComp): interacting through new technologies, sharing of information and content, online citizen participation, collaboration through digital technologies, netiquette, and digital identity management. The purposes of the study are to design and validate an instrument to assess compulsory education students’ digital competences in the area of communication based on their knowledge, skills, and attitudes and to analyse such instrument’s psychometric characteristics with special emphasis on its reliability and validity. The method used consisted of the implementation of various psychometric validation techniques and the analysis of the results based on statistical descriptions. Items show adequate discrimination and difficulty indices. Validity was guaranteed through expert judgement and factorial analysis of the test. The conclusion stresses the pressing need for education centres to provide students with adequate educational-communicative training.

1. Introduction

Knowing what the digital competences of citizens of all ages and the extent to which they use them with awareness and appropriacy is an issue that has raised keen interest at the global level [1,2]. Nevertheless, such interest makes no sense if citizens lack the actual adequate digital literacy [3] to cope in an ever more digital society where they should be able to use the Internet and any other digital technologies within their reach throughout their lives in an ethical, responsible, and safe way [4]. Currently, digital literacy and digital competence are closely interrelated and mutually dependent terms, so that, as observed by various authors [5,6,7,8,9], digital literacy might be defined as the set of literacies a citizen should master to manage in twenty-first century society, bearing in mind that to be successful in this requires appropriate usage of digital competences, which are defined by the DigComp project as:
A set of knowledge, skills, attitudes, strategies, and values that are required when using ICT and digital media to perform tasks, solve problems, communicate, manage information, collaborate, create and share content, and build knowledge effectively, efficiently, appropriately, critically, creatively, autonomously, flexibly, ethically, and reflectively for work, leisure, participation, learning, socializing, consuming, and empowerment [8] (p. 30).
The DigComp project—European Digital Competence Framework for Citizens—was first published in 2013, its update in 2016 by the Joint Research Centre (JRC) under the name of ‘DigComp 2.0′ was followed in 2017 by the publication of the so-called ‘DigComp 2.1.’, which is presented as a tool designed to improve citizens’ digital competences both in Europe and in the Member States. It is structured around five descriptive dimensions ((1) Outlines the competence areas; (2) Defines the competences that are relevant to each area; (3) Proficiency levels for each competence; (4) Examples of knowledge, skills, and attitudes for each competence; and (5) Applicability of the competence to different educational and learning purposes), and includes five areas of competence (information and data literacy, communication and collaboration, digital content creation, safety, and problem solving), which, in turn, gather 21 sub-competences related to key learnings for citizens participation in twenty-first century society.
To brave the complexity and multidimensionality of digital literacy in the twenty-first century inevitably requires proficiency in digital and information competences. This can only be achieved by providing current generations with appropriate training for their acquisition, development, and use anytime and anywhere, given the countless possibilities for online information, exchange, and interaction that are offered and fostered by the digital world, the only environment where the boundaries of human action can become extended [10,11]. Communication plays a major role throughout this process, since living in a multifaceted, datified, and everchanging information and knowledge society is giving rise to different needs and ways of learning that demand a new form of literacy [5,12,13] that is able to meet the requirements of a multimodal, hypertextual, or nonlinear type of communication [14,15]. Thus, it is increasingly pressing for citizens across the world to have an adequate command of communicative, media, information, and digital competences [16,17,18,19] if they wish to be a part of the global network that keeps them permanently connected. The versatility and constant fluctuation of the surrounding information, alongside its ubiquity, leads to the need for those who wish to access it to be able to put into practice a combination of knowledge and know-how so that their own skills may materialize for the benefit of all [11]. In other words, their competences and skills should allow them to manage information and data in an appropriate manner [20,21,22], which involves: developing their personal learning by using a wealth of resources, activities, and information sources (Personal Learning Environment—PLE); to personally manage contents and communication with others throughout the learning process, meaning their Personal Learning Network (PLN); to make critical and responsible use of information [23]; or to solve problems that may arise in virtual environments, always displaying willingness to collaborate and comprehensive collective intelligence [9,24,25].
Numerous teachers and policy-makers in the area of education believe that merely being born in the twenty-first century under the label ‘digital natives’ means that students innately have the conditions, qualities, skills, and competences required to interact with digital media, so that they would need no training nor additional strengthening in the area [26,27]. However, reality indicates that most teenage ‘digital natives’ act on intuition and that, now more than ever, they need training and preparation to ensure that they develop adequate digital literacy and competence and that they are able to operate in the context of digital culture [28,29,30], since at these ages:
Education becomes more active because students are required to manage information, arrange communication, take into account the role of others and observe the social and cultural functions that are involved in online education. All this becomes imperative not only at the didactic but also at the pedagogical level [11] (p. 45).

1.1. Importance of Communicative Competence and Information Literacy in Compulsory Education Students

Nowadays, most teenagers have experimented with social communication networks and media to a greater or lesser extent [31,32], the quality of the experiences being according to the activities and benefits sought [33].
Therefore, endeavours should be made from the sphere of education to train present and future generations in the acquisition and implementation of basic communication competences (information literacy) so that they do not lose touch with the true picture of the society and cultural environment they live and grow in, which can become altered or distorted as a consequence of the connected, hypertextual, and overinformed reality they also live in [34].
Communication with others entails the implementation of skills that are part of the communicative act itself, such as participation and collaboration.
Participating in social media, regardless of whether it is for personal, educational, or professional purposes, or for a combination of all of them, favours and facilitates social interaction, civic engagement, information retrieval and processing, academic performance and professional success [35,36,37] among users with diverse and multidisciplinary experiences who are able to set solid foundations to gradually achieve greater academic or professional consolidation based on the collaborative work generated, and the continuous reflecting involved in the exchange of messages. All this in awareness that, “[…] It is through participation that getting involved and taking action in social life can be achieved, and it comes forth as a tool at the service of citizenship […]” [38] (p. 139).
Likewise, communication and collaboration significantly change young people’s way to access, interact with, and interpret information, making it necessary to train them in the best ways of “communicating in digital environments, sharing resources through web-based tools, connecting with others and collaborating using digital tools, interacting and participating in communities and networks; [and of working on] intercultural awareness” [39] (p. 23) if they wish to create new knowledge in the context of Open Science.
Whether this is fulfilled will largely depend on each individual’s interest, attitude, and ability to adequately use the digital technology and communication tools within their reach [40]. They are often used as a learning network (PLN) that gathers interests on specific topics that may prove useful for the creation and implementation of interdisciplinary contents, while granting access to new perspectives, ideas, and experiences and allowing one to connect with people with common interests or needs [41,42,43,44,45]. Similarly, users might not always be clearly aware that, while using such tools can increase their connectivity, it does not necessarily improve the degree of collaboration among them.
To paraphrase [46] (p. 2), it could, therefore, be assumed that social media create a favourable atmosphere for users to exchange and share resources and ideas and establish collaboration links, and that they contribute to people being connected and informed so that they can participate in real time regardless of their geographical location and its corresponding time zone, offering the possibility to use these media to weave a communicative network among them.
Given the relevance of this process, it is only natural that communication, being inevitably related to interaction with others (both synchronous and asynchronous), compliance with netiquette protocols and collaboration, the latter being a necessary competence to participate and provide input in groups working with ICT, and even to create social networks [47], would consume high levels of energy to keep it alive through the weaving of a good network culture [48]. A culture that can be built, rebuilt, and updated through the talent, imagination, audacity, and intelligence of Internet users [49]. Hence, the need to personalize and individualize digital learning environments (PLEs) [50,51], which according to [52,53] are made up of the different tools, information sources, connections, and activities that each person regularly uses in everyday life for learning purposes. Many of these tools are based on what [52] calls social software, understood as “software that lets people rendezvous, connect or collaborate by use of a computer network” (p. 4), involving three basic cognitive processes: reading, thinking, and sharing [54]. A fourth process that is currently essential could be added: social presence, understood as the feeling of belonging to a community that strengthens and promotes learning while at the same time favouring a dynamic of positive social relationships based on help, encouragement, and support among its members [55,56,57].
In the same vein, [5] (p. 23) believe that PLEs consist of three main elements: (1) reading tools and strategies: the information sources that are accessed and offer such information in the form of an object or artefact (blogs, video channels, RSS…); (2) tools and strategies for reflecting: the environments or services where information can be transformed (places to post, analyse, recreate, or publish); and (3) relationship tools and strategies: environments where interaction with others for learning purposes takes place.
In short, these are mechanisms to share and reflect as a community through personal learning networks (PLNs) made up of the tools, mental processes, and activities that allow their users to share, reflect, discuss, and reconstruct with other knowledge (and questions), as well as the attitudes that foster and feed such exchange [58] (p. 717).
Consequently, it could be said that users who actively participate in certain social and personal learning networks are characterized by being virtual subjects with multiple, almost endless, possibilities to carry out any sort of information transaction thanks to the speed with which it can be accessed, accumulated, and transmitted, which is leading to the building of new personal and collective identities in virtual environments, understood as places and events within the Internet [59,60].

1.2. Digital Literacy and Competence: Previous Studies and Current Scenario

To achieve the goal that is referred to in this study, there is a pressing need for proficiency in digital competence, since being technologically able has become indispensable for survival in the knowledge society and to be able to act in it in a critical manner, since, as pointed out by the Organization for Cooperation and Economic Development [61], mastery of basic information and communication tools through the Internet is crucial for access to culture, services offered by social institutions, citizen participation, and quality of life, while it also improves countries’ competitiveness and economy.
Despite the interest raised by this matter, the findings yielded by research in the area confirm the existence of difficulties, not only in terms of integrating technologies into the classroom and including them in the teaching-learning process [13,62,63,64,65], but also when it comes to the development of digital competences within the different education levels [13,66]. Hence, for example, studies such as that by [47], focused on the assessment of ICT literacy in K-12 education to define its differences and similarities, conclude that most tests measure knowledge and skills in the information area, to the detriment of those that measure knowledge and skills in the areas of communication, collaboration, safety, or problem solving. On their part, [67], in two studies conducted to learn what the skills of the Dutch population between the ages of 18 and 80 were when using the Internet for more than just email exchanges, found that their level of theoretical and practical knowledge on the Internet was quite high, but that their levels of information and especially of strategical knowledge of the Internet were rather dubious. This study provided evidence of how theoretical and practical understanding of the Internet’s basic aspects are not enough to make effective use of it, and that proficiency in digital competences is required if citizens want to use this means effectively.
Equally, the study conducted by the Telefónica Foundation, the King Juan Carlos University and Tuenti [68] on young people’s attitudes and behaviour towards new forms of participation in social media shows that while teenagers acknowledge the advantages and potential risks involved in the use of social networks, including threats to the privacy and safety of their personal data, and that they mobilize in support of causes they feel are important, they are also aware of their shortcomings concerning digital tools, therefore, revealing their need for training to use them successfully. Moreover, the study carried out by [69], approaching the analysis of media competence in citizenry as a whole, concludes that there is a pressing need for education centres to include a media education subject to train students to become educated, active, and aware citizens able to interpret messages in a thoughtful and critical manner and to express themselves with at least a minimum of appropriateness and creativity.
In this context, when venturing, as has been done, into the study of digital competence in the area of communication, what stands out the most is that to master it is an essential and decisive aspect that should be approached with students at compulsory education stages, since a sensible use of technology and media, participating and collaborating in them in an adequate manner, as well as being familiar with and understanding the ethical values that every digital citizen should have and respect, is causing great concern as regards childhood and youth protection policies; and at the same time, suitable education and training in this matter should be offered. This supports the duty of analysing it in the cultural niche where it develops, since research on this competence in compulsory secondary education is scarce, as are the instruments fit for such purposes.
Therefore, this study presents an instrument designed ad hoc for assessing the digital competence in a specific area, that of communication, of students aged 11 to 13, using alternative assessment models that are focused on the problem-based learning method as a reference [70]. More specifically, attention is focused on the design, validation, and analysis of the psychometric characteristics of the instrument at issue, stressing its reliability and validity, because of it being one of the objectives envisaged in the project it is part of, which is funded by the Spanish Ministry of Economy and Competitiveness MINECO/FEDER (Ref. EDU2015-67975-C3-3-P).

2. Materials and Methods

A basic descriptive analysis of the area of communication was conducted in the domains of knowledge, ability, and attitudes. The data were subjected to an exploratory factor analysis to validate the constructs, internal consistency, and the number of factors underlying the domain. The difficulty or ease index of the items was also verified based on the number of correct scores. Finally, for the study of the reliability of the test designed to assess this area, the Cronbach’s alpha statistic was calculated.

2.1. Study Objectives

The objects of this study were: (i) to design and validate an instrument for assessing the digital competences in the area of communication of compulsory education students (aged 11–13), taking into account their knowledge, skills, and attitudes; and (ii) to analyse the instrument’s psychometric characteristics, paying special attention to reliability and validity.

2.2. Participants

The student sample was selected through stratified random sampling of the primary and secondary education centres of two Spanish provinces, taking into account ownership of the centre—public or private-subsidized, and location—rural or urban. This study included 18 education centres and a total sample that consisted of 609 students whose ages ranged between 11 and 13, balanced in terms of gender (49% boys 51% girls), and with most being in their last year of primary education (85% in the sixth grade of primary education, compared to 16% in the first grade of compulsory secondary education).

2.3. Instrument

The DigComp model and other alternative assessment models focused on the problem-based learning method [70] were used as the basis to design an instrument aimed at measuring the level of digital competence of compulsory education students (aged 11–13) in the area of communication, according to: (i) their knowledge, skills, and attitudes in such area; (ii) the six dimensions that make up the area of communication (interacting through new technologies, sharing information and contents, online citizen participation, collaborating through digital technologies, netiquette, and managing digital identity); and (iii) item difficulty (basic, intermediate, and advanced).
It is a known fact that competence assessment means that students should be able to solve the problem-situations (tasks) they are given in contexts as close as possible to reality, by using their knowledge (what they know), skills (what they can do), and attitudes (what they do).
In the case at hand, these three elements that make up digital competence are measured, on the one hand, through items that lay out situations where students are to make decisions and choose a certain answer (knowledge and skills); and, on the other hand, attitudes are measured using a Likert-type scale. It has been considered relevant to express the different questions in terms of problem solutions as close to students’ reality as possible, which is why it could be stated that the test is in line with the problem-based learning approach, since, starting with a given situation, students are asked to choose among different answers that, as formulated, provide information regarding their level of competence.
The final version of the test involved prior adaptation of the DigComp indicators model to the age and characteristics of the sample population, as well as the preliminary validation of the test by a group of experts. This first version of the test gave rise to a model made up of 89 indicators that was subsequently validated through expert judgement, involving a total of 11 experts who were ICT coordinators in public and state-subsidised private education centres and worked with teachers and students in the education levels from which the sample was gathered. Such experts were in charge of assessing the relevance, pertinence, and transparency of the indicators. This validation led to the design of an initial test made up of 64 items, 25 for knowledge and skills (objective test with four answer alternatives) and 39 attitude items (Likert-type scale with five answer alternatives). These 64 items were, once more, submitted for peer review and for assessment by experts on the topic, based on which both content and wording was modified where necessary.
The resulting pilot test was administered anonymously to 75 students in their sixth year of primary education (aged 11–12) and first year of compulsory secondary education (aged 12–13), from 10 Spanish education centres. Questionnaires were accessed digitally. After clearing and changing certain items whose difficulty level was too high or where there were indications that students had not fully understood what they were being asked, the final test consisted of 24 items, 8 for knowledge, 10 for skills, and 6 for attitudes, which, as mentioned, measure the six competences that make up the area of communication: interacting through new technologies, sharing information and contents, online citizen participation, collaborating through digital technologies, netiquette, and managing digital identity. From these 18 items for knowledge and skills, 7 were basic in terms of difficulty level, 8 intermediate and 3 advanced. The difficulty level is clearly defined in the DigComp framework itself, which establishes, as does this study, three difficulty levels (basic, intermediate, and advanced) for confirmation purposes.
The knowledge and skills items are approached using an objective test type with four possible answers, only one of which is correct. Answers were dichotomically coded (1 = right answer; 2 = wrong answer). Therefore, the highest score that could be achieved in the test is 18 points. The pilot study based on the attitudes scale, created in the previous stage, makes the final selection of six items possible. The test is available for use in future studies at http://hdl.handle.net/10366/140240 (accessed on 10 May 2021).

2.4. Procedure and Application

To proceed to the application of the test, the permission of the education authorities and of the Ethics Committee of the University of Salamanca (Spain) was obtained. A digital version of the test was designed via a website built specifically for such a purpose and named ECODIES (https://www.ecodies.es/) (accessed on 13 June 2021) that facilitates the answers of the students participating in the study.
The first step for participation in the project was the preparation of a call that was subsequently sent to the selected education centres, requesting the participation of students in the sixth grade of primary education and/or first grade of compulsory secondary education. The second step, since the sample was made up of minors, was to inform both students and their parents or legal guardians. Thirdly, the permission of the families and children was obtained through protocols prepared ad hoc by the researchers. The education centres that were to participate in the study undertook the collection of the permission forms. Finally, the questionnaire was administered during teaching hours, as requested by the teachers who also decided to collaborate in the research.
The instrument was applied to a sample of students from randomly selected Spanish education centres, under the assumptions of quantitative research (objective test and attitude scale). The study design is descriptive and cross-sectional, since information gathering took place at a single period in time, specifically, during the months of February, March, and April 2019, within the academic year 2018–2019.

2.5. Data Analysis

The analysis of the data obtained is focused on analysing and checking the instrument’s psychotechnical characteristics to gauge its validity and reliability. For this purpose, the knowledge and skills items (knowledge and skills tests) and those that make up the attitudes scale have been studied separately. Subsequently, the study of the test as a whole was carried out to examine any possible gender-related bias.
Data analysis includes descriptive item analysis, item correlations, item discrimination and reliability, and the analysis of the test for structural reliability and validity.
The analyses were performed using SPSS v.21 software and the CORRECTOR 1.2 program developed by Professor José Luis Gaviria. This software works as a complement for MS-Excel and allows the analysis of objective tests and Likert-type scales, providing information about each item: variance, point-biserial correlation, difficulty index, etc.
Furthermore, in order to carry out the reliability analyses of those items that were dichotomous in nature, an Excel spreadsheet was used, which is based on a matrix of tetrachoric correlations [71].

3. Results and Discussion

3.1. Analysis of the Test as a Whole

The entire test is made up of the three competence areas (knowledge, skills, and attitude), so that all 24 items have been taken into account: 8 for knowledge, 10 for skills, and 6 for attitude.
Table 1 shows that the levels students achieved in attitudes are far above those of the areas of knowledge and skills. The latter two are located around midpoint on the scale, while attitudes appear higher.

3.2. Item Analysis of the Knowledge and Skills Test

3.2.1. Descriptive Analysis of Items and Difficulty Index

Table 2 illustrates the data obtained for each item according to mean and standard deviation. Items are dichotomic (right 1/wrong 0) and the maximum would be 1. The competence area of each item is also specified (knowledge or skills).
The mean obtained in the test by the sample population was 9.6, so that it is around midpoint on the scale. The analysis of means obtained for each item shows that the lowest score is 0.22 (item 4) and the highest is 0.81 (item 1), which is evidence of variability in item difficulty, item 4 being the most difficult (only 22% of the subjects answered correctly) and item 1, the easiest (answered correctly by 81% of the subjects). The content of these items leads us to think that teenagers know that when they receive friend requests through a social network from a stranger, they should decline it because they do not know the person (item 1), but they do not know very well what media they might use to share a video with their colleagues (item 4).
Table 2 also shows the statistics of the items grouped into the six competences identified (C1 to C6). Each of these variables has been created through the sum of the scores of the three items that measure the corresponding competence. Average scores are located midpoint on the scale, ranging between scores of 1.26 for the lowest competence (collaborating through digital technologies) to 1.96 for the highest (netiquette). The conclusion that can be drawn is that students’ knowledge and skills are better in netiquette, followed in second place by interacting through new technologies and managing digital identity, in the third place, online citizen participation, ranking fourth, sharing information and contents and in fifth and last position, collaborating through digital technologies.

3.2.2. Item Correlation Analysis

Pearson correlation analysis was applied to identify the relationships among the 18 items included in the test. As can be observed in Table 3, correlations are mostly positive, although correlations between items are not significant. Items 3 and 4 stand out because of their negative correlation with seven of the remaining items.
Correlation among the six competences (Table 4) yielded certain low and certain moderate values, most of them highly significant given the size of the sample.

3.2.3. Analysis of Item Discrimination, Difficulty and Reliability

Table 5 shows the statistical data for each of the knowledge and skills items that allow the establishment of each item’s discrimination index (point-biserial correlation), difficulty index, mean and variance of the scale if the item is removed.
According to right answer percentages (difficulty index), a distribution of the items into three difficulty levels or competence levels can be established (Table 6).
Figure 1 shows the items placed on the coordinates that represent item difficulty and discrimination index (point-biserial correlation), which shows that half of the items are located on the target zone, with an average difficulty index and adequate discrimination indices, except for certain very easy items (6 and 15). Easy items are justified from the conceptual perspective because they are relevant and basic questions in the competence assessment and topics that are often addressed in education centres, for example, through informative talks that are carried out in collaboration with members of security bodies (Local Police), this being a clear indicator that most students have already acquired such knowledge and skills.

3.2.4. Test Reliability Analysis

To study the reliability of the knowledge and skills test, correlation KR20 was calculated considering the items’ dichotomic nature. Cronbach’s alpha = 0.547. This internal consistency index obtained for the test might be considered relevant for this type of competence assessment instruments. After contrasting the significance assumption [72] (p.60), it can be established that the alpha value obtained is significant.

3.3. Analysis of the Items of the Attitude Scale

Attitudes were assessed using a Likert scale (1 strongly disagree, 5 strongly agree) made up of the following items:
  • I decline any social media friendship requests from strangers.
  • I think carefully about what information I am going to share before doing so.
  • I take care of maintaining appropriate behaviour when interacting online.
  • I like to mind, respect, and value the work published by others and shared through the Internet.
  • I am ready to report any cyberbullying situation, whether I or anybody else is the victim.
  • I am aware of the impact of everything I publish on the Internet and of how fast it can become viral and go beyond my scope of control.

3.3.1. Statistical Description of the Attitudes Scale

Below are the statistical descriptions corresponding to the items that make up the scale (Table 7), specifically the lowest and highest scores achieved, the means, standard deviations, skew, and kurtosis.
High averages can be observed for all items, especially numbers 2 and 5 (I think carefully about the information I am going to share before doing so and I am ready to report any cyberbullying situation, whether I or anybody else is the victim), measuring high in skewness. The leptokurtic kurtosis (above 3) of these items suggests that values are highly concentrated around their means.
The total average score on the scale is 26.05, which proves that teenagers’ attitude in the area of communication is very positive.

3.3.2. Correlation Analysis for Attitude Items

After examining the relationship between the items included in the attitudes scale, the conclusion is that all of them show significant correlation with the rest of elements and with the total of the scale, correlation with the total being the highest, with values above 0.60 in all cases (Table 8).
Likewise, correlations are higher between items 2 and 3, related to netiquette and items 3 and 6, also related to sharing information and contents.

3.3.3. Item Discrimination and Reliability Analysis

To assess the discrimination index for each attitude item based on point-biserial correlation, the variable was dichotomized so that scores 4 and 5 were declared “positive attitudes” and the rest of the scores (1, 2, and 3) “non-positive attitude”. The data are illustrated in Table 9, which shows that correlations are positive, with an adequate discrimination index.
Reliability of the scale of attitudes was calculated and yielded a value of Cronbach’s alpha = 0.727, which could be described as high reliability, given the reduced number of items.

3.4. Reliability and Validy of the Test

It should be recalled that the reliability of the knowledge and skills test (18 items) was already stated as having yielded a value of Cronbach’s alpha = 0.54. The second step was to calculate reliability of the attitudes scale, which threw a result of Cronbach’s alpha = 0.72.
The validity study for the test as a whole was based on two types of validity: content and structural. In this regard, the items included in the test are assumed to be a representative sample of the content to be assessed, since they conform to the competence indicators model based on the DigComp model drawn up by the research team and validated by experts, as was mentioned above http://hdl.handle.net/10366/139409 (accessed on 10 May 2021). The items included in the test were created based on such indicators model as adapted to the sample’s age range and subsequently subjected to the judgement of experts and discussion group until the test’s final version was produced after rephrasing item wording and answers where necessary. Thus, the set of items that make up the test was validated to measure knowledge, skills, and attitudes in the competence area of communication following the guidelines of the DigComp theoretical-conceptual model. On the other hand, item analysis has proved its capacity for discrimination and adjustment to the target population.
The study of structural validity was approached through factor analysis of the main components, taking the six knowledge and skills competences and the six attitude items as variables. The analysis of prior conditions to conduct the factor analysis yielded a KMO value of 0.780, which, by being close 0.80 can be considered an adequate sample. Barlett’s sphericity is highly significant (with a Chi-Square value of 999.245 and 66 degrees of freedom, which is significant at 0.000), proving a linear relationship between the variables. It was also observed that the skewness of each item took values that ranged between −1 and +1, which validates the analysis.
The analysis of the main components was carried out to obtain the λ values for each of them and retain those factors whose value was λ ≥ 1. The first factor explains 24.48% of the variance and the second adds 12.77, so that the first two factors would explain 37.25% of the variance of the factor matrix. Table 10, which refers to the matrix of components, clearly reflects how the attitudes items are linked to the first factor, whereas the knowledge/skills competences are linked to the second factor.
Finally, when studying the characteristics of the test, relevance was placed on checking that there was no bias in the items according to gender differences. This was addressed by calculating Chi-Square values using crosstabs for each of the items included in the test. The results show that there are no significant differences with regard to the gender variable, except in item 1 (54% right answers achieved by girls against 46% by boys), item 7 (55% right answers achieved by boys against 45% by girls), item 8 (52% right answers achieved by boys against 48% by girls), and item 15 (46% right answers achieved by boys against 54% by girls), with a higher percentage of right answers for girls (56% against 44%). As regards attitudes, it could be stated that there are no significant differences either where the gender variable is concerned, except in attitude 6, where students tend to be indifferent when it comes to thinking about what information they are about to share before doing so (74% of the boys against 26% of the girls).
These results lead to the conclusion that the test is fully valid for both groups.
In the light of the above, we must emphasise that the lack of instruments to assess the main elements that make up competences, that is, teenagers’ knowledge, skills, and attitudes, especially in the area of communication, makes it necessary to design specific tests that are not only based on students’ self-perceptions on the issue. This is why this article is devoted to the analysis of a test targeted at students aged 11 to 13, custom-designed to achieve this goal and taking the DigComp model as a reference [8]. In view of the findings, it may be concluded that it is a valid and reliable test that allows an adequate classification of subjects according to their competence level.
In this particular case, the area of communication, six competences have been included for assessment: interacting through new technologies, sharing information and contents, online citizen participation, collaboration through digital technologies, netiquette, and managing digital identity. Most of the correlations between them are significant at 0.001.
Factor analysis yields two clearly differentiated factors, the first related to knowledge and skills and the second to attitudes. Item analyses were conducted separately, one consisting of an objective test and the other of a Likert-type scale test. The conclusion that can be drawn is that the items have good discrimination levels and are balanced in terms of difficulty, except for items 6 and 15, which have proved excessively easy.
The analysed test includes items with different competence levels: (i) eight basic level items (five easy and three very easy; percentage > 60% right answers); (ii) six intermediate level items (moderate difficulty; percentages between 40 and 60% right answers); and (iii) four advanced level items (two difficult and two very difficult; percentage < 30% right answers).
After applying the test to a sample of 609 students (aged 11–13), we found that their competence level in the area of communication as regards knowledge and skills is moderate ( x ¯ = 9.65 on a 0 to 18 scale; SD = 2.873), whereas it is quite high where attitudes are concerned ( x ¯ = 26.05 on a 6 to 30 scale; SD = 4.534).
As a final remark, it should be noted that some of the limitations of this study could be the result of possible difficulties in students’ understanding the questions and possible answers in the test, which might have affected the scores achieved. A possible solution could be to design even more specific tasks that might verify teenagers’ acquisition and development of these skills.

4. Conclusions

The results of the assessment reveal the need for students to acquire and use this type of skills, especially those related to collaborating through digital technologies, among which are being aware of the digital tools they can use to work cooperatively or being capable of correcting text documents using the track changes option. They also require further training in sharing information and data, especially where it comes to knowing the media they can use to share videos, contents, data, or resources; as well as in matters revolving around citizen participation online, such as knowing how to access and use the digital services available. In this regard, authors such as [73] agree with this need for training in the appropriate use of digital competences, since they consider that teenagers are only capable of performing certain basic digital activities, such as using smartphones or accessing information and communication using the Internet, but they are not prepared for the rest of the activities that characterize Web 2.0 and datified environments, such as creating content or publishing information and/or data. On their part, [74] speak of the myth of the digital native, referring to the easiness with which this group manages to access and coexist with technologies without it meaning that they know how to use them. This is also supported by the studies of [75], whose results show how young people’s digital skills are not inherent and require training.
Hence, these findings reveal that there is a pressing need for education centres to place special emphasis on providing students with adequate education-communicative training to foster “the development of capacities that may allow them to interact with media in a knowledgeable and creative manner” [27] (p. 36). Furthermore, for communication to be successful, those engaged should be aware of the different digital communication media available, be familiar with communication software packages and how they work as well as their advantages and shortcomings depending on context and recipient, know which resources can be publicly shared and their value, that is, to be knowledgeable about how technologies and media allow different forms of participation and collaboration for content creation to obtain shared benefits; and to be aware of ethical issues such as digital identity and the regulations of digital interaction. Bearing this in mind, the data reported by Eurostat3 [76] reveal that 45% of the population of the European Union between the ages of 16 and 74 lack sufficient digital skills. This circumstance is what underlies the training divide, meaning that it is currently not enough to be skilled and proficient in ICT and Internet access, but that it is also essential to acquire mastery in digital tools and contents in the form of competences, which is the main notion in the new digital environments [8,77].

5. Patents

Two patents resulted from the research presented in this manuscript:
Evaluation of the digital competence of students (ECODIES). Patent Nº: SA-6-19. Date granted: 21/01/2018. Country of registration: Spain, Castilla y León. Registration date: 2019. Authors: Hernández, A.; Cabezas, M.; Iglesias, A.; Casillas, S.; Martín, M.; García-Valcárcel, A.; Basilotta, V.; González, L.
Proposal for indicators to assess the digital competence of students using the DigComp model (INCODIES) as a reference. Patent Nº: SA-7-19. Date granted: 21/01/2018. Country of registration: Spain, Castilla y León. Registration date: 2019. Authors: Hernández, A.; Cabezas, M.; Iglesias, A.; Casillas, S.; Martín, M.; García-Valcárcel, A.; Basilotta, V.; González, L.

Author Contributions

A.I.-R. edited the paper, supervised the process and wrote Section 1, Section 3, Section 4 and Section 5. A.H.-M. structured the paper and wrote Section 2, Section 4 and Section 5. Y.M.-G. supervised the process and wrote Section 1, Section 2 and Section 4. P.H.-C. supervised the process and wrote Section 2 and Section 5. All authors equally contributed to write and review this paper. All authors have read and agreed to the published version of the manuscript.

Funding

Article produced in the framework of the research and development project “Evaluación de las competencias digitales de los estudiantes de Educación Obligatoria y estudio de la incidencia de variables socio-familiares” (Assessment of the digital competences of Basic Education students and study of the impact of social and family variables), developed by the Research-Innovation in Educational Technology Group of the University of Salamanca (GITE-USAL) and funded by the Ministry of Economy and Competitiveness as part of the State Program for the Promotion of Excellence Scientific and Technical Research of the Spanish Government (EVADISO, EDU2015-67975-C3-3-P, MINECO/FEDER).

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki and approved by the Ethical Committee of the University of Salamanca.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to privacy.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Selwyn, N. Reconsidering Political and Popular Understandings of the Digital Divide. New Media Soc. 2004, 6, 341–362. [Google Scholar] [CrossRef]
  2. Hatlevik, O.E.; Christophersen, K.-A. Digital competence at the beginning of upper secondary school: Identifying factors explaining digital inclusion. Comput. Educ. 2013, 63, 240–247. [Google Scholar] [CrossRef]
  3. Serafín, C.; Depešová, J.; Bánesz, G. Understanding digital competences of teachers in Czech Republic. EJST 2019, 15, 125–132. [Google Scholar]
  4. Berson, M.J.; Berson, I.R. Developing thoughtful “Cybercitizens”. Soc. Stud. Young Learn. 2004, 16, 5–8. [Google Scholar]
  5. Martin, A. A european framework for digital literacy. Nord. J. Digit. Lit. 2006, 1, 151–161. [Google Scholar] [CrossRef]
  6. Dussel, I. Los Nuevos Alfabetismos en el Siglo XXI: Desafíos Para la Escuela. 2008. Available online: https://postitulosecundaria.infd.edu.ar/archivos/repositorio/500/748/Dussel_nuevos_alfabetismos.pdf (accessed on 15 July 2020).
  7. Belshaw, D.A.J. What is digital literacy? Ph.D. Thesis, Durham University, Durham, UK, 2011. [Google Scholar]
  8. Ferrari, A. DigComp: A Framework for Developing and Understanding Digital Competence in Europe; Publications Office of the European Union: Luxembourg, 2013. [Google Scholar]
  9. Pérez-Escoda, A.; Aguaded, I.; Rodríguez-Conde, M.J. Generación digital v.s. escuela analógica. Competencias digitales en el currículum de la educación obligatoria. Digit. Educ. Rev. 2016, 30, 165–183. [Google Scholar]
  10. Suárez-Guerrero, C. ¿Con quién aprender? Cuad. Pedag. 2013, 435, 78–81. [Google Scholar]
  11. Plaza-de la Hoz, J. Cómo mejorar el papel de las TIC para promover una educación empoderadora en el desarrollo sostenible. Aloma. Rev. Psicol. Ciènc. Educ. Esport 2018, 36, 43–55. [Google Scholar]
  12. Crocket, L.; Jukes, I.; Churches, A. Literacy Is Not Enough: 21st Century Fluencies from the Digital Age; Corwin: Thousand Oaks, CA, USA, 2011. [Google Scholar]
  13. Gewerc, A.; Montero, L. Conocimiento profesional y competencia digital en la formación del profesorado. El caso del Grado de Maestro en Educación Primaria. RELATEC 2015, 14, 31–43. [Google Scholar]
  14. Cebrián-Herreros, M.; Cebri, M. Interactive Communication in the Cybermedia. Comunicar 2009, 17, 15–24. [Google Scholar] [CrossRef]
  15. Scolari, C. Hipermediaciones: Elementos de Una Teoría de la Comunicación Digital Interactiva; Gedisa: Barcelona, Spain, 2008. [Google Scholar]
  16. Ferrés, J.; Piscitelli, A.; Ferr, J. Media Competence. Articulated Proposal of Dimensions and Indicators. Comunicar 2012, 19, 75–82. [Google Scholar] [CrossRef]
  17. Gutiérrez-Martín, A.; Tyner, K. Media Education, Media Literacy and Digital Competence. Comunicar 2012, 19, 31–39. [Google Scholar] [CrossRef]
  18. Aguaded, I. From infoxication to the right to communicate. Comunicar 2014, 21, 07–08. [Google Scholar] [CrossRef]
  19. Van Dijk, J.; Van Deursen, A. Digital Skills: Unlocking the Information Society; Palgrave Macmillan: New York, NY, USA, 2014. [Google Scholar]
  20. Marzal, M.A. Evolución conceptual de la alfabetización en información a partir de la alfabetización múltiple en su perspectiva educativa y bibliotecaria. Investig. Bibliotecol. 2009, 23, 129–160. [Google Scholar]
  21. Lloyd, A. Information Literacy Landscapes: Information Literacy in Education, Workplace and Everyday Contexts; Chandos Publishing: Oxford, UK, 2008. [Google Scholar]
  22. Martín, Y.; Iglesias, A. Alfabetização da dados: Projetando um novo cenário de treinamento para o contexto universitário. RICI 2021, 14, 318–330. [Google Scholar] [CrossRef]
  23. Himanen, P. The Hacker Ethic and the Spirit of the Information Age; Random House Inc.: New York, NY, USA, 2001. [Google Scholar]
  24. Guitert, M.; Pérez-Mateo, M. La colaboración en la red: Hacia una definición de aprendizaje colaborativo en entornos virtuales. Teor. Educ. 2013, 14, 10–31. [Google Scholar]
  25. Escoda, A.P.; Conde, M.J.R. Evaluación de las competencias digitales autopercibidas del profesorado de Educación Primaria en Castilla y León (España). Rev. Investig. Educ. 2016, 34, 399–415. [Google Scholar] [CrossRef] [Green Version]
  26. Kirschner, P.A.; De Bruyckere, P. The myths of the digital native and the multitasker. Teach. Teach. Educ. 2017, 67, 135–142. [Google Scholar] [CrossRef]
  27. Mateus, J.-C.; Hernández-Breña, W. Design, Validation, and Application of a Questionnaire on Media Education for Teachers in Training. J. New Approaches Educ. Res. 2019, 8, 34–41. [Google Scholar] [CrossRef]
  28. Calvani, A.; Fini, A.; Ranieri, M.; Picci, P. Are young generations in secondary school digitally competent? A study on Italian teenagers. Comput. Educ. 2012, 58, 797–807. [Google Scholar] [CrossRef] [Green Version]
  29. Somyürek, S.; Coşkun, B.K. Digital competence: Is it an innate talent of the new generation or an ability that must be developed? Br. J. Educ. Technol. 2013, 44, E163–E166. [Google Scholar] [CrossRef]
  30. Peñalva-Vélez, A.; Napal, M.; Mendioroz, A.M. Competencia digital y alfabetización digital de los adultos (profesorado y familias). Int. J. New Educ. 2018, 1, 1–13. [Google Scholar] [CrossRef] [Green Version]
  31. Pérez, M.S.; Ortiz, M.G.; Flores, M.M. Redes sociales en Educación y propuestas metodológicas para su estudio. Cienc. Docencia Tecnol. 2015, 26, 188–206. [Google Scholar]
  32. Anderson, M.; Jiang, J. Teens, Social Media & Technology 2018. Pew Research Center, 2018. Available online: https://www.pewinternet.org/2018/05/31/teens-social-media-technology-2018/ (accessed on 15 July 2020).
  33. Floridi, L. The Fourth Revolution. How the Infosphere Is Reshaping Human Reality; Oxford University Press: Oxford, UK, 2014. [Google Scholar]
  34. Arroyo-Sagasta, A. Competencias en comunicación y colaboración en la formación de docentes. Rev. Mediterr. Comun. 2017, 8, 277–285. [Google Scholar] [CrossRef] [Green Version]
  35. Zhong, Z.-J. From access to usage: The divide of self-reported digital skills among adolescents. Comput. Educ. 2011, 56, 736–746. [Google Scholar] [CrossRef]
  36. Pagani, L.; Argentin, G.; Gui, M.; Stanca, L. The impact of digital skills on educational outcomes: Evidence from performance tests. Educ. Stud. 2016, 42, 137–162. [Google Scholar] [CrossRef] [Green Version]
  37. Siddiq, F.; Scherer, R. Is there a gender gap? A meta-analysis of the gender differences in students’ ICT literacy. Educ. Res. Rev. 2019, 27, 205–217. [Google Scholar] [CrossRef]
  38. Aparici, R.; Acedo, S.O. La cultura de la participación. Rev. Mediterr. Comun. 2013, 4, 137–148. [Google Scholar] [CrossRef] [Green Version]
  39. INTEF. Marco Común de Competencia Digital Docente. Ministerio de Educación, Cultura y Deporte, 2017. Available online: https://aprende.intef.es/sites/default/files/2018-05/2017_1020_Marco-Com%C3%BAn-de-Competencia-Digital-Docente.pdf (accessed on 1 July 2020).
  40. Lennon, M.; Kirsch, I.; Von Davier, M.; Wagner, M.; Yamamoto, K. Feasibility Study for the PISA ICT Literacy Assessment: Report to Network A; OECD: Paris, France, 2003. [Google Scholar]
  41. Lee, M.J.; McLoughlin, C. Harnessing the affordances of Web 2.0 and social software tools: Can we finally make “student-centered” learning a reality? In Proceedings of the EdMedia: World Conference on Educational Multimedia, Hypermedia and Telecommunications, Vienna, Austria, 30 June 2008; Association for the Advancement of Computing in Education (AACE): Waynesville, NC, USA, 2008; pp. 3825–3834. [Google Scholar]
  42. Domizi, D.P. Microblogging to Foster Connections and Community in a Weekly Graduate Seminar Course. TechTrends 2013, 57, 43–51. [Google Scholar] [CrossRef]
  43. Carpenter, J.P. Twitter’s capacity to support collaborative learning. Int. J. Soc. Media Interact. Learn. Environ. 2014, 2, 103. [Google Scholar] [CrossRef]
  44. Krutka, D.; Nowell, S.; McMahon, A. Towards a Social Media Pedagogy: Successes and Shortcomings in Educative Uses of Twitter with Teacher Candidates. J. Technol. Teach. Educ. 2017, 25, 215–240. [Google Scholar]
  45. Tur, G.; Marín-Juarros, V.; Carpenter, J. Using Twitter in Higher Education in Spain and the USA. Comunicar 2017, 25, 19–28. [Google Scholar] [CrossRef] [Green Version]
  46. Castañeda, L.; Costa, C.; Torres-Kompen, R. The Madhouse of ideas: Stories about networking and learning with twitter. In Proceedings of the PLE Conference 2011, Southampton, UK, 10–12 July 2011. [Google Scholar]
  47. Siddiq, F.; Hatlevik, O.E.; Olsen, R.V.; Throndsen, I.; Scherer, R. Taking a future perspective by learning from the past—A systematic review of assessment instruments that aim to measure primary and secondary school students’ ICT literacy. Educ. Res. Rev. 2016, 19, 58–84. [Google Scholar] [CrossRef] [Green Version]
  48. Sánchez-Gómez, M.C.; Iglesias-Rodríguez, A.; García-Peñalvo, F.J. Digital competence, social networks and apps in education. In Proceedings of the 5th International Conference on Mobile Software Engineering and Systems, Gothenburg, Sweden, 22–23 May 2017. [Google Scholar] [CrossRef]
  49. Vivar, J.M.F. New Models of Communication, Profiles and Trends in Social Networks. Comunicar 2009, 17, 73–81. [Google Scholar] [CrossRef] [Green Version]
  50. Severance, C.; Hardin, J.; Whyte, A. The coming functionality mash-up in Personal Learning Environments. Interact. Learn. Environ. 2008, 16, 47–62. [Google Scholar] [CrossRef]
  51. Prendes, M.P.; Castañeda, L.; Ovelar, R.; Carrera, X. Componentes básicos para el análisis de los PLE de los futuros profe-sionales españoles: En los albores del Proyecto CAPPLE. EDUTEC 2014, 47, 1–15. [Google Scholar]
  52. Attwell, G. The personal learning environments—The future of e-Learning? eLearn. Pap. 2007, 2. [Google Scholar]
  53. Adell, J.; Castañeda, L. Los Entornos Personales de Aprendizaje (PLEs): Una nueva manera de entender el aprendizaje. In Claves Para la Investigación en Innovación y Calidad Educativas. La Integración de las Tecnologías de la Información y la Comunicación y la Interculturalidad en las Aulas; Roig, R., Fiorucci, M., Eds.; Marfil-Roma TRE Universita degli Studi: Alcoy, Spain, 2010; pp. 1–16. [Google Scholar]
  54. Attwell, G. Personal Learning Environments: The Future of Education? p. 2008. Available online: https://es.slideshare.net/GrahamAttwell/personal-learning-enviroments-the-future-of-education-presentation (accessed on 1 July 2020).
  55. Remesal, A.; Colomina, R. Social presence and online collaborative small group work: A socioconstructivist account. Comput. Educ. 2013, 60, 357–367. [Google Scholar] [CrossRef]
  56. Chiecher, A.C.; Donolo, D.S. Virtual dialogues and exchanges. The social and cognitive dimensions of interactions among students. RUSC. Univ. Knowl. Soc. J. 2013, 10, 37. [Google Scholar] [CrossRef] [Green Version]
  57. Gutiérrez-Santiuste, E.; Gallego-Arrufat, M.J. Presencia social en un ambiente colaborativo virtual de aprendizaje. Análisis de una comunidad orientada a la indagación. RMIE 2017, 22, 1169–1186. [Google Scholar]
  58. Pérez-Escoda, A.; Iglesias-Rodríguez, A. La educación frente al U-learning, el aprendizaje anyplace, anywhere, anytime. In Actas de las Jornadas Virtuales de Colaboración y Formación Virtual USATIC 2016, Ubicuo y Social: Aprendizaje con TIC; Allueva, A.I., Alejandre, J.A., Eds.; Bubok: Madrid, Spain, 2016; pp. 712–726. [Google Scholar]
  59. Martín, A.V.; Muñoz, J.M.; Del Dujo, A.; Sánchez, M.C. Notas para una interpretación educativa del espacio-tiempo social en la Red. Pedag. Soc. Rev. Interuniv. 2011, 18, 13–30. [Google Scholar] [CrossRef] [Green Version]
  60. Fedushko, S.; Davidekova, M. Analytical service for processing behavioral, psychological and communicative features in the online communication. Procedia Comput. Sci. 2019, 160, 509–514. [Google Scholar] [CrossRef]
  61. OECD. Education at a Glance 2014: OECD Indicators; OECD Publishing: Paris, France, 2014. [Google Scholar]
  62. Cerrillo, R.; Esteban, R.; Paredes, J. TIC e inclusión en aulas de educación secundaria de la Comunidad de Madrid: Análisis de las prácticas docentes en el modelo 1 a 1. Profesorado 2014, 18, 81–97. [Google Scholar]
  63. Gewerc Barujel, A. Políticas, Prácticas e Investigación en Tecnología Educativa; Octaedro: Barcelona, Spain, 2009. [Google Scholar]
  64. Gewerc Barujel, A. El Lugar de las TIC en la Enseñanza Universitaria: Estudio de Caso en Iberoamérica; Aljibe: Málaga, Spain, 2010. [Google Scholar]
  65. Montero Mesa, L. Entre sombras y luces. Un estudio sobre la influencia de las TIC en el desarrollo organizativo y profesional de los centros educativos. In Políticas, Prácticas e Investigación en Tecnología Educativa; Gewerc, A., Ed.; Octaedro/ICE-UB: Barce-lona, Spain, 2009; pp. 133–158. [Google Scholar]
  66. Gewerc, A.; Montero, L.; Pernas, E.; Alonso, A. Competencia digital en el plan de estudios de matemáticas de la USC: En busca del eslabón perdido. RUSC 2011, 8, 14–30. [Google Scholar]
  67. Van Deursen, A.; Van Dijk, J. Internet skills and the digital divide. New Media Soc. 2010, 13, 893–911. [Google Scholar] [CrossRef] [Green Version]
  68. García, M.C.; Fernández, C. Si lo Vives, lo Compartes. Cómo se Comunican Los Jóvenes en su Mundo Digital; Ariel: Madrid, Spain, 2016. [Google Scholar]
  69. Ferrés i Prats, J.; García, A.; Aguaded, J.I.; Fernández, J.; Figueras, M.; Blanes, M. Competencia Mediática. Investigación Sobre el Grado de Competencia de la Ciudadanía en España; Instituto de Tecnologías Educativas (ITE), Consell de l’Audiovisual de Catalunya y Comunicar: Barcelona, Spain, 2011. [Google Scholar]
  70. Mateo, J.; Martínez, F. Medición y Evaluación Educativa; La Muralla: Madrid, Spain, 2008. [Google Scholar]
  71. Dominguez-Lara, S. Fiabilidad y alfa ordinal. Actas Urol. Esp. 2018, 42, 140–141. [Google Scholar] [CrossRef]
  72. Muñiz, J. Teoría Clásica de los Tests; Pirámide: Madrid, Spain, 1992. [Google Scholar]
  73. Bennett, S.; Maton, K. Beyond the ‘digital natives’ debate: Towards a more nuanced understanding of students’ technology experiences. J. Comput. Assist. Learn. 2010, 26, 321–331. [Google Scholar] [CrossRef]
  74. Cobo, C.; Moravec, J.W. Aprendizaje Invisible. Hacia una Nueva Ecología de la Educación; Col-Lecció Transmedia XXI; Laboratori de Mitjans Interactius/Publications i Edicions de la Universitata de Barcelona: Barcelona, Spain, 2011. [Google Scholar]
  75. Cabero, J.; Marín, V. Miradas sobre la formación del profesorado en tecnologías de información y comunicación (TIC). Enl@ce 2014, 11, 11–24. [Google Scholar]
  76. DESI. Índice de la Economía y la Sociedad Digitales (DESI) 2018, Informe de País para España. 2018. Available online: https://ec.europa.eu/information_society/newsroom/image/document/2018-20/es-desi_2018-country-profile-lang_4AA8143E-CA74-9BD7-2FBD36EBA0B95CCB_52357.pdf (accessed on 1 July 2020).
  77. Rychen, D.S.; Salganik, L.H. DeSeCo’s Final Report. Key Competencies for a Successful Life and a Well-Functioning Society; Hogrefe & Huber Publishers: Göttingen, Germany, 2003. [Google Scholar]
Figure 1. Item distribution according to difficulty/easiness indices and discrimination (rpb).
Figure 1. Item distribution according to difficulty/easiness indices and discrimination (rpb).
Sustainability 13 06733 g001
Table 1. Summary of descriptive data for the three competence areas (N = 609).
Table 1. Summary of descriptive data for the three competence areas (N = 609).
LowestHighestMeanStandard Deviation
KNOWLEDGE (max. 8)
(8 dichotomous items: 0/1)
084.271.575
SKILLS (max. 10)
(10 dichotomous items: 0/1)
0105.381.885
ATTITUDES (max. 30)
(6 Likert-type items: 1 to 5)
63026.054.534
Table 2. Descriptive analysis of the items included in the test.
Table 2. Descriptive analysis of the items included in the test.
Competence
Area
Mean
( x ¯ )
Standard Deviation (SD)
C1.—INTERACTING THROUGH NEW TECHNOLOGIES1.710.801
1. Knows how to deal with a friend request made by a stranger through social mediaSkills0.810.390
2. Knows what should be done when writing messages using different communication toolsKnowledge0.610.489
3. Knows how to proceed when making a presentationSkills0.290.456
C2.—SHARING INFORMATION AND CONTENTS1.450.768
4. Knows where to share a videoKnowledge0.220.416
5. Knows how to email a paper or taskSkills0.440.497
6. Knows what contents or resources could be shared on the InternetKnowledge0.780.411
C3.—ONLINE CITIZEN PARTICIPATION1.570.964
7. Is aware of the media to use on the Internet to achieve as much citizen participation as possible in the shortest time possibleSkills0.380.485
8. Knows how to access and use online servicesKnowledge0.660.473
9. Knows how to approach and use the online services availableSkills0.530.500
C4.—COLLABORATING THROUGH DIGITAL TECHNOLOGIES1.260.915
10. Is familiar with the digital tools that can be used for cooperative workingKnowledge0.330.472
11. Knows how to draw up an online working document in a cooperative mannerSkills0.470.500
12. Knows how to correct a text document using the track changes optionSkills0.450.498
C5.—NETIQUETTE1.960.928
13. Knows what should be done when a message includes multiple recipientsKnowledge0.580.494
14. Knows what to do upon receiving a chain message offering giftsSkills0.600.490
15. Knows how to respond when faced by a cyberbullying situationSkills0.780.413
C6.—MANAGING DIGITAL IDENTITY1.700.902
16. Understands friendship relationships both inside and outside the InternetSkills0.620.485
17. Is aware of the consequences of having a negative identity on the InternetKnowledge0.630.484
18. Is aware of the benefits of having several digital identitiesKnowledge0.450.498
TOTAL (max. 18 points)
N = 609
9.652.873
Table 3. Correlations between the knowledge and skills items.
Table 3. Correlations between the knowledge and skills items.
12345678910
Item 11.0000.183−0.023−0.099−0.0050.128−0.112−0.039−0.001−0.061
Item 2 1.000−0.042−0.0410.0170.0540.0160.0890.1050.057
Item 3 1.000−0.0320.0210.0830.1370.0550.0360.061
Item 4 1.000−0.0130.0000.0740.0620.0480.182
Item 5 1.0000.0150.0640.1370.0770.055
Item 6 1.0000.0200.1940.1670.007
Item 7 1.0000.1320.1160.093
Item 8 1.0000.2210.012
Item 9 1.0000.110
Item 10 1.000
1112131415161718
Item 10.0090.0010.0350.0830.164−0.016−0.0060.014
Item 2−0.011−0.1090.0580.1030.145−0.0060.1010.009
Item 30.058−0.009−0.0060.0160.036−0.013−0.052−0.008
Item 4−0.072−0.0280.0940.0620.062−0.0100.062−0.081
Item 50.1150.1160.0470.0260.086−0.0060.0190.100
Item 60.0810.0780.1290.0490.1780.0970.0330.043
Item 70.1280.0670.0120.0650.0180.0520.127−0.029
Item 80.1060.1600.1110.0890.1790.0570.088−0.015
Item 90.1390.0370.1300.1220.1820.0970.1210.006
Item 100.1060.0460.062−0.0350.064−0.038−0.0260.144
910111213141516
Item 111.0000.0890.0430.0530.152−0.0720.0620.079
Item 12 1.0000.1320.1180.1330.0590.1520.028
Item 13 1.0000.0770.1860.0260.0420.134
Item 14 1.0000.229−0.0280.1620.018
Item 15 1.0000.1560.2640.098
Item 16 1.0000.1280.026
Item 17 1.0000.050
Item 18 1.000
Table 4. Correlations between competences and attitudes.
Table 4. Correlations between competences and attitudes.
C1C2C3C4C5C6TOTAL
C1—Interacting through new technologies10.0400.093 *0.0720.170 **0.0100.062
C2—Sharing information and contents 10.243 **0.170 **0.203 **0.081 *0.116 **
C3—Online citizen participation 10.231 **0.227 **0.137 **0.147 **
C4—Collaborating through digital technologies 10.191 **0.157 **0.096 *
C5—Netiquette 10.227 **0.225 **
C6—Managing digital identity 10.119 **
TOTAL ATTITUDE0.0620.116 **0.147 **0.096 *0.225 **0.119 **1
* Significant correlation at 0.05 (bilateral). ** Significant correlation at 0.01 (bilateral).
Table 5. Statistics for knowledge and skills items.
Table 5. Statistics for knowledge and skills items.
Discrimination Index (rbp)Difficulty Index
(% Right Answers)
Mean of the Scale If Element Is RemovedScale Variance If Element Is RemovedCorrected Total Element CorrelationCronbach’s Alpha If Element Is Removed
Item 10.1881.3%8.848.0120.0400.554
Item 20.3260.8%9.057.5900.1580.537
Item 30.2129.4%9.367.9080.0530.554
Item 40.1922.2%9.437.9690.0470.554
Item 50.3244.2%9.217.5880.1530.538
Item 60.3678.5%8.877.5620.2310.526
Item 70.3337.8%9.287.5550.1730.534
Item 80.4366.3%8.997.2960.2870.514
Item 90.4552.5%9.137.2010.2990.511
Item 100.3033.5%9.327.6520.1450.539
Item 110.3747.5%9.187.4270.2120.527
Item 120.3944.8%9.217.3940.2260.525
Item 130.3858.0%9.077.4070.2240.525
Item 140.3660.03%9.057.4660.2040.529
Item 150.5378.2%8.877.1680.4130.496
Item 160.2762.4%9.037.7230.1100.546
Item 170.3962.6%9.037.4120.2300.524
Item 180.2845.3%9.207.7100.1070.547
Table 6. Item difficulty of the knowledge and skills test.
Table 6. Item difficulty of the knowledge and skills test.
Items% Right AnswersCriterionDescriptionLevel
1, 6, 1581, 79, 78>70Very easyBasic
(8 items)
2, 8, 14, 16, 1761, 66, 60, 62, 6360–70Easy
5, 9, 11, 12, 13, 1844, 53, 48, 45, 58, 4540–60ModerateIntermediate (6 items)
7, 1038, 3430–40DifficultAdvanced (4 items)
3, 429, 22<30Very difficult
Table 7. Statistical description attitude items (attitude dimension of the competence).
Table 7. Statistical description attitude items (attitude dimension of the competence).
LowestHighestAverageStandard DeviationSkewKurtosis
Attitude 1154.371.204−1.8362.072
Attitude 2154.600.772−2.4977.121
Attitude 3154.340.908−1.6522.953
Attitude 4154.111.037−1.3461.599
Attitude 5154.430.984−2.0143.759
Attitude 6154.201.029−1.3241.254
TOTAL N = 60963026.054.534−2.4679.152
Table 8. Analysis of correlation between the attitude items.
Table 8. Analysis of correlation between the attitude items.
Items123456Total
Attitude 11.0000.384 **0.333 **0.109 **0.294 **0.302 **0.646
Attitude 2 1.0000.482 **0.309 **0.284 **0.331 **0.671 **
Attitude 3 1.0000.359 **0.257 **0.422 **0.704 **
Attitude 4 1.0000.259 **0.392 **0.614 **
Attitude 5 1.0000.299 **0.608 **
Attitude 6 10000.702 **
TOTAL ATTITUDE0.646 **0.671 **0.704 **0.614 **0.608 **0.702 **1
** Significant correlation at 0.01 (bilateral).
Table 9. Total statistics for attitudes items.
Table 9. Total statistics for attitudes items.
Discrimination Index (rpb)Difficulty Index
(% Correct Answers)
Scale Mean If Element Is RemovedScale Variance If Element Is RemovedCorrected Total Element CorrelationCronbach’s Alpha if Element Is Removed
Attitude 10.4421.6810.5400.4040.2300.714
Attitude 20.3321.4511.7140.5380.3210.676
Attitude 30.3821.7110.9990.5520.3440.665
Attitude 40.3821.9411.2680.4030.2350.707
Attitude 50.3621.6211.4630.4080.1710.704
Attitude 60.4221.8410.5890.5230.2900.670
Table 10. Matrix of main components.
Table 10. Matrix of main components.
Components
Variables12
C1.—INTERACTING THROUGH NEW TECHNOLOGIES−0.0360.397
C2.—SHARING INFORMATION AND CONTENTS0.1160.508
C3.—ONLINE CITIZEN PARTICIPATION0.1070.601
C4.—COLLABORATING THROUGH DIGITAL TECHNOLOGIES0.0020.586
C5.—NETIQUETTE0.2040.625
C6.—MANAGING DIGITAL IDENTITY0.0610.448
Attitude 10.5470.227
Attitude 20.7070.119
Attitude 30.7330.130
Attitude 40.645−0.099
Attitude 50.5440.145
Attitude 60.724−0.017
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Iglesias-Rodríguez, A.; Hernández-Martín, A.; Martín-González, Y.; Herráez-Corredera, P. Design, Validation and Implementation of a Questionnaire to Assess Teenagers’ Digital Competence in the Area of Communication in Digital Environments. Sustainability 2021, 13, 6733. https://doi.org/10.3390/su13126733

AMA Style

Iglesias-Rodríguez A, Hernández-Martín A, Martín-González Y, Herráez-Corredera P. Design, Validation and Implementation of a Questionnaire to Assess Teenagers’ Digital Competence in the Area of Communication in Digital Environments. Sustainability. 2021; 13(12):6733. https://doi.org/10.3390/su13126733

Chicago/Turabian Style

Iglesias-Rodríguez, Ana, Azucena Hernández-Martín, Yolanda Martín-González, and Patricia Herráez-Corredera. 2021. "Design, Validation and Implementation of a Questionnaire to Assess Teenagers’ Digital Competence in the Area of Communication in Digital Environments" Sustainability 13, no. 12: 6733. https://doi.org/10.3390/su13126733

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop