Produced with Scholar

Project: Educational Theory Practice Analysis

Project Overview

Project Description

Project Requirements

The peer-reviewed project will include five major sections, with relevant sub-sections to organize your work using the CGScholar structure tool.

BUT! Please don’t use these boilerplate headings. Make them specific to your chosen topic, for instance: “Introduction: Addressing the Challenge of Learner Differences”; “The Theory of Differentiated Instruction”; “Lessons from the Research: Differentiated Instruction in Practice”; “Analyzing the Future of Differentiated Instruction in the Era of Artificial Intelligence;” “Conclusions: Challenges and Prospects for Differentiated Instruction.”

Include a publishable title, an Abstract, Keywords, and Work Icon (About this Work => Info => Title/Work Icon/Abstract/Keywords).

Overall Project Wordlength – At least 3500 words (Concentration of words should be on theory/concepts and educational practice)

Part 1: Introduction/Background

Introduce your topic. Why is this topic important? What are the main dimensions of the topic? Where in the research literature and other sources do you need to go to address this topic?

Part 2: Educational Theory/Concepts

What is the educational theory that addresses your topic? Who are the main writers or advocates? Who are their critics, and what do they say?

Your work must be in the form of an exegesis of the relevant scholarly literature that addresses and cites at least 6 scholarly sources (peer-reviewed journal articles or scholarly books).

Media: Include at least 7 media elements, such as images, diagrams, infographics, tables, embedded videos, (either uploaded into CGScholar, or embedded from other sites), web links, PDFs, datasets, or other digital media. Be sure these are well integrated into your work. Explain or discuss each media item in the text of your work. If a video is more than a few minutes long, you should refer to specific points with time codes or the particular aspects of the media object that you want your readers to focus on. Caption each item sourced from the web with a link. You don’t need to include media in the references list – this should be mainly for formal publications such as peer reviewed journal articles and scholarly monographs.

Part 3 – Educational Practice Exegesis

You will present an educational practice example, or an ensemble of practices, as applied in clearly specified learning contexts. This could be a reflection practice in which you have been involved, one you have read about in the scholarly literature, or a new or unfamiliar practice which you would like to explore. While not as detailed as in the Educational Theory section of your work, this section should be supported by scholarly sources. There is not a minimum number of scholarly sources, 6 more scholarly sources in addition to those for section 2 is a reasonable target.

This section should include the following elements:

Articulate the purpose of the practice. What problem were they trying to solve, if any? What were the implementers or researchers hoping to achieve and/or learn from implementing this practice?

Provide detailed context of the educational practice applications – what, who, when, where, etc.

Describe the findings or outcomes of the implementation. What occurred? What were the impacts? What were the conclusions?

Part 4: Analysis/Discussion

Connect the practice to the theory. How does the practice that you have analyzed in this section of your work connect with the theory that you analyzed on the previous section? Does the practice fulfill the promise of the theory? What are its limitations? What are its unrealized potentials? What is your overall interpretation of your selected topic? What do the critics say about the concept and its theory, and what are the possible rebuttals of their arguments? Are its ideals and purposes hard, easy, too easy, or too hard to realize? What does the research say? What would you recommend as a way forward? What needs more thinking in theory and research of practice?

Part 5: References (as a part of and subset of the main References Section at the end of the full work)

Include citations for all media and other curated content throughout the work (below each image and video)

Include a references section of all sources and media used throughout the work, differentiated between your Learning Module-specific content and your literature review sources.

Include a References “element” or section using APA 7th edition with at least 10 scholarly sources and media sources that you have used and referred to in the text.

Be sure to follow APA guidelines, including lowercase article titles, uppercase journal titles first letter of each word), and italicized journal titles and volumes.

Icon for Comparing Digital Skills Frameworks

Comparing Digital Skills Frameworks

Section 1: Introduction

As digital technology is developed and is adopted around the world, various governments and organizations have identified a need within their education systems to incorporate education about technology into school curricula and training programs. Teachers are concerned that students without strong digital skills will face tough competition in the job markets of the future, and governments are concerned that populations without digital skills will struggle to compete in an increasingly digital economy.

Digital skills are exactly that – skills or competences regarding use of digital technology. As this paper shows, these skills go beyond core functionality of hardware and software to include skills such as communication, collaboration, privacy, data, security, and social identity. Section 4 of this paper provides a comprehensive list of these skills.

Benefits of a framework

Beyond simply being ‘nice to have’, Ferrari (2013) argues that competence related to digital technology is now a ‘life skill’, similar to literacy and numeracy. Despite general acknowledgement of this reality in developed economies, a significant digital divide persists between ‘haves and have nots’ – those with the skills needed to productively use digital technology and those without such skills (Dijk, 2020).

Ferrari (2013) argues that inclusion in a digital-first world is determined not by access and use but rather knowledge, skills and attitudes regarding digital technology. The solution is then a matter of providing education, and digital skills frameworks are a crucial first step in that direction.

A well-constructed digital skills framework serves the same goal as any other type of mandate or recommendation regarding educational content. For schools, companies, or governments that want to enhance the digital skills of their students, employees, or citizens, a framework provides a set of skills for those organizations to teach. Since digital technology and how it is used rapidly changes (relative to subjects such as mathematics or literature), any knowledge that an individual educator learned during their own time as a student may have little utility for modern students. Further, individual educators may not be up to date with current digital technology trends or research on what digital skills students are likely to need in the future.

Digital skills frameworks then serve a critical role. Technology specialists and researchers develop frameworks as guides for educators and policy makers, and work to keep the frameworks current and relevant as new technology is developed or society uses technology in new ways. Educators and policy makers, for their part, interpret these frameworks in ways that make them relevant for their students or constituents.

School curricula development is simply one possible application of a digital skills framework. Training programs for workers who are displaced by digital disruption in the economy or for elderly who have been caught off guard by rapid adoption of new technology can use frameworks to develop effective, targeted curricula.

If a framework is well established, then researchers can develop inventories to gather from individuals self-reported information about their current proficiency with digital technologies. Similarly, researchers can develop low-stakes diagnostic tests to assess an individual’s digital skills.

Such inventories and diagnostic tools can be used as pretests and post-tests to determine the effectiveness of a framework. They can be used to compare samples within a population to analyze differences in digital skills between grade levels, schools, regions and countries. The results can be used to analyze the various digital skills curricula and identify the most effective ones in order to share their methods with other teachers.

Governments can use these tests to compare economic classes, ethnic groups or any other type of demographic category in order to improve educational policies or allocate resources. Companies could use the results of these diagnostic tests to choose where to locate their offices in order to take advantage of a localized talent pool.

In short, planning effective education and making informed policy decisions related to digital skills requires that we have a reliable way to define and measure the results of our efforts. A comprehensive and detailed framework can make this possible.

Current state of frameworks and technology education

The current state of digital skills frameworks could be best described as an embarrassment of mediocre choices. One report from UNESCO researchers found no fewer than 47 frameworks (Law et al., 2018, p. 8). Unfortunately, rather than being unique and offering a meaningfully different perspective on the concept of digital skills education, many of these frameworks have significant overlaps. Further complicating the matter is that many frameworks have been developed relatively recently. Very few of the frameworks have had studies conducted to validate their constructs or to develop measurement tools. Despite this, from among the choices available, I wish to determine which framework best suits my needs as an educator.

From the perspective of an educator, lesson planning has two critical aspects: what to teach, and how to teach it. For education about technology, digital skills frameworks address the first aspect by providing a set of skills for educators to impart. For the other critical aspect, how to teach it, educators can refer to various best practices established by the STEM education field such as scaffolding, productive struggle, and collaborative discussions.

Scaffolding is the practice providing educational support for students to allow them to accomplish tasks that would otherwise be too difficult for them (Wood et al., 1976).

Productive struggle, from the field of mathematics education, refers to “expend[ing] effort to make sense of mathematics, to figure something out that is not immediately apparent.” (Hiebert & Grouws, 2007) The concept of productive struggle adapts well from mathematics education to technology education, given the similarity of the two fields. Both involve a student attempting to work through a practical task to a clear end or completed state. Students have a wide variety of tools at their disposal, but it is not necessarily clear which tools should be used for the task. Additionally, the function of the tools is based on abstract rules that are not immediately apparent from their surface level representations and must be learned through trial and error.

Collaborative discussions are a tool for students to help each other through the productive struggle process (Israel et al., 2015). Using a set of questions provided by a collaborative discussion framework, students can ask each other for assistance and collectively work through any challenges they have in completing a task. The discussion process produces ideas and motivation to help students work through the problem and achieve results that the participants would not be able to achieve if they worked alone.

With a well-established framework, these general best practices could be adapted and used to develop specialized best practices for each specific skill; these best practices could serve the technology education equivalents of mathematics teachers using pizza slices or cake slices to teach the concept of fractions.

Personal interest

My personal interest in this topic relates to my professional work. I teach a digital skills course that’s part of an undergraduate program at a business school. I wanted to benchmark my course design against other leading business schools, but most don’t have core courses that cover productivity software as widely as I do. Fortunately, digital skills frameworks can serve that role instead. My personal motivation for this paper is to find a digital skills framework that can serve as a guide for improving the course I teach.

My school partnered with an NGO based in Seoul called the Center for Digital Literacy (CDL) to help develop an earlier version of the digital skills course I now teach. CDL applied their digital skills framework to this earlier version of the course, which is how I became familiar with their framework. Part of my goal with this paper is to see how their framework compares against the frameworks from larger organizations.

In an earlier paper, I analyzed three digital skills frameworks of note: the ISTE’s Standards for Students, the EU’s DigComp 2.2, and UNESCO’s Digital Literacy Global Framework.

In this paper, I expand my analysis to include two additional frameworks: IEEE’s DQ Framework, and the Center for Digital Literacy’s Digital Literacy Framework.

Research question #1: Which digital skills framework is best suited for teachers who want to include instruction on digital skills in their curriculum?

Research question #2: Which digital skills framework is best suited for researchers who want to assess digital skills of a target population or assess the effectiveness of a given digital skills curriculum?

Section two of this paper offers a detailed overview of these two additional frameworks. Section three follows up with an analysis of both frameworks. Section four lists provides a comprehensive list of the digital skills listed in any of the five frameworks and compares which skills are present in or absent from the most frameworks.

A note about terminology: The frameworks analyzed use a variety of terms to refer to digital skills individually or as a whole. These include “digital skills”, “digital competencies”, “digital competency”, “digital literacy” and “indicators”, but these terms, as used within their respective frameworks, are not distinct from one another in any practical manner. When discussing individual frameworks, this paper uses the same terms as the frameworks’ authors. When discussing digital skills in a general sense, this paper uses the term “digital skills”.

I should offer a disclosure before discussing the frameworks in depth. I work closely with CDL and I have helped them translate their framework and associated assessment tools into English. I don’t consider myself biased towards their digital skills framework over others; I agree with some parts and disagree with others. Also, CDL has not yet published their framework or an explanation of their philosophy and justifications for the design of their framework. I provide their entire framework as an appendix for the reader’s reference. Other discussions about the intent or purpose of the framework are based on personal communications I’ve had with CDL’s directors.

Section 2: Description of DQ Framework and CDL Framework

DQ Framework

The first framework to introduce is the Standard for Digital Intelligence - Framework for Digital Literacy, Skills, and Readiness (also referred to by the authors as the ‘DQ Framework’). It was developed by the Coalition for Digital Intelligence, a partnership between the IEEE Standards Association, the World Economic Forum (WEF), the Organization for Economic Co-operation and Development (OECD), and the DQ Institute. It was published in 2020 by the IEEE Standards Association as IEEE Std 3527.1. (IEEE Computer Society, 2021)

The following video is the announcement of the launch of the Coalition for Digital Intelligence.

Media embedded December 16, 2023

The DQ Institute is a private nonprofit organization registered in the United States and in Singapore. The organization describes itself as “dedicated to setting global standards for digital intelligence.” (DQ Institute, 2023a) The DQ Institute published an earlier version of the framework as the DQ Global Standards Report 2019 (Park, 2019).

The other members of the Coalition for Digital Intelligence have higher profiles. The IEEE Standards Association develops industry standards related to electrical and electronics engineering (IEEE SA, 2023). The WEF is an international non-governmental organization for public-private cooperation composed of corporate ‘partners’ (World Economic Forum, 2023). The OECD is an intergovernmental organization composed of 38 member states, primarily located in Europe, North America and South America (OECD, 2023).

The DQ Framework uses the term ‘digital intelligence’, which it defines as

“... a comprehensive set of technical, cognitive, meta-cognitive, and socio-emotional competencies that enable individuals to face the challenges of and harness the opportunities of digital life.” (IEEE Computer Society, 2021, p. 11)

The DQ Framework separates digital intelligence into eight primary categories, each connected to a guiding principle based on the principle of respect, as used in the Universal Declaration of Human Rights. These are listed in Table 1. The framework describes its overall guiding principles as “respect for human rights, dignity, and worth of the person in all areas of their digital life.” (IEEE Computer Society, 2021, p. 15)

Table 1: DQ Framework’s eight types of digital intelligence
Digital intelligence area Guiding Principle

Digital identity
Digital use
Digital safety
Digital security
Digital emotional intelligence
Digital communication
Digital literacy
Digital rights

Respect for oneself
Respect for time and the environment
Respect for life
Respect for property
Respect for others
Respect for reputation and relationships
Respect for knowledge
Respect for rights

 

The DQ Framework’s eight areas are described with three proficiency levels each, referred to as “levels of maturity”. From lowest to highest, these are Digital Citizenship, Digital Creativity, and Digital Competitiveness. The authors refer to each of the 24 resulting combinations as a distinct competency. These are shown in Figure 1. 

Figure 1: DQ Framework’s 24 competences (Park, 2019)

 

Each competency has a general descriptor, and is further detailed with a list of knowledge, skills, and attitudes associated with each competency. An example of this for the competence related to direct communication is provided in Table 2 and Table 3.

Table 2: DQ Framework’s communication and collaboration competency
Digital intelligence area Competence Descriptor
Digital Communication DQ14 Online communication and collaboration Using technology effectively to communicate and collaborate collectively, including at a distance.

 

Table 3: Knowledge, skills and attitudes for DQ Framework’s communication and collaboration competency

Knowledge Skills Attitudes
Individuals understand how their online interactions might affect others’ feelings and recognize how others can be influenced by their online interactions (e.g., effects of online trolls). Individuals demonstrate socio-emotional skills by showing sensitivity to and respect for the perspectives and emotions of others. Individuals demonstrate an awareness of and compassion for the feelings, needs, and concerns of others online.

 

The stated goal of the DQ Framework is to aid the global coordination of improving digital literacy and competency by developing a standard that can be adopted by all stakeholders – including governments, companies and schools – and customized to fit local needs (IEEE Computer Society, 2021, p. 13).

The DQ Framework’s authors also state that it is an aggregation of over twenty leading digital skills frameworks, thus attempting to become a single framework that encompasses all of the digital skills identified by other major frameworks. The DQ Institute’s website provides an interactive tool that shows, for each of the skill areas included in the framework, which other major frameworks include that same skill (DQ Institute, 2023b). The referenced digital skills frameworks include each of the major frameworks discussed earlier in this paper: the EU’s DigComp, UNESCO’s Digital Literacy Global Framework, and the ISTE’s Standards for Students.

The DQ Framework was first published by the DQ Institute in 2019 under the name “DQ Global Standards Report 2019” (DQ Institute, 2019). After this publication, the DQ Institute partnered with branches of the IEEE, WEF and OECD to form the Coalition for Digital Literacy and publish the framework as the “DQ Framework”, IEEE standard 3527.1 (IEEE Computer Society, 2021).

The IEEE version of the framework stated that it would be reviewed annually, with updates made as necessary (IEEE Computer Society, 2021, p. 9). However, no subsequent updates have been published, and the coalition appears to no longer be active. As of this writing (in November, 2023), the website for Coalition for Digital Intelligence, at www.coalitionfordigitalintelligence.org, resolves to a DNS error. Data captured by the Wayback Machine suggests this has been the case since early 2021 (Internet Archive, 2023). The cause of this is unclear, but it is possible that the coalition has disbanded and future updates to the standard will not be developed. Related to this lack of updates to the framework itself, no validation studies or practical extensions have been published.

(Note: One day before submitting this paper, the DQ Institute’s website updated with an expanded version of the DQ Framework. However, it has not been published as an update to the existing IEEE standard. In order to not delay my submission of this paper any further, I made the decision to not carry out an analysis of this expanded version at this time.)

Separately, it is worth noting that the DQ Institute, starting in 2020, publishes a report called the Child Online Safety Index (c.f. Park et al., 2020). The report tracks how exposed children are to “cyber-risks”, such as bullying, exposure to highly violent or sexual media, and requests from online contacts to meet in person or exchange sexual photos. The report also tracks the efforts of governments to mitigate such risks. The insights contained in the report appear to have greatly influenced the content of the DQ Framework. This point will be explored in the analysis section.

CDL Framework

The second framework to introduce is the Digital Literacy Framework developed by the Center for Digital Literacy (CDL). CDL is a non-profit organization based in Seoul, South Korea. Its stated mission is to “Promote a better life for individuals and contribute to creating a better world for all,” and to create “digital humanitarians, ... people who use digital technology wisely for the health of society.” (Center for Digital Literacy, 2023) CDL created an updated version of their Digital Literacy Framework (referred to as the “CDL Framework” from here onward) this year, but have not yet published it on their website. The authors provided me with a copy of their framework through personal communication (Park I.J., personal communication, November 20th, 2023).

In the following video, CDL’s chairman describes the organization’s vision of digital literacy.

Media embedded December 16, 2023

 

The CDL Framework defines digital literacy as “All the necessary competencies for life in the era of digital transformation.” (Park I.J. & Kim M.E., document shared through personal communication, 2023) This is then broken down into seven literacy areas, shown in Table 4.

Table 4: CDL Framework’s areas of digital literacy
Digital literacy area

Digital Technology Literacy
Digital Data Literacy
Digital Content Literacy
Digital Media Literacy
Digital Communication Literacy
Digital Community Literacy
Digital Wellness Literacy

 

Each of the seven areas has a descriptor. Each area also has six skills (referred to as “indicators”), and each of these indicators also has an associated descriptor. CDL has represented these indicators and literacy areas visually as a house, as seen in Figure 2.

Figure 2: Visual representation of the CDL Framework

 

Table 5 provides an example of this for the ‘Communication literacy’ indicator.

Table 5: CDL Framework’s communication indicator
Digital literacy area Indicator Descriptor
Digital Communication Literacy Communication literacy Basic literacy skills of listening, speaking, reading, and writing,

 

The CDL Framework is relatively recent. The current version was created over the summer and is a significant change from an earlier version that’s available on CDL’s website. The framework does not yet have the types of details – such as various levels of proficiency or; specific knowledge, skills and attitudes associated with each indicator – that are found with leading frameworks from large organizations.

However, CDL has developed an inventory with 65 question items to measure each indicator included in the framework. Earlier this year, the organization partnered with two researchers to conduct a validation study of the inventory (Kim et al., 2023). This is a step towards academic acceptance that many large frameworks have not taken.

Section 3: Analysis of DQ Framework & CDL Framework

DQ Framework

The DQ Framework emphasizes health and safety, and child safety in particular in the included competences and in the report that introduces the framework. Competence 3, behavioural cyber-risk management, lays out many of the common risks that one can be exposed to online, including bullying, harassment, and stalking. Competence 5, digital empathy, describes skills related to understanding others emotions and knowing what impact our behaviours have on others. Competence 2, balanced use of technology, and competence 10, healthy use of technology, describe a balanced use of technology and understanding the impacts of technology on mental and physical health. Other competences describe identifying and avoiding harmful content, managing privacy, and managing one’s digital footprint. Digital footprint refers to “trails of information and corresponding metadata” that are produced by online activity and influence a user’s reputation. (IEEE Computer Society, 2021, p. 23)

The emphasis on child safety stems from the Child Online Safety Index (Park et al., 2020), a report published by the DQ Institute (one of the organizations that participated in developing the DQ Framework). This report is primarily a measure of how well each government addresses the issue of ‘cyber-risks’ faced by children online, such as education on the types of risks that exist, and having systems in place to address any specific cases of harmful content or behaviour that do occur.

However, the report also includes the results of a global survey of ‘over 145,000’ children surveyed across 30 countries about the types of risky behaviour they have engaged in online. Figure 3, from the report, provides a summary of the findings.

Figure 3: Child safety related findings from Park et al. (2020)

 

The report highlights what seems to be a blindspot in the concerns held by parents and educators. Media discussions about problematic online behaviour often relate to teenagers. Under that age, the assumption seems to be that limiting screen time is sufficient, and that parents don’t need to pay particular attention to what online social communities their children are participating in. The report’s findings however suggest that children are able to find and participate in all types of online communities from a young age.

In any case, the findings from the Child Online Safety Index appear to have strongly influenced the choice of competences to include in the DQ Framework.

A second point of interest of the DQ Framework is the use of guiding principles for each of the eight digital intelligence areas, as outlined in Table 1. Some of this amounts to borrowing human rights phrasing in order to impart a sense of importance. For example, the DQ Framework lists ‘respect for time’ and ‘respect for reputation’ as guiding principles for the areas of digital use and digital communication, respectively. However, none of the UN’s core human rights conventions – the Universal Declaration of Human Rights (United Nations General Assembly, 1948), the International Covenant on Civil and Political Rights (United Nations General Assembly, 1966), and the International Covenant on Economic, Social and Cultural Rights (United Nations General Assembly, 1966) – describe a right to time or a right to reputation.

The difference between the three tiers of proficiency included in the DQ Framework – Digital Citizenship, Digital Creativity, Digital Competitiveness – is large. Table 6 provides an example of this for the Digital Safety skill area. Given these differences, the tiers could be alternatively described as ‘beginner’, ‘intermediate’, and ‘advanced’, or ‘child’, ‘teen/adult’, and ‘industry professional’.

Table 6: DQ Framework’s digital safety intelligence area and associated competences and descriptors
Digital intelligence area Competency Descriptor
Digital safety DQ3 Behavioral cyber-risk management Identifying, mitigating, and managing cyber-risks (e.g., cyberbullying, harassment, and stalking) that relate to personal online behaviors.
DQ11 Content cyber-risk management Identifying, mitigating, and managing content cyberrisks online (e.g., harmful user-generated content, racist/hateful content, image-based abuse).
DQ19 Commercial and community cyber-risk management Identifying, mitigating, and managing commercial or community cyber-risks online, such as organizational attempts to exploit individuals financially or through ideological persuasion (e.g., embedded marketing, online propaganda, and gambling).

 

As Table 6 shows, the DQ Framework attempts to be applicable to both children, to offer methods to mitigate the risks described earlier, while also being relevant for adults and higher proficiency individuals. While admirable, I question the necessity of including the highest tier of proficiency (labeled as ‘Digital Competitiveness’). The use case that digital skills frameworks most naturally lend themselves to is one of general education, describing the skills that all young children should have, that all students should have, or that all workers in the modern economy should have, as some examples. By contrast, the DQ Framework’s highest tier of proficiency includes competencies like the following:

DQ20: Organizational cyber security management
Recognizing, planning, and implementing organizational cyber security defenses.

Such skills are highly specialized and generally possessed only by network security specialists. It is difficult to assume that now or in the near future an adult working in a typical service-economy role (a ‘white collar worker’) would require or even meaningfully benefit from possessing advanced cyber security skills. A digital skills framework cannot reasonably describe a set of skills that the general population would benefit from having while also outlining the curriculum needed for training workers for the types of highly skilled technical roles that a large organization would require. If the DQ Framework is attempting to achieve the latter, it is glaringly incomplete. It does not describe skills related to learning hardware or learning new software, and it makes only brief mentions of fundamental concepts for IT experts such as programming and algorithmic thinking.

CDL Framework

As noted earlier, the entire CDL Framework is provided in Appendix 1.

The Center for Digital Literacy’s Digital Literacy Framework (‘CDL Framework’) does not (yet) have breakdowns of each indicator into various levels of proficiency. However, instead of describing a general intermediate level of proficiency, the framework’s descriptors describe a decidedly above-intermediate level of proficiency.

Take, for instance, indicator 3.4, ‘Creating content’:

Generating information and creative ideas, and creating various types of high-quality, popular content by making full use of various digital technologies, tools, and techniques.

This is also noticeable in indicator 2.5, ‘Visualizing data’:

Translating complex data into intuitively understandable forms and conveying information effectively using various graphs and visualization techniques.

Targeting a higher level of proficiency with the general purpose descriptors for each indicator gives a clearer sense of the authors’ vision for highly developed digital literacy. At the same time, this produces a tradeoff with accessibility for anyone looking to create a curriculum or assessment tool for learners with low technological proficiency.

Another aspect of CDL Framework that sounds out when compared with other digital skills frameworks is the number of skills (also referred to as ‘indicators’ or ‘competences’ in other frameworks). The DQ Framework, introduced previously in this paper, has 8 competence areas. (If the proficiency levels in the DQ Framework are considered as separate competences, which the DQ Framework’s authors do, then it has 24 competencies.) Other leading digital skills frameworks include the ISTE’s Standards for Students, which has 28 skills, and DigComp 2.2, which has 21 competences.) The CDL Framework , with seven skill areas and six indicators for each, has a total of 42 indicators.

Having more skills than other frameworks is not necessarily a positive. An effective framework is wide-reaching but also is sufficiently concise, reducing unnecessary complexity by combining similar skills into a single item description or removing items that are not essential. This is certainly important with digital skills frameworks, as there already is an almost endless number of skills that could be included and new skills appear as digital technology advances.

This paper explores this point by examining some of the indicators included in the CDL Framework that are not commonly found in other digital skills frameworks. For reference, Section 4 contains a full keyword comparison of the CDL Framework and other selected frameworks.

Independent learning indicators

First, I’ll describe the indicators related to learning. Indicator 1.4, ‘Trying and assessing new technologies’, covers learning about technology itself. However, the framework includes another indicator related to learning in general, i.e. taking advantage of technological resources and tools to enhance learning of any topic. This is described in indicator 1.5, Independent Learning:

Independently learning new things without being taught, evaluating one's own learning outcomes and processes, and managing and developing learning capabilities.

Notably, neither ‘digital’ or ‘with technology’ appears in the descriptor. (This is a departure from many leading digital skills frameworks.) According to the framework’s author, Il Jun Park of the Centre for Digital Literacy (CDL), not including these phrases was an intentional choice. He argues that the vocational learning model of the past, where students acquired a set of specialized knowledge and skills which provided them with employment opportunities for their entire working life, is no longer realistic. Developments related to digital technology have shortened the shelflife of vocational skills, and today’s students will need to continually learn (or ‘upskill’) in order to maintain a set of skills that are relevant in the job market. Whether these workers use digital technology to support their learning or not, a need for continual learning has become the norm in modern developed economies, according to Park (personal communication, November 20th, 2023).

This broader perspective that Park takes, which is also shared by CDL as an organization, is the reason for choosing to not include ‘digital’ in many of the indicator descriptors. Tellingly, the framework defines digital literacy as:

All the competencies needed for life in the era of digital transformation.

As seen with the Independent Learning indicator above, the scope of this framework goes beyond competencies directly related to understanding or using technology. An additional example of this expanded scope is found in the description for Digital Media Literacy skill area:

The competency to understand media; search, record, store, share, and use information through media; adhere to ethics; and critically assess media.

The rationale behind defining the indicators in a broader sense is understandable, but this choice introduces a possibility of failing to sufficiently highlight the difference between digital and non-digital versions of the same core concept. Commonly accessible digital learning resources and methods can be significantly different from non-digital resources covering that same content. Digital media, such as social media, function very differently from traditional media. By taking the broadest possible definition of these concepts and others like them, there’s a loss of specificity and focus on the digital part of the digital era.

Arguing this point further, we can say that a well structured school curriculum would include content on meta-cognition and teaching students to be aware of their own learning habits and results. Similarly, media literacy is an established area with various curricula that educators can reference to incorporate relevant media skills into their lessons. Assuming that general concepts from the field of media literacy transfer over to the digital realm, then digital media literacy needs only outline how these skills relate to digital media in particular.

Interpersonal skills indicators

A second category of indicators not commonly found in other frameworks are those related to interpersonal skills, such as empathy, persuasion, managing conflicts, and emotional regulation. Similar to the skills described in the previous section, the descriptors for these skills make no direct mention of digital environments, digital communities, or digitally-mediated relationships.

Taking a narrow view, the inclusion of these skills can be justified without too much trouble. Digital media and tools put us in contact with far more people, with more diverse backgrounds and perspectives, than we would encounter without the use of internet-based communications. At the same time, such communications often take place in environments that are very different from the environments our species evolved in. In these situations, many forms of non-verbal communication that we rely on during in-person communication – including body language, posture, eye contact, facial expressions, tone of voice, and touch – is made less effective or impossible due to the communication channel in use.

Understanding the mind of others in the communication becomes more difficult when non-verbal cues aren’t available. Intentional empathizing, by actively trying to understand the perspectives of others, becomes more important. Interacting with people with whom we share fewer experiences and views can lead to more conflicts and disagreements, so managing such situations becomes more important, and strategies to do so within a digital community may differ in meaningful ways from strategies employed by in-person communities.

CDL, however, takes a broader perspective by viewing the inclusion of ‘digital’ in descriptors for soft skills as harmfully limiting the scope of the related concepts. Park provides three arguments to support this position.

First, digital interactions and communication are not a small subset of communication or a specialized use case. Many relationships between people in the modern world involve digital interactions and in-person interactions, and many relationships are entirely mediated by digital communication tools. People fluidly mix analogue and digital communication, such as chatting with a friend in person and then picking up the conversation later through text message, with little perceived difference between the forms. To Park, digital and analogue communication are functionally combined.

Second, the lines separating digital and analogue communication are increasingly blurred as technology advances. Tools such as video calls and conferences, combined with high resolution displays and high bandwidth connections and high quality microphones and cameras, provide many of the same affordances as analogue communication. The objective reality and the subjective experience of these interactions becomes a mix of digital and analogue.

Third, while people who were born before the era of mass adoption of smart devices have needed to acquire digital communication skills. By contrast, people who are born in the era of ever-present smart computing are generally highly familiar with digital communication but may need to acquire analogue communication skills. For instance, preferring to communicate through text messages can lead to problems for sensitive conversations that would be better suited to a phone call or an in-person meeting. To Park, digital literacy includes the ability to appropriately judge when to use digital tools and when to use analogue tools instead. In this context, the philosophy behind CDL’s definition of digital literacy – “All the necessary competencies for life in the era of digital transformation.” – becomes clearer.

Section 4: Keyword coding & comparison

This paper, together with its companion paper (my work from the previous semester), analyzes five digital skills frameworks in depth, describing their vision, structure and unique elements, while also critiquing various aspects of the design of each.

Another way to compare these frameworks is with comprehensive listing of each of the skills outlined in each framework, and providing an at-a-glance comparison of which skills are or are not included in each framework.

Before presenting my own listing of the skills described in each framework, I provide two examples of similar listings and comparisons done by other researchers, namely Law et al. (2018) and Lee and Fanea-Ivanovici (2022).

Law et al. (2018)

Law et al. (2018) is a report that presents a proposal for a new digital skills framework which the authors titled as a ‘Digital Literacy Global Framework’. The authors produced the report on behalf of UNESCO, as an effort to establish a tool that can be used to measure progress toward SDG 4’s indicator 4.4. (This indicator includes “technical … skills, for employment, decent jobs and entrepreneurship.)

Law et al. compared nine digital skills frameworks as part of their literature review. The individual skills they included are those from the framework they propose in the paper. The methodology employed was to identify the keywords from each skill (n = 21) and skill area (n = 7) in their own framework, and then count each instance of those keywords appearing in the analyzed frameworks. For instance, if the term ‘communication’ appeared in multiple areas of an analyzed framework, then count of each of those instances was made. This methodology produced a table that gives a sense of how much emphasis each skill or skill area was given in the analyzed frameworks. Figure 4 is my recreation of their result. Figure 5 shows the same data, but colour-coded as a ‘heat map’ to visualize the data. (The colour scheme I’ve used is inspired by the ‘Ironbow’ thermal palette popularized by the thermal cameras produced by the company Flir. For reference, Figure 6 shows a sample image using this thermal palette.)

Figure 4: Keyword coding of nine digital skills frameworks based on Digital Literacy Global Framework proposed by Law et al. (2018) – Table 3 from Law et al. (2018)
Figure 5: Data from Figure 4, with colour coding to show relative frequency of selected keywords
Figure 6: Reference for Figure 5 – An image from a thermal camera using the ‘ironbow’ thermal palette to indicate relative heat. Yellow represents the hottest part of the image, while purple represents the coldest part of the image. (Flir, 2023)

 

The authors summarized the findings by noting that the following competences appeared in the highest number of frameworks:

  • Hardware and software operations
  • Information and data literacy
  • Interacting through digital technologies
  • Developing digital content
  • Copyright and licenses
  • Protecting personal data and privacy
  • Identifying digital competence gaps

Additionally, Law et al. found that keywords related to the following three competences appeared most frequently, and note that each is related to technical competence related to productivity software most often performed on a desktop computer:

  • Developing digital content
  • Browsing, searching and filtering data, information and digital content

Lee & Fanea-Ivanovici (2022)

A second thematic comparison is provided in Lee & Fanea-Ivanovici (preprint, 2022). The researchers analyzed 15 digital skills frameworks, produced a comprehensive list of every skill listed in at least one of the analyzed frameworks, and noted which other frameworks each of those skills appear in. They identified 43 unique skills. Figure 7 shows the results of their comparison. The digital skills frameworks are listed in chronological order in the table, from earliest to most recent.

Figure 7: Keyword coding of 15 digital skills frameworks – Table 7 from Lee & Fanea-Ivanovici (2022)

 

The authors’ primary conclusion is that the number of skills included in frameworks has significantly increased from 2003 to 2022. They interpret this result as an indication of the increase in ways digital technology is used over that period.

However, the data presented by the authors provides weak support for this result. The most recent digital skills framework in their analysis, CDL’s Digital Literacy Framework (analyzed in Section 3), is unusual in the breadth of its scope. If the CDL Framework and the earliest framework in their analysis, COQS, are removed, the 13 remaining frameworks data show no pattern of an increase in skills between from 2013 to 2019. Figure 8 visualizes this result.

Figure 8: Total number of skills included in frameworks analyzed in Lee & Fanea-Ivanovici (2022). The earliest and most recent frameworks included in the study have been removed from this chart. A line of best fit is seen in light blue.

Keyword comparison

For this paper, I’ve carried out my own keyword comparison of the five frameworks stated in the introduction: the European Union’s DigComp 2.2, UNESCO’s Digital Literacy Global Framework (DLGF), ISTE’s Standards for Students (ISTE-S), DQ Institute’s DQ Framework, and Center for Digital Literacy’s Digital Literacy Framework (CDL Framework).

The methodology employed for this comparison was a keyword coding of each primary skill / competence listed in each of the five frameworks, based on the name of the skill itself and the descriptor provided within the framework. The keywords are listed according to broad thematic groupings, based on my determination.

Two of the frameworks, DigComp 2.2 and DQ Framework, include additional detail for each competence in the form of a listing of the knowledge, skills and attitudes associated with each competence. DigComp 2.2 includes even further detail in the form of a description of each competence across eight tiers of competence. None of this was considered when coding for keywords. This choice was made for two reasons.

First, the name and primary description of each competence in these frameworks provides the best insight into the intent of each framework’s authors. Three of the five frameworks analyzed do not (yet) have a detailed specification published that goes beyond the primary skills and one 20- to 40-word descriptor for each. Including the additional material provided by DigComp 2.2 or DQ Framework’s authors amounts to giving more consideration of the intent of those frameworks.

Second, for any educator or policy maker looking to make use of either framework, the competences and their primary description is what they would spend most of their time considering.

The results of my keyword comparison follow. Figure 9 shows keywords that I’ve related to ‘software and hardware’ and ‘information, media & data’. Figure 10 shows keywords related to ‘communication, content, collaboration, community’ and ‘health, safety & ethics’. Figure 11 shows keywords related to ‘identity’, ‘programming & computational thinking’, ‘learning’ and ‘design’. (A spreadsheet version of this data is available at
https://docs.google.com/spreadsheets/d/1S-x0SHprZr1peURxUOTCXdQH1HsqpzQj29LN01ZjbXk/edit?usp=sharing.)

Figure 9: Presence of keywords related to ‘software and hardware’ and ‘information, media & data’ within selected digital skills frameworks
Figure 10: Presence of keywords related to ‘communication, content, collaboration, community’ and ‘health, safety & ethics’ within selected digital skills frameworks
Figure 11: Presence of keywords related to ‘identity’, ‘programming & computational thinking’, ‘learning’ and ‘design’ within selected digital skills frameworks

 

With 38 skills identified, CDL Framework has many more skills included than the other frameworks analyzed for this paper. Table 7 provides a count of identified skills for each of these frameworks.

Table 7: Count of unique skill keywords included selected digital skills frameworks
Framework Number of skills
​DigComp 2.2 24
DLGF 24
DQ Framework 21
ISTE-S 25
CDL Framework 38

 

Various surface level differences between the frameworks become apparent when reviewing the keyword comparison. Only the DQ Framework and CDL Framework have multiple skills related to data. The DQ Framework has no competences that describe general skills related to using software and hardware, or related to programming and computational thinking. Design process skills only appear in ISTE-S.

However, since I’ve also read each of these frameworks in detail and written analyses of each, the results of the keyword analysis do not appear to be particularly informative. A simple keyword analysis hides much of the detail that’s needed for effectively evaluating and comparing these frameworks. Various examples of this follow.

The CDL Framework includes many more skills than the other frameworks partly because it takes items covered in one skill in other frameworks and turns them into multiple skills. Examples of this include the CDL Framework having the following three skills: ‘accepting technology’, ‘understanding benefits and harms [of technology]’, and ‘using technology to its fullest’. Most other frameworks have one skill that references productive use of technology, and it’s possible to interpret this skill in a way that would include acceptance and understanding of pros and cons, eliminating the need to describe these as separate skills. If the CDL Framework offered a detailed breakdown of their indicators, similar to how the DQ Framework details each of its competences with knowledge, skills and attitudes, then it would be possible to assess if the CDL Framework’s choice to create these three separate indicators provides a meaningful improvement over other frameworks. Since the CDL Framework does not include this level of detail, the differences appear to be one of categorization.

The other factor resulting in the CDL Framework having many more skills is the high proficiency level targeted in its descriptions, leading to descriptors that state many component skills. Each of the other frameworks describe a general level of proficiency within the primary descriptions of their skills.

DigComp 2.2, by contrast, has 24 competences. However, DigComp is part of the Key Competences for Lifelong Learning, a collection of frameworks developed by the EU. Taken together, these frameworks include many of the competences shown in Figures 9, 10 and 11, including mental and physical health, social skills, learning habits, core (traditional) literacy skills, and data analysis and data-based decision making. If included, the collection of frameworks would have more competences than the CDL Framework. Deciding where exactly to draw the line and determining what to include or exclude in this analysis is a somewhat arbitrary decision and skews the data in one way or another.

A separate reason for not fully trusting the keyword comparison is its limitations in properly capturing the intent of the authors. For a young field such as digital skills, definitions of core terms have not yet been well-established, and interpretations of these core terms can differ greatly. For instance: the CDL Framework is the only framework of those analyzed for this paper that did not have any keywords regarding ‘modifying content’, despite having six indicators dedicated to digital content that are all highly detailed. CDL Framework’s author, Park Il Jun, argues that ‘creating content’ and ‘modifying content’ are not meaningfully different, and so a separate indicator related to ‘modifying content’ is unnecessary (Park I.J., personal communication, November 20th, 2023).

Finally, a keyword comparison obfuscates the areas of relative focus of each framework, and in doing so it hides their strengths. The keyword analysis doesn’t make clear the focus that the ISTE-S has on students in the classroom or the emphasis in DigComp 2.2 on vocational skills. The gaps that authors of the DQ Framework identified and sought to address – child safety – or those addressed by the DLGF – low-proficiency learners with limited access to technology – are hidden in the keyword analysis results.

As a concluding note, this keyword analysis process was highly informative for me. I carried out the analysis believing that the result would offer a great insight into the compared frameworks. The individual, in-depth analysis of each framework provided far more information, but I only know that now after conducting both analyses.

Summary

I turn now to the research questions introduced at the start of this paper.

Research question #1: Which digital skills framework is best suited for teachers who want to include instruction on digital skills in their curriculum?

For a teacher looking for the most practical and accessible framework to reference, the DQ Framework would be the better fit. It provides detailed breakdowns of knowledge, skills and attitudes for each competence, and an experienced teacher can use these details as productive starting points to build lessons around. The CDL Framework, in its current form, does not provide such details, meaning a teacher would have to put in a significant amount of work to create these breakdowns of core skills to impart before developing a lesson plan to teach those skills.

Research question #2: Which digital skills framework is best suited for researchers who want to assess digital skills of a target population or assess the effectiveness of a given digital skills curriculum?

For a researcher looking to assess the digital skills of a population, the CDL Framework provides better support. The authors have published an internal validation study on a set of inventory questions that can be used to quickly determine the self-reported skills of study participants. However, there is currently no assessment tool associated with this framework.


References

Center for Digital Literacy. (2023). Center for Digital Literacy. https://sites.google.com/view/cdlorg/

Dijk, J. van. (2020). The Digital Divide. John Wiley & Sons.

DQ Institute. (2023a). About. DQ Institute. https://dqinstitute.org/about/

DQ Institute. (2023b). Digital Literacy. DQ Institute. https://dqinstitute.org/global-standards/

Ferrari, A. (2013). DIGCOMP: A Framework for Developing and Understanding Digital Competence in Europe. JRC Publications Repository. https://doi.org/10.2788/52966

Hiebert, J., & Grouws, D. A. (2007). The effects of classroom mathematics teaching on students’ learning. Second Handbook of Research on Mathematics Teaching and Learning, 1(1), 371–404.

IEEE Computer Society. (2020). IEEE Standard for Digital Intelligence (DQ)–Framework for Digital Literacy, Skills, and Readiness. IEEE Std 3527.1-2020, 1–47. https://doi.org/10.1109/IEEESTD.2021.9321783

IEEE SA. (2023). IEEE - About us. https://standards.ieee.org/about/

Internet Archive. (2023). Squarespace - Website Expired. https://web.archive.org/web/20210223043922/https://www.coalitionfordigitalintelligence.org/

Israel, M., Pearson, J. N., Tapia, T., Wherfel, Q. M., & Reese, G. (2015). Supporting all learners in school-wide computational thinking: A cross-case qualitative analysis. Computers & Education, 82, 263–279. https://doi.org/10.1016/j.compedu.2014.11.022

Kim, H. C., Lim, J. Y., Park, I., & Kim, M. (2023). A Study on the Development and Validation of Digital Literacy Measurement for Middle School Students. Journal of The Korea Society of Computer and Information, 28(9), 177–188. http://dx.doi.org/10.9708/jksci.2023.28.09.177

Law, N., Woo, D., de la Torre, J., & Wong, G. (2018). A Global Framework of Reference on Digital Literacy Skills for Indicator 4.4.2. UNESCO Institute for Statistics. https://unevoc.unesco.org/home/Digital+Competence+Frameworks/lang=en/id=4

Lee, Y.-T., & Fanea-Ivanovici, M. (2022). An Exploratory Study of Digital Literacy Frameworks: A Comparative Analysis (SSRN Scholarly Paper 4088293). https://doi.org/10.2139/ssrn.4088293

OECD. (2023). List of OECD Member countries - Ratification of the Convention on the OECD. https://www.oecd.org/about/document/ratification-oecd-convention.htm

Park, Y. (2019). DQ Global Standards Report 2019 (p. 61). https://www.dqinstitute.org/wp-content/uploads/2019/03/DQGlobalStandardsReport2019.pdf

Park, Y., Gentile, D. A., Morgan, J., He, L., Allen, J. J., Jung, S. M., Chua, J., & Koh, A. (2020). 2020 Child Online Safety Index: Findings & Methodology. DQ Institute. https://www.dqinstitute.org/impact-measure/

Park, Yuhyun Park. (2019). DQ Global Standards Report 2019. DQ Institute. https://www.dqinstitute.org/wp-content/uploads/2019/03/DQGlobalStandardsReport2019.pdf

Picking a Thermal Color Palette. (2023). Teledyne Flir. https://www.flir.com/discover/industrial/picking-a-thermal-color-palette/

United Nations General Assembly. (1948). Universal Declaration of Human Rights. United Nations General Assembly. https://www.ohchr.org/en/human-rights/universal-declaration/translations/english

United Nations General Assembly. (1966a). International Covenant on Civil and Political Rights. United Nations General Assembly. https://www.ohchr.org/en/instruments-mechanisms/instruments/international-covenant-civil-and-political-rights

United Nations General Assembly. (1966b). International Covenant on Economic, Social and Cultural Rights. United Nations General Assembly. https://www.ohchr.org/en/instruments-mechanisms/instruments/international-covenant-economic-social-and-cultural-rights

Wood, D., Bruner, J. S., & Ross, G. (1976). The role of tutoring in problem solving. Child Psychology & Psychiatry & Allied Disciplines, 17(2), 89–100. https://doi.org/10.1111/j.1469-7610.1976.tb00381.x

World Economic Forum. (2023). Our Mission. https://www.weforum.org/about/world-economic-forum/


Appendix 1: Center for Digital Literacy’s Digital Literacy Framework (‘CDL Framework’) (2023)

Digital intelligence area

Competency Descriptor
Digital Literacy Digital Literacy All the competencies needed for life in the era of digital transformation.
Digital Technology Literacy 1 Digital Technology Literacy The competency to accept new technologies without fear, to understand pros and cons of technology, their functions, and their uses, and to productively use them in an integrated way to solve real-life problems.
1.1 Accepting technology Being curious about and willing to learn and accept new digital knowledge and technology without anxiety or unfounded fear.
1.2 Understanding concepts, functions and trends of technology

Understanding fundamental digital concepts, quickly learning various functions of each piece of software and device, and quickly identifying the latest digital trends.

1.3 Understanding benefits and harms of technology

Understanding the positive and negative effects of new digital technologies, and working to maximize the positive effects minimizing the negative effects.

1.4 Trying and assessing new technologies

Exploring and experiencing new digital technologies (including devices, tools, media, and methods) at an early stage, evaluating the pros and cons of each technology, and determining which technologies are necessary and appropriate for oneself and one's community.

1.5 Independent learning

Independently learning new things without being taught, evaluating one's own learning outcomes and processes, and managing and developing learning capabilities.

1.6 Using technology to its fullest potential

Using a wide range of digital technologies; productively integrating technology into work, study and daily life; and using technology to develop new solutions.

Digital Data Literacy 2 Digital Data Literacy The competency to efficiently and ethically collect, analyze, interpret, and visualize data based on an understanding of data principles, and to make rational and creative decisions, and develop strategies and plans to solve real-life problems.
2.1 Understanding data

Understanding the significance and value of data; identifying its attributes, types, formats, and quality; and understanding how data is generated and managed.

2.2 Adhering to data ethics Adhering to data ethics principles and procedures in every aspect of collecting, processing, analyzing, and using data.
2.3 Gathering data Efficiently collecting, processing, and effectively managing data to address clear, specific, and solvable problem definitions and hypotheses.
2.4 Analyzing & interpreting data

Analyzing collected data based on logical thinking, deriving meaningful information based on subject area knowledge, and wisely interpreting data based on past experience.

2.5 Visualizing data

Translating complex data into intuitively understandable forms and conveying information effectively using various graphs and visualization techniques.

2.6 Using data to make decisions

Making rational decisions based on information obtained through data analysis and interpretation, suggesting creative alternatives, and solving problems by establishing strategies and plans.

Digital Content Literacy 3 Digital Content Literacy The competency to consume diverse content, judge authenticity and bias of content, plan and create ethical content with a sense of social responsibility, and protect the rights of one's own content.
3.1 Consuming balanced content

Understanding the importance of balanced digital content consumption, finding content by oneself, through media functions, and system algorithms, and consuming content of various fields, perspectives, and formats in a broad and balanced manner.

3.2 Understanding authenticity of information & bias

Actively avoiding biased thinking by being aware of one's own information biases,

separating truths from falsehoods in content,

and distinguishing between factual information and mere claims.


 
3.3 Planning content

Creating digital content by systematically planning in advance,

measuring and evaluating the produced content,

and using those results to revise and improve plans for future content.

3.4 Creating content

Generating information and creative ideas,

and creating various types of high-quality, popular content by making full use of various digital technologies, tools, and techniques.


 
3.5 Adhering to content creation ethics

Making use of only reliable and accurate information,

respecting human rights, individual privacy, and diversity,

and striving to have a positive impact on society when creating content.


 
3.6 Protecting copyright of own content Protecting one's own copyrights through legal systems, technologies, and alternative licensing methods while sharing and promoting one's content widely.

Digital Media Literacy

4 Digital Media Literacy The competency to understand media; search, record, store, share, and use information through media; adhere to ethics; and critically assess media.
4.1 Understanding media Understanding the characteristics and format, social function and influence, and business model and profit structure of digital media.
4.2 Adhering to media ethics Understanding the social roles, values, and responsibilities of digital media and putting them into practice when using personal media.
4.3 Making productive use of media Utilizing digital media for a variety of purposes, including information gathering, communication, cultural exchange, daily activities, self promotion, and financial transactions.
4.4 Searching for information

Efficiently searching for information using digital media and finding the key information among the search results.

4.5 Critically assessing media Making independent and objective judgments by combining various facts and perspectives on information obtained from digital media.
4.6 Recording, storing and sharing information Recording, storing, and sharing data and information safely, efficiently, and effectively through various forms and methods.
Digital Communication Literacy 5 Digital Communication Literacy The competency to empathize with and express sympathy for individuals and groups in digital environments, resolve online issues and conflicts, and digitally collaborate while efficiently and effectively solving any problems that arise.
5.1 Communication literacy

Basic literacy skills of listening, speaking, reading, and writing,

as well as the ability to communicate effectively in various ways through each type of media.


 
5.2 Empathy Taking interest in others, understanding their positions, empathizing with their emotions, and communicating this empathy through both verbal and non-verbal means.
5.3 Persuasion Influencing the thoughts, feelings, and behaviors of others through effective messaging and practical actions.
5.4 Managing disagreements and disputes

Discovering disagreements and disputes early, analyzing their causes and expected effects, responding efficiently and effectively according to plans and procedures, and resolving them wisely.

5.5 Resolving conflicts

Analyzing the causes of conflict based on a correct understanding of conflict, practicing appropriate conflict response methods, and wisely resolving conflicts and turning crises into opportunities.

5.6 Collaboration Using effective types of collaboration and efficient methods of communication to solve problems and achieve goals based on a proper understanding of collaboration.

Digital Community Literacy

6 Digital Community Literacy The competency to be considerate and respect others in digital environments; follow laws, norms, and etiquette in digital interactions; contribute to digital communities through effort and dedication, and live in harmony with others as citizens of the digital age.
6.1 Respect and consideration Recognizing diversity and viewing others' thoughts, feelings, positions, and situations with respect and consideration.
6.2 Digital etiquette

Understanding correct digital etiquette for each digital device, media, and tool and reflecting this etiquette in one's communication and behaviour.

6.3 Adhering to laws and regulations

Adhering to domestic and international laws and discipline-specific regulations in order to maintain social order, protect individual rights and interests and promote fair competition.

6.4 Social participation

Participating and engaging in various aspects of society, such as politics, economy, and culture, and through them contributing to innovation and solving social problems.

6.5 Fostering social inclusion

Forgiving others' mistakes and wrongdoings, and utilizing one's abilities and resources to practice giving and volunteering.

6.6 Upholding social responsibilities

Understanding one's role and responsibility as a citizen of a digital society, contributing to the creation of social values, and contributing to the well-being and sustainable development of the social community.

Digital Wellness Literacy

7 Digital Wellness Literacy The competency to manage one's use of technology and protect oneself based on a healthy personal and social identity, to use technology to foster positive life outcomes, and to use digital technology to help realize one's dreams.
7.1 Managing personal identity

Presenting oneself to digital communities while maintaining one's identity in various social relationships, based on solid self-awareness and well-formed values.

7.2 Managing health, habits and social identity

Achieving a balance in life including study, work, and leisure, maintaining a healthy lifestyle, and managing one's public image and reputation effectively.

7.3 Self-protection

Recognizing personal risks and dangers related to use of digital technology, protecting oneself from them, and receiving help from others and professionals when needed.

7.4 Emotional regulation

Recognizing and regulating one's emotions, maintaining resilience in negative situations, comforting oneself, and explaining and expressing one's emotions.

7.5 Self-improvement Achieving self-improvement by using innovative technology to support goals, ideas, plans, and actions.
7.6 Self-actualization

Recognizing one's own potential, realizing life's dreams and plans through technology, and improving the quality of one's life.