Analíticas de aprendizaje en la Educación Primaria y Secundaria en España: Una revisión sistemática de la literatura
Belén Donate-Beby
Departamento de Informática y Automática, Universidad de Salamanca, Salamanca, España
https://orcid.org/0000-0002-2722-1140 | belendonate@usal.es
Francisco José García-Peñalvo
Departamento de Informática y Automática, Universidad de Salamanca, Salamanca, España
https://orcid.org/0000-0001-9987-5584 | fgarcia@usal.es
Daniel Amo-Filva
Departamento de Ingeniería, Universidad Ramón Llull, Barcelona, España
https://orcid.org/0000-0002-4929-0438 | daniel.amo@salle.url.edu
Recibido: 29/09/2023Aceptado: 16/10/2023
Abstract
Learning analytics is defined as the measurement, collection, analysis, and presentation of data about learners and their contexts to understand and optimize learning and the environments in which it occurs. Although their usefulness could be fundamental to recognize students’ learning processes, there is no clear framework on the current state of development of learning analytics in the K-12 Spanish territory. The present work aims to increase knowledge on the empirical frame of the question through a Systematic Literature Review (SLR). The methodology follows the indications provided by the PRISMA procedure. As a result, 16 papers have been selected and analyzed using different research indicators. The most significant findings within the selected papers are a lack of research where teachers have maintained an active role in the development of Learning Analytics in the natural educational context. Also, it has been found a tendency for the prediction and improvement of student engagement and performance on Game Learning Analytics in different knowledge or competencies.
Keywords
Learning methods, education, innovation behavior, data processing, data visualization.
Resumen
Las Analíticas del Aprendizaje (AA) se definen como la medición, recopilación, análisis y presentación de datos sobre los estudiantes y su entorno con el fin de comprender y optimizar el aprendizaje y los contextos educativos. A pesar de su potencial importancia, se carece de un marco claro para las AA en la Educación Primaria y Secundaria en España. Este estudio busca enriquecer el conocimiento mediante una Revisión Sistemática de la Literatura (SLR) siguiendo el procedimiento PRISMA. Se analizaron 16 artículos científicos, revelando la falta de participación activa de los docentes en el desarrollo de AA en entornos educativos tradicionales. Además, se destacó la tendencia a usar AA para predecir y mejorar el compromiso y rendimiento de estudiantes a través de juegos didácticos en diversas áreas.
Palabras clave
Métodos de aprendizaje, educación, comportamiento innovador, tratamiento de datos, visualización de datos.
The COVID-19 pandemic has led to significant changes in primary and secondary education, posing significant challenges for schools around the world. In early 2020, more than 1.5 million students were forced to learn remotely due to school closures schools (UNESCO, 2020). This emergency has prompted educational institutions to adopt digitalized educational models for distance learning. Although initially only an emergency response measure, this move towards emergency distance learning (ERT) has the potential to become a permanent part of the education system by measuring its effectiveness dynamics during lockdown (Hodges et al., 2020).
These changes have led to adjustments in teaching methods, requiring teachers and students to adapt to new digital learning environments. These adaptations include blended learning approaches, online assessments, and digital skills development (García-Peñalvo et al., 2021; García-Peñalvo & Corell, 2020).
To respond to these changing challenges, researchers have developed analytical methods based on data analytics to improve teaching and learning processes. Learning Analytics (LA) and Educational Data Mining (EDM) have become essential tools (Siemens, 2012). While EDM relies heavily on computing and automation (López Zambrano et al., 2021), LA goes further requiring human interpretation and judgment. Learning Analytics, as defined at the first LAK conference (Society for Learning Analytics Research [SoLAR], n.d.), includes measuring, collecting, analyzing, and presenting data about learners and their contexts to understand and optimize learning environments. Essentially, LA leverages Big Data and data mining techniques to improve learning assessment through human interpretation (Serrano-Lagunaa et al., 2014). This has also been shown to be instrumental in solving the problem of school dropouts (Khalil & Ebner, 2015).
The development of the field of LA is strongly influenced by networks of researchers. In the Spanish territorial context, the Spanish Network for Learning Analytics (SNOLA) stands out. This research’s community is recognized by the Ministry of Economy and Competitiveness and aims to promote collaboration between researchers, businesses, educational institutions, and international networks. It also seeks to disseminate and organize initiatives related to learning analytics on a national scale (Spanish Network Of Learning Analytics [SNOLA], n.d.).
SNOLA includes 34 research streams, including predictive LA systems, visual analytics, multimodal and contextual data, supporting active learning strategies, LA for learning design (LD), supports assessment and analysis of emotions and feelings.
Incorporating learning analytics and predictive models in K-12 education can enable early identification of students at risk of failure, allowing teachers to devote more time to these students and apply novel strategies to improve their skills (Figueiredo et al., 2019; García-Peñalvo et al., 2018)
One area of research is predictive learning analytics systems, which focus on predicting learning variables such as student performance or risks associated with the learning process (Peña-Ayala, 2018). However, multimodal learning analytics (MMLA) is gaining ground due to its ability to leverage data from various educational applications to improve learning and contextual understanding (Celdrán et al., 2020).
Another area of research focuses on supporting active learning strategies, promoting collaborative learning, adaptive learning, peer feedback, social learning, and flipped classrooms. Research is also moving towards developing indicators to assess 21st century skills and evaluate work teams. Additionally, there is an increasing emphasis on using AL for learning design (LD), exemplified by the work of Michos et al. (2018), Michos & Hernández-Leo (2020) sought to evaluate and establish a framework for implementing teacher-oriented AL according to the principles of Learning Design.
In particular, Game Learning Analytics (Ruipérez-Valiente & Kim, 2020) is an emerging trend. It explores the parameters of the game and reveals the potential of LA by providing insight into how students approach learning tasks beyond their performance. Early research such as the European Commission Survey on ICT in Schools in Education (Wastiau et al., 2013) has highlighted the importance of digital technologies as teaching tools in education. secondary education. In contrast, the Horizon 2014 report on K-12 contexts (Johnson et al., 2016) emphasized that AL is more widely used in academic settings.
The main potential of AL in pre-university education lies in the wealth of digital data generated by students’ digital activities (Schmid & Petko, 2019). This wealth of data allows for deeper analysis and better use to improve the learning process.
However, because the majority of K-12 students are minors, considerations related to privacy, confidentiality, security, and ethics are important. It is possible to create a safe environment for using student data in an online context while respecting security and ethical standards (Amo, Prinsloo, et al., 2021). In contrast, educators must possess a solid understanding of data literacy in order to effectively interpret or convert raw educational data and analytical outcomes derived from contemporary digital educational tools. (Amo-Filva et al., 2022; Donate-Beby et al., 2023)
Some authors argue that emerging LA practices go beyond policy and institutional regulatory frameworks (Willis et al., 2016). They argue that these practices may not provide adequate guidance and protection for students, who are often considered passive data subjects.
To further advance LA, especially in K-12 education, additional efforts are needed to disseminate LA research to teachers (Amo, Alier, et al., 2021). The research hypothesis of this study suggests that LA is an emerging field in K-12 Spanish education. In summary, this Systematic Literature Review (SLR) aims to explore the development of K-12 learning analytics in Spain by extending a Systematic Mapping Review (SMR) (Donate et al., 2021).
Focusing on the Spanish context, this SLR seeks to identify needs, trends and solutions appropriate to the cultural and social context of Spanish Education. Furthermore, it aims to explore knowledge gaps and research needs, providing an overview of the current state of the Learning Analytics field from its inception to the present. The reliability of this Systematic Review is based on transparency, including the definition and sharing of the review process (Marcos-Pablos et al., 2020; Page et al., 2021), ensuring the ability to reproduce the results and promote further developments in this field.
Thus, this paper is organized into the following sections. This first section as context introduction. A second section methods subject to PRISMA considerations (Page et al., 2021) where the research questions, the search strategy, the description of inclusion and exclusion criteria and the selection process. The third section is the results and discussion, where the information extracted from the research indicators is presented and reflected. A fourth and final section is the conclusion, where the results are synthesized and analyzed, identifying the limitations and future directions of the present work.
This section outlines the methodological approach for conducting a Systematic Review according to the PRISMA Statement, which is an updated and extended version of QUOROM (Quality of Reporting of Meta-analyses Statement) (Urrutia et al., 2005). Similarly, the conduct of the systematic review will be subject to the considerations expressed by Brereton et al. (2007). This systematic review includes both quantitative and qualitative articles as part of its mixed typology. The protocol for this Systematic Literature Review includes establishing various key elements:
•Scope of the review.
•Time frame.
•Inclusion and exclusion criteria.
•Quality criteria.
•Data sources.
•Search terms.
•Canonical search equation.
In this way, the aim is to synthesize the available evidence on aspects derived from the four research questions, quantifying, classifying, and analyzing the concepts found to interpret what has been researched and published so far. Following the conceptual aspects of PRISMA (Page et al., 2021), a systematic review should include research studies of scope and quality. Therefore, the original review protocol will be subject to modification.
This paper aims to investigate the current trends in education and research practice in K-12 learning analytics research in Spain. The formulation of research questions intends to understand the current practice and identify discrepancies between assumptions and reality. To further this objective, the following table presents the research questions which results from the extension of the work (Donate et al., 2021).
Table 1. Mapping and Research Questions
Research Questions |
RQ1: What are the most significant research achievements? |
RQ2: What are the main limitations of Learning Analytics research? |
RQ3: Does research currently transfer into educational practice? |
RQ4: What methods or techniques have been engaged in the use of Learning Analytics? |
To define this work’s scope PICOC framework has been followed (Petticrew & Roberts, 2008; Richardson et al., 1995).
•Population: corresponds to pupils in the K-12 stages of compulsory schooling. Synonyms and related terms have been defined for these stages (high school, school, classroom, primary and secondary education).
•Intervention or Exposure: this is an extension of a Systematic Mapping Review that allows us to know the state of the field of learning analytics in this population. To further deepen the thematic approach, a Systematic Literature Review will be performed in three phases: planning, implementation, and reporting.
•Comparison: the work outcomes obtained will be compared in terms of their relevance and limitations for their transfer to practice.
•Outcomes: according to the criteria of reproducibility and rigor, it will follow the indications proposed by to evaluate the work developed during and after the process.
•Context: this work covers the Spanish educational context, given that there is a certain homogeneity in educational practice.
The criteria for inclusion and exclusion of articles are decisive for the assessment of the suitability of the articles. These criteria have been defined to allow the articles’ inclusion regardless of the publication’s language. The Capitalised Learning Analytics (CLA) concept has been defined back in 2011 for the first LAK (Siemens, 2012).
Therefore, the timeframe has been defined from 2010, a year before the consolidation of the term, to the present (2023). The main reason for this decision has been to ensure that no work has been overlooked in the article selection process.
Table 2. Inclusion and Exclusion Criteria
Inclusion Criteria |
Exclusion Criteria |
Papers published in journals, conferences, and books |
Reports, dissertations, government documents or others |
First-order empirical research and second order works |
Non-empirical research |
Full text available |
Incomplete or unavailable text |
Pre-school, primary and secondary schools |
Universities and non-formal learning contexts |
K-12 investigations in the Spanish context |
K-12 oriented research out of school settings in Spain |
Publications between 2010 and 2023 |
Publications before 2010 |
The scientific article focuses on the learning analytics development at K-12 levels in Spain. The search strategy is designed through search terms definition, data sources, and the canonical search equation. Furthermore, Quality Criteria have been established for the final selection of articles. The bibliographic search encompasses articles from scientific journals, conferences, and books, utilizing databases such as Web of Science, Scopus, and Dialnet. Keywords such as “Learning Analytics,” “Education,” “Data Processing,” “School,” and “High School,” along with their synonyms, have been combined using Boolean operators (AND, OR). In addition, the definition of search strings aims to capture as many papers as possible, and parentheses are employed in the search strategy to indicate the order of execution. The following tables show how these terms have been combined as a canonical search equation in the databases mentioned above. This procedure aims at promoting the reproducibility of the searches (Page et al., 2021).
Table 3. Search sequences in Databases
Database |
Search terms |
Results |
Total |
Scopus |
Search 1: TITLE-ABS-KEY (“Learning Analytics” AND (“Primary Education” OR “Secondary Education”)) |
73 |
646 |
Search 2: TITLE-ABS-KEY (“Learning Analytics” OR “Data Processing” OR “Data Visualization” ) AND ( “Primary Education” OR “Secondary Education” ) |
501 |
||
Search 3: TITLE-ABS-KEY (“Learning Analytics” AND “Learning Methods” AND “Education”) |
72 |
||
Web of Science |
Search 1: TS= (“learning analytics” AND secondary AND education) OR TS= (“learning analytics” AND primary AND education) |
213 |
478 |
Search 2: (TS=(“Learning Analytics” AND “Education” AND “Innovation Behavior” OR “Data Processing” OR “Data Visualization”) AND TS=(“secondary education” OR “primary education”)) |
265 |
||
Dialnet |
Search 1: Contiene palabras= (“Learning analytics”) AND ((secondary AND education) OR (primary AND education)) |
20 |
109 |
Contiene palabras= (“Learning analytics”) AND (“High school” OR School OR “K-12”) |
27 |
||
Contiene palabras=(“Learning Analytics” AND classroom) |
62 |
Quality criteria have been outlined below to screen out primary sources that fulfil the inclusion criteria but may have limitations and not provide sufficient answers to the research questions. For this purpose, a checklist design has been developed to check the relevant aspects of each selected article (Table 4). The checklist follows a Trivalued metric scale (Cruz-Benito et al., 2019), where the score depends on the fill level. Therefore, Yes/No/Partial fulfilling the criteria would mean 1, 0.5 or 0 points. A total of 10 items have been developed, establishing that the works included at least must reach a score of 7.5.
Table 4. Quality Criteria
Question |
Score |
Are the research aims related to K-12 Learning Analytics clearly specified? |
Yes/No/Partial |
Was the study designed to achieve these aims? |
Yes/No/Partial |
Is the learning analytic design process clearly described and justified? |
Yes/No/Partial |
Are the research process clearly specified and justified? |
Yes/No/Partial |
Are the work methodology and results transparent and reproducible? |
Yes/No/Partial |
Is the work sampling transparency? |
Yes/No/Partial |
Are the links between data, interpretation and conclusions made clear? |
Yes/No/Partial |
Do the researchers discuss any problems in the development of learning analytics? |
Yes/No/Partial |
Are data presented on the evaluation of the proposed solution? |
Yes/No/Partial |
Are all research questions answered adequately? |
Yes/No/Partial |
The bibliographic selection process has been carried out following the PRISMA procedure (Page et al., 2021) and the inclusion and exclusion criteria represented in Table 2. Likewise, it’s reflected in the flow chart corresponding to Figure 1.
Figure 1. PRISMA 2020 Flow Diagram
The bibliographic selection model proposed by PRISMA consists of a series of phases that sequentially present the steps carried out for the articles’ selection. These phases consist of: identification of papers, screening through criteria, assessment of their suitability and final inclusion of articles (Page et al., 2021).
At first, the articles under analysis were identified, obtaining a series of records through the search sequences. In this phase, a “search refinement” was established on an initial screening according to the inclusion and exclusion criteria. Afterwards, duplicates were scrapped via Zotero.
Subsequently, bibliographic eligibility has been assessed following the inclusion and exclusion criteria. With this purpose in mind, a first approach has been made to evaluate the abstract, title, and keywords. On a later phase, carrying out a more detailed analysis, delving into the structure and content of the papers. Finally, 16 papers were selected.
A summary table has been set up (Table 5) to establish an overall view of the works found (authors, year of publication, titles of the works).
Table 5. Final Selection
Number |
Cite |
Title |
1 |
Learning Analytics and Educational Games: Lessons Learned from Practical Experience |
|
2 |
How was the activity? A visualization support for a case of location-based learning design |
|
3 |
Contextualizing learning analytics for secondary schools at micro level |
|
4 |
Teacher-led inquiry in technology-supported school communities |
|
5 |
Diseño de un estudio exploratorio para la aplicación de técnicas de analíticas de aprendizaje en la enseñanza de las fracciones en 5° curso de Educación Primaria |
|
6 |
Clickstream for learning analytics to assess students’ behavior with Scratch |
|
7 |
Fundamentos de diseño de un entorno tecnológico para el estudio de las habilidades en resolución de problemas en primeras edades escolares |
|
8 |
Measuring motivation from the Virtual Learning Environment in secondary education |
|
9 |
Use of Computing Devices as Sensors to Measure Their Impact on Primary and Secondary Students’ Performance |
|
10 |
Assessment of the Effects of Digital Educational Material on Executive Function Performance |
|
11 |
CIDA: A collective inquiry framework to study and support teachers as designers in technological environments |
|
12 |
Data Capture and Multimodal Learning Analytics Focused on Engagement with a New Wearable IoT Approach |
|
13 |
Validation of a Cyberbullying Serious Game Using Game Analytics |
|
14 |
Creating awareness on bullying and cyberbullying among young people: Validating the effectiveness and design of the serious game Conectado |
|
15 |
The Associations Between Computational Thinking and Creativity: The Role of Personal Characteristics |
|
16 |
Building personalised homework from a learning analytics based formative assessment: Effect on fifth-grade students’ understanding of fractions |
To answer and deepen this question (RQ1), an analysis of the results of the selected studies has been presented. The following text summarizes the abstract, objectives and main results of each selected work.
1.Learning Analytics in Learning Management System (LMS) platforms highlights interaction potential but acknowledges differences among institutions (Serrano-Lagunaa et al., 2014).
2.Melero et al. (2015) conducted a case study on location-based learning using the “QuesTInSitu: The Game” app. Visualizing learning data improved teacher actions and student performance.
3.Sancho et al. (2016) presents the research project PILARES, which aims to create a learning analytics platform for Spanish secondary schools, primarily targeting the micro level involving students, teachers, tutors, and families.
4.Michos et al. (2018) explored teacher-led research through Technology Enhanced Learning (TEL), emphasizing technology and data visualization for reflective practice.
5.Rodríguez et al. (2018) evaluated individualized feedback via ARS (clickers) and suggested compatibility with data analysis tools.
6.Amo-Filvà et al. (2019) analyzed student behavior in programming activities through data analysis, providing support for teachers.
7.Diago et al. (2019) introduced a technological environment for assessing student problem-solving strategies in math.
8.Aluja-Banet et al., (2019) developed the PILARES system to measure student motivation and adapt teaching strategies.
9.Fernández-Soriano et al. (2019) studied the relationship between student computing devices and performance in accessing the LMS.
10. Nieto-Márquez et al. (2020). used Smile and Learn (GLAs) to enhance executive functions.
11. Michos & Hernández-Leo (2020) proposed the CIDA framework to support teacher communities in technology-based environments.
12. López Camacho et al. (2020) addressed student engagement through MMLA techniques using the WIoTED system.
13. Calvo-Morata et al. (2020) developed the Serious Game “Connected” to raise awareness of bullying.
14. Calvo-Morata et al. (2021) extended the previous study and examined player characteristics.
15. Israel-Fishelson et al. (2021) explored the relationships between computational thinking, creativity, and LA.
16. Rodríguez-Martínez et al. (2023) LA-based formative assessment improved personalized teaching sequences and student motivation in mathematics (via Audience Response Systems).
Additionally, the summary table above illustrates the most relevant results in the selected papers. These are presented below and refer to the different practices that have shown the usefulness of learning analytics (Table 6):
(1) Relevant results on the success of predictive models of learning analytics on student performance.
(2) They point to the usefulness of learning analytics on student learning through process and outcome variables.
(3) Student learning and learning designs improvement through research cycles involving teachers.
(4) Provides a valuable guide for the development of learning analytics tools in K-12 as an experimental/quasi-experimental design.
(5) Provides a valuable guide for the development of learning analytics in K-12 as a transfer to teaching practice.
Table 6. Main Results Achievements
Cite |
Number |
|||||||
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
|
|
|
|
X |
|
|
|
|
|
|
X |
X |
|
X |
|
|
|
|
|
|
|
|
X |
|
|
|
|
|
|
X |
|
X |
|
|
|
|
|
X |
|
|
|
|
|
|
|
X |
|
|
|
|
|
|
|
|
|
X |
|
|
|
|
|
X |
|
|
X |
|
|
|
|
|
|
|
X |
|
|
|
|
|
|
|
|
|
|
|
|
|
X |
|
|
|
|
X |
|
|
|
|
|
|
|
|
|
|
|
|
X |
X |
|
|
|
|
|
|
|
|
|
X |
|
|
X |
X |
|
|
|
|
|
Concerning those results on instruments that aim to be validated and have the purpose of increasing knowledge on specific variables related to learning or specific learning types:
(6) Obtaining relevant information on emotional/cognitive variables through Educational Data Mining (EDM) techniques and the development of Learning Analytics (LA).
(7) Significant results in raising awareness of bullying and cyberbullying.
(8) Improvements in teaching and computational learning.
This research question (RQ2) attempts to explore the limitations of each study. For this purpose, the achievement of each paper’s has been analyzed in Table 7.
Table 7. Main Limitations
Cite |
Limitation |
This research guide does not explicitly state its limitations as research. However, it expresses the need to follow a thoughtful process in designing and executing the experiment. |
|
The teacher data visualization developed is represented in a paper document instead of integrated into the digital environment. They also point out the needed to including learning analytics visualization techniques and collecting students’ opinions. |
|
The paper points out to validate the platform integration with a large-scale operational LMS system. However, there are no results yet on the validity and limitations of the PILARES project. |
|
The study is limited to the specific case studies of schools and the sample of teachers. Also, various types of learning activities, technologies and data sources to support reflection and redesign are needed. |
|
In this work, they do not explicitly point out limitations in their research. However, they point out that the diversity of the use of instruments is required to facilitate the automatic analysis of the data collected. |
|
This study highlights two aspects as limitations: the high development costs, time and other technologies are involved; and the difficulty of storing Scratch multimedia information. |
|
The technical limitations of the initial version of the environment have been highlighted. However, the limits arising from the research development are unexplored. |
|
The work only addresses the activities conducted through the VLE and not all learning activities in which the students are engaged. |
|
The results obtained may be partially conditioned by the use of tablets (iPads) as the most widely used device. They also highlight the need to study the data in more schools and educational stages. |
|
This study raises limitations on the training and transfer of specific executive functions. Another limitation that can be noted about academic performance is that intelligence quotient was not controlled for as a variable. |
|
This paper does not include a section specifying the limitations of the research. However, they point out that it is necessary to contain the teacher’s tools, using ML techniques in real-time to show the behavior’s results when using WIoTED. |
|
This paper has measured participants’ awareness but not the prevalence of bullying. In turn, it is difficult to determine whether the results are caused solely by the game or whether they are related to other activities carried out by each Spanish school. |
|
The studied platform does not encourage multiple correct solutions and limits the free use of coding blocks, which may affect and limit creative solutions. Furthermore, the analysis is based on students from a single country (Spain). |
|
The study limitations encompass a small sample size, a short study duration of four 50-minute sessions, a narrow focus on mathematics (fractions), and a potential technology-related limitation due to the use of ARS (Audience Response Systems), which might impact generalizability and accessibility. |
As is shown in the summary table, some studies highlight the transfer of results as the main limitation. Calvo-Morata et al. (2020, 2021) stated there are difficulties establishing the effects of the dependent variable (awareness) on the independent variable (bullying). In this line, in the work of López Camacho et al. (2020), it is highlighted how training can influence the measurement of performance in executive functions. Other authors point out that a single device or instrument in data collection may condition the interpretation of the results (Fernández-Soriano et al., 2019; Michos et al., 2018). Meanwhile, Aluja-Banet et al. (2019), Diago et al. (2019), Amo-Filvà et al. (2019) and Israel-Fishelson et al. (2021) have pointed out the limitations of the results in the features of the tools for the development of learning analytics.
As well as it should be noted that the selection of the sample influences the results’ generalization (Fernández-Soriano et al., 2019; Michos et al., 2018). Furthermore, the economic and temporal costs of the tools may qualify their transfer to practice (Amo-Filvà et al., 2019).
Within the papers selected, some authors highlight the need to evolve the instruments presented to allow interventions in real-time (Melero et al., 2015; Michos & Hernández-Leo, 2020). In addition, four studies mark the need to transfer the tools proposed to other contexts outside Spain.
Other studies do not delve into the limitations of their research (Diago et al., 2019; Michos & Hernández-Leo, 2020; Rodríguez et al., 2018; Sancho et al., 2016; Serrano-Lagunaa et al., 2014), which is a weakness in scientific work.
Evidence of the transfer of research to educational practice has been examined by analyzing the methodology and objectives of the studies incorporated. In addition, the answer to RQ3 has been used as supporting data to answer this research question (RQ3). The categorization recommended by Calvo-Morata et al., (2021) to identify different types of research, developed by Wieringa et al. (2006). In addition, the search has been extended to include literature reviews that meet the inclusion and exclusion criteria. This classification is observable in Table 8, where 6 of the 16 studies included are validation research.
Table 8. Paper types
Cite |
Paper Type |
|||
Validation research |
Solution proposal |
Experience paper |
Review paper |
|
X |
|
|
|
|
|
|
X |
|
|
|
|
|
X |
|
|
|
X |
|
|
X |
|
|
|
|
|
X |
|
|
|
|
X |
|
|
|
|
X |
|
|
|
X |
|
|
|
|
|
X |
|
|
|
|
|
X |
|
|
X |
|
|
|
|
X |
|
|
|
|
|
X |
|
|
|
X |
|
|
|
|
|
|
X |
|
Among this type of work, there are quasi-experimental works carried out in the natural context of the school. In addition, there are six proposed solutions, consisting of pilot studies, aimed at validating the instruments for the development of learning analytics and establishing possible difficulties in their application. It is relevant to note that some of these studies have not preserved the sample’s representativeness, thus reducing the robustness and validity of the results. Therefore, their results are to be interpreted with caution.
In addition, we have included four experience papers based on multiple case studies. A systematic analysis of cases reflecting teaching experiences defines this research methodology. Finally, a literature review article has been included with a proposal for a European project, in the Spanish educational field, for the learning analytics development.
Considering that 4 out of 16 papers are experience papers, we have analyzed the main objectives and work results (RQ1). Except for these four papers, the remaining ones do not seek to promote transfer to practice in schools.
To understand the current trends of the K-12 stages in Spain, it has identified how learning analytics have been developed. Table 9 shows a predominance of the use of Game Learning Analytics in the works found (8). The second trend is the development and use of Virtual Learning Environments (VLE) or Learning Management Systems (LMS) (5). Two further studies advocate for learning analytics developed by teachers through Design-Based Research (DBR), Learning Design (LD) methodologies and ARS Technology.
Table 9. A: Game Learning Analytics (GLA); B: Design Based Research (DBR) and Learning Design (LD); C: Virtual Learning Environment (VLE) or Learning Management System (LMS); D: Multimodal Learning Analytics (MMLA); E: ARS Technology (clickers)
Cite |
A |
B |
C |
D |
E |
X |
|
|
|
|
|
X |
|
|
|
|
|
|
|
X |
|
|
|
|
X |
|
|
|
|
|
|
|
|
X |
|
X |
|
|
|
|
|
X |
|
X |
|
|
|
|
|
X |
|
|
|
|
|
X |
|
|
|
X |
|
X |
|
|
|
|
X |
|
|
|
|
|
|
|
X |
|
|
X |
|
|
|
|
|
X |
|
|
|
|
|
X |
|
|
|
|
|
|
|
|
|
X |
The significance of the present work is located in the scientific evidence of the incipient nature of Learning Analytics at these levels. Likewise, there is insufficient continuity and depth in the authors’ research. Furthermore, the emergent nature of the subject gets reinforced by the exploratory properties of the empirical methods observed in the papers.
Learning analytics in Primary and Secondary Education is still in its first stages in Spain (Donate et al., 2021). However, this can be an opportunity to deepen the results and focus of the work done until now (Amo, 2020). In this line, among the works found, those that support teachers’ design of learning activities stand out. These studies demonstrate the feasibility of developing learning analytics at these levels. However, other research understands the need for a framework and attempts to guide the development of learning analytics in K-12 settings. However, most papers’ approaches do not ensure that their research is appropriate for the school. This research focuses on the validation of instruments, neglecting the role of teachers.
These learning analytics tools are valuable for the research development in the field, but there are limitations in budget and transfer of results. In this line, among the works found, there is a lack of a foundation in learning theory and limitations in evaluation and reporting. On the other hand, there is a tendency for LAs to research Game Learning Analytics models in Spain. Although this is an innovative approach, it may be a mistake identifying disengaged learners, as the metrics obtained could also be transferred to engagement outside the game environment.
Thus, to increase scientific knowledge while influencing educational practice, it is necessary to follow the work that considers the Spanish socio-educational context.
In this comprehensive analysis of literature consisting of 16 articles, the focus is on evaluating the application and impact of Learning Analytics (LA) in K-12 educational settings in Spain. The emergence of LA has gained momentum recently due to the digitalization brought about by the COVID-19 pandemic. However, it is important to note that despite these advancements, LA is still in its early stages within this context.
The most significant and impactful findings within the selected papers are as follows (RQ1):
•A tendency for the prediction and improvement of student engagement and performance in different knowledge or competencies: Computational Learning (Diago et al., 2019; Israel-Fishelson et al., 2021); Cognitive Executive Functions (Nieto-Márquez et al., 2020); Bullying awareness (Calvo-Morata et al., 2020, 2021). These approaches leave aside their use feasibility in K-12 schools to focus more on the reliability and validity of such instruments. While other studies coincide with this approach, giving great relevance to the variables of the learning process, offering information during the development of the activity (Aluja-Banet et al., 2019; Diago et al., 2019; López Camacho et al., 2020; Melero et al., 2015; Rodríguez et al., 2018). Likewise, they point out the transfer of their results as the main difficulty: due to the problem of isolating the study variables; and because the resources and methods used are not easily generalizable to other contexts.
•A lack of research where teachers have maintained an active role in the development of Learning Analytics in the natural educational context. Only Michos et al. (2018), Michos & Hernández-Leo (2020), López Camacho et al., (2020); Rodríguez-Martínez et al. (2023) are focusing on obtaining satisfaction from participants and promising results in the redesign of learning activities. In this line, among the papers is a practical guide for the development of learning analytics, which recommends understanding the teacher’s perspective, also promoting joint work between researchers and teachers.
On the other hand, the main limitations (RQ2) are those associated with the type of research and that certain studies do not investigate their weaknesses as research. In other words, it is observed that: (1) no approach brings together computational sciences and educational innovation (Aluja-Banet et al., 2019; Amo-Filvà et al., 2019; Calvo-Morata et al., 2020; Calvo-Morata et al., 2021; Fernández-Soriano et al., 2019; Israel-Fishelson et al., 2021; Melero et al., 2015; Nebot et al., 2019; Nieto-Márquez et al., 2020; Rodríguez et al., 2018; Sancho et al., 2016; Serrano-Lagunaa et al., 2014); (2) the scientific contribution is reduced by not delimiting the weaknesses of the works (López Camacho et al., 2020; Rodríguez et al., 2018; Sancho et al., 2016; Serrano-Lagunaa et al., 2014).
About the RQ3, the lack of work since the consolidation of the learning analytics concept to the present day reinforces the incipient developmental role of LA at these educational levels. In this line, the approach of the investigations encountered reinforces the idea that there isn’t a distinct model for developing learning analytics at these stages (Donate et al., 2021). Furthermore, the lack of model development hinders the transfer of pedagogical knowledge in the development of learning analysis.
In respect of RQ4, there is a higher number of research studies on Game Learning Analytics (8 investigations) and, in the background, the development and use of Virtual Learning Environments (VLE) or Learning Management Systems (LMS) are highlighted (5 papers).
This study has several limitations that should be addressed in future research. The rigid inclusion and exclusion criteria may have limited the geographical representativeness of the analyzed studies. Although the interest of the present study lies in the research and transfer to the practice of learning analytics at Spanish educational levels, it is understood that extending the geographical range to other student samples could provide a more comprehensive vision. However, the context in which it takes place heavily influences the development of learning analytics. Indeed, it is well worth maintaining the restriction of the search area to Spain for further research into the natural field of education. Furthermore, aspects such as the digitization level and data literacy of institutions and teachers could condition the development of learning analytics and, therefore, the social or cultural context would play a relevant role (Raffaghelli, 2018).
Following the approach of some authors, the present work considers it essential to maintain the active role of teachers and institutions (Amo, Prinsloo, et al., 2021; Michos et al., 2018; Michos & Hernández-Leo, 2020; Sancho et al., 2016; Serrano-Lagunaa et al., 2014). The only way to promote the establishment of this element of analysis and educational support is to provide teachers and educational institutions with sufficient resources to employ them.
Definitely, research should be encouraged across different schools and levels, promoting learning analytics to improve learning.
It is also recommended to promote transfer to practice not only through the development and validation of the different instruments or technological resources that allow data management (Calvo-Morata et al., 2020, 2021; Israel-Fishelson et al., 2021; Nieto-Márquez et al., 2020).
Moreover, future research should continue to explore the relationship between educational technology, personalized learning environments, and students’ self-reported digital skills and beliefs to further enhance the understanding and application of learning analytics in K-12 education (Schmid & Petko, 2019). Therefore, it is proposed to continue with lines of research that promote joint work between universities and schools, as well as the autonomy of the teaching staff and the educational institution itself, at the micro, meso, and macro levels. In this collaborative work between institutions, teachers and researchers, students should take part actively (Donate et al., 2022).
Aluja-Banet, T., Sancho, M.-R., & Vukic, I. (2019). Measuring motivation from the Virtual Learning Environment in secondary education. Journal of Computational Science, 36, 100629–100629. https://doi.org/10.1016/j.jocs.2017.03.007
Amo, D., Alier, M., Sansaloni, R., Geli, J., Fonseca, D., García-Peñalvo, F. J., & Casañ, M. J. (2021). Learning Analytics Icons for analytics’ transparency, information, and easy comprehension of data treatment of students. In Ninth International Conference on Technological Ecosystems for Enhancing Multiculturality (TEEM’21) (pp. 587–593). Association for Computing Machinery. https://doi.org/10.1145/3486011.3486520
Amo, D., Fox, P., Fonseca, D., & Poyatos, C. (2020). Systematic Review on Which Analytics and Learning Methodologies Are Applied in Primary and Secondary Education in the Learning of Robotics Sensors. Sensors, 21(1), 153–153. https://doi.org/10.3390/s21010153
Amo, D., Prinsloo, P., Alier, M., Fonseca, D., Torres Kompen, R., Canaleta, X., & Herrero-Martín, J. (2021). Local technology to enhance data privacy and security in educational technology. International Journal of Interactive Multimedia and Artificial Intelligence, 7(2), 262–273. https://doi.org/10.9781/ijimai.2021.11.006
Amo-Filvà, D., Alier Forment, M., García-Peñalvo, F. J., Fonseca Escudero, D., & Casañ, M. J. (2019). Clickstream for learning analytics to assess students’ behavior with Scratch. Future Generation Computer Systems, 93, 673–686. https://doi.org/10.1016/j.future.2018.10.057
Amo-Filva, D., Donate Beby, B., García-Peñalvo, F. J., & Chen, J. (2022). Towards an ethical data literacy proficiency: A Moodle logs analytical tool. 2022 XII International Conference on Virtual Campus (JICV), 1–3. https://doi.org/10.1109/JICV56113.2022.9934790
Brereton, P., Kitchenham, B. A., Budgen, D., Turner, M., & Khalil, M. (2007). Lessons from applying the systematic literature review process within the software engineering domain. Journal of Systems and Software, 80(4), 571–583. https://doi.org/10.1016/j.jss.2006.07.009
Calvo-Morata, A., Alonso-Fernández, C., Freire, M., Martínez-Ortiz, I., & Fernández-Manjón, B. (2021). Creating awareness on bullying and cyberbullying among young people: Validating the effectiveness and design of the serious game Conectado. Telematics and Informatics, 60. https://doi.org/10.1016/j.tele.2021.101568
Calvo-Morata, A., Rotaru, D. C., Alonso-Fernandez, C., Freire-Moran, M., Martinez-Ortiz, I., & Fernandez-Manjon, B. (2020). Validation of a Cyberbullying Serious Game Using Game Analytics. IEEE Transactions on Learning Technologies, 13(1), 186–197. https://doi.org/10.1109/TLT.2018.2879354
Celdrán, A., Ruipérez-Valiente, J. A., Garcia Clemente, F. J., Rodríguez-Triana, M. J., Shankar, S. K., & Martinez Perez, G. (2020). A scalable architecture for the dynamic deployment of multimodal learning analytics applications in smart classrooms. Sensors, 20(10), 2923. https://doi.org/10.3390/s20102923
Cruz-Benito, J., Garcia-Penalvo, F. J., & Theron, R. (2019). Analyzing the software architectures supporting HCI/HMI processes through a systematic review of the literature. Telematics and Informatics, 38, 118–132. https://doi.org/10.1016/j.tele.2018.09.006
Diago, P. D., González-Calero, J. A., & Arnau, D. (2019). Fundamentos de diseño de un entorno tecnológico para el estudio de las habilidades en resolución de problemas en primeras edades escolares. Research in Education and Learning Innovation Archives, 22, 58-76. https://doi.org/10.7203/realia.22.14113
Donate, B., García-Peñalvo, F. J., & Amo, D. (2021). Learning Analytics in K-12 Spanish education: A systematic mapping study. 2021 XI International Conference on Virtual Campus (JICV). https://doi.org/10.1109/JICV53222.2021.9600315
Donate, B., García-Peñalvo, F.J., Amo, D., de Torres, E., Herrero-Martín, J. (2022). An Instrument for Self-assessment of Data Literacy at the Micro, Meso, and Macro Educational Levels. In P. Zaphiris & A. Ioannou (Eds.), Learning and Collaboration Technologies. Designing the Learner and Teacher Experience. HCII 2022. Lecture Notes in Computer Science, vol 13328 (pp. 228-237). Springer. https://doi-org.sabidi.urv.cat/10.1007/978-3-031-05657-4_16
Donate-Beby, B., García-Peñalvo, F. J., & Amo-Filvà, D. (2023). Data Literacy for the Development of Learning Analytics in K-12 Environments. In F. J. García-Peñalvo & A. García-Holgado (Eds.), Proceedings TEEM 2022: Tenth International Conference on Technological Ecosystems for Enhancing Multiculturality (pp. 1370–1376). Springer. https://doi.org/10.1007/978-981-99-0942-1_146
Fernández-Soriano, F. L., López, B., Martínez-España, R., Muñoz, A., & Cantabella, M. (2019). Use of Computing Devices as Sensors to Measure Their Impact on Primary and Secondary Students’ Performance. Sensors, 19(14), 3226–3226. https://doi.org/10.3390/s19143226
Figueiredo, J., Lopes, N., & García-Peñalvo, F. J. (2019). Predicting Student Failure in an Introductory Programming Course with Multiple Back-Propagation. TEEM’19: Proceedings of the Seventh International Conference on Technological Ecosystems for Enhancing Multiculturality, 44–49. https://doi.org/10.1145/3362789.3362925
García-Peñalvo, F. J., & Corell, A. (2020). La COVID-19: ¿enzima de la transformación digital de la docencia o reflejo de una crisis metodológica y competencial en la educación superior? Campus Virtuales, 9(2), 83–98. http://uajournals.com/ojs/index.php/campusvirtuales/article/view/740
García-Peñalvo, F. J., Corell, A., Abella-García, V., & Grande-de-Prado, M. (2021). Recommendations for Mandatory Online Assessment in Higher Education During the COVID-19 Pandemic. In D. Burgos, A. Tlili, & A. Tabacco (Eds.), Radical Solutions for Education in a Crisis Context: COVID-19 as an Opportunity for Global Learning (pp. 85–98). Springer. https://doi.org/10.1007/978-981-15-7869-4_6
García-Peñalvo, F. J., Reimann, D., & Maday, C. (2018). Introducing Coding and Computational Thinking in the Schools: The TACCLE 3 – Coding Project Experience. In M. S. Khine (Ed.), Computational Thinking in the STEM Disciplines (pp. 213–226). Springer International Publishing. https://doi.org/10.1007/978-3-319-93566-9_11
Hodges, C. B., Moore, S., Lockee, B. B., Trust, T., & Bond, M. A. (2020, March 27). The difference between emergency remote teaching and online learning. EDUCAUSE Review. https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-teaching-and-online-learning
Israel-Fishelson, R., Hershkovitz, A., Eguíluz, A., Garaizar, P., & Guenaga, M. (2021). The Associations Between Computational Thinking and Creativity: The Role of Personal Characteristics. Journal of Educational Computing Research, 58(8), 1415–1447. https://doi.org/10.1177/0735633120940954
Johnson, L., Becker, S. A., Cummins, M., Estrada, V., Freeman, A., & Hall, C. (2016). NMC Horizon Report: 2016 Higher Education Edition. The New Media Consortium. https://www.learntechlib.org/p/171478/
Khalil, M., & Ebner, M. (2015). A STEM MOOC for school children—What does learning analytics tell us? 18th International Conference on Interactive Collaborative Learning (ICL), 1217–1221. http://dx.doi.org/10.1109/ICL.2015.7318212
López Camacho, V., de la Guía, E., Olivares, T., Flores, M.J., & Orozco-Barbosa, L. (2020). Data Capture and Multimodal Learning Analytics Focused on Engagement with a New Wearable IoT Approach. IEEE Transactions on Learning Technologies, 13(4), 704–717. https://doi.org/10.1109/TLT.2020.2999787
López Zambrano, J., Lara Torralbo, J. A., & Romero Morales, C. (2021). Early prediction of student learning performance through data mining: A systematic review. Psicothema, 33(3), 456-465. https://doi.org/10.7334/psicothema2021.62
Marcos-Pablos, S., García-Holgado, A., & García-Peñalvo, F. J. (2020). Guidelines for performing Systematic Research Projects Reviews. International Journal of Interactive Multimedia and Artificial Intelligence, 6(2), 136-144. http://dx.doi.org/10.9781/ijimai.2020.05.005
Melero, J., Hernández-Leo, D., Sun, J., Santos, P., & Blat, J. (2015). How was the activity? A visualization support for a case of location-based learning design. British Journal of Educational Technology, 46(2), 317–329. https://doi.org/10.1111/bjet.12238
Michos, K., & Hernández-Leo, D. (2020). CIDA: A collective inquiry framework to study and support teachers as designers in technological environments. Computers & Education, 143, 103679. https://doi.org/10.1016/j.compedu.2019.103679
Michos, K., Hernández-Leo, D., & Albó, L. (2018). Teacher-led inquiry in technology-supported school communities. British Journal of Educational Technology, 49(6), 1077–1095. https://doi.org/10.1111/bjet.12696
Nebot, P. D. D., Somoza, J. A. G.-C., & Vera, D. A. (2019). Fundamentos de diseño de un entorno tecnológico para el estudio de las habilidades en resolución de problemas en primeras edades escolares. Research in Education and Learning Innovation Archives. REALIA, 22, 58–76. https://dialnet.unirioja.es/servlet/articulo?codigo=6997125
Nieto-Márquez, N. L., Cardeña Martínez, A., Baldominos, A., González Petronila, A., & Pérez Nieto, M. Á. (2020). Assessment of the Effects of Digital Educational Material on Executive Function Performance. Frontiers in Education, 5, 545709. https://doi.org/10.3389/feduc.2020.545709
Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., … Moher, D. (2021). The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. International Journal of Surgery, 88, 105906. https://doi.org/10.1016/J.IJSU.2021.105906
Peña-Ayala, A. (2018). Learning analytics: A glance of evolution, status, and trends according to a proposed taxonomy. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 8(3), e1243. https://doi.org/10.1002/widm.1243
Petticrew, M., & Roberts, H. (2008). Systematic reviews in the social sciences: A practical guide. John Wiley & Sons.
Raffaghelli, J. E. (2018). Educators’ Data Literacy: Supporting critical perspectives in the context of a “datafield” education. In L. Menichetti, M. Ranieri, & M. Kaschny Borges (Eds.), Teacher Education & Training on ICT between Europe and Latin America (pp. 91–109). Aracne. https://doi.org/10.4399/97888255210238
Richardson, W. S., Wilson, M. C., Nishikawa, J., & Hayward, R. S. (1995). The well-built clinical question: A key to evidence-based decisions. ACP J Club, 123(3), A12. https://doi.org/10.7326/ACPJC-1995-123-3-A12
Rodríguez, J. A., González-Calero, J. A., & Cózar, R. (2018). Diseño de un estudio exploratorio para la aplicación de técnicas de analíticas de aprendizaje en la enseñanza de las fracciones en 5o curso de Educación Primaria. Magister, 30(1 y 2), 29–42. https://doi.org/10.17811/msg.30.1.2018.29-42
Rodríguez-Martínez, J. A., González-Calero, J. A., Del Olmo-Muñoz, J., Arnau, D., & Tirado-Olivares, S. (2023). Building personalised homework from a learning analytics based formative assessment: Effect on fifth-grade students’ understanding of fractions. British Journal of Educational Technology, 54(1), 76–97. https://doi.org/10.1111/bjet.13292
Ruipérez-Valiente, J. A., & Kim, Y. J. (2020). Effects of solo vs. collaborative play in a digital learning game on geometry: Results from a K12 experiment. Computers & Education, 159, 104008. https://doi.org/10.1016/j.compedu.2020.104008
Sancho, M.R., Cañabate, A., & Sabate, F. (2016). Contextualizing learning analytics for secondary schools at micro level. 2015 International Conference on Interactive Collaborative and Blended Learning (ICBL), 70–75. https://doi.org/10.1109/ICBL.2015.7387638
Schmid, R., & Petko, D. (2019). Does the use of educational technology in personalized learning environments correlate with self-reported digital skills and beliefs of secondary-school students? Computers & Education, 136, 75–86. https://doi.org/10.1016/j.compedu.2019.03.006
Serrano-Laguna, Á., Torrentea, J., Maneroa, B., del Blancoa, Á., Borro-Escribanoa, B., Martínez-Ortiza, I., Freirea, M., & Fernández-Manjóna, B. (2014). Learning Analytics and Educational Games: Lessons Learned from Practical Experience. In De Gloria, A. (Eds.), Games and Learning Alliance. GALA 2013. Lecture Notes in Computer Science, vol 8605. Springer. https://doi.org/10.1007/978-3-319-12157-4_2
Siemens, G. (2012). Learning analytics: Envisioning a research discipline and a domain of practice. Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, 4–8. https://doi.org/10.1145/2330601.2330605
Spanish Network Of Learning Analytics (SNOLA) (n.d.). Acerca de SNOLA – About SNOLA. Retrieved 4 April 2022, from https://snola.es/acerca-de-snola-about-snola/
Society for Learning Analytics Research (SoLAR). (n.d.). About SoLAR. Retrieved 13 January 2022, from https://www.solaresearch.org/about/
UNESCO (2020). Education: from school closure to recovery. https://en.unesco.org/covid19/educationresponse
Urrutia, G., Torta, S., & Bonfill, X. (2005). Metaanálisis (QUOROM). Medicina Clínica, 125(1), 32–37. https://doi.org/10.1016/S0025-7753(05)72207-7
Wastiau, P., Blamire, R., Kearney, C., Quittre, V., Van de Gaer, E., & Monseur, C. (2013). The Use of ICT in Education: A survey of schools in Europe. European Journal of Education, 48(1), 11–27. http://doi.org/10.2307/23357043
Wieringa, R., Maiden, N., Mead, N., & Rolland, C. (2006). Requirements engineering paper classification and evaluation criteria: A proposal and a discussion. Requirements Engineering, 11(1), 102–107. https://doi.org/10.1007/s00766-005-0021-6
Willis, J. E., Slade, S., & Prinsloo, P. (2016). Ethical oversight of student data in learning analytics: A typology derived from a cross-continental, cross-institutional perspective. Educational Technology Research and Development, 64, 881–901. https://doi.org/10.1007/s11423-016-9463-4
Cómo citar: Donate-Beby, B.; García-Peñalvo, F. J. y Amo-Filva, D. (2023). Learning Analytics in Spanish K-12 levels: A Systematic Literature Review. UTE Teaching & Technology (Universitas Tarraconensis), (2), e3685. https://doi.org/10.17345/ute.2023.3685