Skip to main content

Learning analytics in virtual laboratories: a systematic literature review of empirical research

Abstract

Remote learning has advanced from the theoretical to the practical sciences with the advent of virtual labs. Although virtual labs allow students to conduct their experiments remotely, it is a challenge to evaluate student progress and collaboration using learning analytics. So far, a study that systematically synthesizes the status of research on virtual laboratories and learning analytics does not exist, which is a gap our study aimed to fill. This study aimed to synthesize the empirical research on learning analytics in virtual labs by conducting a systematic review. We reviewed 21 articles that were published between 2015 and 2021. The results of the study showed that 48% of studies were conducted in higher education, with the main focus on the medical field. There is a wide range of virtual lab platforms, and most of the learning analytics used in the reviewed articles were derived from student log files for students’ actions. Learning analytics was utilized to measure the performance, activities, perception, and behavior of students in virtual labs. The studies cover a wide variety of research domains, platforms, and analytical approaches. Therefore, the landscape of platforms and applications is fragmented, small-scale, and exploratory, and has thus far not tapped into the potential of learning analytics to support learning and teaching. Therefore, educators may need to find common standards, protocols, or platforms to build on each others’ findings and advance our knowledge.

Introduction

The COVID-19 coronavirus pandemic has created an extremely difficult situation that causes anxiety in the academic field. Practical sessions and experiments in schools and universities have been suspended, which are essential for students’ experience and skill development in laboratory-based disciplines (Vasiliadou, 2020). Despite the pandemic conditions, some specialties have started to use virtual labs for teaching biology, chemistry, and the natural sciences. Virtual labs have the advantages of unlimited time, immediate feedback, experiment repetition, and safety for students and the subjects of the experiment (Vasiliadou, 2020). Students’ experience with virtual and simulated experiments helps prepare them for their physical laboratories and offers a reasonable solution—at least in emergencies—(Breakey et al., 2008). Technology affords students several means of communication, allowing students to interact with teachers, ask for help, or provide feedback about their learning. Furthermore, students can conduct virtual experiments in groups, allowing for social engagement and collaboration through teamwork (Manchikanti et al., 2017). Virtual laboratories can generate digital traces to monitor students' learning and identify their learning strategies. These traces of students' interactions with virtual labs revealed an enhancement in students' ability to solve problems, engage in critical thinking, develop laboratory skills, and acquire knowledge (Ramadahan & Irwanto, 2018). To take advantage of such data, the "learning analytics" field was conceptualized to provide insights into learning by analyzing various student-generated data (Hantoobi et al., 2021).

Learning analytics (LA) is commonly defined as “the measurement, collection, analysis, and reporting of data about learners, learning environments, and contexts to understand and optimize learning and their environments” (SoLAR, 2011). Therefore, LA adopts a data-driven strategy in educational settings with the ultimate goal of enhancing and optimizing the educational experience for students and teachers. LA has a broad range of applications in many fields of education, from preschool to postgraduate studies (Adejo & Connolly, 2017). The LA implementation may provide educational institutions and stakeholders with multiple significant benefits. (Howell et al., 2018; Ifenthaler, 2017). These include LA being used for students’ collaboration measurement (Saqr, Elmoazen, et al., 2022), grade prediction (Agudo-Peregrina et al., 2014; Strang, 2017), learning gap identification (Nyland et al., 2017), failure prediction (Tempelaar et al., 2018), decision making (Vanessa Niet et al., 2016), active learning support (Kwong et al., 2017), profiling students (Khalil & Ebner, 2017) and assessment improvement (Azevedo et al., 2019).

LA has been implemented in many contexts, such as the early identification of at-risk students for underachievement, the tracking of students' online activity, the provision of automated feedback, the facilitation of learning strategies, and the optimization of teamwork in collaborative learning (Kaliisa et al., 2022; Papamitsiou & Economides, 2014). Previous systematic reviews have either narrowed in on the technology and design of virtual laboratories in a single discipline, such as biology (Udin et al., 2020) or chemistry (P. ), covered a wider range of disciplines while focusing on a single technology, such as virtual reality (Rahman et al., 2022) or provided a more broad-based review of the theoretical and practical approaches of virtual labs in various fields (Reeves & Crippen, 2021). However, a systematic review that synthesizes research about how learning analytics are used to monitor, support, or assess virtual laboratory work does not exist. In this study, we aim to bridge such a gap and contribute to the literature with a systematic review encompassing all research about learning analytics and virtual laboratories. We investigate the characteristics, research methods, and findings of learning analytics in virtual labs. Therefore, the main research questions for this study are: How has research on virtual laboratories used learning analytics in regards to educational levels, subjects, applications, and methods of analysis?

Background

Virtual labs

Technology-based training is growing across many areas of practice, and education is not an exception. Organizations are adopting virtual and simulated applications to improve trainees’ working skills, problem-solving strategies, and self-directedness (Dalgarno et al., 2003; Richard et al., 2006). Virtual laboratories offer the opportunity to practice several times, anytime, at any pace. Most importantly, they offer safe practice without fear of harm to themselves, equipment, or subjects. Virtual labs have provided students with access to large equipment such as telescopes (Slater et al., 2014), expensive devices such as electron microscopes (Childers & Jones, 2015), risky techniques such as radioactivity measurements (Jona & Vondracek, 2013), and biotic interactions such as cell stimulation (Hossain et al., 2016). Students can access virtual labs via computers and mobile devices, providing a new dimension for students (Lynch & Ghergulescu, 2017). Virtual labs range from simple 2D video games to interactive 3D simulations that provide a more engaging learning environment. Some provide students with instructions and technical directions to complete difficult tasks, whereas others are open-ended (Jones, 2018). Virtual labs have many advantages compared to traditional labs, including less cost, easy access, time-saving, environmental safety, and adaptability (Ali & Ullah, 2020). However, one of the possible drawbacks of virtual labs is that, unlike conventional labs, they do not always offer the same learning environment or the same opportunities for student interactions (Lynch & Ghergulescu, 2017).

Various organizations have created a variety of virtual laboratories, with many of them available as open-source software. The Go-Lab and LiLa projects are two general-initiative virtual labs that offer both a remote framework and a broader scope (Potkonjak et al., 2016). The Go-Lab project is a large collection of interactive virtual labs that enables teachers to develop inquiry learning spaces by combining online laboratories and applications. The learning space can be shared with teachers and students for creating and testing hypotheses as well as designing educational games (Dziabenko & Budnyk, 2019). The “Library of Labs (LiLa)” project creates an infrastructure for virtual experimentation. It goes beyond just gaining scientific knowledge by offering social communication skills with colleagues and mentors (Richter et al., 2011). Various commercial software packages are recently available with immersive simulators, for example, Labster, which has multiple virtual labs in different disciplines with game-based components to motivate students to learn techniques, solve problems, and apply experiments. Labster and other comparable programs like Late Nite Late Labs let students actually feel as if they are in the lab through the simulation environment to improve the immersion quality (Jones, 2018).

Virtual labs provide students with a personalized immersive learning experience through immersive tools such as virtual reality (VR), augmented reality (AR), and mixed reality (MR) for use within education (Hauze & Frazee, 2019). Early research suggests that immersive simulation improves student skills, knowledge, and motivation to learn (Chiu et al., 2015; Freina & Ott, 2015; Salmi et al., 2017; Zhang et al., 2014). VR has been widely used in a variety of educational settings. High school students used VR in 3D interactive chemistry labs (Ali et al., 2014; Civelek et al., 2014). Many articles focus on higher education; for instance, students in computer science courses have tested VR as an intelligent learning environment (Griol et al., 2014). A VR immersive environment can be used to design architectural spatial experiences (Ângulo & Velasco, 2014) and the presentation of neutrino data (Izatt et al., 2014). VR has been widely utilized in the field of medical education, particularly for applications such as nurse education in an interactive virtual environment (Green et al., 2014), simulated hospitals in medical education (Kleven & Prasolova-Førland, 2014), a caries removal simulation for dental students (Eve et al., 2014), and finger tracking using a head-mounted display to show surgeons how the expert's fingers move during surgery. Furthermore, VR is utilized directly with patients for educational purposes (de Ribaupierre et al., 2014; Rodrigues et al., 2014).

Many other virtual labs were developed as discipline-based labs, such as the Open-Source Physics (OSP) project improves computational physics education by providing simulators for basic techniques as well as education (Christian et al., 2011). In engineering, the TriLab project, which includes three access modes “hands-on, virtual, and remote lab” provides students with control engineering concepts and loop control using “Laboratory Virtual Instrument Engineering Workbench (LabVIEW)” (Abdulwahed & Nagy, 2013). In biology, the BioInteractive provides classroom resources and improves biology teachers’ content with scientific-based multimedia resources and stories to motivate students (Beardsley et al., 2022). In chemistry, ChemCollective involves virtual labs, educational materials as alternatives to textbooks, and student- or team-based activities (Yaron et al., 2010). The students can work with hundreds of chemicals and manipulate them without extra cost or possible risks (Yaron et al., 2010). According to a literature review on chemical virtual labs, there is a limitation in updating virtual labs based on student level, and the information provided by current virtual laboratories is static and limited in analytics (Ali & Ullah, 2020).

Learning analytics

Educational technology has evolved in three distinct waves. The first wave started with the development of learning management systems (LMS). Social networks are considered the second wave of educational development that affects learning. Learning analytics, which is the third wave, is used to improve and optimize education (Fiaidhi, 2014). LA as a multidisciplinary field has been drawn from diverse scientific fields including computer science, education science, data mining, statistics, pedagogy, and behavioral science (Chatti et al., 2012).

The main objectives that have been explored in LA research are to support instructional strategies and the most promising applications in education, identify at-risk students to provide effective interventions; recommend reading materials and learning activities to students; and assess their outcomes (Romero & Ventura, 2020). The use of LA allows for tracking students' activities and providing feedback to improve the learning experience. LA pursued its objectives using various data mining techniques to create analytical models, which give a deep look into the learning process and could lead to more effective learning and pedagogical intervention (Elmoazen et al., 2022; Heikkinen et al., 2022). Among the approaches utilized, improved, or introduced in LA are machine learning, predictive analytics, process and sequence mining, and social network analysis (Romero & Ventura, 2020). The initial work was mostly algorithms for the prediction of students' success, and at-risk student identification (Ifenthaler & Yau, 2020). Then some researchers argued that relying on learning analytics for prediction is not sufficient (Saqr et al., ; Tempelaar et al., 2018), and it is essential to include pedagogical perspectives while studying the learning process (Gašević et al., 2015; Wong et al., 2019). Accordingly, scholars give more attention to pedagogical practices and feedback in recent LA research (Banihashem et al., 2022; Wise & Jung, 2019).

In virtual labs, LA techniques were applied in a variety of approaches to investigate the impact of using virtual labs to gain the necessary skills and competencies. Govaerts et al. (2012) applied the Student Activity Meter (SAM) to visualize students' performance based on many metrics, which they then displayed in a comprehensive dashboard with dimensional filtering. Similarly, in the FORGE European online learning project, a dashboard was used to visualize students' interactions with course materials and each other, in addition to surveys and questionnaires (Mikroyannidis et al., 2015). The dashboards of virtual labs present a summary of student progress by visualization using different statistical charts such as histograms and plots (Garcia-Zubia et al., 2019; Tobarra et al., 2014).

Many research papers use interaction data, including statistical extraction of students' interactions in relation to time spent, the distribution of time-on-task per student, and different user configurations (Elmoazen et al., 2022; Heikkinen et al., 2022; Ifenthaler & Yau, 2020). Another approach is to develop an autonomous assessment and recommendation system to analyze real-time activity results and improve students’ performance in virtual labs (Considine et al., 2019; Gonçalves et al., 2018). For instance, for optimal performance of virtual labs, students should spend appropriate amounts of time interacting with tools and resources. The relationship between students' interactions and their academic progress may be used to study students' behavior. Moreover, clustering methodologies can categorize students by their weaknesses and strengths to study their learning progress (Tulha et al., 2022).

Methodology

The authors conducted this review according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) 2020 guidelines (Page et al., 2021) and the eight fundamental steps of systematic reviews by Okoli (2015). The authors followed these guidelines to identify the purpose of the review, prepare a protocol draft, identify inclusion and exclusion criteria, and conduct the search process in order to extract data and appraise articles' quality before writing the review.

First, the authors determined that the purpose of the study was to report on the application of learning analytics in virtual labs to answer research questions. Following the assessment of the review's scope, the authors frequently convened to draft the protocol. This document organizes all subsequent actions to reduce the possibility of bias in the selection of publications and data processing. The protocol ensures reproducibility and consistency by planning the strategy for practicing and conducting the review (Fink, 2019). Accordingly, the protocol included research questions, the literature search strategy, inclusion criteria, the assessment of the studies, the data extraction, and the planned schedule (Kitchenham & Charters, 2007).

The inclusion and exclusion criteria for study selection were based on the research questions and guided by ) previous review. All reviewed articles to be included should use learning analytics in virtual labs and meet the following inclusion and exclusion criteria:

  1. 1.

    Publications are written in English.

  2. 2.

    Journal articles, conference proceedings, and book chapters in their entirety. Thus, we excluded editorials, conference abstracts, workshop proposals, and posters.

  3. 3.

    Empirical studies with empirical data collection and analysis. Reviews and incomplete reports (e.g., abstract-only papers or papers without methods and results) were excluded.

Database and literature search

The authors identified three established databases for the search: Scopus, Web of Science (WoS), and ERIC. Both Scopus and WoS databases employ rigorous inclusion criteria for journals and conferences, have a robust meta-data system, and have been established as literature search venues (Kumpulainen & Seppänen, 2022). ERIC is an educational database that covers a wide range of educational literature (Robbins, 2001). Additionally, the same keywords were used to search in the database of the Journal of Learning Analytics, the official publishing outlet for learning analytics.

We performed several iterations of search using different combinations of keywords; using the keyword “virtual” severely limited our findings and missed several papers. Some of the authors of the papers used other keywords, e.g., online laboratories, or did not use the keyword "virtual" at all within their keywords and therefore were not captured by the initial keywords that included "virtual." Therefore, a decision was made to cast a wide net, and retrieve any article that includes the keyword “lab*” with a wild card and then qualitatively—by the expert eyes of researchers—identify which of such keywords’ articles are about virtual laboratories. After several iterations, the following search formula yielded the best results for capturing all forms of keywords (Table 1):

( "learning analytics") AND ( lab* OR experiment* OR clinic* OR practical* OR immers*))

Table 1 The used keywords with wildcards to cover all keyword forms

This combination of keywords was selected to be searched in the fields of the title, abstract, or author keywords of articles. The search was conducted within two days of the eighth of November 2021. The returned search resulted in 1069 articles from all specified databases, as follows: 653 articles from Scopus, 248 articles from WoS, 120 articles from the ERIC database, and 48 articles from the Journal of Learning Analytics. All articles were uploaded to the Covidence web-based systemFootnote 1 for analysis. Duplicates (n = 280) were removed, resulting in 789 articles. Two researchers independently scanned and assessed the first 100 papers' abstracts, titles, and keywords. The inter-rater agreement showed strong reliability using Cohen's Kappa test (κ = 0.92), and any conflicts were discussed and resolved, i.e., when the two authors had differing views about the classification of the paper, they discussed it until they reached a consensus.

The remaining articles were divided and filtered by both researchers. All authors met after filtration to discuss any uncertainties. Based on the inclusion and exclusion criteria, the title and abstract scan yielded 86 publications that were suitable for full-text review (Fig. 1).

Fig. 1
figure 1

The study selection process

In order to obtain data from the included articles, the relevant information was first collected in a codebook. This was done to reduce the individual differences that existed between the reviewers. The following categories of information were extracted from each article: descriptive statistics, educational settings and levels, disciplines, learning analytics approaches, and the primary conclusions of each study. The first ten studies were coded by two different coders, and then they had a meeting to discuss any conflicts and complete the codebook before continuing to code articles. Finally, the retrieved papers were checked for quality before beginning the stage of synthesis. At this point, the writers are organizing all of the data within the framework of the review hypothesis (Webster & Watson, 2002). The data analysis gets a comprehensive presentation from a learning analytics perspective.

Results

The included studies are listed in the appendix, and each one is given a capital S and a number.

Descriptive statistics of the reviewed articles

There are a total of 21 studies that have been incorporated into this review. Before the year 2015, there were no studies utilizing learning analytics in virtual labs. All articles were published between 2015 and 2021. The maximum number of studies per year was five articles in 2021 followed by four articles in 2018. The majority of the reviewed articles were presented at conferences (N = 12), whereas the remaining nine articles were published in journals (Fig. 2).

Fig. 2
figure 2

Type and year of the reviewed articles

Educational levels

The reviewed studies have populations from various educational levels (Fig. 3). The majority of the research in the reviewed articles (57.1%) was conducted in higher education institutions (n = 12) and two of these studies involved postgraduate students in their analysis (Burbano & Soler, 2020; Considine et al., 2021). Six studies (28.6%) were conducted on secondary education, and four of them focused on STEM (science, technology, engineering, and math) subjects (de Jong et al., 2021; Rodríguez-Triana et al., 2021; Sergis et al., 2019; Vozniuk et al., 2015). Only two research projects (9.5%) focused on elementary and middle school students (Metcalf et al., 2017; Reilly & Dede, 2019a). Finally, one study was conducted online as a Massive Open Online Course (MOOC) for students of varying education levels (Hossain et al., 2018).

Fig. 3
figure 3

Educational Levels in the reviewed studies

Subjects

The reviewed studies covered different disciplines of science, medicine, and engineering (Fig. 4). The medical and dental virtual practices were used in practical-based physiology courses (King et al., 2016), virtual patient cases (Berman et al., 2018), periodontology and oral pathology (Burbano & Soler, 2020) and prosthodontics courses (Chan et al., 2021a, 2021b). Chemistry virtual labs were used in concentration experiments (Liu et al., 2018) and organic chemistry (Qvist et al., 2015), while biology labs covered Euglena’s interactive live (Hossain et al., 2018), and molecular biology experiments (Qvist et al., 2015). Virtual labs for science classes were available for school students (Metcalf et al., 2017; Reilly & Dede, 2019b) and students in science, technology, engineering, and mathematics (STEM) (de Jong et al., 2021; Rodríguez-Triana et al., 2021; Sergis et al., 2019; Vozniuk et al., 2015). Virtual labs were used in different fields of computer science, namely Java programming (Castillo, 2016), cloud applications (Manske & Hoppe, 2016), and network virtual labs (Venant et al., 2017). The engineering virtual labs covered automotive engineering (Goncalves et al., 2018), container-based virtual labs (Robles-Gómez et al., 2019), and building electrical circuits (Considine et al., 2021). Other practices include digital electronic simulation environments (Considine et al., 2021) and remote labs in the field of image processing (Vahdat et al., 2015).

Fig. 4
figure 4

Context of the reviewed studies

Virtual environment

The authors of the reviewed articles used a wide range of virtual environments. Go-lab was used in STEM education (de Jong et al., 2021; Rodríguez-Triana et al., 2021; Sergis et al., 2019) and was combined with other applications such as the GRAASP platform (Vozniuk et al., 2015) and cloud applications (Manske & Hoppe, 2016). In addition, the EcoXPT system was utilized in science classes. (Metcalf et al., 2017; Reilly & Dede, 2019b). In the medical field, the LabTutor platform was used in physiology courses (King et al., 2016), the ASUS virtual patient package (Berman et al., 2018), and the M-Health Smilearning application with TIMONEL platform in the dental field (Burbano & Soler, 2020). Chemistry virtual labs were accessible on two platforms: the ChemVLab + tutor (Liu et al., 2018) and the LabLife3D platform (Qvist et al., 2015). In the field of biology, virtual labs were available in LabLife3D for molecular biology (Qvist et al., 2015) and Open edX for Euglena experiments (Hossain et al., 2018). The virtual labs used in computer science were the Magentix 2 platform with virtual hosts (Castillo, 2016), and the network Lab4CE (Laboratory for Computer Education) (Venant et al., 2017). Various engineering fields utilized different virtual lab platforms, such as Falstadat’s Circuit Simulator Applet and Virtual Instrumentation Systems in Reality (VISIR) (Goncalves et al., 2018), Netlab for building electrical circuits (Considine et al., 2021), and a container-based virtual laboratory (CVL) using Linux Docker containers (Robles-Gómez et al., 2019). Other labs included such as DEEDS (Digital Electronics Education and Design Suite) for digital electronic simulation environments (Vahdat et al., 2015) and the WebLab-Deusto remote lab management system (RLMS) for image processing (Schwandt et al., 2021).

Perception of virtual labs

The findings reported that virtual labs are inexpensive, robust (Hossain et al., 2018), and have a very high satisfaction level among students (Castillo, 2016). Students recorded their positive feedback and interest in virtual labs as they simplified complex scientific practices (Hossain et al., 2018; Qvist et al., 2015; Robles-Gómez et al., 2019). Similarly, some post-graduate students preferred remote labs after their experience during the COVID-19 pandemic (Considine et al., 2021). Regarding the teachers, they displayed a positive response regarding learning analytics in virtual labs as they can monitor students’ progress (Qvist et al., 2015; Vozniuk et al., 2015). The teachers expressed the need for an enhancement in displaying students’ activities and technical guidelines to support inquiry-based learning in virtual labs (Rodríguez-Triana et al., 2021). Many authors showed evidence of improvement in students’ performance with the use of virtual labs (King et al., 2016; Manske & Hoppe, 2016; Metcalf et al., 2017; Robles-Gómez et al., 2019).

Learning analytics

The reviewed studies mainly covered one or more of these variables: performance, activities, perception, and behavior. Performance was assessed in 11 studies, either to evaluate the impact of the virtual labs on learning achievement (King et al., 2016; Metcalf et al., 2017; Reilly & Dede, 2019b; Robles-Gómez et al., 2019; Vahdat et al., 2015); improve knowledge (Burbano G & Soler, 2020; Manske & Hoppe, 2016); assess the need for support (Goncalves et al., 2018; Venant et al., 2017) or assess the inquiry-based educational designs by teachers (de Jong et al., 2021; Sergis et al., 2019). There are 10 studies focusing on the analysis of students’ activities and the pattern of virtual lab utilization (Castillo, 2016; King et al., 2016; Hossain et al., 2018; Metcalf et al., 2017; Berman et al., 2018; Liu et al., 2018; Burbano and Soler 2020; Considine et al., 2021; Schwandt et al., 2021; Chan et al., 2021a, 2021b). Nine studies measured the perceptions towards virtual labs in one of three forms: self-reported feedback (Berman et al., 2018; Chan et al., 2021a, 2021b; Considine et al., 2021; Hossain et al., 2018), teacher’s opinions (Qvist et al., 2015; Rodríguez-Triana et al., 2021; Vozniuk et al., 2015) and students’ satisfaction questionnaires (Castillo, 2016; Robles-Gómez et al., 2019). Three studies identified the behavior pattern of the students in virtual labs (Robles-Gómez et al., 2019; Vahdat et al., 2015; Venant et al., 2017).

The learning analytics in the reviewed articles were mainly based on log data from the virtual lab platforms. The data collected from system log files consist of general data such as user ids, students’ clicks, the start and end of experiments, and users’ actions (Schwandt et al., 2021). Authors used log files to analyze the patterns of experiments (Metcalf et al., 2017; Qvist et al., 2015), long-term patterns in MOOC courses (Hossain et al., 2018), and interactions between learners (Venant et al., 2017). Some authors used the time sequence as part of their analysis to monitor the timeline pattern (Qvist et al., 2015), durations of system activities (Burbano G & Soler, 2020; Vozniuk et al., 2015), time spent on tasks (King et al., 2016), sequence of actions (Manske & Hoppe, 2016) and comparison between more than one academic year to assess the improvement when using virtual labs (Robles-Gómez et al., 2019). Also, the students’ performance can be predicted using engagement metrics of student activity (Berman et al., 2018; Castillo, 2016), complexity metrics (Vahdat et al., 2015), and behavior during practical learning (Venant et al., 2017). Thus, learning analytics help teachers figure out when students are having difficulties and support them when needed (Goncalves et al., 2018; Sergis et al., 2019; Venant et al., 2017).

Process mining was used as a temporal method to discover the hidden strategies of students to achieve their goals (Castillo, 2016). Similarly, students’ learning strategies and practical activity sequences were analyzed using sequential pattern mining to identify behavior variations at different performance levels (Venant et al., 2017). The learning trajectories of students were identified by meta-classification of the events with their timestamps (Reilly & Dede, 2019b) and by selecting the segments of interest in log data and then coding the video and audio recordings for these segments (Liu et al., 2018). Correlation analysis and multiple linear regression analysis were used to address the relationship between access to learning resources and academic achievement (Chan et al., 2021a, 2021b). Students’ performance was part of the analysis by monitoring the students’ results in exams (Goncalves et al., 2018) and extracting their mistakes (Considine et al., 2021). Virtual labs included built-in learning analytics tools in many studies such as the Learning Analytics Data Collector (LADC) (Vahdat et al., 2015), Inquiry Learning Space (ILS) dashboard, “Teaching and Learning Analytics (TLA)” and measurements based on algorithms to analyze the correlation between students’ performance and actions (Schwandt et al., 2021). Finally, virtual patients’ metrics were used to monitor students (Berman et al., 2018).

Discussions and conclusions

This study is aimed at reviewing research at the intersection of learning analytics and virtual laboratories. While learning analytics emerged more than a decade ago, the number of articles that particularly focus on virtual laboratories remains paltry, and the growth curve is largely flat with a yearly frequency of three to four articles per year. The included articles (n = 21) were published in the last six years, pointing to a rather cautious adoption trend among educators. In fact, systematic literature reviews have pointed to a slow adoption trend within scientific education fields with the faint appearance of articles from these domains, e.g., (Ifenthaler & Yau, 2020; Saqr, 2018). The results came from diverse fields with a concentration around STEM, health sciences (Burbano G and Soler 2020; Chan et al., 2021a, 2021b), science (chemistry and biology), as well as engineering sciences (Considine et al., 2021; Goncalves et al., 2018; Robles-Gómez et al., 2019). There was a diverse repertoire of digital platforms; almost every study used a different platform. Such a wide diversity across contexts and digital platforms is giving rise to fragmentation of the insights, making it hard to draw a consistent conclusion or a common narrative. In other words, since most of the experimental findings come from a different context with a specific platform, we can hardly reach a conclusion that applies in other cases that do not use such a platform or come from a different platform.

The reported results—by the reviewed papers—have studied students’ perceptions of the virtual laboratories (Berman et al., 2018; Chan et al., 2021a, 2021b; Hossain et al., 2018), performance, and online behavior (e.g., using log data). Obviously, students’ perceptions or performance are not well related to learning analytics, yet they continue to receive researchers’ attention. In particular, the issue of improving performance has witnessed rising adoption in the last decade as stakeholders wish to use data to improve students’ learning. Log data within the reviewed studies formed the basis of most analyses and revolved around understanding behavioral patterns of using online laboratories, or how using such platforms can help us predict or understand students’ performance. Less frequently, studies have tried to map students’ temporal behavioral patterns using e.g., sequence (Such studies—that used temporal methods—offered valuable insights about students’ laboratory learning strategies and the sequence of virtual lab activities). Some of the reviewed studies had built-in analytics solutions in the form of dashboards specific to such platforms. Teachers’ perspectives have been investigated in several studies, in which they reported a positive perception of the potential of learning analytics, e.g., enabling students’ monitoring, helping support students’ during laboratory work, and offering ways for scaffolding.

The small number of studies in this review, which are distributed across different fields, platforms, and methods, makes it hard to draw any general conclusions. It is, therefore, fair to say that studies hitherto are still in an exploratory stage. Several areas of research and questions are still unanswered, e.g., what are the effective strategies when using online laboratories, what are the indicators that point to a student needing help; what are the effective supportive strategies, and what are the indicators that best predict that a student is benefiting from online laboratories. What is more, we have little information about interactivity in virtual labs, their patterns, benefits, or lack thereof, and how to best support such strategies. In addition, we stress the findings by Birkeland (Birkeland, 2022) regarding the absence of collaborative environments and teamwork incentives, which are very common practices in virtual labs, pointing to a critical issue that current virtual labs lack.

Since studies do not use a common standard, protocol, or shared methods, their findings are not shared, portable, or built on each other. Authors and researchers need to think of common protocols, standards, or application programming interfaces (APIs). Such common protocols would make efforts more likely to build on each other and results more likely to be shared.

Virtual laboratory dashboards are in the very early stages of development, and little is known about the effective elements of dashboards that could help students or teachers. In fact, how learning analytics can help teachers optimize learning and teaching with virtual laboratories is still an open area of inquiry. In the same vein, how learning analytics can help teachers design, assess, or improve learning tasks is still largely unexplored.

This systematic review comes with the following limitations. The search was performed using five search terms in the title and keywords, which were too generic and resulted in a large number of articles being initially reported and then excluded. This may complicate the search and filtration processes, but it reduces the exclusion of any articles, as authors didn't include "virtual labs" in their keywords. However, if authors didn’t include our keywords, their work may have been missed in this review. Although the coding process for this review worked well for most articles, the coders had to make an effort to interpret some articles. Thus, in order to facilitate coding, the authors had to discuss and figure out the primary emphasis of the research that was unclear. Also, construct validity may be needed as we rely on author descriptions and code groupings that sometimes differ from the author's domains, which don’t follow a standardized approach or protocol. Finally, the qualitative analysis and the relatively small number of articles included in this review from a variety of disciplines and research approaches restrict the ability to make broad generalizations due to a lack of standardization”. Nonetheless, this research presents the first systematic overview of learning analytics in virtual labs. Researchers may utilize our work as a framework and lens to perform further rigorous research, and we believe that the results we have provided can serve as a new basis for learning analytics in laboratories.

In summary, our review addressed questions pertaining to the use of learning analytics in virtual laboratories. An area that still has significant gaps of knowledge that only future research would help us shed light on.

Availability of data and materials

The data of this systematic review consist of articles published in journals and conferences. Many of these are freely available online, others can be accessed for a fee or through subscription.

Notes

  1. https://www.covidence.org/ (last accessed September 2022).

References

  • Abdulwahed, M., & Nagy, Z. K. (2013). Developing the TriLab, a triple access mode (hands-on, virtual, remote) laboratory, of a process control rig using LabVIEW and Joomla. Computer Applications in Engineering Education, 21(4), 614–626. https://doi.org/10.1002/cae.20506

    Article  Google Scholar 

  • Adejo, O., & Connolly, T. (2017). Learning analytics in a shared-network educational environment: ethical issues and countermeasures. International Journal of Advanced Computer Science and Applications. https://doi.org/10.14569/ijacsa.2017.080404

    Article  Google Scholar 

  • Agudo-Peregrina, Á. F., Iglesias-Pradas, S., Conde-González, M. Á., & Hernández-García, Á. (2014). Can we predict success from log data in VLEs? Classification of interactions for learning analytics and their relation with performance in VLE-supported F2F and online learning. Computers in Human Behavior, 31(1), 542–550. https://doi.org/10.1016/j.chb.2013.05.031

    Article  Google Scholar 

  • Ali, N., & Ullah, S. (2020). Review to analyze and compare virtual chemistry laboratories for their use in education. Journal of Chemical Education, 97(10), 3563–3574. https://doi.org/10.1021/acs.jchemed.0c00185

    Article  Google Scholar 

  • Ali, N., Ullah, S., Alam, A., & Rafique, J. (2014). 3D interactive virtual chemistry laboratory for simulation of high school experiments. Proceedings of Eurasia Graphics, Vember, 2015, 1–6.

    Google Scholar 

  • Ângulo, A., & Velasco, G. V. de. (2014). Immersive Simulation of Architectural Spatial Experiences. pp. 495–499. https://doi.org/10.5151/despro-sigradi2013-0095

  • Azevedo, J. M., Oliveira, E. P., & Beites, P. D. P. D. (2019). Using learning analytics to evaluate the quality of multiple-choice questions: A perspective with classical test theory and item response theory. International Journal of Information and Learning Technology, 36(4), 322–341. https://doi.org/10.1108/IJILT-02-2019-0023

    Article  Google Scholar 

  • Banihashem, S. K., Noroozi, O., van Ginkel, S., Macfadyen, L. P., & Biemans, H. J. A. (2022). A systematic review of the role of learning analytics in enhancing feedback practices in higher education. Educational Research Review, 37, 100489. https://doi.org/10.1016/j.edurev.2022.100489

    Article  Google Scholar 

  • Beardsley, P., Csikari, M., Ertzman, A., & Jeffus, M. (2022). BioInteractive’s free online professional learning course on evolution. The American Biology Teacher, 84(2), 68–74. https://doi.org/10.1525/abt.2022.84.2.68

    Article  Google Scholar 

  • Berman, N. B., Artino, A. R., Artino, A. R., Jr., & Artino, A. R. (2018). Development and initial validation of an online engagement metric using virtual patients 13 education 1303 specialist studies in education. BMC Medical Education, 18(1), 213. https://doi.org/10.1186/s12909-018-1322-z

    Article  Google Scholar 

  • Birkeland, H. (2022). Understanding collaboration in virtual labs: A learning analytics framework. The University of Bergen.

    Google Scholar 

  • Breakey, K. M., Levin, D., Miller, I., & Hentges, K. E. (2008). The use of scenario-based-learning interactive software to create custom virtual laboratory scenarios for teaching genetics. Genetics, 179(3), 1151–1155. https://doi.org/10.1534/genetics.108.090381

    Article  Google Scholar 

  • Burbano, G. D. C., & Soler, J. A. (2020). Learning analytics in m-learning: Periodontic education. Communications in Computer and Information Science, 1280, 128–139. https://doi.org/10.1007/978-3-030-62554-2_10

    Article  Google Scholar 

  • Castillo, L. (2016). A virtual laboratory for multiagent systems: Joining efficacy, learning analytics and student satisfaction. In International symposium on computers in education (SIIE 2016): learning analytics technologies, 1–6. https://doi.org/10.1109/SIIE.2016.7751820

  • Chan, A. K. M., Botelho, M. G., & Lam, O. L. T. (2021a). The relation of online learning analytics, approaches to learning and academic achievement in a clinical skills course. European Journal of Dental Education: Official Journal of the Association for Dental Education in Europe, 25(3), 442–450. https://doi.org/10.1111/eje.12619

    Article  Google Scholar 

  • Chan, P., Van Gerven, T., Dubois, J.-L., & Bernaerts, K. (2021b). Virtual chemical laboratories: A systematic literature review of research, technologies and instructional design. Computers and Education Open, 2, 100053.

    Article  Google Scholar 

  • Chatti, M. A., Dyckhoff, A. L., Schroeder, U., & Thüs, H. (2012). A reference model for learning analytics. International Journal of Technology Enhanced Learning, 4(5/6), 318. https://doi.org/10.1504/ijtel.2012.051815

    Article  Google Scholar 

  • Childers, G., & Jones, M. G. (2015). Students as Virtual scientists: An exploration of students’ and teachers’ perceived realness of a remote electron microscopy investigation. International Journal of Science Education, 37(15), 2433–2452. https://doi.org/10.1080/09500693.2015.1082043

    Article  Google Scholar 

  • Chiu, J. L., Dejaegher, C. J., & Chao, J. (2015). The effects of augmented virtual science laboratories on middle school students’ understanding of gas properties. Computers and Education, 85, 59–73. https://doi.org/10.1016/j.compedu.2015.02.007

    Article  Google Scholar 

  • Christian, W., Esquembre, F., & Barbato, L. (2011). Open source physics. Science, 334(6059), 1077–1078. https://doi.org/10.1126/science.1196984

    Article  Google Scholar 

  • Civelek, T., Ucar, E., Ustunel, H., & Aydin, M. K. (2014). Effects of a haptic augmented simulation on K-12 students’ achievement and their attitudes towards physics. Eurasia Journal of Mathematics, Science and Technology Education, 10(6), 565–574.

    Article  Google Scholar 

  • Considine, H., Nedic, Z., & Nafalski, A. (2019). Automation of basic supervision tasks in a remote laboratory-case study netlab. In Proceedings of the 2019 5th experiment at international conference, exp.at 2019, (pp. 189–192). https://doi.org/10.1109/EXPAT.2019.8876508

  • Considine, H., Nafalski, A., & Milosz, M. (2021). An Automated support system in a remote laboratory in the context of online learning. In M. E. Auer & T. Rüütmann (Eds.), Educating Engineers for future industrial revolutions: proceedings of the 23rd international conference on interactive collaborative learning (ICL2020), Volume 2 (pp. 657–665). Springer International Publishing. https://doi.org/10.1007/978-3-030-68201-9_64

    Chapter  Google Scholar 

  • Dalgarno, B., Bishop, A. G., & Bedgood, R. (2003). The potential of virtual laboratories for distance education science teaching: reflections from the development and evaluation of a virtual chemistry laboratory. In K. Placing (Ed.), UniServe science improving learning outcomes symposium proceedings (pp. 90–115).

  • de Jong, T., Gillet, D., Rodríguez-Triana, MJe., Hovardas, T., Dikke, D., Doran, R., Dziabenko, O., Koslowsky, J., Korventausta, M., Law, E., Pedaste, M., Tasiopoulou, E., Vidal, G., & Zacharia, Z. C. (2021). Understanding teacher design practices for digital inquiry–based science learning: the case of Go-Lab. Educational Technology Research and Development, 69(2), 417–444. https://doi.org/10.1007/s11423-020-09904-z

    Article  Google Scholar 

  • de Ribaupierre, S., Kapralos, B., Haji, F., Stroulia, E., Dubrowski, A., & Eagleson, R. (2014). Healthcare training enhancement through virtual reality and serious games. Intelligent Systems Reference Library, 68, 9–27. https://doi.org/10.1007/978-3-642-54816-1_2

    Article  Google Scholar 

  • Dziabenko, O., & Budnyk, O. (2019). Go-Lab Ecosystem: Using Online Laboratories in a Primary School. EDULEARN19 Proceedings, 1, 9276–9285. https://doi.org/10.21125/edulearn.2019.2304

  • Elmoazen, R., Saqr, M., Tedre, M., & Hirsto, L. (2022). A systematic literature review of empirical research on epistemic network analysis in education. IEEE Access, 10, 17330–17348. https://doi.org/10.1109/ACCESS.2022.3149812

    Article  Google Scholar 

  • Eve, E. J., Koo, S., Alshihri, A. A., Cormier, J., Kozhenikov, M., Donoff, R. B., & Karimbux, N. Y. (2014). Performance of dental students versus prosthodontics residents on a 3D immersive haptic simulator. Journal of Dental Education, 78(4), 630–637.

    Article  Google Scholar 

  • Fiaidhi, J. (2014). The next step for learning analytics. IT Professional, 16(5), 4–8. https://doi.org/10.1109/MITP.2014.78.

  • Fink, A. (2019). Conducting research literature reviews: From the Internet to Paper. SAGE Publications.

    Google Scholar 

  • Freina, L., & Ott, M. (2015). A literature review on immersive virtual reality in education: State of the art and perspectives. In: Proceedings of eLearning and Software for Education (eLSE)(Bucharest, Romania, April 23--24, 2015), 8.

  • Garcia-Zubia, J., Cuadros, J., Serrano, V., Hernandez-Jayo, U., Angulo-Martinez, I., Villar, A., Orduna, P., & Alves, G. (2019). Dashboard for the VISIR remote lab. In Proceedings of the 2019 5th experiment at international conference, Exp.at 2019, (pp. 42–46). https://doi.org/10.1109/EXPAT.2019.8876527

  • Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends: For Leaders in Education & Training, 59(1), 64–71. https://doi.org/10.1007/s11528-014-0822-x

    Article  Google Scholar 

  • Goncalves, A. L., Carlos, L. M., Da Silva, J. B., & Alves, G. R. (2018). Personalized student assessment based on learning analytics and recommender systems. In 3rd International conference of the portuguese society for engineering education (CISPEE 2018). https://doi.org/10.1109/CISPEE.2018.8593493

  • Gonçalves, A. L., Alves, G. R., Carlos, L. M., Da Silva, J. B., & Alves, D. M. (2018). Learning analytics and recommender systems toward remote experimentation. CEUR Workshop Proceedings, 2188, 26–37.

    Google Scholar 

  • Govaerts, S., Verbert, K., Duval, E., & Pardo, A. (2012). The student activity meter for awareness and self-reflection. CHI ’12, Austin, Texas, 869–884. https://doi.org/10.1145/2212776.2212860

  • Green, J., Wyllie, A., & Jackson, D. (2014). Virtual worlds: A new frontier for nurse education? Collegian, 21(2), 135–141. https://doi.org/10.1016/j.colegn.2013.11.004

    Article  Google Scholar 

  • Griol, D., Molina, J. M., & Callejas, Z. (2014). An approach to develop intelligent learning environments by means of immersive virtual worlds. Journal of Ambient Intelligence and Smart Environments, 6(2), 237–255. https://doi.org/10.3233/AIS-140255

    Article  Google Scholar 

  • Hantoobi, S., Wahdan, A., Al-Emran, M., & Shaalan, K. (2021). A review of learning analytics studies. In M. Al-Emran & K. Shaalan (Eds.), Recent advances in technology acceptance models and theories (pp. 119–134). Cham: Springer International Publishing. https://doi.org/10.1007/978-3-030-64987-6_8

    Chapter  Google Scholar 

  • Hauze, S., & Frazee, J. (2019). Virtual immersive teaching and learning: How immersive technology is shaping the way students learn. EdMedia+ Innovate Learning, (pp. 1445–1450).

  • Heikkinen, S., Saqr, M., Malmberg, J., & Tedre, M. (2022). Supporting self-regulated learning with learning analytics interventions – a systematic literature review. Education and Information Technologies. https://doi.org/10.1007/s10639-022-11281-4

    Article  Google Scholar 

  • Hossain, Z., Bumbacher, E., Brauneis, A., Diaz, M., Saltarelli, A., Blikstein, P., & Riedel-Kruse, I. H. (2018). Design guidelines and empirical case study for scaling authentic inquiry-based science learning via open online courses and interactive biology cloud labs. International Journal of Artificial Intelligence in Education, 28(4), 478–507. https://doi.org/10.1007/s40593-017-0150-3

    Article  Google Scholar 

  • Hossain, Z., Bumbacher, E. W., Chung, A. M., Kim, H., Litton, C., Walter, A. D., Pradhan, S. N., Jona, K., Blikstein, P., & Riedel-Kruse, I. H. (2016). Interactive and scalable biology cloud experimentation for scientific inquiry and education. Nature Biotechnology, 34(12), 1293–1298. https://doi.org/10.1038/nbt.3747

    Article  Google Scholar 

  • Howell, J. A., Roberts, L. D., Seaman, K., & Gibson, D. C. (2018). Are we on our way to becoming a “Helicopter University”? Academics’ views on learning analytics. Technology, Knowledge and Learning, 23(1), 1–20. https://doi.org/10.1007/s10758-017-9329-9

    Article  Google Scholar 

  • Ifenthaler, D., & Yau, J. Y. K. (2020). Utilising learning analytics to support study success in higher education: a systematic review. Educational technology research and development: ETR & D, 0123456789. https://doi.org/10.1007/s11423-020-09788-z

  • Ifenthaler, D. (2017). Are higher education institutions prepared for learning analytics? TechTrends, 61(4), 366–371. https://doi.org/10.1007/s11528-016-0154-0

    Article  Google Scholar 

  • Izatt, E., Scholberg, K., & Kopper, R. (2014). Neutrino-KAVE: An immersive visualization and fitting tool for neutrino physics education. Proceedings - IEEE Virtual Reality. https://doi.org/10.1109/VR.2014.6802062

    Article  Google Scholar 

  • Jona, K., & Vondracek, M. (2013). A remote radioactivity experiment. Physics Teacher, 51(1), 25–27. https://doi.org/10.1119/1.4772033

    Article  Google Scholar 

  • Jones, N. (2018). Simulated labs are booming. Nature, 562(7725), S5–S7. https://doi.org/10.1038/d41586-018-06831-1

    Article  Google Scholar 

  • Kaliisa, R., Rienties, B., Mørch, A. I., & Kluge, A. (2022). Social learning analytics in computer-supported collaborative learning environments: A systematic review of empirical studies. Computers and Education Open, 3, 100073. https://doi.org/10.1016/j.caeo.2022.100073

    Article  Google Scholar 

  • Khalil, M., & Ebner, M. (2017). Clustering patterns of engagement in Massive Open Online Courses (MOOCs): The use of learning analytics to reveal student categories. Journal of Computing in Higher Education, 29(1), 114–132. https://doi.org/10.1007/s12528-016-9126-9

    Article  Google Scholar 

  • King, D. A., Arnaiz, I. A., Gordon-Thomson, C., Randal, N., & Herkes, S. M. (2016). Evaluation and use of an online data acquisition and content platform for physiology practicals and tutorials. International Journal of Innovation in Science and Mathematics Education, 24(5), 24–34.

    Google Scholar 

  • Kitchenham, B., & Charters, S. (2007). Guidelines for performing Systematic Literature Reviews in Software Engineering.

  • Kleven, N. F., & Prasolova-Førland, E. (2014). Virtual University hospital as an arena for medical training and health education. 106.

  • Kumpulainen, M., & Seppänen, M. (2022). Combining web of science and scopus datasets in citation-based literature study. Scientometrics, 127(10), 5613–5631. https://doi.org/10.1007/s11192-022-04475-7

    Article  Google Scholar 

  • Kwong, T., Wong, E., & Yue, K. (2017). Bringing abstract academic integrity and ethical concepts into real-life situations. Technology, Knowledge and Learning, 22(3), 353–368. https://doi.org/10.1007/s10758-017-9315-2

    Article  Google Scholar 

  • Liu, R., Stamper, J. C., & Davenport, J. (2018). A novel method for the in-depth multimodal analysis of student learning trajectories in intelligent tutoring systems. Journal of Learning Analytics. https://doi.org/10.18608/jla.2018.51.4

    Article  Google Scholar 

  • Lynch, T., & Ghergulescu, I. (2017). Review of Virtual Labs As the Emerging Technologies for Teaching Stem Subjects. In: INTED2017 Proceedings, 1, 6082–6091. https://doi.org/10.21125/inted.2017.1422

  • Manchikanti, P., Kumar, B. R., & Singh, V. K. (2017). Role of Virtual Biology Laboratories in Online and Remote Learning. In Proceedings - IEEE 8th International Conference on Technology for Education, T4E 2016, (pp. 136–139). https://doi.org/10.1109/T4E.2016.035

  • Manske, S., & Hoppe, H. U. (2016). The concept cloud: Supporting collaborative knowledge construction based on semantic extraction from learner-generated artefacts. In 16th international conference on advanced learning technologies (ICALT 2016), 302–306. https://doi.org/10.1109/ICALT.2016.123

  • Metcalf, S. J., Kamarainen, A. M., Grotzer, T. A., & Dede, C. J. (2017). Changes in student experimentation strategies within an inquiry-based immersive virtual environment (Annual Meeting of the American Educational Research Association (AERA)).

  • Mikroyannidis, A., Gomez-Goiri, A., Domingue, J., Tranoris, C., Pareit, D., Vanhie-Van Gerwen, J., & Marquez Barja, J. M. (2015). Deploying learning analytics for awareness and reflection in online scientific experimentation. CEUR Workshop Proceedings, 1465, 105–111.

    Google Scholar 

  • Nyland, R., Davies, R. S., Chapman, J., & Allen, G. (2017). Transaction-level learning analytics in online authentic assessments. Journal of Computing in Higher Education, 29(2), 201–217. https://doi.org/10.1007/s12528-016-9122-0

    Article  Google Scholar 

  • Okoli, C. (2015). A guide to conducting a standalone systematic literature review. Communications of the Association for Information Systems, 37(1), 879–910.

    Google Scholar 

  • Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., & Moher, D. (2021). The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. Systematic Reviews, 10(1), 89. https://doi.org/10.1186/s13643-021-01626-4

    Article  Google Scholar 

  • Papamitsiou, Z., & Economides, A. A. (2014). Learning analytics and educational data mining in practice: A systematic literature review of empirical evidence. Journal of Educational Technology & Society, 17(4), 49–64.

    Google Scholar 

  • Potkonjak, V., Gardner, M., Callaghan, V., Mattila, P., Guetl, C., Petrović, V. M., & Jovanović, K. (2016). Virtual laboratories for education in science, technology, and engineering: A review. Computers and Education, 95, 309–327. https://doi.org/10.1016/j.compedu.2016.02.002

    Article  Google Scholar 

  • Qvist, P., Kangasniemi, T., Palomäki, S., Seppänen, J., Joensuu, P., Natri, O., Närhi, M., Palomäki, E., Tiitu, H., Nordström, K., Palomaki, S., Seppanen, J., Joensuu, P., Natri, O., Narhi, M., Palomaki, E., Tiitu, H., Nordstrom, K., Palomäki, S., & Nordström, K. (2015). Design of virtual learning environments: learning analytics and identification of affordances and barriers. International Journal of Engineering Pedagogy (IJEP), 5(4), 64. https://doi.org/10.3991/ijep.v5i4.4962

    Article  Google Scholar 

  • Rahman, F., Mim, M. S., Baishakhi, F. B., Hasan, M., & Morol, M. K. (2022). A systematic review on interactive virtual reality laboratory. In ACM international conference proceeding series, (pp. 491–500). https://doi.org/10.1145/3542954.3543025

  • Ramadahan, M. F., & Irwanto. (2018). Using virtual labs to enhance students’ thinking abilities, skills, and scientific attitudes. In International conference on educational research and innovation (ICERI 2017), Iceri, (pp. 494–499).

  • Reeves, S. M., & Crippen, K. J. (2021). Virtual laboratories in undergraduate science and engineering courses: A systematic review, 2009–2019. Journal of Science Education and Technology, 30(1), 16–30. https://doi.org/10.1007/S10956-020-09866-0

    Article  Google Scholar 

  • Reilly, J. M., & Dede, C. (2019b). Differences in student trajectories via filtered time series analysis in an immersive virtual world. In Proceedings of the 9th International Conference on Learning Analytics & Knowledge (LAK 2019b), (pp. 130–134). https://doi.org/10.1145/3303772.3303832

  • Reilly, J. M., & Dede, C. (2019a). Differences in student trajectories via filtered time series analysis in an immersive virtual world. In Proceedings of the 9th International Conference on Learning Analytics & Knowledge (LAK 2019a). https://doi.org/10.1145/3303772.3303832

  • Richard, E., Tijou, A., Richard, P., & Ferrier, J. L. (2006). Multi-modal virtual environments for education with haptic and olfactory feedback. Virtual Reality, 10(3–4), 207–225. https://doi.org/10.1007/s10055-006-0040-8

    Article  Google Scholar 

  • Richter, T., Boehringer, D., & Jeschke, S. (2011). LiLa: A European project on networked experiments. In Automation, communication and cybernetics in science and engineering 2009/2010 (pp. 307–317). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-642-16208-4_27

  • Robbins, J. B. (2001). ERIC: Mission, structure, and resources. Government Information Quarterly, 18(1), 5–17. https://doi.org/10.1016/S0740-624X(00)00062-9

    Article  Google Scholar 

  • Robles-Gómez, A., Tobarra, L., Pastor, R., Hernández, R., Duque, A., & Cano, J. (2019). Analyzing the students’ learning within a container-based virtual laboratory for cybersecurity. 7th International Conference on Technological Ecosystems for Enhancing Multiculturality (TEEM 2019), (pp. 275–283). https://doi.org/10.1145/3362789.3362840

  • Rodrigues, H. F., Machado, L. S., & Valença, A. M. G. (2014). Applying haptic systems in serious games: A game for adult’s oral hygiene education. Journal on Interactive Systems, 5(1), 1. https://doi.org/10.5753/jis.2014.639

    Article  Google Scholar 

  • Rodríguez-Triana, M. J., Prieto, L. P., Dimitriadis, Y., de Jong, T., & Gillet, D. (2021). ADA for IBL: Lessons Learned in aligning learning design and analytics for inquiry-based learning orchestration. Journal of Learning Analytics, 8(2), 22–50. https://doi.org/10.18608/jla.2021.7357

    Article  Google Scholar 

  • Romero, C., & Ventura, S. (2020). Educational data mining and learning analytics: An updated survey. Wires Data Mining and Knowledge Discovery. https://doi.org/10.1002/widm.1355

    Article  Google Scholar 

  • Salmi, H., Thuneberg, H., & Vainikainen, M. P. (2017). Making the invisible observable by Augmented Reality in informal science education context. International Journal of Science Education, Part b: Communication and Public Engagement, 7(3), 253–268. https://doi.org/10.1080/21548455.2016.1254358

    Article  Google Scholar 

  • Saqr, M., Jovanovic, J., Viberg, O., & Gašević, D. (2022b). Is there order in the mess? A single paper meta-analysis approach to identification of predictors of success in learning analytics. In Studies in Higher Education, (pp. 1–22). https://doi.org/10.1080/03075079.2022.2061450

  • Saqr, M. (2018). A literature review of empirical research on learning analytics in medical education. International Journal of Health Sciences, 12(2), 80–85.

    Google Scholar 

  • Saqr, M., Elmoazen, R., Tedre, M., López-Pernas, S., & Hirsto, L. (2022a). How well centrality measures capture student achievement in computer-supported collaborative learning? – A systematic review and meta-analysis. Educational Research Review, 35, 100437. https://doi.org/10.1016/j.edurev.2022.100437

    Article  Google Scholar 

  • Schwandt, A., Winzker, M., & Rohde, M. (2021). Utilizing user activity and system response for learning analytics in a remote lab. In M. E. Auer & D. May (Eds.), Cross reality and data science in engineering: proceedings of the 17th international conference on remote engineering and virtual instrumentation (pp. 63–74). Springer International Publishing. https://doi.org/10.1007/978-3-030-52575-0_5

    Chapter  Google Scholar 

  • Sergis, S., Sampson, D. G., Rodríguez-Triana, M. J., Gillet, D., Pelliccione, L., & de Jong, T. (2019). Using educational data from teaching and learning to inform teachers’ reflective educational design in inquiry-based STEM education. Computers in Human Behavior, 92, 724–738. https://doi.org/10.1016/j.chb.2017.12.014

    Article  Google Scholar 

  • Slater, T. F., Burrows, A. C., French, D. A., Sanchez, R. A., & Tatge, C. B. (2014). A proposed astronomy learning progression for remote telescope observation. Journal of College Teaching & Learning (TLC). https://doi.org/10.19030/tlc.v11i4.8857

    Article  Google Scholar 

  • SoLAR. (2011). What is Learning Analytics? https://www.solaresearch.org/about/what-is-learning-analytics/

  • Strang, K. D. (2017). Beyond engagement analytics: Which online mixed-data factors predict student learning outcomes? Education and Information Technologies, 22(3), 917–937. https://doi.org/10.1007/s10639-016-9464-2

    Article  Google Scholar 

  • Tempelaar, D., Rienties, B., Mittelmeier, J., & Nguyen, Q. (2018). Student profiling in a dispositional learning analytics application using formative assessment. Computers in Human Behavior, 78, 408–420. https://doi.org/10.1016/j.chb.2017.08.010

    Article  Google Scholar 

  • Tobarra, L., Ros, S., Hernández, R., Robles-Gómez, A., Caminero, A. C., & Pastor, R. (2014). Integrated Analytic dashboard for virtual evaluation laboratories and collaborative forums. In Proceedings of XI Tecnologias Aplicadas a La Ensenanza de La Electronica (Technologies Applied to Electronics Teaching), TAEE 2014. https://doi.org/10.1109/TAEE.2014.6900177

  • Tulha, C. N., Carvalho, M. A. G., & De Castro, L. N. (2022). LEDA: A Learning Analytics Based Framework to Analyze Remote Labs Interaction. In Proceedings of the 9th ACM Conference on Learning @ Scale (L@S ’22), (pp. 379–383). https://doi.org/10.1145/3491140.3528324

  • Udin, W. N., Ramli, M., & Muzzazinah. (2020). Virtual laboratory for enhancing students’ understanding on abstract biology concepts and laboratory skills: A systematic review. Journal of Physics. Conference Series, 1521(4), 042025. https://doi.org/10.1088/1742-6596/1521/4/042025

    Article  Google Scholar 

  • Vahdat, M., Oneto, L., Anguita, D., Funk, M., & Rauterberg, M. (2015). A learning analytics approach to correlate the academic achievements of students with interaction data from an educational simulator. In G. Conole, T. Klobučar, C. Rensing, J. Konert, & E. Lavoué (Eds.), Design for teaching and learning in a networked world: 10th european conference on technology enhanced learning, EC-TEL 2015, Toledo, Spain, September 15-18, 2015, Proceedings (pp. 352–366). Cham: Springer International Publishing. https://doi.org/10.1007/978-3-319-24258-3_26

    Chapter  Google Scholar 

  • Vanessa Niet, Y., Diaz, V. G., & Montenegro, C. E. (2016). Academic decision making model for higher education institutions using learning analytics. 2016 4th International Symposium on Computational and Business Intelligence ISCBI, 2016, 27–32. https://doi.org/10.1109/ISCBI.2016.7743255

    Article  Google Scholar 

  • Vasiliadou, R. (2020). Virtual laboratories during coronavirus (COVID-19) pandemic. Biochemistry and Molecular Biology Education: A Bimonthly Publication of the International Union of Biochemistry and Molecular Biology, 48(5), 482–483. https://doi.org/10.1002/bmb.21407

    Article  Google Scholar 

  • Venant, R., Sharma, K., Vidal, P., Dillenbourg, P., & Broisin, J. (2017). Using sequential pattern mining to explore learners’ behaviors and evaluate their correlation with performance in inquiry-based learning. In É. Lavoué, H. Drachsler, K. Verbert, J. Broisin, & M. Pérez-Sanagustín (Eds.), Data driven approaches in digital education (pp. 286–299). Springer International Publishing. https://doi.org/10.1007/978-3-319-66610-5_21

    Chapter  Google Scholar 

  • Vozniuk, A., Rodriguez-Triana, M. J., Holzer, A., Govaerts, S., Sandoz, D., & Gillet, D. (2015). Contextual learning analytics apps to create awareness in blended inquiry learning. In International Conference on Information Technology Based Higher Education and Training (ITHET 2015), (pp. 1–4). https://doi.org/10.1109/ITHET.2015.7218029

  • Webster, J., & Watson, R. T. (2002). Analyzing the past to prepare for the future: Writing a literature review. The Mississippi Quarterly, 26, xiii.

    Google Scholar 

  • Wise, A. F., & Jung, Y. (2019). Teaching with analytics: Towards a situated model of instructional decision-making. Journal of Learning Analytics, 6(2), 53–69.

    Article  Google Scholar 

  • Wong, J., Baars, M., de Koning, B. B., van der Zee, T., Davis, D., Khalil, M., Houben, G.-J., & Paas, F. (2019). Educational theories and learning analytics: From data to knowledge: The whole is greater than the sum of its parts. In D. Ifenthaler, D.-K. Mah, & J.Y.-K. Yau (Eds.), Utilizing learning analytics to support study success (pp. 3–25). Springer International Publishing. https://doi.org/10.1007/978-3-319-64792-0_1

    Chapter  Google Scholar 

  • Yaron, D., Karabinos, M., Lange, D., Greeno, J. G., & Leinhardt, G. (2010). The chemcollective—virtual labs for introductory chemistry courses. Science, 328(5978), 584–585. https://doi.org/10.1126/science.1182435

    Article  Google Scholar 

  • Zhang, J., Sung, Y. T., Hou, H. T., & Chang, K. E. (2014). The development and evaluation of an augmented reality-based armillary sphere for astronomical observation instruction. Computers and Education, 73, 178–188. https://doi.org/10.1016/j.compedu.2014.01.003

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to thank Hanna Birkeland for her contribution in articles scanning process. This study is co-funded the EU’s Erasmus + program within the project of “European Network for Virtual lab & Interactive SImulated ONline learning (ENVISION_2027)” (2020-1-FI01-KA226-HE-092653). The paper is also co-funded by the Academy of Finland (Suomen Akatemia) Research Council for Natural Sciences and Engineering for the project Towards precision education: Idiographic learning analytics (TOPEILA), Decision Number 350560 which was received by the second author.  

Funding

This study is co-funded the EU’s Erasmus + program within the project of “European Network for Virtual lab & Interactive SImulated ONline learning (ENVISION_2027)” (2020-1-FI01-KA226-HE-092653). The paper is also co-funded by the Academy of Finland (Suomen Akatemia) Research Council for Natural Sciences and Engineering for the project Towards precision education: Idiographic learning analytics (TOPEILA), Decision Number 350560 which was received by the second author.

Author information

Authors and Affiliations

Authors

Contributions

RE led the project. RE and MS contributed to the study design, methods, and manuscript writing; RE contributed to the results, and MS contributed to the discussion and conclusions. All authors contributed to the writing, provided critical feedback, helped shape the research and analysis, All authors read and approved the final manuscript.

Corresponding author

Correspondence to Ramy Elmoazen.

Ethics declarations

Competing interests

The authors have no competing interests to declare that are relevant to the content of this article.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

No

Study ID

Title

Aim of study/research question

S1

Qvist 2015 (Qvist et al., 2015)

Design of Virtual Learning Environments Learning Analytics and Identification of Affordances and Barriers

To present the design and implementation of virtual laboratories, and to discover student and teacher views on the affordances and barriers to learning in these environments

S2

Vahdat 2015 (Vahdat et al., 2015)

A learning analytics approach to correlate the academic achievements of students with interaction data from an educational simulator

Understanding the learning behavior of students while interacting with Technology Enhanced Learning (TEL) systems

S3

Vozniuk 2015 (Vozniuk et al., 2015)

Contextual learning analytics apps to create awareness in blended inquiry learning

RQ1. Do such contextual real-time visualisations improve teacher’s awareness?

RQ2. Are the apps understandable and easy to use?

S4

Castillo, 2016 (Castillo, 2016)

A virtual laboratory for multiagent systems: Joining efficacy, learning analytics and student satisfaction

It aims at capturing the daily activity of students, providing the basis for data-driven assessment, and introducing a distributed virtual laboratory for a multiagent programming course

S5

King 2016 (King et al., 2016)

Evaluation and use of an online data acquisition and content platform for physiology practicals and tutorials

1) to examine the usage pattern of students during delivery of one module of the online practical courseware, “Electrophysiology of the Nerve”, over the first two years of its implementation

2) to gather evidence of the impact of the platform on student engagement and learning outcomes

S6

Manske 2016 (Manske & Hoppe, 2016)

The "Concept cloud": Supporting collaborative knowledge construction based on semantic extraction from learner-generated artefacts

Propose the use of computational methods of semantic extraction to better understand and reflect on the activities in the Go-Lab online learning environment

S7

Hossain 2018 (Hossain et al., 2018)

Design Guidelines and Empirical Case Study for Scaling Authentic Inquiry-based Science Learning via Open Online Courses and Interactive Biology Cloud Labs

To demonstrate that the cloud lab technology in question can support authentic science inquiry-based learning at large scale, and to distill design principles from the core technology, the user interface, and the course for successful deployments of online labs and courses for inquiry-based learning

S8

Metcalf 2017 (Metcalf et al., 2017)

Changes in Student Experimentation Strategies within an Inquiry-Based Immersive Virtual Environment

1. How did students use the Mesocosm tool over time? Did their patterns of use change over time, in terms of number of pools, number of measurements collected, and use of a control?

2. How did students interpret Mesocosm experimental results over time? Was there a change in students' connection of experimental results and their conceptual understanding of the causal relationships affecting the ecosystem?

S9

Venant 2017 (Venant et al., 2017)

Using sequential pattern mining to explore learners’ behaviors and evaluate their correlation with performance in inquiry-based learning

The objective is to identify behavioural patterns for a practical session that lead to better learning outcomes, to predict learners’ performance and to automatically guide students who might need more support to complete their tasks

S10

Berman 2018 (Berman et al., 2018)

Development and initial validation of an online engagement metric using virtual patients

Do student actions while completing an online virtual patient case reflect their engagement?

S11

Goncalves 2018 (Goncalves et al., 2018)

Personalized student assessment based on learning analytics and recommender systems

Analysis of student assessment to provide clues to help teachers in scaffolding the students' performance

S12

Liu 2018 (Liu et al., 2018)

A Novel Method for the In-Depth Multimodal Analysis of Student Learning Trajectories in Intelligent Tutoring Systems

Describing a generalizable approach for combining quantitative and qualitative analyses to yield efficient yet rich sensemaking around intelligent tutoring data

S13

Reilly 2019 (Reilly & Dede, 2019b)

Differences in student trajectories via filtered time series analysis in an immersive virtual world

This study aims to explore ways time-stamped log files of groups’ actions may enable the automatic generation of formative supports

S14

Robles-Gómez 2019 (Robles-Gómez et al., 2019)

Analyzing the students' learning within a container-based virtual laboratory for cybersecurity

This work focuses on the proposal and analysis of a container-based virtual laboratory for a "cybersecurity" subject, from the point of view of the students’ behavior and their outcomes

S15

Sergis 2019 (Sergis et al., 2019)

Using educational data from teaching and learning to inform teachers’ reflective educational design in inquiry-based STEM education

Investigates whether Teaching Analytics can be used to assess Inquiry-based Educational Designs (IED) and relate analyses to customizable students' educational data to facilitate the re-design process

S16

Burbano 2020 (Burbano G & Soler, 2020)

Learning analytics in m-learning: Periodontic education

Understand the transformation of educational and training systems from the perspective of the ubiquitous learning experience of medical and dental students

S17

Chan 2021 (Chan et al., 2021a, 2021b)

The relation of online learning analytics, approaches to learning and academic achievement in a clinical skills course

1. effect of students approach on access of e-learning resource

2. effect of students’ approaches to learning and access of e-learning on academic achievement examination results

S18

Considine 2021 (Considine et al., 2021)

An Automated Support System in a Remote Laboratory in the Context of Online Learning

Reports on learning habits of the students, their backgrounds and their perception of online learning preceding and following the use of the automated tutoring system

S19

DeJong 2021 (de Jong et al., 2021)

Understanding teacher design practices for digital inquiry-based science learning: the case of Go-Lab

Analyze how teachers design Inquiry Learning Spaces (ILSs) for online learning with STEM-related online laboratories in Go-Labs

S20

Rodríguez-Triana 2021 (Rodríguez-Triana et al., 2021)

ADA for IBL: Lessons learned in aligning learning design and analytics for inquiry-based learning orchestration

1) What are the orchestration needs of teachers implementing IBL in their classrooms?

2) To what extent do "alignment of design and analytic" (ADA) solutions fulfill such orchestration needs?

S21

Schwandt 2021(Schwandt et al., 2021)

Utilizing User Activity and System Response for Learning Analytics in a Remote Lab

Perform learning analytics by recording the user interactions and the behavior inside the remote lab

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Elmoazen, R., Saqr, M., Khalil, M. et al. Learning analytics in virtual laboratories: a systematic literature review of empirical research. Smart Learn. Environ. 10, 23 (2023). https://doi.org/10.1186/s40561-023-00244-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40561-023-00244-y

Keywords