Skip to main content

An intervention engine design and development based on learning analytics: the intelligent intervention system (In2S)

Abstract

In this study, an intervention engine based on learning analytics was designed and developed. The intervention engine is named the Intelligent Intervention System (In2S). Within the scope of this research; In2S system and its components have been introduced, and the system is evaluated based on learners’ views. In2S includes three types of intervention that are instructional, supportive, and motivational intervention. The instructional intervention was structured based on assessment tasks. The supportive and motivational interventions were structured based on the learning experiences of the learners. Signal lights (red, yellow, and green) are presented to the learners for each assessment task as an instructional intervention. Supportive intervention is presented to the learners via the dashboard. In the context of motivational intervention, elements of gamification as a leader board, badges, and notifications have been used. In order to obtain the learner’s views about the In2S, semi-structured interviews were conducted with the learners who had a previous learning experience with the system. The learning environment was evaluated based on their views. Learners had a nine-week learning experience in the e-learning environment. Then, eight students who used the system most actively and eight students who used the system most passively were selected for focus group interviews.. According to the findings, it was seen that the learners who use the intervention engine indicated that the system is useful and want to use it in the context of other courses.

Introduction

The use of online learning environments is rapidly increasing in the field of education. Especially, Learning Management Systems (LMS) are used in higher education (Brown, Dehoney and Millichap, 2015). It’s possible to say that there are three generations of LMS (Fiadhi, 2014). LMS 1.0 from 1991 to 2004; LMS 2.0 from 2004 to 2011; and LMS 3.0 from 2011 up until today have been used (Fiadhi, 2014). LMS 1.0 is read-only system (Richardson, 2005; Peraković et al., 2011). Systems presented the content and learners could read the content. LMS 2.0 is structured through web 2.0 (wikis, blogs, forums etc.) (Rubens, Kaplan, & Okamoto, 2012). With LMS 2.0, learners gained the ability to interact with the system. These systems are designed to be personalized according to the needs of the learners, and there is a social interaction between the learner-learner and the learner-instructor (Pinheiro, 2016). However, due to the fact that too many data in these LMSs are not well structured, today’s educational environments can’t adequately satisfy the needs of learning and teaching processes (Šimić, Gašević, & Devedžić, 2004; Shabani & Eshaghian, 2014). These deficiencies in LMS have started to be solved by the third generation LMS (LMS 3.0). Nowadays, Learning Management System 3.0 (LMS 3.0) based systems now take their place as an e-learning environment. LMS 3.0 systems use learning analytics and machine learning intensively. Learning analytics provide a crucial power to improve learners’ learning performance and increase learning efficiency (Dyckhoff et al., 2012). In addition, learning analytics can provide better feedback about the learning process. (Kloos et al., 2013). Although learning analytics is one of the most important issues of the last period, it is still an immature area. There is a great deal of study on learning analytics in the literature (Ali et al., 2012; Lonn, Aguilar, & Teasley, 2015; Choi et al., 2018). But learning analytics could not associate with the learning theories and learning design yet (Şahin, 2018). One of the issues about LMS 3.0 and learning analytics is that the concept of intervention engine is not fully modeled. When the studies about intervention in the literature are examined, it is observed that they have a 100-year history and are aimed at improving performance (Kluger & DeNisi, 1996). Interventions are conducted through feedback. Narciss et al. (2004) underline that feedback is an important factor in promoting effective learning in all types of computer-based learning environments.

Whereas, in order to develop LMS 3.0, the intervention engines need to be modeled and developed. Course Signal (Arnold & Pistilli, 2012), E2Coach (Electronic and Expert Coach) (Mckay, Miller, & Tritz, 2012), and iMoodle (An Intelligent Moodle Based on Learning Analytics) (Tlili et al., 2018) studies can be presented as an example of an intervention studies which were developed. One of the main reasons for this is the fact that the concept of intervention is a psycho-educational structure, and this structure cannot yet be associated with learning analytics (Wu et al., 2015; Şahin, 2018). In this study, the Intelligent Intervention System (In2S) was developed in order to fill this gap. Within the scope of this research;

  1. (a)

    In2S system and its components have been introduced.

    1. (i)

      instructional intervention.

    2. (ii)

      supportive intervention.

    3. (iii)

      motivational intervention.

  2. (b)

    then, the system is evaluated based on learners’ views.

In the scope of this research, firstly, educational data mining and learning analytics, the concept of intervention, intervention systems based on learning analytics are discussed. Then In2S system and components are introduced. Finally, the system has been examined with real-time users and user views about the In2S are presented.

Educational data mining and learning analytics

Learning environments have been influenced by many developments from past to present. One of these is instructional technologies. In instructional technologies, there have been changes in both definition and content. The definition of the instructional technology by AECT (2008) is “the study and ethical practice of facilitating learning and improving performance by creating, using and managing appropriate technological processes and resources.” In this context, it is seen that learning analytics will play an important role in facilitating learning and increasing learner performance. Because the ultimate goal of learning analytics is to improve learning and teaching process (Elias, 2011). Learning analytics use various data sources, including interaction data in online learning environments, to improve efficiency and improve learning environments. The aim of using interaction data is to increase the understanding of learning environments and to improve learners’ experience (Pardo & Dawson, 2016). There are many types of data that can be recorded and used in online learning environments. This data types are presented by Tzelepi (2014) as follows:

• Interaction data (from learner and instructor).

• Online discussion (from learner and instructor).

• Instructional design choices and tools presented to the learners (from instructor).

• Use of existing LMSs (from learner and instructor).

In online learning environments, it is also possible to collect self-report data from learners. Collecting and analyzing all these data contributes to the improvement of the learning environment by creating certain interventions in the learning environment. Educational data mining methods are utilized in order to make improvements based on these data obtained from learning environments. Educational data mining is defined as the development of methods to reveal meaningful structures and latent patterns from the data obtained from educational environments and to use them in accordance with the purpose of these methods (Baker & Siemens, 2014). Educational data mining is utilized in order to explore and determine learners’ interaction pattern. Then, the necessary improvements and interventions are made by learning analytics. In this context, two concepts appear that are referred as analytic and analyze. The concept of Analyze is defined as a process of determining the pattern, and the concept of analytic is defined as utilizing this patterns in order to improve the learning environment.

Learning analytics is defined as; collecting, analyzing, measuring, and reporting data about learners and learning environments in order to understand and improve learning environments and processes (Siemens & Gasevic, 2012). Learning analytics provide a very important power to improve learning performance of learners and to increase the efficiency of learning process (Dyckhoff, Zielke, Bültmann, Chatti, & Schroeder, 2012). Besides, learning analytics can provide better feedback about the learning process (Kloos, Pardo, Munoz-Merino, Gutierrez, & Leony, 2013). The learning environment designs are reviewed and improved, and more suitable environments can be developed based on this feedback.

The main purpose of learning analytics is to improve the learning environment and the learning process. With the use of learning analytics, it has become possible to intervene to the learners in e-learning environments during the learning experiences and improve the learning environment. The ultimate aim of the interventions is to increase learners’ achievement or improve learners’ learning experiences (Pardo & Dawson, 2016). Two types of interventions are possible in e-learning environments. One of them intervenes to the system and the other one is to intervene to the individual. This structure is presented in Fig. 1.

Fig. 1
figure 1

Intervention systems based on learning analytics

As can be seen in Fig. 1; meaningful and latent patterns from big data are obtained by educational data mining and interventions can be made to the system or individual based on these patterns. It’s possible to intervene to the system via adaptive engine and intervene to the individual via intervention engines.

Educational intervention in online learning

Intelligent Tutoring Systems, Adaptive Hypermedia Systems, Electronic Performance Support Systems, and Learning Management Systemsare some of the systems that intervene to the individuals or systems. However, intervention is an important structure and it is not possible to consider it independent from theory. First of all, in the context of classroom management, changing the order of the student or changing the instruction material are interventions. However, while the concept of intervention has developed several models for students who need more intervention in education (response to intervention, RTI theory), this concept is frequently used, especially in the field of psychology (in the context of behavior change). In this context, in order to be able to design an intervention engine based on learning analytics, the concept of intervention should be well investigated and understood.

In the literature, it is observed that intervention studies have 100 years’ history and interventions have been aimed at improving performance (Kluger & DeNisi, 1996). Narciss, Körndle, Riemann, & Müller (2004) underline that feedback is an important factor in promoting effective learning across all types of computer-based learning environments. In this context, Kluger and Denisi introduced Feedback Intervention Theory (FIT) that provides knowledge about the performance of the learners as feedback. But the concept of intervention is not only feedback but also a broader concept that includes feed-forward and feed-up. Therefore, limiting the concept of intervention to feedback is not an accurate approach. So it is possible to say that all feedbacks are an intervention but not all interventions are feedback.

It is seen that there are different types of intervention in the literature. Even, there is “Theory of Intervention” and “Models of Intervention” in the literature. Intervention is defined as “interact with individuals in an on-going system in order to help them” by (Argyris, 1970). The concept of intervention for e-learning environments has been investigated in accordance with the research context. And the types of intervention made by Geller (2005) are discussed. Thus, these types of intervention and their definitions are closer to the educational context. Geller (2005) defines three types of intervention as a) instructional, b) supportive, and c) motivational intervention. The intervention engine which is developed within the scope of this study includes instructional, supportive, and motivational intervention types. Information about these types of interventions is presented in Table 1.

Table 1 Types of Intervention and their definition (Geller, 2005)

These concepts and components are discussed in behavioral psychology by Geller. In the field of education concept of intervention has not been considered as a structure in order to guide educational designs. Therefore, in this study, the dimensions of intervention are referenced as seen in Table 1. And the concepts of the intervention were transferred and designed for online learning environments.

In the educational context, instructional interventions are frequently performed is seen in the literature. Instructional intervention; use the assessment tasks as an activator and it is done after the assessment tasks. The benefits of intervention are examined according to the result of the next assessment task. Supportive and motivational interventions can be structured based on the learning experiences of the learners. Supportive intervention is the type of intervention that reinforces the correct behaviors of learners and ensures the continuity. Motivational intervention is an intervention to provide external encouragement to the learners. Within the scope of this research, an intervention engine was designed and developed which includes instructional intervention based on assessment tasks, supportive and motivational interventions based on learning experiences.

Intervention engines based on learning analytics

With the use of learning analytics, it is possible to intervene to the learners during their learning experience in online environments. The ultimate aim of the interventions is increasing learner achievement or improving students’ learning experiences (Pardo & Dawson, 2016). Learners’ formative performance and learning objectives are taken into consideration for using analytics as an intervention (Lonn, Aguilar, & Teasley, 2015). If learners’ formative performance and learning objectives are not taken into consideration, this situation is likely to affect their performance and engagement negatively. Developing a structured intervention model based on learning analytics can improve learning and teaching experiences of the learners (Wu, Huang, & Zou, 2015). Studies have shown that effective interventions have a significant impact on the learning performance of learners (Chen, 2011).

In the literature, it is seen that the investigation of the intervention models is not done extensively enough (Wu et al., 2015). It is observed that the adaptive engine studies that intervene in the system are performed, but the intervention engine studies which intervene to the individual are very limited. Besides, it is seen that the studies related to the intervention engine are limited with the feedback intervention. Developing an intervention engine model in order to improve the learning performance of learners has an important role in promoting both learning and teaching experiences and in progressing of learning analytics (Wu et al., 2015). In this research, an intervention engine based on learning analytics was designed and developed in order to improve learners learning output. The intervention engine includes feed-forward in addition to the feedback.

Intelligent intervention system (In2S)

Learning analytics also offers researchers the opportunity to develop intelligent LMS. Due to the fact that a lot of data is unstructured in LMS, LMSs cannott meet today’s requirements in education (Šimić, Gašević, & Devedžić, 2004; Shabani & Eshaghian, 2014). In order to meet these requirements, Intelligent Learning Management Systems (ILMS) have been developed. LMSs; storage the content, present content to the students, make assessment tasks, grading, and documentation about the students; ILMS is a software that collects all information about the learners from the log and utilizes them for the next step (Parthasarathy, Ananthasayanam, & Ravi, 2011). ILMSs track learning activities, report, recommend etc. and they can reduce the time spent on instruction (Fardinpour, Pedram, & Burkle, 2014).

In the scope of this research; the intervention system was developed as an add-on to Moodle LMS. The developed e-learning environment is also an example of ILMS, which is a third generation LMS. Because, in this system individual orientations and recommendations are made to the learners, the learners’ interactions and the findings related to the assessment tasks are reported to them and learners are encouraged to interact with the system. The intervention engine includes instructional, supportive and motivational intervention components. Position of these intervention components in the e-learning environment presented in Fig. 2a and b.

Fig. 2
figure 2

(a) Position of intervention types in the e-learning environment. (b) Screenshot about the intervention types in the e-learning environment

There is a great deal of data about learners’ interaction in unstructured form in Moodle LMS. In this study selected 11 metrics about learners’ interaction data. For this purpose, feature selection algorithms which are used for pre-processing phase in educational data mining was utilized. Selected 11 metrics and their themes presented in Table 2.

Table 2 Metrics which used in the system as interaction data

As seen in Table 2; there are three sample metrics for L-C, L-A and L-L interaction theme and there are two sample metrics for the L-I interaction theme. After the metrics that used as the interaction data are given, components of the intervention types are presented.

Components of the instructional intervention

The instructional intervention is structured based on the assessment tasks of the learners. The results of the assessment task of the learners were calculated using the caution index developed by Sato (1980), which is a successful diagnostic tool, and the learners were categorized according to their learning performance. SATO is a caution index that aims to provide feedback to the learners and the instructor about the learning process after the assessment tasks (Acar, 2006). SATO caution index classifies learners to six different levels like A, A’, B, B′, C and C′. The learners were classified according to the results of the assessment tasks and the classes of the learners were presented with signal lights. In this context, as an instructional intervention; red, yellow and green signal lights presented to the learners for each assessment task. If the signal light is red, also topic contingent feedback about deficient topics is presented to the learners. In addition to these, textual feedback is presented to the learners. Examples of instructional intervention are presented in Fig. 3.

Fig. 3
figure 3

Examples of Instructional Intervention

As shown in Fig. 3, different feedback is given both textually and visually based on the learner’s assessment results. The system presents feedback to the learners for each assessment task. In addition to this, reminder notifications are sent to the learners who have not completed their evaluation tasks via e-mail and SMS.

Components of the supportive intervention

Supportive intervention is structured based on learners’ experiences and results are presented to the learners via dashboard. Interaction themes are presented to the learners as five themes which are Learner-Content (L-C), Learner-Assessment (L-A), Learner-Discussion (L-D), Learner-Instructor (L-I) and Learner-Overall (L-O). Screenshot of these themes is presented in Fig. 4.

Fig. 4
figure 4

Screenshot of the Supportive Intervention

As a supportive intervention, different interaction data presented to the learners. These are:

  • Individual interaction performance (1),

  • Compare their interaction performance with group (2),

  • Prediction of the achievement status (3)

Naive Bayes, one of the educational data mining methods, has been utilized in order to make predictions of achievement of the learners.

Components of the motivational intervention

Supportive intervention is structured based on learners’ experiences. In the context of motivational intervention, elements of gamification have been used. The leader board was created by getting an interaction point from the students’ interaction. In order to get a single interaction score, gray relationship analysis, which is one of the multi-criteria decision-making algorithm, was employed. Also, weekly badges were presented to the learners. In addition to this, various notifications were sent to the students who had not enter the system for a certain period of time via SMS and e-mail. Examples of motivational intervention are presented in Fig. 5.

Fig. 5
figure 5

Examples of Motivational Intervention

As shown in Fig. 5 these weekly badges are presented to the learners;

  • Most interaction with content,

  • Highest rating from the assessments

  • Most interaction with discussion environments

  • Most interaction with the instructor

  • Most interaction with the learning environments (overall)

Moreover, a leader board is created and presented to the top 10 learners according to their interaction with the e-learning environment. In addition, e-mail and SMS notifications were sent by the system to the learners who had not enter the e-learning environment for a certain period of time.

Learning environment

There are four types of interaction in the online learning environment. These are L-C, L-A, L-L, and L-I. And learners interacted with these interaction themes. Detailed information on these types of interactions is as follows.

  • Text, SCORM packages and video material (L-C Interaction)

  • Assessment tasks after the unit and unit test (L-A Interaction)

  • In the discussion topics (L-L Interaction)

  • Via message (L-I Interaction)

In addition, learners interacted with the dashboard and interventions presented by the system. However, the interaction data of these interventions and dashboard are not included in this study. Types of intervention and their components are presented in Table 3.

Table 3 Types of intervention and their components

For instructional intervention, it used the traffic signal (inspired by Purdue University). And also some textual feedback and topic contingent feedback were presented to the learners. As supportive intervention; comparison with the group, individual interaction performance, and prediction of achievement status were presented to the learners via dashboard. And in the context of motivational intervention; leader board, badges, and notification were presented to the learners.

Data collection

Learners had a nine-week learning experience in the e-learning environment. Then, eight students who used the system most actively and eight students who used the system most passively were selected for focus group interviews in order to evaluate the system. The focus group interviews were conducted in four different interviews with four students in each group. Focus group interviews were conducted with a semi-structured interview form developed by the researchers. The descriptive information about the learners who participated in the focus group interviews is presented in Table 4.

Table 4 Descriptive information about the learners who participated focus group interview

As seen in Table 4, focus group interviews were conducted with 16 learners. In order to evaluate the system, instead of interviewing all learners, learners with high and low interaction levels were selected. Extreme or deviant case sampling was used as a sampling type. Extreme or deviant case include more information about the situation (Creswell & Poth, 2018), so this type of sampling served better toward the purpose of this study. Nine of the learners who participated in the focus group interview were male and seven of them were female. Interaction level was determined based on learners’ interaction score. Interaction scores were obtained from log data. Low level learners indicates that the learnerinteracted with the system at a low level. High-level learners are the opposite of the low level. The data obtained from the interviews were analyzed by content analysis to evaluate the e-learning environment The findings are presented in detail findings topic.

Case study

This study is designed as a case study. Case study is a research method that examines a real situation or system in detail (Creswell & Poth, 2018). In order to obtain the learner’s views about the In2S, the learning environment was presented to the learners and the learning environment was evaluated according to their views. In this study, a holistic single case design was utilized (Yin, 2017). Because in this research, only one case has been completely examined. In the following section, information about participants, e-learning environment, data collection, and findings are presented.

Participants

79 learners signed up to the e-learning environment and had a learning experience for nine weeks. These learners were enrolled “Computer Networks and Communication” course in the spring semester of the 2017–2018 academic year. 46% (n = 36) of learners were female, and 54% (n = 43) of learners were male.. Participants had previous e-learning experiences. Participants were enrolled in Computer Education and Instructional Technology Department.. The study was approved by the Ethical Review Board and the participants signed the inform consent for the study priorty to participation. .

Data analyze

Content analysis was utilized as data analysis. Content analysis allows researcher to discover content in a communication source (Neuman, 2007). Content analysis is conducted to cluster related themes with each other (Weber, 1990). In this research, related statements were clustered and themes were created. In order to do this, the following steps were taken:

  • Coding,

  • Determination of the themes,

  • Preparation of the themes and codes and,

  • Interpretation of findings.

Findings

In this section, descriptive information about the system interaction and findings of content analysis are presented. Before the learners’ opinions about the system, log data about the use of the system were presented. Within the e-learning environment, learners can interact under four different themes as learner-content, learner-learner assessment, learner-learner and learner-instructor. Eleven different metrics under four different interaction types of learners are recorded by the system. Average, max and min values for these metrics are presented in Table 5.

Table 5 Descriptive information about the system interaction

Firstly, descriptive information about the system interaction is presented in Table 5.

As seen in Table 5, learners visited content pages on average 1520 times. The least visited learner interacted with the content page 19 times., and the most visited learner interacted with the content page 26173 times. The other metric is learners total time spent in content. Learners total time spent in content on average was 96,522 s. The least total time spent in content was 369 s. The most total time spent in content was 747,922 s. In addition to these, other theme and metrics values are presented in Table 5. As seen in Table 5, content interaction is the first order as expected. It means that the learner-content theme is the most interacted by the learners. L-L interaction is second order, L-A interaction is third order and L-I interaction is the last order. Some learners didnot prefer to interact with the assessment, other learners and instructor. After these descriptive information findings of the content analysis are presented below.

The findings are presented respectively as; (a) effectiveness of the system, (b) findings of instructional interventions, (c) findings of supportive interventions and (d) findings of motivational interventions. The first finding to be presented is the effectiveness of the system. In this context, learners who have both low and high-level interaction stated that the system was effective and the interventions were beneficial. Some of the learners’ views about the system are as follows:

P1: “...when I see the charts, tables, schemes and increasing, I want to more interact with the system and I want to study more...”

K5: “...Normally when we login to the moodle environment we just get bored. But when it comes to those indicators, we get more and more efficient when the system follows...”

P10: “...If you don’t do your tasks in school, you will be punished if you do it on the site, you will take an award. I am going to attend the course for discontinuity in school but I am login to the system for learning...”

The instructional interventions were structured in order to show the weak and strong aspects of the learners after the assessment tasks, to guide them to the subjects they are lacking, and to have a more effective learning experience. In this context, as an instructional intervention red, yellow, and green signal lights presented to the learners for each assessment task. If the signal light is red, also topic contingent feedback about deficient topics is presented to the learners. In addition to these, textual feedback is presented to the learners. It was emphasized by the learners that the signal lights are the most remarkable component; they encourage them and are beneficial for them.

P8: ... I like them, I saw them when I login to the system, it was useful to send me directly to the topic. Just because those red lights bothered me, I went from that connection to the subject and I did the assessment. Just because those lights bother me in red. I login it with a request to correct it...

Learners said that it would be useful to get notifications about assessment tasks and assignments. It has been observed that such notifications are necessary for learners who tend to postpone behavior.

P10: “...Reminders was good. You haven’t done the assessment, you know, you’re doing a conscience. It’s good for me....”

Another intervention type is the supportive intervention. Supportive intervention was made in order to enable learners to continue their activities in this way by presenting interaction data for learning processes and to see their own performance according to the group. Information about their learning experience is presented via dashboard to the learners. The learners stated that the supportive intervention improved their self-regulation and planning skills.

P8: “...My daily interaction was effective in my self-interpretation of how much I studied. I thought I had to increase the count of the day when I saw that I was missing this group....”

Some of the learners stated that their individual daily performances were more effective and regulated according to this graph. And some others stated that the graph which comparison with the group was more effective.

P1: “...It’s good, are we bottom or top? If I’m on the top, it motivates me; if I’m on the bottom, it resolves me...”

P8: “...My favorite part of the system because the supporters part of me to see myself. In addition, the system has something special and you can see it according to the group and you can see it individually. I want to login this system thanks to the promoters...”

In addition to the graphs showing their status according to individual and group, prediction of achievement status was presented to the learners. For this purpose, machine learning and educational data mining were utilized. This prediction is one of the factors that increase students’ interaction with the system.

P13: “...In the first weeks there was a sentence as you will be failed, it made me ambitious. I made an effort to change its to green. After it returned to green, you continue in the same way …”.

P10: “...As they login to the system, it was very motivating to see that it increased. Sometimes I was just login to the system in order to increase my graph...”

Motivational intervention included to the system for external encouragement. Leader board, badges and notifications were used in the system as components of motivational intervention. The learners stated that the leader board and badges in the system increased the competition and motivated them.

P10: “...The leader table was my most motivated resource. Once I entered the leader table … because I like to compete. After I saw that I entered the leader table, I started to work better...”

P12: “...The badge is of course that you would get to go upstairs. I’d like to see my ranking. It’s a good thing for a man to know himself...”

It is possible to conclude that the learners who logged in to system stated that the system was useful and effective. They want to see information about their interaction in such systems with graphs and visuals and this had a positive effect on the learning processes.

In addition to these positive findings, there are some negative findings about the system. First of all, learners stated that textual feedback did not attract much attention compared to visuals. And also, the number of badges was insufficient. So, learners stated that the number of badges should be increased in the system. Learners stated that they need a moderator/instructor in the e-learning environments.

P8: “…I’m more impressed by the flashing light than those textual feedbacks…”.

P10: “…It’s good to have a moderator, you want it to be in discussions. Good to have a teacher in discussions...”

P9: “…I think the badges could be improved or something nice. Add different badges…”.

Results and discussion

Within the scope of this research, components of an intervention engine design based on learning analytics were introduced and the system is evaluated based on learners’ views. The intervention engine includes instructional, supportive, and motivational intervention components. Instructional intervention is designed based on the assessment tasks of learners. Supportive and instructional interventions are structured based on the learning experiences of the learners. Learners through interventions: (a) can monitor individual performance, (b) compare themselves with groups, (c) determine missing topics, (d) monitor individual prediction of achievement and (e) obtain information about individual learning process. Educational interventions positively effect learning effectiveness (Choi et al., 2018) also, systems that identify dropout learners early and intervene are very important (Casy & Azcona, 2017). It is also possible to identify dropout learners and make interventions based on this patterns with learning analytics. In this study an intervention engine was developed based on learning analytics. In2S can be shown as a reference model for intervention engine based on learning analytics. In the context of this study, system was evaluated based on learners’ views. It was seen that the learners who use the intervention engine said that the system is useful and want to use it in the context of other courses. In the literature, it was found that the systems developed based on learning analytics was useful, usable, satisfaction levels were high, and students wanted to use such systems for other courses (Arnold & Pistilli, 2012; Ali, Hatala, Gasavic, & Jovanovic, 2012; Govaerts, Duval, Verbert, & Pardo, 2012; Hu, Lo, & Shih, 2014; Park and Jo, 2015).

Ma, Han, Yang, & Cheng, (2015) pointed out that the instructors’ guidance and assistance in providing students had a significant impact in completion of students’ learning tasks. In this context, the instructional intervention was structured based on assessment tasks of learners. It was found that the signal lights indicating the strengths and weaknesses of the learners were effective for the learners, and learners made effort in order to change red to green. Similarly, Arnold and Pistilli (2012) found that signal lights have a positive effect on the motivation of learners. They emphasized that topic contingent feedback is one of the positive aspects of the system. In addition, it was stated that the notification e-mails and SMS related to the assessment tasks were beneficial.

The systems must be developed that can predict learners’ achievement and identify students at risk (Siemens, 2013; Casey & Azcona, 2017). One of the supportive intervention components presented to the students within the scope of this study is to prediction their achievement status based on their interactions. The learners who use the system also stated that the graph that predicts the achievement is effective and that they work hard in the green area. It is possible to say that supportive intervention is effective. Similarly, Hu, Lo & Shih (2014) developed a system that predicted learners’ performances and found that this system was successful.

Providing a learning climate that supports their autonomy is important for learners’ self-regulated skills (Ng, Liu & Wang, 2015). Design based on learning analytics develops students’ self-regulation skills (Lu, Huang, Huang & Yang, 2017). In this context, learners who use the system stated that supportive interventions have a positive contribution to self-regulation skills. In addition, systems must encourage learners to observe and reflect in order to support the autonomy of learners (Ribbe & Bezenilla, 2013). From this perspective, the system presented to the learners’ individual interaction performance and comparison with the group. And learners had the opportunity to reflect. In addition to this, the learners have seen their deficiencies with supportive interventions and have continued their interactions by determining a starting point for themselves. Based on all these findings, it is possible to say that supportive intervention supports the autonomy of learners and contributes positively to self-regulation skills.

According to the findings obtained from the interviews with the learners, they stated that the leader board is the greatest motivation source of them. They also stated that badges increased their motivation but the number of badges should be increased. In addition to the leader table and badges, they also indicated that the e-mails and SMSs integrated into the system excited them and that they were satisfied with the arrival of these notifications.

Recommendations

Nowadays, data can be obtained from various sources (database, data warehouse, sensors).However, this data is unstructured form. Through educational data mining and learning analytics, patterns can be determined in this data and interventions can be developed based on these patterns. The process of learning analytics is a cyclical and formative process. The output of a research constitutes the input of another research. Therefore,, it is necessary to identify the patterns of learners and then develop the systems that can make interventions based on these patterns. In addition to the interaction data of the learners, sequential patterns can be determined and interventions can be structured based on them.

The individualization of learning processes of learners has an important role for the future of learning analytics (Siemens, 2013). It is recommended to add features such as plagiarism detection, cognitive support, strategy training, and customizable learning pathways to systems developed based on learning analytics. Besides, it is seen that the relationship between learning analytics and learning theory is not established sufficiently. The relationship between learning analytics and learning theories should be established and the designs should be made based on this model or models.

Limitations

The intervention engine developed within the scope of this research is limited to three types of intervention named as instructional, supportive,and motivational. In addition, the developed intervention engine can be used as a plug-in for Moodle LMS. In addition, In2S was used only by 79 learners and the system was evaluated based on 16 learners’ opinions. One of the limitations of the study is its application on a small scale. Moreover, the evaluation of the system is done based on a single data collection source. In order to test the effectiveness of such systems, applying them to a larger group of learners and collecting data from several different sources will provide more effective results.

Availability of data and materials

The authors do not have ethics approval to make the raw student data or the tool available to anyone outside the organisation, in which the experiment was conducted.

References

  • Acar, T. (2006). Sato uyarı indeksleri ile madde ve başarı analizleri. Retrieved April 09, 2018, from http://www.parantezegitim.net/hakkimizda/Sato-TulinACAR.pdf

  • AECT. (2008). Definition. In A. Januszewski & M. Molenda (Eds.), Educational technology: A definition with commentary (pp. 1–14). New York: Lawrence Erlbaum Associates.

    Google Scholar 

  • Ali, L., Hatala, M., Gasavic, D., & Jovanovic, J. (2012). A qualitative evaluation of evolution of a learning analytics tool. Computers & Education, 58, 470–489.

    Article  Google Scholar 

  • Argyris, C. (1970). Intervention theory and method: A behavioral science view. Addison Wesley: Reading, MA.

    Google Scholar 

  • Arnold, K. E., & Pistilli, M. D. (2012). Course signals at purdue: Using learning analytics to increase student success. In Proceedings of the 2nd international conference on learning analytics and knowledge (pp. 267–270). ACM.

  • Baker, R., & Siemens, G. (2014). Educational data mining and learning analytics. In K. Sawyer (Ed.), Cambridge handbook of the learning sciences (2nd ed., pp. 253–274).

    Chapter  Google Scholar 

  • Brown, M., Dehoney, J., & Millichap, N. (2015). The next generation digital learning environment. A Report on Research (ELI paper). Louisville, CO: Educause April.

  • Casey, K., & Azcona, D. (2017). Utilizing student activity patterns to predict performance. International Journal of Educational Technology in Higher Education, 14(1), 4.

    Article  Google Scholar 

  • Chen, L. H. (2011). Enhancement of student learning performance using personalized diagnosis and remedial learning system. Computers & Education, 56(1), 289–299.

    Article  Google Scholar 

  • Choi, S. P., Lam, S. S., Li, K. C., & Wong, B. T. (2018). Learning analytics at low cost: At-risk student prediction with clicker data and systematic proactive interventions. Journal of Educational Technology & Society, 21(2), 273–290.

    Google Scholar 

  • Creswell, J. W., & Poth, C. N. (2018). Qualitative inquiry and research method: Choosing among five approaches (2nd ed.). Thousand Oaks, CA: Sage.

    Google Scholar 

  • Dyckhoff, A. L., Zielke, D., Bültmann, M., Chatti, M. A., & Schroeder, U. (2012). Design and implementation of a learning analytics toolkit for teachers. Educational Technology & Society, 15(3), 58–76p.

    Google Scholar 

  • Elias, T. (2011). Learning analytics: definitions, process and potential. http://learninganalytics.net/LearningAnalyticsDefinitionsProcessesPotential.pdf

  • Fardinpour, A., Pedram, M. M., & Burkle, M. (2014). Intelligent learning management systems: Definition, features and measurement of intelligence. International Journal of Distance Education Technologies (IJDET), 12(4), 19–31.

    Article  Google Scholar 

  • Fiaidhi, J. (2014). The next step for learning analytics. IT Professional, 16(5), 4–8.

    Article  Google Scholar 

  • Geller, E. S. (2005). Behavior-based safety and occupational risk management. Behavior Modification, 29(3), 539–561.

    Article  Google Scholar 

  • Govaerts, S., Duval, E., Verbert, K., & Pardo, A. (2012). The student activity meter for awareness and self-reflection. CHI 2012. Austin, TX: ACM.

    Google Scholar 

  • Hu, Y., Lo, C., & Shih, S. (2014). Developing early warning systems to predict students’ online learning performance. Computers in Human Behavior, 36, 469–478.

    Article  Google Scholar 

  • Kloos, C.D., Pardo, A., Muñoz-Merino, P.J., Gutiérrez, I, Leony, D. (2013). Learning analytics @ UC3M. In 2013 IEEE global engineering education conference (EDUCON), Berlin, Germany.

  • Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Pyschological Bulletin, 119(2), 254–284.

    Article  Google Scholar 

  • Lonn, S., Aguilar, S. J., & Teasley, S. D. (2015). Investigating student motivation in the context of a learning analytics intervention during a summer bridge program. Computers in Human Behavior, 47, 90–97.

    Article  Google Scholar 

  • Lu, O. H., Huang, J. C., Huang, A. Y., & Yang, S. J. (2017). Applying learning analytics for improving students engagement and learning outcomes in an MOOCs enabled collaborative programming course. Interactive Learning Environments, 25(2), 220–234.

    Article  Google Scholar 

  • Ma, J., Han, X., Yang, J., & Cheng, J. (2015). Examining the necessary condition for engagement in an online learning environment based on learning analytics approach: The role of the instructor. Internet and Higher Education, 24, 26–34.

    Article  Google Scholar 

  • McKay, T., Miller, K., & Tritz, J. (2012, April). What to do with actionable intelligence: E 2 coach as an intervention engine. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp. 88–91). ACM.

  • Narciss, S., Körndle, H., Reimann, G., & Müller, C. (2004). Feedback-seeking and feedback efficiency in web-based learning–how do they relate to task and learner characteristics. In instructional design for effective and enjoyable computer-supported learning. Proceedings of the first joint meeting of the EARLI SIGs instructional design and learning and instruction with computers (pp. 377–388).

  • Neuman, L. W. (2007). Social research methods, 6/E. Delhi: Pearson Education.

    Google Scholar 

  • Ng, B. L. L., Liu, W. C., & Wang, J. C. (2015). A preliminary examination of teachers’ and students’ perspectives on autonomy-supportive instructional behaviors. Qualitative Research in Education, 4(2), 192–221p.

    Article  Google Scholar 

  • Pardo, A., & Dawson, S. (2016). Measuring and visualizing learning in the information-rich classroom. In P. Reimann (Ed.), Learning analytics (pp. 41–55). UK: Routledge.

    Google Scholar 

  • Park, Y., & Jo, I. H. (2015). Development of the learning analytics dashboard to support students’ learning performance. Journal of Universal Computer Science, 21(1), 110–133.

    Google Scholar 

  • Parthasarathy M, Ananthasayanam, R., & Ravi, R. (2011). Intelligent learning management system: A conceptual framework. Indian Streams Research Journal, 1(5). ISSN:-2230-7850.

  • Peraković, D., Remenar, V., & Jovović, I. (2011, January). LMS 2.0: Next generation of E-learning. In Third International Scientific Congress of Informational technology for e-learning–ITeO 2011.

  • Pinheiro, M. M. (Ed.). (2016). Handbook of research on engaging digital natives in higher education settings. Hershey PA: IGI Global.

    Google Scholar 

  • Ribbe, E., & Bezenilla, M. J. (2013). Scaffolding learner autonomy in online university courses. Digital Education Review, 24, 98–113.

    Google Scholar 

  • Richardson, W. (2005). The educator’s guide to the read/write web. Educational Leadership, 63(4), 24.

    Google Scholar 

  • Rubens, N., Kaplan, D., & Okamoto, T. (2012, September). E-learning 3.0: Anyone, anywhere, anytime, and AI. In International conference on web-based learning (pp. 171–180). Berlin, Heidelberg: Springer.

    Google Scholar 

  • Şahin, M. (2018). Design and development of the intervention engine based on learning analytics for e-learning environments (PhD Dissertation). Hacettepe University, Ankara.

    Google Scholar 

  • Sato, T. (1980). The SP chart and the caution index. NEC Educational Information Bulletin, 80(1).

  • Shabani, Z., & Eshaghian, M. (2014). Decision support system using for learning management systems personalization. American Journal of Systems and Software, 2(5), 131–138.

    Google Scholar 

  • Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist, 57(10), 1380–1400.

    Article  Google Scholar 

  • Siemens, G., & Gasevic, D. (2012). Guest editorial – learning and knowledge analytics. Educational Technology & Society, 15(3), 1–2.

    Google Scholar 

  • Šimić, G., Gašević, D., & Devedžić, V. (2004). Semantic web and intelligent learning management systems. In Workshop on Applications of Semantic Web Technologies for e-Learning

  • Tlili, A., Essalmi, F., Jemni, M., Chang, M. & Kinshuk, (2018). iMoodle: An intelligent moodle based on learning analytics.

  • Tzelepi, M. (2014). Personalizing learning analytics to support collaborative learning design and community building. 2014 IEEE 14th international conference on advanced learning technologies, Athens, Greece (pp. 771–773).

  • Weber, R. P. (1990). Basic content analysis (no. 49). Beverly Hills, CA: Sage.

    Book  Google Scholar 

  • Wu, F., Huang, L., & Zou, R. (2015). The design of intervention model and strategy based on the behavior data of learners: A learning analytics perspective. Hybrid Learning: Innovation in Educational Practices, 9167, 294–301.

    Google Scholar 

  • Yin, R. K. (2017). Case study research and applications: Design and methods. Thousand Oaks, CA: Sage.

    Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

This work was partly supported by Hacettepe University Scientific Research Projects Coordination Center as SHD-2017-15640 project ID.

Author information

Authors and Affiliations

Authors

Contributions

MŞ has been studying in the fields of learning analytics and educational data mining. He contributed to all stages of this research. HY has been studying in the fields of assessment, learning analytics, statistics and educational data mining. He contributed to all process of this research. Both authors read and approved the final manuscript.

Corresponding author

Correspondence to Muhittin Şahin.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Şahin, M., Yurdugül, H. An intervention engine design and development based on learning analytics: the intelligent intervention system (In2S). Smart Learn. Environ. 6, 18 (2019). https://doi.org/10.1186/s40561-019-0100-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40561-019-0100-7

Keywords