- Open Access
On the introduction of intelligent alerting systems to reduce e-learning dropout: a case study
Smart Learning Environments volume 9, Article number: 29 (2022)
E-learning students have a tendency to get demotivated and easily dropout from online courses. Refining the learners’ involvement and reducing dropout rates in these e-learning based scenarios is the main drive of this study. This study also shares the results obtained and crafts a comparison with new and emerging commercial solutions. In a primary phase, the goal was to center the study and research made (background, state of the art, survey and interviews to uncover motives and behavior patterns). In an additional stage, the development, trails and validation of an operating prototype of an Intelligent Alerting System to grant and evaluate concepts, gather statistical data on its efficiency, explore and detect if course accomplishment rates did actually improve. The results measured the effectiveness of learning (accomplishment and dropout rates) before and after the application of the proposed solution. Finally, some related work is considered, as well as emerging commercial solutions are compared with the proposed solution.
Introduction and context
This research work is centered on technologies directed at e-learning based scenarios and intelligent alerting systems (IAS, Luis et al., 2017). The primary objective is to improve the learners’ involvement and diminish dropout rates in these e-learning based scenarios. For this, it was determined, as the initial aim, to study the background and state of the art, essentially with the exploration, analysis and comparison between current collaborative platforms and technologies, their pros, cons and requirements.
E-learning based scenarios generally have low completion rates, so we seek the answer to: How may we improve the learner’s experience in these learning scenarios by reducing dropout rates? This is the main issue of the presented research.
This analysis aspired also to classify, in a first stage, through a survey within a student population, the central causes for dropout when using lectures as a primary e-learning resource. At a second phase, additional insight was obtained by interviewing instructors, counselors and platform administrators. This early research and analysis were used to detect motives and behavior patterns of students with dropout thoughts.
These measurements, analysis and exposure of data about students, their perspectives and backgrounds were fundamental in classifying dropout patterns. Therefore, a supplementary aim of this study was to settle diverse phases, levels and patterns in dropout apprentices and suggest appropriate involvements in order to avoid, in advance, these closure behaviors.
Founded on intelligent alerting systems, and with the knowledge of abandonment patterns, computer intelligent services can be envisaged, in a first reaction to predict a dropout profile (Siebra et al., 2020). Although the work has a technological approach, its primary goal is to prevent dropouts and rise completion rates within these scenarios (Luis et al., 2018). Figure 1 describes the motivations, steps and processes of the main plan for this research work.
This study addresses research, analysis and results on improving learners’ experience and decreasing dropout rates in e-learning based scenarios. The following section aims to contextualize this research with a discussion on state of the art on e-learning based scenarios and the experiences of apprentices and studies about e-learning dropout. Then, in “Objective” section reviews the goals of this research; in “Methods” section portrays the approaches used, and in “Results” and “Discussion” sections discuss the achieved outcomes. Lastly, in “Conclusions and future work” section draws the leading deductions of this research and describes ongoing and upcoming research work.
State of the art and related work
The initial phase of this work was aimed to learn more about the background and state of the art of e-learning based scenarios so, in this account, a thorough analysis of the literature was performed. In this account, more than 140 references were identified addressing several relevant topics such as “Teaching, tutoring, coaching, training and mentoring” (Shumow et al., 2002), “Online education background” (Valverde-Berrocoso et al., 2020), “Learning from online lectures” (Nguyen, 2019; Tang et al., 2021), “E-learning providers and platforms” (Power, 2020), “Highlighting text, graphic organizers and eye tracking” (Wang et al., 2019), “Effects of different online lecture types” (Seo et al., 2021), “Retrieving information using speech, video and text” (Tarchi et al., 2020), “Generating Open Educational Resources (OER), Collaborative Lecture Annotation System (CLAS, Risko et al., 2013) and Computer Supported Collaborative Learning (CSCL, Hernández-Sellés et al., 2019)”, “E-tutoring approach”, “Parent-teacher conference/interview approach” (Li et al., 2019), “Learning Analytics” (Pierrakeas et al., 2020) and “Motivation and Dropout Studies” (Aziz et al., 2019; Ortigosa et al., 2019). This analysis was contributory to understand the current setting and establish, within scientific conclusions, what could be improved in actual e-learning based scenarios in order to reduce apprentice dropouts.
According to an online Google Forms survey performed by the authors, around 500 secondary and college students participated in this investigation. Also, 10 interviews with teachers, computer technicians and counselors were performed by the authors. The results of these interviews/study, along with the related work and reports mentioned above (Aziz et al., 2019; Ortigosa et al., 2019), were fundamental to support the study and develop the survey. The following results were obtained as shown in the figures below.
As shown in Fig. 2, a first analysis reveals that “boring contents” (47%), “demotivation” (34%), “lack of time” (28%), “bad experiences” (25%) and “can't keep up with tasks and timings” (21%) are the leading reasons to quit or stop using an e-learning based system.
Figure 3 determines that, in terms of avoiding dropping out an e-learning based course, the main recommendations are: “adapting the tasks and timings” (56%), “enhance the contents” (47%), “extra support (on and offline)” (30%), “address an extracurricular course (language and/or digital skills)” (27%), “alert the teacher” (26%) and “message motivational phrases” (23%) (Luis et al., 2017).
The key purpose of these survey and interviews was to collect information and detect motives, behaviors and patterns of dropout students. This study also aimed to gather specialized opinions and recommendations to avoid dropping out an e-learning based program. The followed categories were obtained up by the respondents through the survey and interviews.
Today, many schools or educational systems are handling retention by reacting to the students’ behavior. However, by the time they identify a serious problem, it can be too late to keep that student in class. Such models should improve instructors’ usage of learning analytics by allowing them to exploit available technologies that support teaching and learning in online and blended learning environments (Atif et al., 2020; McKee, 2015).
Current solutions state that student behavior is an indication of their eventual success (Adnan & Anwar, 2020). For example, Dropout Detective integrates with existing learning management systems (e.g., Moodle LMS, Canvas LMS) and reviews grades, missing assignments, last date of attendance and many other factors. This solution retrieves the needed information, processes it and displays it on a dashboard that is updated daily. Learners are then given a Risk Index and are displayed in a red–yellow–green “traffic light” format (cf. Fig. 4) to draw attention to the most at-risk individuals within the student community. Administrators, advisors and instructors can then investigate individual student reports to determine the risk motives and to develop a pro-active plan of action (AspirEDU Educational Analytics, 2017).
Course Signals is another a predictive application that was highly influential to many other Early Warning Systems developed after it, becoming one of the most referenced systems by researchers in the community. Course Signals was developed at Purdue University and works with their educational LMS, Blackboard Vista (Bernacki et al., 2020). The main goal is to increase student success through the use of analytics to alert faculty, students, and staff about potential problems. For this to happen, they use a formula that takes into account a variety of predictors and current behaviors (e.g., attendance, running scores) and label student status in a given course according to a green–yellow–red scheme (cf. Fig. 5) that predictably may indicate whether a student is in danger of DWIF (dropping out, withdrawing, getting an incomplete, or failing) (Russell et al., 2020).
The existence of current and internal LMS statistics, such as Canvas Analytics or Moodle Analytics, Activity and Report Logs, is not enough to determine and predict behaviors and avoid future dropouts. The existence of recent commercial products that pursue similar objectives is not necessarily bad. On the contrary, it illustrates that this research topic may be considered a hot and relevant topic, has market value and supports the initial and proposed thesis. A primary approach and comparison done so far can distinguish this solution from the commercial ones as a free, open source and an adjustable system (Luis et al., 2018).
Figure 6 summarizes the main features of the above-mentioned EWS. It is important to understand that this comparison and the chosen platforms were selected according to the following aspects: first, an early application with background experience, well-known in the academic community and that was in use for several years (Course Signals); second, a rising commercial solution with good evaluations (Dropout Detective); third and last, the proposed, developed and tested prototype that emerged from this study (RML Agent).
The first feature that was considered was their Business Model. Course Signals was developed and is the property of Purdue University, Dropout Detective is a commercial solution and RML Agent is a tested free open-source prototype (Fig. 7).
The type of analysis offered by each solution is presented in the table above. Course Signals analysis is based on points obtained on the course at that date, time spent on each task and past performances. It is not as flexible as other approaches but it gives out some signals (red, yellow or green) based on the combination of these results. On the other hand, Dropout Detective is more flexible and adaptable, it can analyze other aspects, but needs expertise intervention. RML Agent is also flexible, adaptable and does not need an expert mediation. It is Query based and if you know where and how to connect your data sources, domain structured query language and the parameters you need for the alerting system, no specialized outsourcing is required.
In terms of data source connection, and what was possible to discover, Course Signals handles only with the universities LMS data (Riestra-González et al., 2021; Wang, 2021). The other two solutions can be multi-source, in other words, they can engage, analyze and operate with several distributed sources.
The output results have diverse presentations. Course Signals proposes a “stop light” or “traffic light” approach (red, yellow or green) for each student in each course. Dropout Detective uses a similar approach but with a colored bar with the same gradual colors. The RML Agent method is quite different, it lists students by warning levels so that the associated alerting actions may be triggered.
Finally, the generation of alerts and warnings follow diverse approaches. As indicated above, Course Signals only displays the alerts to the students and is known by each instructor. Similarly, Dropout Detective, also alerts students, is seen by instructors and has an Advisor Alert System that allows instructors to send an alert to advisors and any user may set an alert on a student. RML Agent has levels of alerts, it is flexible and is parameterized. Each level is gradual on warning actions and the goal is to automatize these alerting actions (e.g., from messaging students to alerting advisers, tutors and course coordinators) (Luis et al., 2018).
All solutions are valid and have their pros and cons but, “without feathering our own nest”, in terms of costs, flexibility and scalability the proposed solution (RML Agent) appears to be best choice.
As outlined above, the objective of this research was to analyze the impact of intelligent alerting systems in students’ abandon or dropout rates. To address this objective, the operational objectives below were identified:
Analyze the state of the art in e-learning alerting systems and existing platforms and technologies.
Identify, in a statistically sound way, the most relevant reasons for a student to drop out an e-learning experience through a survey within a student community and by interviewing teachers, counselors and platform administrators. This would serve to detect motives and behavior patterns of students with dropout thoughts.
Apply learning analytics to determine existing stages, levels and patterns in dropout students and suggest appropriate interventions in order to prevent, in advance, these closure actions.
Develop and test within the student population participating in this study an Intelligent Alerting System, trained by means of the usage patterns identified above, that automatically intervenes to prevent eventual dropouts.
Gather statistical information to determine the effectiveness of intelligent intervention and its benefits.
The methodology was driven by the work plan above. Thus, the most relevant milestones in this research were as follows:
Gather information in popular science and academic media, and identify existing applications and tools in this field. Identify the state of the art in intelligent alerting in e-learning through the analysis, exploration and comparison among existing platforms and technologies, their benefits, drawbacks and specifications.
Establish contacts with public and private educational institutions to learn firsthand from the experience of professionals in the sector about their needs. As first approach, a survey within a student community was conducted, aimed to discover and classify primary motives for dropouts and behavior patterns when using e-learning scenarios as a learning platform. Use a Google Forms online survey with 9 items/interrogations (Gender?; Age?; Degree you're studying?; Have you ever used an E-learning Based scenario?; Nationality?; What are (or would be) the main motivations to use an E-learning Based system?; What are (or would be) the main reasons for you to quit or stop using an E-learning Based system? How do you describe yourself? Suggestions to avoid dropping out an E-learning Based course?). All interrogations based on MCQ (Multiple Choice Question) and the last three on MCQ with open-ended questions. A second approach looked to extend these acknowledgements by interviewing teachers, counselors and platform administrators.
Describe and structure the information collected. Define the locale of work and identify the difficulties and problems that must be addressed.
State the scope of the study, that is, the characteristics of e-learning based systems. Seek an answer to the research question, namely How the learners’ experience in e-learning based scenarios can be improved to reduced dropout rates?
Propose a technological solution, in a theoretical form, that meets the needs identified above. Besides, propose protocols, guidelines and rules for the creation and use of a functional prototype of the proposed solution.
Test and validate the prototype in real educational scenarios and with real users that meet the defined objects. Also, do the evaluation of concepts by testing with the developed prototype and collect statistical information on the effectiveness of current e-learning based scenarios. Compare this information with the generated with the proposal. Analyze the results objectively to discover the differences between both models and measured the effectiveness of education (completion and dropout rates) before and after the implementation of the proposed solutions through student evaluation results and accomplishments. Apply this to mandatory and similar courses, lectured both synchronous and asynchronously, by the same teacher and with the same students that attend secondary or high school equivalent. Achieve this data by connecting the IAS (intelligent alerting systems) to the schools e-learning platform (Moodle) and the stated courses. Collect data via IAS which analysis log files and database actions (students’ actions) throughout the courses.
The methodology employed for obtaining opinions and answers from current students was through the creation and usage of an online survey. The benefits of using this sort of technique is that it’s faster, it has a high response rate with low or no cost, it is able to collect data automatically, attain real-time access, its design flexibility, quickness to analyze data, accessibility and flexibility—participants can fill out the survey whenever they have availability and choose to.
The method chosen was to develop a standalone agent that runs independently from the e-learning platform, which could be scalable to new features (dropout patterns and corresponding alerts) and would read logs (records of computer or software activity and tasks) from potential e-learning frameworks (Luis et al., 2018).
Therefore, an original, platform-independent build process is more advantageous. It avoids any unnecessary association or dependability with the existing e-learning based platforms, ensuring flexibility and scalability on the present, upcoming or updated frameworks.
A survey was conducted among the 637 students mainly from college and high school students, namely students from Polytechnic Institute of Viana do Castelo (IPVC) and ETAP Professional School. The participation in the survey project was voluntary and there were no incentives for the participation. A total of 494 students submitted their responses (77.55%). The massive participation on the student survey indicates that students are concerned by these issues and understand the importance of addressing them.
Collecting statistical data was essential to understand the effectiveness of this solution beside current e-learning based scenarios and it was also an aim by comparing this information with the one generated in the framework of this proposal.
Before the implementation of the proposed solution, and with the same students, the sample of data collected indicated that 67% of the students succeeded with their e-learning course, 23% failed and 10% dropped out.
As mentioned before, the method chosen was to create an independent agent that operates autonomously from the e-learning platform, which could be extended with additional features (e.g., dropout patterns and corresponding alerts) and would read logs (records of computer or software activity and tasks) from potential e-learning frameworks.
In this past year, the agent was enriched with new information such as results from the Engine Analyzer (cf. Fig. 8) and Latest Individual Analysis/Activity (cf. Fig. 9). Results (alerts) of the analysis and possible actions are displayed to system administrators and can be shared with guidance counselors and/or teachers. These same archives can be stored in a XML format for further processing.
Once the proposed solution was implemented and the new intelligent alert system (IAS) was deployed, the effectiveness of education (completion and dropout rates) was measured before and after the implementation of the proposed solution through student evaluation results and accomplishments. During the course, four alerts were reported and automatically engaged the same number of actions, three emails were sent with motivational phrases due to 1 week student inactivity and one teacher alert was generated due to 2 weeks of student inactivity. The subsequent graphics (cf. Figs. 10 and 11) show the outcomes of that effectiveness (S—succeeded; F—failed; D—dropped out).
The outcome was quite satisfying as indicated by the above charts. The percentage of successful scores/students increased and fail scores decreased. Another major fact exposed is that there were no dropouts in the second course. Additionally, it should be stated that both courses are similar, mandatory and that the students are the same. Therefore, they have the same profile and age range (between 16 and 19 years of age). The subjects where lectured both synchronous and asynchronously in a class from a professional/vocational education school (secondary—high school equivalent), with whom was spent more hours teaching then the usual. Both courses contemplated Microsoft Office applications, specifically Microsoft Excel and Microsoft Access, similar in handling data, menus and functionalities.
Other statistical information relevant to extract further conclusions are that 19% of participants were female students, one of them dropped out (got a job) and all the remaining succeeded.
A good sample size would be around 10% of the total population, but due to constant rotation of students during this analysis (i.e., new students arrived and others completed their studies), it was possible to obtain a sample of 21 students, who were still concluding their secondary education, out of a total population of 494 students. The sample size is of 4.2% and the estimated subset has a sample error of 21.35% (cf. Fig. 12).
The following table (cf. Fig. 13) compares student scores before and after the usage of the intelligent alerting system as well as if the students were alerted of not by the same system in their subject. Within the alerted ones, there was an increase in most of the scores and even the non-alerted students improved their scores probably because they knew that they were being supervised by the intelligent alerting system.
There will always be some uncertainty associated with sample statistic, so the confidence interval represented on Figs. 14 and 15 were used to describe these uncertainties. As shown, with 95% confidence level, the margin of error for both scores (before and after IAS) are low and sustain a good level of confidence on the presented results.
Figure 16 represents the correlation between the two variables (after and before IAS). This is useful because it can indicate a predictive relationship that can be exploited in practice. Therefore, the in this analysis, the correlation is 0.91 which indicates that there is a strong relationship between both variables.
Further analysis was also performed to assess the statistical significance of the study (cf. Figs. 17 and 18). The regression analysis estimates the relationship between outcome variables and predictors. Obtained R Square reveals a good indicator and the graph supports the outcomes. Correspondingly, the probability chart indicates a good prospect and valuable chances of future outcomes to appear with similar results as the exposed.
|SUMMARY OF RESULTS|
|Adjusted R Square||087069091|
With the frequent employment of learning analytics tools, there is a need to explore how these technologies can be used to improve teaching and learning. Little research was done so far on what human processes are necessary to ease significant adoption of learning analytics (McKee, 2015). The research setback is that there is an absence of evidence-based guidance on how instructors can effectively implement learning analytics to support at-risk students with the purpose of improving learning results (Anni Silvola et al., 2021). Therefore, and because of the lack of exploration, further and more extensive research must be made in this field and area of interest.
To date, the prototype agent meets the initial and mandatory expectations, the needs and the requirements that are constantly changing. First results, to this point, forecast notable expectations and a glittering future for this adjustable system as a functional and complementary solution using efficiently tested guidelines, rules and a suitable protocol.
Within the scope of this study and analysis, the results obtained are encouraging and, for a functional prototype, the expectations on further analysis, similar or better results seem high and probably obtainable. The prototype, as the name implies, is an early sample, model, or release of a product to test this concept. It can evaluate to a new design and additional functionalities, that’s the main purpose of the developed IAS. As for the obtained results, as said before, they are motivating but would be better sustained with a larger and more diverse scale of students and courses/subjects.
In conclusion, the findings and results were quite satisfying, the percentage of succeeded scores/students grew as well as failed scores and dropouts decreased. This is an excellent outcome which justifies the concern and usage of this or similar intelligent alerting systems.
Conclusions and future work
The main goal of this research was to enhance the learners’ experience and reduce dropout rates in e-learning based scenarios. Learning more about the background and state of the art helped on understanding concepts, technologies and methods in this learning context. The huge participation in this survey indicates that students are concerned with these issues and understand the importance of this sort of studies. The survey and interviews also collected important data, such as the main reasons to dropout and suggestions to avoid that unpleasant action. This retrieved information was used on the development of the prototype.
Testing and validating the prototype in real educational scenarios, with real users, encountered the defined objectives and evaluated concepts within these tests.
Collecting statistical data on the effectiveness of this solution beside current e-learning based scenarios was also an aim by comparing this information with the generated with the proposal. The results measured the effectiveness of education (completion and dropout rates) before and after the implementation of the proposed solution with satisfying results.
In conclusion, the contributions, tests, analysis and results obtained assume that the main objective was achieved and that all the specific objectives were fully accomplished. Research objectives such as studying the state of the art and existing platforms was fulfilled as well as identifying the most relevant reasons for a student to drop out an e-learning experience through a survey. The premise was predicted and a prototype of an Intelligent Alerting System was developed to test the hypothesis and collect statistical information to determine the effectiveness of intelligent intervention and its benefits. The final analysis concludes that, within a real scholar scenario, completion and dropout rates, before and after the implementation of the proposed solution, accomplished the main objective that was to enhance the learners’ experience and reduce dropout rates in these e-learning based scenarios.
Finally, the existence of current and internal LMS statistics, is not enough to determine and predict behaviors and avoid future dropouts. The existence of recent commercial products illustrates that this research topic may be considered a hot and relevant topic, has market value and supports the initial and proposed thesis.
The new pandemic reality forced millions of students worldwide to study at home and interact daily with e-learning based platforms (Alameri et al., 2020; Lu et al., 2020). This new reality awoken the importance of this theme, making it emerge really fast because of its universal importance and high media impact (Coman et al., 2020; Muzaffar et al., 2021; Soni, 2020).
There were also numerous limitations in accessing private and sensible data because of the General Data Protection Regulation (GRPD) 2016/679 (EU). GDPR is a regulation in the European Union (EU) law on data protection and privacy in the EU and the European Economic Area (EEA). It imposes obligations onto organizations anywhere, as long as they target or collect data related to people in the EU.
Additionally, the fact of being a public school teacher and having been in 5 different schools in the last 3 years, didn’t help on gathering a huge amount of data samples for this particular analysis.
There were also some specific restraints because of internal policies and conditions imposed by several school boards.
In the future, more research will be done to find out what is behind the new upcoming marketable solutions, their tools, specifications, implementation, benefits and performance.
Availability of data and materials
Due to the nature of this research, the data are not publicly available due to EU General Data Protection Regulation (GDPR) restrictions. The raw data and material contains information that could compromise the privacy of research participants.
Adnan, M., & Anwar, K. (2020). Online learning amid the COVID-19 pandemic: Students’ perspectives. Journal of Pedagogical Sociology and Psychology, 1(2), 45–51. https://doi.org/10.33902/JPSP.2020261309
Alameri, J., Masadeh, R., Hamadallah, E., Ismail, H. B., & Fakhouri, H. N. (2020). Students’ perceptions of e-learning platforms (Moodle, Microsoft Teams and Zoom platforms) in The University of Jordan Education and its Relation to self-study and Academic Achievement During COVID-19 pandemic. The University of Jordan.
Anni Silvola, A., Näykki, P., Kaveri, A., & Muukkonen, H. (2021). Expectations for supporting student engagement with learning analytics: An academic path perspective. Computers & Education, 168, 104192. https://doi.org/10.1016/j.compedu.2021.104192
AspirEDU Educational Analytics. (2017). Dropout detective—Identify, prioritize and support your at-risk students. Available at: http://aspiredu.com/wp-content/uploads/2017/03/Dropout-Detective-Higher-Ed-Overview.pdf. Accessed 15 May 2021.
Atif, A., Richards, D., Liu, D., & Bilgin, A. (2020). Perceived benefits and barriers of a prototype early alert system to detect engagement and support ‘at-risk’ students: The teacher perspective. Computers & Education, 56, 103954. https://doi.org/10.1016/j.compedu.2020.103954
Aziz, R., Hashim, N., Omar, R., Yusoff, A., Muhammad, N., Simpong, D., Abdullah, T., Zainuddin, S., & Safri, F. (2019). Teaching and learning in higher education: E-learning as a tool. International Journal of Innovative Technology and Exploring Engineering, 9, 458–463. https://doi.org/10.35940/ijitee.A4188.119119
Bernacki, M., Chavez, M., & Uesbeck, P. (2020). Predicting achievement and providing support before STEM majors begin to fail. Computers & Education, 158, 103999. https://doi.org/10.1016/j.compedu.2020.103999
Coman, C., Tîru, L., Mesesan-Schmitz, L., Stanciu, C., & Bularca, M. (2020). Online teaching and learning in higher education during the coronavirus pandemic: Students’ perspective. Sustainability, 12(24), 10367. https://doi.org/10.3390/su122410367
Hernández-Sellés, N., Muñoz-Carril, P. C., & González-Sanmamed, M. (2019). Computer-supported collaborative learning: An analysis of the relationship between interaction, emotional support and online collaborative tools. Computers & Education, 138, 1–12. https://doi.org/10.1016/j.compedu.2019.04.012
Li, G., Lin, M., Liu, C., Johnson, A., Li, Y., & Loyalka, P. (2019). The prevalence of parent–teacher interaction in developing countries and its effect on student outcomes. Teaching and Teacher Education, 86, 102878. https://doi.org/10.1016/j.tate.2019.102878
Lu, O., Huang, A., & Yang, S. (2020). Impact of teachers’ grading policy on the identification of at-risk students in learning analytics. Computers & Education, 163, 104109. https://doi.org/10.1016/j.compedu.2020.104109
Luis, R., Llamas-Nistal, M., & Iglesias, M. (2017). Enhancing learners’ experience in e-learning based scenarios using intelligent tutoring systems and learning analytics: First results from a perception survey. In 12th Iberian conference on information systems and technologies (CISTI) (pp. 1–4). https://doi.org/10.23919/CISTI.2017.7975976
Luis, R., Llamas-Nistal, M., & Iglesias, M. (2018). Analyzing learners’ experience in e-learning based scenarios using intelligent alerting systems: Awakening of new and improved solutions. In 13th Iberian conference on information systems and technologies (CISTI) (pp. 1–3). https://doi.org/10.23919/CISTI.2018.83993
McKee, H. M. (2015). The construction and validation of an instructor learning analytics implementation model to support at-risk students. Doctoral dissertation. Nova Southeastern University. Retrieved from NSUWorks, College of Engineering and Computing.
Muzaffar, A., Tahir, M., Anwar, M., Chaudry, Q., Mir, S., & Rasheed, Y. (2021). A systematic review of online exams solutions in e-learning: Techniques, tools, and global adoption. IEEE Access, 9, 32689–32712. https://doi.org/10.1109/ACCESS.2021.3060192
Nguyen, P. T. (2019). Roles of E-learning in higher education. Ho Chi Minh City Open University.
Ortigosa, A., Carro, R., Bravo-Agapito, J., Lizcano, D., Alcolea, J., & Blanco, O. (2019). From lab to production: Lessons learnt and real-life challenges of an early student-dropout prevention system. IEEE Transactions on Learning Technologies, 12(2), 264–277. https://doi.org/10.1109/TLT.2019.2911608
Pierrakeas, C., Koutsonikos, G., Lipitakis, A., Kotsiantis, S., Xenos, M., & Gravvanis, G. (2020). The variability of the reasons for student dropout in distance learning and the prediction of dropout-prone students. In Machine learning paradigms. Advances in learning analytics. Volume 158 of Intelligent Systems Reference Library, 91–111, Springer, 2020. https://doi.org/10.1007/978-3-030-13743-4_6
Power, R. (2020). E-learning essentials 2020. Power Learning Solutions.
Riestra-González, M., Paule-Ruíz, M., & Ortin, F. (2021). Massive LMS log data analysis for the early prediction of course-agnostic student performance. Computers & Education, 163, 104108. https://doi.org/10.1016/j.compedu.2020.104108
Risko, E. F., Foulsham, T., Dawson, S., & Kingstone, A. (2013). The collaborative lecture annotation system (CLAS): A new TOOL for distributed learning. IEEE Transactions on Learning Technologies, 6(1), 4–13. https://doi.org/10.1109/TLT.2012.15
Russell, J., Smith, A., & Larsen, R. (2020). Elements of success: Supporting at-risk student resilience through learning analytics. Computers & Education, 152, 103890. https://doi.org/10.1016/j.compedu.2020.103890
Seo, K., Dodson, S., Harandi, N., Roberson, N., Fels, S., & Roll, I. (2021). Active learning with online video: The impact of learning context on engagement. Computers & Education, 165, 104132. https://doi.org/10.1016/j.compedu.2021.104132
Shumow, L., Farlowe, A., & Bray, M. (2002). Tutoring. Retrieved from Encyclopedia: http://www.encyclopedia.com/doc/1G2-3403200630.html. Accessed 15 May 2021.
Siebra, C., Santos, R., & Lino, N. (2020). A self-adjusting approach for temporal dropout prediction of e-learning students. International Journal of Distance Education Technologies, 18(2), 19–33. https://doi.org/10.4018/IJDET.2020040102
Soni, V. D. (2020). Global impact of E-learning during COVID 19. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.3630073
Tang, Y., Chen, P., Law, K., Wu, C. H., Lau, Y., Guan, J., He, D., & Ho, G. T. S. (2021). Comparative analysis of Student’s live online learning readiness during the coronavirus (COVID-19) pandemic in the higher education sector. Computers & Education, 168, 104211. https://doi.org/10.1016/j.compedu.2021.104211
Tarchi, C., Zaccoletti, S., & Mason, L. (2020). Learning from text, video, or subtitles: A comparative analysis. Computers & Education, 160, 104034. https://doi.org/10.1016/j.compedu.2020.104034
Valverde-Berrocoso, J., Garrido-Arroyo, M., Burgos-Videla, C., & Morales-Cevallos, M. (2020). Trends in educational research about e-learning: A systematic literature review (2009–2018). Sustainability, 12(12), 5153. https://doi.org/10.3390/su12125153
Wang, F. (2021). Interpreting log data through the lens of learning design: Second-order predictors and their relations with learning outcomes in flipped classrooms. Computers & Education, 168, 104209. https://doi.org/10.1016/j.compedu.2021.104209
Wang, J., Antonenko, P., & Dawson, K. (2019). Does visual attention to the instructor in online video affect learning and learner perceptions? An eye-tracking analysis. Computers & Education, 146, 103779. https://doi.org/10.1016/j.compedu.2019.103779
A special thanks to University of Vigo (UVigo), Polytechnic Institute of Viana do Castelo (IPVC), ETAP Professional School, their directors, teachers and students for the all their availability, patience and understanding throughout this study.
Ricardo Manuel Meira Ferrão Luis has an initial technical background in Computer Management (1994), Bachelor's and Degree in Computer Engineering from the Polytechnic Institute of Porto (ISEP), in 1997 and 2000 respectively. Master's in IT and specialty in Distributed Systems, Computer Communications and Computer Architecture from Minho University in 2007. He was a computer technician between 1994 and 1997 at Cronograma Inc. and a Junior and Senior Programmer at Sage Portugal from 1998 to 2000. In 2000 he started teaching in High Schools and Professional/Vocational Schools which he still does ‘til now. Currently, he is also a Faculty Member at Polytechnic Institute of Viana do Castelo (IPVC) and between 2004 and 2006 he was a Company Partner and CEO at Kaminho Digital - Information Systems. Currently he is finishing the PhD at University of Vigo.
Martín Llamas-Nistal (Senior Member, IEEE) received the Eng. and Ph.D. degrees in telecommunication from the Polytechnic University of Madrid, Spain, in 1986 and 1994, respectively. He has been a Faculty Member with the Higher Technical School of Telecommunication Engineers, University of Vigo, Spain, since March 1987. He is the author or a co-author of more than 300 articles in peer-reviewed international refereed journals and conference proceedings. He has directed several national and international research projects in telematics and technology-enhanced learning fields. He was a member of the Steering Committee of the IEEE Transactions on Learning Technologies, since its founding from 2008 to 2013, where he has been an Associate Editor, since 2014. He has received several awards from the W3C, Highlight Paper in the WWW 2001, and Education Track Best Paper and Conference Best Paper Finalist in the WWW 2002, the 2007 Chapter Achievement Award for the Spanish Chapter as an outstanding model of technical activities, membership services, and professional development in Spain and Latin America, the 2010 Distinguished Chapter Leadership Award, the 2011 IEEE Education Society Chapter Achievement Award, and the IEEE EDUCON 2015 and 2018 Meritorious Service Awards from the IEEE. He was the General Co-Chair of IEEE EDUCON 2012, 2013, 2014, and 2018. He was the Co-Founder of the IEEE Latin-American Learning Technologies Journal (IEEE-RITA), in 2006, and is the Editor-in-Chief, since its founding. He has been serving in different positions in the Education Society of the IEEE: member of the Board of Governors, since 2008, and the Strategic Planning Committee, since 2009, and the Vice-President for Member and Geographic Activity, since 2019. He has served in the Education Society of the IEEE as the Chair of the Publications Committee, from 2010 to 2018, and the Vice-President for Publications, from 2011 to 2018.
Manuel J. Fernández Iglesias has a PhD in telecommunication engineering (1997) from the University of Vigo. He initiates his teaching and research activities as a member of the Dept. of Telematics Engineering at the School of Telecommunications Engineering of the University of Vigo in October 1990. His academic activity is initially aimed to courses the field of informatics and computer architectures, later being expanded to courses in the field of network services and applications. Regarding his research work, it was initially targeted to the field of formal specification techniques, which was extended from 1997 to include the development of advanced network services and applications. During his post-doctoral career, he continuously lectures in the doctorate and master's programs, being the promoter, organizer and coordinator of the first distance-only doctoral program of the Galician university system in 2001. In addition, the applicant was part during 2004 and 2005 of the Group of Bologna Promoters in Spain, sponsored by the European Union and the Spanish General Directorate of Universities through an institutional project financed by the Socrates program of the EU. In 2005, he temporarily suspended his dedication to the University of Vigo to serve in the Galician administration as General Director of Audiovisual Communication and he also joined the Galician public telecommunications operator Retegal S.A. as Chief Executive Officer, the Sociedad Anónima de Gestión del Centro de Computación de Galicia (Galician Supercomputing Center, Inc.) as a member of its board of directors, and the Audiovisual Consortium of Galicia as First Vice-Chairman of its board of directors. In 2009 he resumed his teaching and research activities at the University of Vigo, focusing on the development of technological solutions, services and applications aimed at people with disabilities or the elderly. In 2010, he also assumed the responsibilities of Vice-Rector for International Relations at the University of Vigo.
This research is partially funded by project TIN2016-80515-R “Service platform based on multimodal analysis for self-regulated learning” from the Spanish Ministry of Economy and Competitiveness.
The authors Ricardo Luis, Martin Llamas-Nistal and Manuel Iglesias declare that there is no competing of interest.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Luis, R.M.M.F., Llamas-Nistal, M. & Iglesias, M.J.F. On the introduction of intelligent alerting systems to reduce e-learning dropout: a case study. Smart Learn. Environ. 9, 29 (2022). https://doi.org/10.1186/s40561-022-00210-0
- Behavioural patterns
- At-risk students
- Intelligent alerting system
- Learning analytics