Skip to main content

Role of interactive presentation platform ASQ in delivering web design course

Abstract

Contemporary technology enhanced learning together with different innovative learning methodologies are significantly initiating progress in educational ecosystems. Educational systems and tools that invoke active participation of learners are excellent facilitators of modern education. One such system is ASQ. ASQ is an interactive presentation platform that allows teachers to incorporate interactive questions in their presentations. Learners are then answering these questions on site on their digital devices. In that way teachers have immediate feedback from learners, allowing them to adjust course of presentation. In this paper we tried to determine in what extent is ASQ beneficial for learners. For that purpose we conducted analysis on the data collected from Web Design course, where ASQ was utilized during two school years. Results of the analysis suggest that ASQ has a positive influence on learners’ acquired knowledge.

Introduction

Contemporary trends in technology enhanced education and research increasingly require introduction and application of innovative, “intelligent”, personalized systems adjusted to learners’ needs. One of the main directions in this area is oriented towards promotion of active participation of learners. Immediate learning feedback is important for learners but also for teachers in order to properly support and increase effects of teaching and learning (Triglianos et al. 2017; Freeman et al. 2014; Brusilovsky et al. 2015; Lin et al. 2019). Pervasive technologies have been incorporated in modern educational environments to allow learners and teachers to fully benefit from learning ecosystems (Hung et al. 2018; Denden et al. 2019; Klašnja-Milićević et al. 2018). The real advances in improving learning and teaching activities can be achieved by capturing and analyzing various data generated during educational processes. Wide variety of useful educational data can be collected from diverse learning spaces and activities like: clickstream data (Triglianos et al. 2016), grades, assessment data, and even physiological types of data.

Additionally it is necessary to increase quality of theoretical research and practical implementation of educational tools, systems, and environments that will offer teachers a multifunctional, technology-enhanced educational milieu able to collect, integrate, and harmonize educational data to achieve higher potential (Manzoor et al. 2019; Lin et al. 2019; Gašević et al. 2019) and increase learners’ performances and achievements.

Recently, educational data mining, learning analytics (Klašnja-Milicevic and Ivanovic 2018) and appropriate tools influence higher awareness of necessity to use innovative sorts of assessment employing single virtual learning environment or utilizing one learning management system. Enhancement of learners’ performances through active learning from constructivist point of view (Freeman et al. 2014) is getting more important. Active learning and learners’ reactions about presented topics increase learners’ participation in the classroom and provide teachers with adequate feedback.

Our main intention was to change the usual predominant passive traditional teaching style by applying a new lecture pattern in Web Design course: teaching short pieces of new material and after that immediately asking learners related questions. Based on their answers teacher can gauge the level of understanding of the taught material. Such particular pattern of lecturing is vital for students but even more for teachers. Learners can confront their understanding of the taught concepts momentarily. Based on learners’ immediate feedback teachers can adapt their teaching method in real-time and adjust it to the learners’ results, expectations, and needs.

ASQ (Triglianos and Pautasso 2014) was developed to support active learning educational scenarios including the aforementioned lecture pattern. Its core feature is the ability to seamlessly embed various types of questions within presentation slides and stream them to students’ devices. Students’ answers are aggregated in real-time and presented to the teacher to help them adjust the lecture to attain desired educational goals.

In this paper we report on our experiences of using ASQ, we discuss lessons learned, identify some of the problems that arose and we propose ways how to address them. The primary contribution of this paper is a case study in a real-world classroom about how a group of students being taught in a novel lecture pattern, that utilizes a directed use of technology to foster active learning, can learn better than a control group following the traditional lecture paradigm.

The rest of the paper is organized as follows. In the second section related work is briefly presented. Section three is devoted to the concise presentation of the role of ASQ in the educational process. In the fourth section experimental setup is presented describing the organization of Web Design course. Fifth and sixth sections bring respectively results and discussion about statistical analysis performed on data collected during the course activities in two school years. Last section draws some concluding remarks.

Background and related work

In this paper we are primary oriented to consider, analyze and draw useful conclusions about higher education classroom experience by using digital devices with the aim to adopt active learning style in a programming oriented course. In conventional, traditional classrooms teachers often ask learners to express their opinion and give answers on orally posed questions by speaking out loud, drawing on a whiteboard or discussing with their peers. However, usually small number of students participate in these activities while majority of others are passive.

One strategy to scale these interactions to multiple students is to utilize digital devices in the classroom. It should be noted though that the undirected use of technology may produce negative effects on learning achievements and outcomes. Some authors performed experiments and studied effects on learners’ achievements in multi-tasking during the lectures with digital devices. In the experiment presented in (Wood et al. 2012) learners were randomly assigned to one of several different conditions and use of technology instruments. Authors finally concluded that learners’ achievements were even negatively impacted.

In order to minimize such possible negative effects, a lot of innovative online learning platforms expect learners to answer different types of questions and also rather short questions or programming tasks. Recently going a step ahead, many widely available online learning platforms support multiple ways of presenting teaching materials and obtaining students responses, like in-video quizzes for flipped classrooms and MOOCs. For example in the paper (Kim et al. 2015) authors explore an advanced method using specially developed system. RIMES system facilitates “easy authoring, recording, and reviewing interactive multimedia exercises embedded in lecture videos”. With RIMES learners can make video or audio recordings of their responses to the videos presented in classroom. Teachers can immediately review responses in aggregated gallery and react on the students’ responses. After performed experiments with 25 students, teachers found out that the system is highly useful in identifying variety of misconceptions which help them to improve teaching content.

Also an interesting study was presented in the paper (Ravizza et al. 2017). During the class, learners’ activities on Internet were observed. Online activities were classified in two groups: related or not related to the class topic. Final conclusion was that Internet use not related to the class topic was predominant among learners who used their laptops in class. Expectedly such behavior negatively affected learning performances. More surprisingly, use of Internet related to class topic was not beneficial for learners’ performances.

It is definitely clear that modern education based on technology enhanced learning requires innovative approaches to use digital devices, to make learning more attractive but also more productive and with better learners’ performances. So designers and developers of the distributed event collection and analysis architecture of ASQ, produced modern, attractive and very useful educational environment that enables practice questions during lectures (Triglianos et al. 2017; Triglianos and Pautasso 2014). Learners can easily and relatively unobtrusively use and work with the system in wide variety of courses. Learners’ interactions with the web application generate different types of events that are captured and stored in appropriate database. Collected data about particular tasks/questions is processed immediately and results are visually shown to the teacher who can properly react and change style and dynamics of teaching giving additional explanations or skipping some well known elements. The ASQ system has been intensively used in teaching activities for students at various universities and interesting results have been published (Triglianos et al. 2017; Triglianos and Pautasso 2014).

Developers of ASQ did not explicitly mention social dimension of their platform. However, group/summary results of all students for particular question are visually presented to all of them in privacy-preserving way, after processing all collected answers. So each student can personally assess his/her position and learning performance and compare them with results of other peers. This approach is not as comprehensive as one offered by MasteryGrids OSSM (Open Social Student Modeling) interface (Brusilovsky et al. 2015) but have some similar basic useful elements. MasteryGrids uses parallel social visualization approach. It shows progress of student knowledge in wide range of subsequently organized educational topics. Personal performances and progress is compared and visually presented with the progress of the class. In majority of cases it is motivational for students with lower learning performances to try more tasks and questions with non-mandatory learning content in order to achieve better results.

As a consequence of joint SCOPES project, colleagues from Technical University from Bratislava also used ASQ for interaction with students in their classroom. In the paper (Triglianos et al. 2017), experiences of use of ASQ in a Functional and Logic Programming course were presented. Authors collected and analyzed data about students’ activities and achievements. Based on students’ immediate feedback on complex types of questions, that has been used during classes, authors managed to aggregate students’ answers in real time. In the paper several requirements and useful guidelines for successfully adoption of ASQ were identified. However the essential one and highly important for both students and teachers is: find proper moment when to stop collecting the students’ answers and proceed to their presentation and evaluation. After that particular moment, it is important to discuss findings together with the class and adjust further topics to the current students’ performances.

Our experiments with ASQ had gone in the similar directions i.e to determine if ASQ helps students to gain higher knowledge and achieve better learning performance in friendly and not stressful technology enhanced learning. The methodology applied and discussion on obtained results, are presented in the rest of the paper.

The role of ASQ in the educational process

Taking tests on studied material to promote active learning, as opposed to using tests only as an evaluation tool, is beneficial to retention and leads to better scores in final tests (Bartlett 1977; Darley and Murdock 1971; Hanawalt and Tarr 1961; Hogan and Kintsch 1971; McDaniel et al. 1989; McDaniel and Masson 1985; Whitten II and Bjork 1977). There is also strong evidence (Glover 1989; Morris et al. 1977; McDaniel et al. 2007; Kang et al. 2007) that suggests that the more effort-full free recall format for quizzes (such as short text quizzes or programming tasks) perform better in terms of learning outcomes compared to cued recall or recognition formats (such as “fill the gaps” or multi-choice quiz types). Mixing theoretical explanations with interactive exercise activities benefits long-term recall as it helps students to reflect on the taught material (which is still in their working memory) and prevent misconceptions from taking root (as teachers can give early feedback).

ASQ is a Web application designed to enable practice questions during lectures. Questions of various types can be embedded in the slides of a presentation which are streamed to the students’ devices (Fig. 1). All students can participate and their answers are aggregated in real-time allowing presenters to gauge the level of understanding in the classroom, catch misconceptions early and give timely feedback to their audience.

Fig. 1
figure1

ASQ in the classroom: most students’ laptops are connected and focused on a slide that contains the results of an interactive question

The goals of the ASQ application are to: a) turn student devices from distractions to affordances for learning; b) use technology to scale active learning to large audiences; c) raise teachers awareness for each and all of their students’ level of comprehension of the taught material, during and after the lecture.

ASQ offers the opportunity to present complex question types in class to hundreds of students and to receive real-time feedback about the students’ focus, understanding and their performance on the tasks.

Questions are not limited to closed types of questions, such as multiple choice, but include many general and informatics-related domain-specific question types that promote active learning and experimentation with the taught material. Example question types include live JavaScript programming; programming Java code and testing it against unit tests; creating Web pages with HTML, CSS and JavaScript (Fig. 2); formulating database queries; creating and editing visual diagrams; using CSS selectors; highlighting text and classifying and rating items. With these question types, students are asked to demonstrate their understanding and apply their knowledge in concrete scenarios. This way they realize very quickly whether they are capable of solving the given challenges. At the same time, aggregating all students’ answers allows presenters to assess the level of comprehension of the whole audience, provide timely feedback while students are attempting to construct solutions and detect and address misconceptions before they take root.

Fig. 2
figure2

Example of an < asq-fiddle-q> question type element that allows students to build a webpage with HTML, CSS and JavaScript

Experimental setup

Web design is an elective course at the Department of Mathematics and Informatics, University of Novi Sad, Serbia. It is intended for first-year students in second semester of following two study programs: computer science and information technology. However, it is possible to choose this course on second or even on third year. The first-year students have obtained the necessary Java programming prior-knowledge from the obligatory first-semester course Introduction to Programming.

Besides some basics about Internet and Web (which include history, client-server architecture, web browsers, web servers, definition of URL, DNS etc.) the content of the course is divided into three main topics:

  • HTML: history, tags, elements of web page, attributes, text formatting, links, images, tables, lists, frames, forms.

  • CSS: positioning, CSS rules, selectors, declaration blocks, box model (padding, border, margin), page flow, floating.

  • JavaScript: variables, statements, operators, functions, arrays, objects, events, Document Object Model (DOM), Browser Object Model (BOM), form validation, HTML5 canvas.

The course consists of one hour of lectures and three hours of lab exercises per week. During the lectures general and only theoretical concepts are presented. The lectures are performed in large classrooms with one big group of students (usually around 60) and without individual access to digital devices. In such circumstances, the utilization of ASQ during lectures was not possible.

On the other hand, during lab exercises students are divided into smaller groups (less than 15), where each student has access to a computer. The main goal during labs is to support the practical work of students. However, there are some parts of the exercises where teaching assistant introduces and explains some new concepts. These introductory parts of exercises are ideal for utilization of ASQ.

The main goal of our research was to determine if ASQ helps students to gain higher knowledge in the topic. In order to verify that, we performed an experiment with ASQ in Web Design course during two school years: 2016/2017 and 2017/2018. In each year, students were divided into two groups:

  • Control group (NA group): The introductory parts of exercises are presented to students in this group using traditional PowerPoint slides and HTML/CSS/JavaScript examples.

  • Test group (A group): The introductory parts of exercises are presented to students in this group using ASQ.

We tried to balance these two groups in terms of number of students and their prior-knowledge as much as possible. The prior-knowledge was measured by students’ scores earned on practical tests from Introduction to Programming course. The characteristics of students’ distribution is given in Table 1. This distribution could not be ideal due to two main reasons:

  • Some students could not be assigned to an arbitrary group due to time constraints in their schedule.

    Table 1 Properties of NA and A groups for two school years
  • There are some students who skipped Introduction to Programming course, so we did not have data about their programming skills.

The final step in the experimental setup was to define an objective evaluation of students’ performance. The simplest way would be to use students’ final grades in the course. The grade is formed by summing the results from three individual projects which correspond to three main topics: HTML, CSS and JavaScript. However, different groups of students are assessed by different teaching assistants, so this final grade might be biased. In order to avoid subjective bias, we introduced an unified final test at the end of semester. It contained thirty questions (ten per each course’s topic), each question being worth one point. By introducing this test, we were able to obtain an unbiased evaluation of students’ performance. It was used only for the purpose of ASQ evaluation and it did not influence students’ final grades. The detailed structure of the test can be found in the Appendix.

Our final data sample includes 107 students, 51 of them being from the school year 2016/2017, and 56 from the school year 2017/2018. The students were mostly first year students, hence without much programming experience. For each student we recorded three scores obtained on the final unified test – one score for HTML, one for CSS and one for JavaScript. Besides that, for each student we also recorded information about their group, i.e. for each student we know whether they belong to A or NA group. The sample contains 54 students that belong to group A, and 53 students that belong to group NA.

The goal of this research was to determine how does ASQ influence students’ learning performance in Web Design course. To meet this goal, we compared the results of the students who belonged to the group A with the results of the students who belonged to the group NA. Besides that, we also wanted to see how the influence of ASQ changes over different question types and over different topics, i.e. we wanted to define in which cases is ASQ most suitable. For that purpose, we analyzed the results of individual questions and tried to find correlations between question type and students’ performance. We also analyzed and compared the results of each separate topic (HTML, CSS and JavaScript), in order to see if ASQ affects learning performance differently over different topics.

Results

In this section we present the results of statistical analysis that was performed on the collected data.

Firstly, we wanted to see if the distribution of the scores obtained by NA group is the same as the distribution of the scores obtained by A group. If the distributions were identical, that would mean that ASQ did not introduce any changes. For that purpose we conducted Mann-Whitney U test on two sets of data: one contained of NA group’s scores, and the other of A group’s scores. P-value of 0.0232 was obtained, meaning that we could reject null hypothesis and that the two sets have significantly different distributions.

In order to compare performances of NA and A groups on individual questions, we calculated mean values of scores for each question. Mean values are separately calculated for NA and A groups and are also separately calculated for years 2016/2017, 2017/2018, and for both years together. Let us now introduce question ratio value denoted as R(q,g), which is shown in Eq. 1. Function M(q,g) returns mean value of the scores for question q and group g, while the function I(g) returns inverse of the group g, i.e. if the group g was A function I returns NA, and vice versa. Finally, for a given question q and group g value R(q,g) tells us how good group g performs compared to group I(g) (value closer to 0 indicates worse and value closer to 1 indicates better performance). Throughout this section, for a question q and group g we will say that group g outperformed group I(g) iff R(q,g)>0.55, similarly group g underperformed group I(g) iff R(q,g)<0.45, otherwise, if 0.45≤R(q,g)≤0.55 performance of two groups is said to be similar.

$$ R(q,g) = \frac{M(q,g)}{M(q,g) + M(q,I(g))} $$
(1)

Table 2 contains the aforementioned mean values, while Fig. 3 visualizes them. Distributions of question grades for NA and A groups over each separate topic are visualized with violin plots in Fig. 4.

Fig. 3
figure3

Overview of ratios between scores of A and NA groups on each individual question from final test. Three charts from top to bottom represent following data: data for year 2016/2017, data for year 2017/2018, and averaged data for both years. In each chart top part of stacked bar represents NA group, while the bottom part represents A group. Yellow bar color indicates that performance of the two groups is equal, red bar color indicates that given group under-performs other group, while the green bar color indicates that given group outperforms other group

Fig. 4
figure4

Distribution of question grades over whole examination period (i.e. data of both years is included)

Table 2 Mean values of scores for each individual question in the final test

Discussion

In this section an analysis of the obtained results is given. First an overview of the limitations of our research is given, and then scores on individual questions and scores of the three main topics are analyzed.

Even though we put a lot of effort into avoiding any noise and biases in the data, our research still has certain limitations. First and major limitation is that A and NA groups might not be ideally balanced. Instead of randomly dividing students into the two groups, we used students’ grade from Introduction to Programming course, assuming that students’ performances on the two courses would be similar. However, it might be the case that this assumption is not correct for non negligible amount of students, and unfortunately, we could not do much about this issue. Besides this, we had an additional difficulty during the balancing – dealing with students’ schedule. Consequently, some students could not be assigned to the preferred groups. For that reason groups in school year 2017/2018. were not as balanced as groups in school year 2016/2017.

Second limitation is regarded to the final unified test. Namely, the test was not introduced for the purpose of grading, but only for the purpose of our research. Therefore, we had to conduct this test anonymously, which might have caused students’ lack of motivation to do this test as good as they can.

Data (see Table 2 and Fig. 3) shows that in the year 2016/2017 group A outperformed group NA in 13 questions, while group NA outperformed group A in 3 questions; in the rest of the questions groups performed equally well. In the year 2017/2018 group A outperformed group NA in 7 questions, while group NA outperformed group A in 3 questions. If we look at means of whole period that includes both years we can see that group A outperformed group NA in 14 questions, while group NA outperformed group A in only 1 question. First important thing to note is that group A indeed has a significant advantage over group NA in terms of number of outperforming questions. This result suggests that in this case ASQ did influence students’ performance positively. Second thing to note is that relative performance of group A over group NA slightly dropped in year 2017/2018. One possible reason for this could be the structure of the groups. Namely, in the year 2017/2018 group NA had non-negligibly better students in terms of number of points from Introduction to Programming course (see Table 1). This slight unbalance increased possibility of good NA group performance. With that circumstance, even equal performance of NA and A groups could mean that ASQ influenced students’ performance positively, since that could mean that students with less potential who used ASQ perform as well as students with more potential who did not use ASQ.

In order to understand in which cases is ASQ beneficial to students, we analyzed scores of concrete, individual questions in which group A drastically outperformed group NA and vice versa – the questions in which group NA drastically outperformed group A. Some of such examples are given below:

  • In question HTML2 in year 2016/2017 group A drastically outperformed group NA. In order to answer the question correctly students needed to know how to distinguish between following HTML concepts: element, tag and attribute. During the traditional labs with group NA, teaching assistant explained each mentioned HTML concept, while during the ASQ labs with group A, after each explanation students had an interactive ASQ slide in which they had to distinguish between the three concepts.

  • In question CSS3 group A also outperformed group NA. The students’ task here was to write a CSS selector. During the traditional labs, examples of CSS selectors were presented on the blackboard, while during the ASQ labs, students had an interactive slide where they could write selectors themselves and immediately see their outcome.

  • In question CSS6 group NA outperformed group A. The accent of this question was CSS syntax. Namely, students needed to say what does the third number of padding value mean. Students of traditional labs had more time to code since ASQ labs lasted longer, leaving less time for pure coding. As a consequence students from group NA might remembered syntax-related concepts better.

  • In question CSS7 group A outperformed group NA. Students needed to explain the difference between block and inline elements. Both traditional and ASQ labs were performed in the same way regarded this topic.

  • In question JS2 group A outperformed group NA. As a part of this question students needed to distinguish between local and global JavaScript variables. In traditional labs local and global variables are pointed out in preprepared JavaScript code, while in ASQ labs students had an interactive slide where they had to distinguish between two variable types by themselves.

  • In question JS7 during the year 2017/2018. group NA outperformed group A. In this question students needed to write JavaScript code that does DOM manipulations. Both groups had similar programming tasks regarded this question during the labs, but group NA had more time to do it.

  • In question JS10 group A drastically outperformed group NA. In this question students needed to explain how would they implement an animation by using HTML5 canvas. Both traditional and ASQ labs were performed in the same way regarded this topic.

As it can be seen, most questions in which group A outperformed group NA are those that had relevant interactive ASQ slide which forced students to answer simple questions during the lectures. On the other hand, poor performance of the group A is detected in the situations where they had to deal with pure syntax. The reason for that might be that group A had less time for programming during the labs since presentation part lasted longer with ASQ. Nevertheless, it is much more important that students adopt deep knowledge in the topic rather than just syntax rules. From that perspective ASQ is proved to be really beneficial.

Next part of analysis is regarded to influence of ASQ on adopting concepts from three main course’s topics: HTML, CSS and JavaScript. Namely, we wanted to see for which topics ASQ introduced most benefits. Violin plot in Fig. 4 shows the distributions of question grades for NA and A groups, each topic being shown separately. For HTML part ASQ managed to decrease a number of students who scored less than six. Still, unlike group NA, group A has some extreme cases where students performed really badly. The reason for this is not completely clear, but it might be the case that these few students did not attend corresponding classes or lacked with motivation. Number of students that scored more than six has also increased for the CSS part. However, the best influence of ASQ is detected for JavaScript part, which might be a sign that ASQ is better suited for this part of the course.

Conclusions and future work

Technology enhanced learning offers enormous opportunities to increase quality of teaching and learning in modern educational ecosystems. In this paper we put attention on assessment of students’ achievements and specifically we considered effects of active learning on students’ learning performances and outcomes. Particularly we concentrated on possibilities that emerging information communication technologies offer to delivering programming-oriented courses and use of a modern web-based dynamic educational architecture.

Thanks to joint project with colleagues who developed ASQ (Triglianos et al. 2017; Triglianos and Pautasso 2014), members of the project have had opportunity to test it in their educational environments and programming courses (Triglianos et al. 2017). ASQ offers balanced way of mixing theoretical explanations with interactive exercise activities. Such combined lecturing helps students to adopt taught material and prevent misconceptions by giving them teacher’s feedback. ASQ was positively accepted by both the students and the teachers during the Web design course at University of Novi Sad.

Studies in the past have shown increased learning benefits when using personal response systems (i.e. clickers) (for example (Mayer et al. 2009; Gauci et al. 2009)), through the use of recognition format question types. While these systems are not integrated with presentation functionality, thus imposing high context switching costs, their most important limitation is the lack of support for free recall format questions. With ASQ we hope to shift lecturing to a paradigm where the majority of students can answer free recall questions without overwhelming the teacher and benefit from the associated gains in retention.

In this paper the results of experimenting with several types of interactive, constructive, and domain-specific questions within Web Design course are presented. During two school years within lab exercises we observed two groups of students: group NA which did not use ASQ and group A which used ASQ for testing learning achievements and performances. Collected data was analyzed and results show that ASQ did influence students’ performance very positively. In spite the fact that for few types of questions group NA showed better results, in majority of cases group that used ASQ showed significantly better learning performances. The questions where ASQ showed lower performance are the ones that deal with pure syntax. The lower performance is probably caused by the fact that ASQ presentations last longer, leaving less time for coding. Nevertheless, the concepts which were improved by ASQ seem to be more valuable – deep understanding of the main concepts is more important than memorizing syntax rules.

We can conclude that ASQ definitely has great potential to help to quickly identify common students’ mistakes and misconceptions. Based on immediately available answers from students the teacher can comment on them timely and give the students chance at the same class to understand concepts better and be more successful.

ASQ definitively offers numerous resources and high potential to methodologically and technically improve delivery of teaching material for students especially for programming-oriented courses, and improve their learning performances. Accordingly we plan to continue to use ASQ in other similar programming-oriented courses and motivate teachers to prepare material in more attractive way that will additionally motivate students and improve their active role in lecturing.

Table 3 Overview of the unified final test that was used for students’ performance evaluation
Table 3 Overview of the unified final test that was used for students’ performance evaluation. Continued

Appendix

Student performance assessment

In order to do an unbiased assessment of students’ performance, we created an unified test with thirty questions that cover all course’s topics. Each question from the test is worth one point, whole test being worth thirty points. At the end of semester each student had to solve the test, so we were able to collect each student’s score. Evaluation of the ASQ is then performed by statistical analysis of the collected scores. Deeper insight of the test is given in the Table 3.

Availability of data and materials

The datasets used and analysed during the current study are available from the corresponding author on reasonable request.

References

  1. Bartlett, J.C. (1977). Effects of immediate testing on delayed retrieval: Search and recovery operations with four types of cue. Journal of Experimental Psychology: Human Learning and Memory, 3(6), 719.

    Google Scholar 

  2. Darley, C.F., & Murdock, B.B. (1971). Effects of prior free recall testing on final recall and recognition. Journal of Experimental Psychology, 91(1), 66.

    Article  Google Scholar 

  3. Brusilovsky, P., Somyürek, S., Guerra, J., Hosseini, R., Zadorozhny, V. (2015). The value of social: Comparing open student modeling and open social student modeling. In International Conference on User Modeling, Adaptation, and Personalization. Springer, Cham, (pp. 44–55).

    Google Scholar 

  4. Denden, M., Tlili, A., Essalmi, F., Jemni, M., Chang, M., Huang, R., et al (2019). imoodle: An intelligent gamified moodle to predict “at-risk” students using learning analytics approaches. In Data Analytics Approaches in Educational Games and Gamification Systems. Springer, Singapore, (pp. 113–126).

    Google Scholar 

  5. Freeman, S., Eddy, S.L., McDonough, M., Smith, M.K., Okoroafor, N., Jordt, H., Wenderoth, M.P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410–8415.

    Article  Google Scholar 

  6. Gašević, D., Joksimović, S., Eagan, B.R., Shaffer, D.W. (2019). Sens: Network analytics to combine social and cognitive perspectives of collaborative learning. Computers in Human Behavior, 92, 562–577.

    Article  Google Scholar 

  7. Gauci, S.A., Dantas, A.M., Williams, D.A., Kemm, R.E. (2009). Promoting student-centered active learning in lectures with a personal response system. Advances in Physiology Education, 33(1), 60–71.

    Article  Google Scholar 

  8. Glover, J.A. (1989). The “testing” phenomenon: Not gone but nearly forgotten. Journal of Educational Psychology, 81(3), 392.

    Article  Google Scholar 

  9. Hanawalt, N.G., & Tarr, A.G. (1961). The effect of recall upon recognition. Journal of Experimental Psychology, 62(4), 361.

    Article  Google Scholar 

  10. Hogan, R.M., & Kintsch, W. (1971). Differential effects of study and test trials on long-term recognition and recall. Journal of Verbal Learning and Verbal Behavior, 10(5), 562–567.

    Article  Google Scholar 

  11. Hung, I.-C., Chen, N.-S., et al (2018). Embodied interactive video lectures for improving learning comprehension and retention. Computers & Education, 117, 116–131.

    Article  Google Scholar 

  12. Kang, S.H., McDermott, K.B., Roediger III, H.L. (2007). Test format and corrective feedback modify the effect of testing on long-term retention. European Journal of Cognitive Psychology, 19(4-5), 528–558.

    Article  Google Scholar 

  13. Kim, J., Glassman, E.L., Monroy-Hernández, A., Morris, M.R. (2015). Rimes: Embedding interactive multimedia exercises in lecture videos. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, New York, (pp. 1535–1544).

    Google Scholar 

  14. Klašnja-Milicevic, A., & Ivanovic, M. (2018). Learning analytics–new flavor and benefits for educational environments. Informatics in Education, 17(2), 285–300.

    Article  Google Scholar 

  15. Klašnja-Milićević, A., Ivanović, M., Vesin, B., Budimac, Z. (2018). Enhancing e-learning systems with personalized recommendation based on collaborative tagging techniques. Applied Intelligence, 48(6), 1519–1535.

    Article  Google Scholar 

  16. Lin, Y.-L., Parra, D., Trattner, C., Brusilovsky, P. (2019). Tag-based information access in image collections: insights from log and eye-gaze analyses. Knowledge and Information Systems, 61, 1715–1742.

    Article  Google Scholar 

  17. Manzoor, H., Akhuseyinoglu, K., Wonderly, J., Brusilovsky, P., Shaffer, C.A. (2019). Crossing the borders: Re-use of smart learning objects in advanced content access systems. Future Internet, 11(7), 160.

    Article  Google Scholar 

  18. Mayer, R.E., Stull, A., DeLeeuw, K., Almeroth, K., Bimber, B., Chun, D., Bulger, M., Campbell, J., Knight, A., Zhang, H. (2009). Clickers in college classrooms: Fostering learning with questioning methods in large lecture classes. Contemporary educational psychology, 34(1), 51–57.

    Article  Google Scholar 

  19. McDaniel, M.A., Anderson, J.L., Derbish, M.H., Morrisette, N. (2007). Testing the testing effect in the classroom. European Journal of Cognitive Psychology, 19(4-5), 494–513.

    Article  Google Scholar 

  20. McDaniel, M.A., Kowitz, M.D., Dunay, P.K. (1989). Altering memory through recall: The effects of cue-guided retrieval processing. Memory & Cognition, 17(4), 423–434.

    Article  Google Scholar 

  21. McDaniel, M.A., & Masson, M.E. (1985). Altering memory representations through retrieval. Journal of Experimental Psychology: Learning, Memory, and Cognition, 11(2), 371.

    Google Scholar 

  22. Morris, C.D., Bransford, J.D., Franks, J.J. (1977). Levels of processing versus transfer appropriate processing. Journal of verbal learning and verbal behavior, 16(5), 519–533.

    Article  Google Scholar 

  23. Ravizza, S.M., Uitvlugt, M.G., Fenn, K.M. (2017). Logged in and zoned out: How laptop internet use relates to classroom learning. Psychological science, 28(2), 171–180.

    Article  Google Scholar 

  24. Triglianos, V., Labaj, M., Moro, R., Simko, J., Hucko, M., Tvarozek, J., Pautasso, C., Bielikova, M. (2017). Experiences using an interactive presentation platform in a functional and logic programming course. In Adjunct Publication of the 25th Conference on User Modeling, Adaptation and Personalization. ACM, New York, (pp. 311–316).

    Google Scholar 

  25. Triglianos, V., & Pautasso, C. (2014). Interactive scalable lectures with ASQ. In International Conference on Web Engineering (ICWE). Springer, Cham, (pp. 515–518).

    Google Scholar 

  26. Triglianos, V., Pautasso, C., Bozzon, A., Hauff, C. (2016). Inferring student attention with ASQ. In 11th European Conference on Technology Enhanced Learning (EC-TEL). Springer, Lyon.

    Google Scholar 

  27. Triglianos, V., Praharaj, S., Pautasso, C., Bozzon, A., Hauff, C. (2017). Measuring student behaviour dynamics in a large interactive classroom setting. In Proceedings of the 25th Conference on User Modeling, Adaptation and Personalization. ACM, New York, (pp. 212–220).

    Google Scholar 

  28. Whitten II, W.B., & Bjork, R.A. (1977). Learning from tests: Effects of spacing. Journal of Verbal Learning and Verbal Behavior, 16(4), 465–478.

    Article  Google Scholar 

  29. Wood, E., Zivcakova, L., Gentile, P., Archer, K., De Pasquale, D., Nosko, A. (2012). Examining the impact of off-task multi-tasking with technology on real-time classroom learning. Computers & Education, 58(1), 365–374.

    Article  Google Scholar 

Download references

Acknowledgments

This is work is a partial result of the collaboration within the SNF SCOPES JRP/IP, No. 160480/2015, between Switzerland, Slovakia and Serbia.

Funding

This is work was funded by SNF SCOPES JRP/IP, No. 160480/2015, between Switzerland, Slovakia and Serbia.

Author information

Affiliations

Authors

Contributions

All authors contributed to the study conception, design, material preparation, data collection and analysis. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Brankica Bratić.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Bratić, B., Triglianos, V., Kurbalija, V. et al. Role of interactive presentation platform ASQ in delivering web design course. Smart Learn. Environ. 7, 15 (2020). https://doi.org/10.1186/s40561-020-00123-w

Download citation

Keywords

  • ASQ
  • Web design
  • Active learning
  • Interactive presentations