Student-content interactions: Exploring behavioural engagement with self-regulated inquiry-based online learning modules
Smart Learning Environments volume 10, Article number: 1 (2023)
Technological innovations and changing learning environments are influencing student engagement more than ever before. These changing learning environments are affecting the constructs of student behavioural engagement in the online environment and require scrutiny to determine how to facilitate better student learning outcomes. Specifically, recent literature is lacking in providing insights into how students engage and interact with online content in the self-regulated environment, considering the absence of direct teacher support. This paper investigates how instructional design, informed by the factors relating to behavioural engagement, can influence the student-content interaction process within the fabric of inquiry-based learning activities. Two online learning modules on introductory science topics were developed to facilitate students’ independent study in an asynchronous online environment. The study revealed that students showed a high commitment to engaging and completing the tasks that required less manipulative and pro-active effort during the learning process. The findings also revealed that instructional guidance significantly improved the behavioural engagement for student groups with prior learning experience in science simulations and technology skills. This study highlights several issues concerning student engagement in a self-regulated online learning environment and offers possible suggestions for improvement. The findings might contribute to informing the practice of teachers and educators in developing online science modules applicable to inquiry-based learning.
Student engagement is a prerequisite for learning and central to any successful educational experience. Contemporary research relating to online learning environments (Garrison & Cleveland-Innes, 2005; Meyer, 2014) highlights the key role of engagement in effective learning. Researchers have endeavoured to define and understand various dimensions of student engagement that apply across various contexts (Bond et al., 2020). Some have defined student engagement as a ‘psychological process’ implicated in learning (Marks, 2000); others have conceptualised it by considering what behaviours count as engagement (Harris, 2008) and what constructs need to be considered to define them (Sinatra et al., 2015). Nonetheless, commonly identified and investigated dimensions of engagement found in the literature focus on the behavioural, cognitive, and emotional aspects of this phenomenon (Fredricks et al., 2004). Behavioural, cognitive, and emotional engagement often include multidimensional constructs and are highly influenced by context and defined by a given conceptual framework (Reeve et al., 2019; Schmidt et al., 2018). Whether it is the construct or context, it has been argued that a detailed level of specificity is required to measure and conceptualize student engagement (Sinatra et al., 2015).
Within an online learning context, student engagement and interactivity are difficult to capture in precise detail (Rojas et al., 2016). One of the reasons for this difficulty resides in the complex nature of the online environment and the nature of the task involved. The online environment may involve multiple dimensions (Anderson, 2008; Mayer, 2019) as variables (Fig. 1) and their combination requires careful consideration during instructional design.
A traditional didactic lecture might be defined by combining the far, left-hand conditions in the continua in Fig. 1, whereas an online, open, inquiry-based learning (IBL) environment involving individual students might be described by a combination of the far right-hand conditions. Mayer (2004) presents a strong case for avoiding unstructured, unguided inquiry environments where high cognitive load and lack of direction may result in negative outcomes on student learning.
Online students in remote, asynchronous, individual environments are likely to experience different interactions to those in face-to-face, teacher facilitated, synchronous contexts (such as the traditional classroom), and immediate individual feedback is easier to deliver in the latter. Also, an online environment offers a novel teaching and learning context which is highly influenced by the digital interface, available technologies, and the underpinning pedagogical design. Mayer (2019) proposes, after 30 years of research on online learning, that the instructional methods are central to student learning and are informed by a combination of behaviourist, cognitivist, and constructivist conceptions of learning. It is not the instructional media on its own that enables learning.
Key questions that educators might pose regarding students’ engagement in online contexts include: What do students engage with? When do students engage? and How do students use educational technology in their learning? (Ding et al., 2017; Dixson et al., 2017; Sheeran & Cummings, 2018). To answer these questions, educational institutions are primarily dependent on the data from the learning management system (LMS) analytics. LMS analytics readily capture quantitative engagement data such as how many clicks, login time, submissions or reads were made by each student. Total time spent on the activities, the total number of completed tasks achieved, etc. are also available in LMS. However, while data analytics are conceptualised as indicators of student behavioural engagement, they are insufficient to define student engagement in detail, specifically the quality of the engagement related to learning. Researchers are keen to understand the nature of students’ behavioural engagement with the technology resources while they study independently and how the underlying pedagogical design influences students’ independent interactions during tasks. To address this issue, the following research question has been investigated within an inquiry-based learning context to enable further understanding of the nature of student exploration and interaction with the learning content:
To what extent does prior experience with interactive simulations influence student behavioural engagement during student-content interactions in the self-regulated inquiry-based online learning modules?
Inquiry-based learning pedagogies: disciplinary and contextual versatility
IBL has been described as a flexible pedagogical approach for active, student-centred forms of instruction in higher education, its adoption is evident across all levels of education (Aditomo et al., 2013). Many consider IBL as a pedagogy that is particularly relevant to science, technology, engineering, and mathematics (STEM) and science education disciplines through the focus on laboratory learning (Abd-El-Khalick et al, 2004). It is also evident in practice across multiple other disciplines such as psychology (MacKinnon, 2017), arts, humanities, and social sciences (Ahmad et al., 2014; Shih et al., 2010), vocational education (van der Graaf et al., 2020), nursing and medical education (Rodríguez et al., 2019; Theobald & Ramsbotham, 2019). IBL has been used in a wide range of educational levels and contexts such as K-12 classrooms (Aditomo & Klieme, 2020; Kubicek, 2005), undergraduate and graduate level education (Chan & Pow, 2020; Lewis et al., 2021).
IBL approaches have been particularly prevalent in STEM and health education disciplines. IBL methods are praised for fostering authentic learning experiences in practice-based disciplines and are well suited to the cognitive difficulties encountered in clinical practice (Levett-Jones et al., 2010; Tang & Sung, 2012). Research shows that IBL strategies promote group interaction and reflection on authentic practices (Horne et al., 2007), and provide an enjoyable experience to learn (Kirwan & Adams, 2009). Recently Theobald and Ramsbotham (2019) employed IBL approach using a clinical reasoning framework with scaffolding elements to examine undergraduate nursing students' interactions and teachers’ teaching behaviours. They found that clinical reasoning scaffolds embedded within the IBL approach promote high levels of student engagement. The teacher also plays a key role to create a favourable IBL environment for the students.
Sotiriou et al., (2020) showed that even in large-scale implementations at school levels, teachers can create individual inquiry scenarios and monitor students’ achievement when an IBL approach has been effectively integrated within the programme. Findings from this research showed that individual inquiry scenario helps the high achievers more than the other students in complex problem-solving scenarios. Spronken-Smith & Walker (2010) recommend that teachers carefully consider the learning outcomes based on the level of instructional guidance provided during IBL, the teaching and research nexus can be strengthened through open, discovery-oriented inquiry whereas highly structured activities scaffold the development of inquiry skills.
IBL pedagogies can facilitate multiple aspects of blended learning according to the instructional aims for student learning outcomes, the integration of collaborative learning tools within the IBL can create more effective teaching and learning processes involving student–student (S–S) interactions in higher education (Chan & Pow, 2020; Kopeinik et al., 2017).
Student diversity is an important consideration in instructional design, Laursen et al., (2014) studied the implementation of IBL in undergraduate mathematics courses and found that deep engagement and collaboration of ideas are the two key components contributing to students' active learning (Laursen et al., 2011). They also found that gender becomes an important variable in non-IBL courses in which women are gaining lower mastery than men, these differences disappeared in IBL courses. This indicates that IBL approaches can potentially address those courses that have historically promoted inequitable access to learning for women. Archer-Kuhn (2020) examined in what ways has IBL been utilized in higher education, and how IBL approaches might be compatible with values that promote social justice. Archer-Kuhn (2020) further argued that IBL can uphold various social work principles and supports the linking of theory to practice during service-learning.
Research has incorporated modern technology tools and devices to facilitate the IBL approach into the online learning context. For example, in the vocational education context, van der Graaf et al., (2020) use eye-tracking to examine the integration of informational texts and virtual labs during inquiry-based learning in science. Results showed a higher learning gain in domain knowledge when students did frequent integration of informational texts and virtual labs in their virtual experiment. The findings thus infer that integration could compensate for the negative effects of lower prior knowledge. Becker et al., (2020) showed that mobile devices, such as tablets, in the form of multimedia learning in physical experimental processes enhance IBL processes. They further provide evidence that IBL approach with multimedia integration leads to a significant reduction of extraneous cognitive load and greater conceptual knowledge of the subjects (Becker et al., 2020).
With all the disciplinary and contextual flexibility that inquiry-based pedagogy offers, there is enough evidence that when scaffolding support is given, students become actively involved in their learning. The research described above, however, is predominantly situated within the social constructivism paradigm, where interaction between peers and teachers are regarded as central. Student–teacher (S–T) and student–student (S–S) interaction, the two facets of interaction theory and the essential tenets of social constructivism, have received the majority of attention in the literature of distance education (Bernard et al., 2009; Xiao, 2017). The focus on student-content (S-C) interaction has received far less attention than it deserves in the interaction theory literature, especially when it comes to building a self-regulated learning environment in the absence of immediate human support (e.g., teachers, peers). Additional research is necessary to help us better understand student behavioural engagement in the setting of an independent online study environment. In this study, therefore, the S-C interaction process is explored further to understand students’ behvaioural engagement while learning science concepts.
The lack of suitable pedagogical approaches has meant that researchers face significant challenges in developing an effective online learning environment for science inquiry (Lai et al., 2018). For instance, online environments are unable to deliver effective interaction or increase learning engagement without carefully planned learning tactics (Chen & Hwang, 2019). In order to create a productive atmosphere for the S-C interaction process, there is still a significant challenge in integrating technology to facilitate self-regulated learning processes (Lai & Hwang, 2021). Through incorporating an instructional scaffolding technique into the design of the intervention, this current study aims to overcome this problem and facilitate students’ self-regulation and behavioural engagement during science inquiry (Fig. 3).
In this study, we applied different levels of scaffolding support to explicitly synthesize the student engagement with the learning content. The scaffolding framework is unique in that it gives researchers a focused lens through which to view how students actively engage with the learning materials that place an emphasis on inquiry and science learning. The scaffolding framework represents an emerging pedagogical approach assisting researchers to understand how teachers can design learning activities to encourage student self-regulation and engagement in online environment.
The POEE scaffolding strategy is demonstrated in practice through two online learning modules on introductory science concepts that include simulation-based science inquiry. It also provides an outline for instructors to create a student-driven independent online learning environments and to focus on how guided inquiry facilitated by technology support can student interactions and engagement with learning content.
Behavioural engagement in the online context
Moore (1989) proposed three important interactions for online learning environments: student-content (S-C), student–teacher (S–T), and student–student (S–S) interaction. Moore's categorization has become a widely accepted framework for the study of the interrelationships between teacher, student, and content in an online environment. Student behavioural engagement inherently plays a key role in the effectiveness of these relationships.
In a traditional environment, it is conceptualized that the study of behavioural engagement relies on observation of student responses to physical and verbal cues provided by the teacher; however, these cues become less valuable in the online environment where students do not necessarily engage directly with their teachers and peers as part of the learning process (Lei et al., 2019). In an asynchronous online context, S-C interactions become the key indicator of student behavioural engagement. While visual indicators of physical engagement in the online learning process are not as evident as face-to-face learning (Lei et al., 2019), Vytasek et al. (2020) infer that tracking students’ digital artefacts can be used to indirectly understand their behavioural patterns. However, these analytics data often provide insufficient information to understand how students intrinsically regulate their behaviour or why they behave in a particular way during the S-C interaction.
In the online context, student behavioural engagement can be transacted either in an individual study space or one that is socially oriented. Self-contained online modules or courses designed for self-directed study are common examples of learning activities in which students must engage individually. on the other hand, students might use the feedback and forum aspects of a learning management system to interact more socially with their teachers and peers (Baragash & Al-Samarraie, 2018). Within technologically mediated situations, this kind of engagement fosters social presence. Hong et al. (2019) argued that social presence demands active participation from the people involved in the online community. Research indicates that during collaborative tasks, students display interdependency and essentially synchronize their work through some level of time commitment (Romero & Lambropoulos, 2011; Yoo & Alavi, 2001). Furthermore, Yoo and Alavi (2001) found that group cohesion promoted students’ drive to be involved in collaborative tasks, however, this commitment is only possible when collaborative options are included in the online environment. In contrast, it is much more difficult to facilitate student engagement in an independent study space when no social interaction and collaborative tasks are available.
To better understand student behavioural engagement in the context of an online study environment without synchronous teacher (S–T) or peer (S–S) interactions, it is important to explore the nature of the student-content (S-C) interactions. Two primary aspects of online S-C interactions that have been explored in research are: (a) total time spent (time-on-task) on the activity, and (b) quality time spent (nature of student participation) in the learning process (Christenson et al., 2008; Ding et al., 2017). In their study, Brenner et al. (2017) considered both participation (such as the productive moves, clicks, and total tries) as well as time on task (such as total elapsed time) to determine the students’ behavioural engagement. Also, Romero and Barberà (2011) argued that both time-on-task and the quality of time spent could influence students’ academic performance. Therefore, in this study, we combine both time-on-task and quality time spent (or participation) on the tasks to conceptualize students’ behavioural engagement (see Fig. 2) during S-C interaction.
Previous studies have argued that several key behavioural engagement constructs need to be considered to understand student quality time spent in an online activity. Fredricks et al., (2004, 2016) concentrated on effort, persistence, attention, good conduct, and the absence of disruptive conduct to measure student behavioural engagement. Young (2010) argued that students with high effort and persistence are generally exhibiting high levels of behavioural engagement. However, it is undoubtedly more challenging to quantify students' good and disruptive behaviour in a remote learning environment. Fredricks et al. (2004) reported that students’ completion of a designated task is a sign of behavioural engagement. Additionally, a systematic and organised interaction process essentially provides a qualitative dimension to student engagement (Garrison & Cleveland-Innes, 2005). Therefore, in this study, students’ systematic efforts in the inquiry process are conceptualised as ‘systematic investigation’ and considered as one of the important constructs to measure students’ quality time spent on a task. In brief, the three important constructs that can define quality time spent by a student on a task are: persistence, systematic investigation, and task accomplishment (Fig. 2).
Instructional method design
Critically, it has been found previously that students have demonstrated poor participation when scaffolding or guidance has been absent during online learning (Tallent-Runnels et al., 2006). Therefore, educators are continually seeking a viable solution to delivering an effective, guided inquiry-based, online learning environment. In recent times, sophisticated technology has offered educators the opportunity to explore and create more sophisticated guided learning environments (Hong et al., 2019). However, Meyer (2014) recommended that a strong pedagogical design is required to create and structure the learning environment that makes what they need to do and achieve transparent for students.
The inquiry-based learning environment is exploratory by nature in science education, it requires active participation, and self-regulation by students in the process of their knowledge construction (Sharples et al., 2015). Therefore, students are encouraged to engage in a series of inquiry cycles formulating their reasoning on the problem under investigation during the process (Pedaste et al., 2015). In creating an effective pedagogical design, educators often categorise the student learning process in accord with the cycle of inquiry phases. One of the popular long-standing pedagogical strategies employed within science education is the predict observe explain (POE) pedagogical framework (White & Gunstone, 1992). The POE pedagogical framework supports instructional methods that enable students to work in phases. For example, students need to predict a phenomenon, perform an observation, and then explain the observed findings about the initial prediction (Bilen et al., 2016). Other studies have also reported that the POE framework can be used to change the students’ initial misconceptions into correct ones (Ayvacı, 2013; Karamustafaoğlu & Mamlok-Naaman, 2015; Samsudin & Efendi, 2019), while supporting self-regulation (Al Mamun et al., 2020, 2022) in the inquiry process.
Consequently, the predict, observe, explain, and evaluate (POEE) pedagogical design, an extended version of POE, has been utilised in this study to provide a series of inquiry phases for student learning in an asynchronous, self-directed, online environment. The details of the development of this pedagogical design have been reported elsewhere (Al Mamun, 2018; Al Mamun et al., 2020). Figure 3 shows the schematic representation of the POEE pedagogical framework.
Under the POEE pedagogical design, emerging technologies such as interactive multimedia have been employed to promote higher quality S-C interactivity in terms of elicitation, exploration, explanation, and clarification of the concepts. Such multimodal technology, including dynamic and interactive representations, may help students to understand more complex science concepts (Bernard et al., 2009) and support increased student performance (Mayer et al., 2001).
In this study, two learning modules that cover the introductory science topics of Phase change and Heat have been used to illustrate how the POEE framework can be used to guide instructional design for online inquiry-based environments. Several POEE activities have been employed in each of the learning modules and examples are shared (Al Mamun et al., 2020; Al Mamun, 2022).
Multiple media in the form of videos and animations that include audio narration and sound effects, also sometimes music, were utilised to introduce dynamic representations of concepts linked to the text and embedded images. Interactive simulations were also a core learning object included in the modules, they provided only visual interactive experiences without embedded auditory media such as narration, sound effects or music. The interactive simulations that have been selected for inclusion in the modules in this study were sourced from two popular websites that freely share science simulations, namely physics education technology (PhET) interactive simulations (PhET, n.d.)) and Molecular Workbench science simulations (Molecular Workbench, n.d.). Both platforms provide students with interactive and flexible experiences of science concepts at the molecular level. Such forms of multimedia technology integration in online environments can facilitate proximity between learners, teachers and learning content and can influence student engagement (Dyer et al., 2018). In addition, Miles et al. (2018) argue that delivering educational materials in multiple forms can facilitate student engagement and support effective navigation and utilisation of the materials.
Study context and participants
This study aimed to explore S-C interactions in a self-directed online environment and employ a mixed method research design. A group of 30 science students, enrolled in first-year chemistry of a large Australian university were selected as a sample for this study. In general, sample sizes of 30 are considered adequate for a qualitative data dominant study and can achieve data saturation (Creswell, 2007). Small sample sizes in a qualitative study help researchers to obtain detailed, in-depth experiential accounts of the phenomenon under study (Ryan et al., 2009). In fact, researchers often do not consider the sample size in qualitative research (Onwuegbuzie & Leech, 2005). However, this study also used statistics for quantifying the qualitative data to conduct t-test and chi-square test analysis. A small sample size generally satisfies the assumptions of t-test and chi-square test analysis (Kim & Park, 2019; Poncet et al., 2016).
Due to the ease of access, this study employed a convenience sampling technique to secure this cohort. All enrolled students received an invitation to participate in the study via the LMS (Blackboard), and only those who responded positively to the invitation were chosen to participate. Students had to give informed consent in order to participate in this study. Two student groups were formed based on their self-reported prior learning experience with online simulations: experienced and non-experienced. Experienced learner, in this study, was conceptualised as the student having experience of a science simulation in the online environment during their previous science learning. Figure 4 summarises the details of the participants and study context.
The two learning modules were offered to students in parallel to their formal coursework, that is, these activities were not required for their courses. Students participated voluntarily in learning from the modules, and they were aware that their performances would not be assessed; no grading was assigned to course marks upon their completion of a module.
Observations of the S-C interactions included video recording, observation, and stimulated recall interview and a variety of tools were used to collect the data. Students were required to participate in only one of the two available learning modules (either Phase Change or Heat) and participant IDs are formulated to indicate which module they had completed. For example, an ID that begins ‘pxxx’ indicates the Phase Change module and those beginning ‘hxxx’ indicate the completion of the Heat module. At the beginning of the module, a short orientation was provided to the students showing different components of the web-based learning module such as the simulation models, videos, and other important elements. Each student was then left to work independently on their own in a dedicated room. The student’s on-screen computer activities were recorded through the Echo360 software. Additionally, the researcher made observation notes on a student’s written responses and on-screen interactions from a remote location using Virtual Networking Computing (VNC). Once a student finished a module, a stimulated recall interview was conducted to record the student’s immediate reflection on their experiences with the module (O’Brien, 1993). The video recording of the students’ on-screen activities and the researcher’s notes in combination provided the basis for conducting this post-module interview. These data collection techniques focussed on exploring the different constructs of behavioural engagement like persistence, systematic investigation, and task accomplishment.
This study used both an inductive and theory driven thematic analysis approach to formulate themes from the data (Boyatzis, 1998; Braun & Clarke, 2006). The constructs of behavioural engagement originated in the relevant theories (described above in Fig. 2) while various sources documented in the literature review provided the basis of a rationale for formulating the construction of the themes. Thereafter, the students’ behavioural efforts, related to the identified themes, were quantified, and codified to measure the relative degree of influence those factors exerted on the interaction process.
Persistence is defined in the literature as a student’s continuous effort to overcome various challenges faced in the process of learning (Parker, 2003). Likewise, student persistence, in this study, refers to the student’s prolonged exploration of the simulation task in pursuit of understanding the science concepts, even though the consequences of this exploration might not contribute to their anticipated learning. Thus, student persistence was measured in this study as the combination of students’ time-on-task and their efforts to interact with the simulation activities. In contrast, systematic investigation denotes a strategic and organised investigation of a concept, contributing directly to achieving the anticipated learning. Finally, in combining the results of both the persistence and systematic investigation, students’ task accomplishment was assigned as either complete or incomplete. The codified data were then triangulated to explore how they impacted students’ behavioural engagement.
For each activity, a threshold time has been defined in order to determine time-on-task (Al Mamun et al., 2022). Combining two distinct metrics has allowed the researcher to determine the threshold value of time-on-task. The first author of this study engaged in each activity themselves to determine how much time was needed to fully comprehend the intended concept from the interaction. Second, the researcher looked at how long each participant spent participating in each activity and noticed how long it typically took a student to understand the target concept during each encounter. The researcher's judgment has been merged with the observations of the students' engagement time to define the threshold of time-on-task for a particular activity. We took into account the students' attempts to make use of the virtual tools built into the simulation model during the inquiry process (Al Mamun et al., 2022). According to Vytasek et al. (2020), tracking students' digital artifacts can be utilized to deduce their behavioural patterns and interaction process.
However, systematic investigation refers to the organized study of the concepts, i.e., a student tries to comprehend a topic by thoroughly exploring it while taking into account the available stimuli from the simulation environment. Research shows that students are generally involved in the process of grasping a particular concept through this kind of investigation (Al Mamun et al., 2022). The details of data analysis coding technique have been reported in other studies in which four key constructs of behavioural engagement mentioned in Fig. 2 have been conceptualised (Al Mamun, 2018; Al Mamun et al., 2022). These studies along with the current study are parts of a larger study. The two authors iteratively discussed and cross-checked the coding reliability.
After completing the thematic analysis and quantification of the themes, relevant statistical analysis has been conducted to compare the data arising between the two groups of students. An independent sample t-test has been conducted to consider whether any observed difference in mean engagement time between the experienced and inexperienced student groups was significant. Pearson’s chi-square test of independence was conducted to gain further insight into any significant association between two categorical variables. A cross tabulation of the data has been formulated based on the observed value and the expected value comes from the null hypothesis, i.e., when the distribution is independent to each categorical variable. Research suggests that chi-square test can be conducted when expected values of the contingency table cells are greater than 5 (Franke et al., 2012). For any significant association between the categories in a chi-square test larger than 2 × 2 contingency table, Cramer’s V has been reported to indicate the strength of the association (Kline, 2013). A value of Cramer’s V less than 0.26 is considered to indicate weak strength of association (McHugh, 2012). Also, for a contingency table larger than 2 × 2, the source of a statistically significant result can be unclear. Therefore, a post hoc test is required to reveal where the significant result is existing in the contingency table cells (Sharpe, 2015). For this, adjusted residual, a recommended procedure compared to other post hoc alternatives has been used (MacDonald & Gardner, 2000). MacDonald and Gardner (2000) also suggested a Bonferroni correction in this process to reduce the chance of committing type 1 error. Therefore, this study used the Bonferroni correction to report the adjusted p-value for identifying the value which is statistically significantly different from the expectation of the null hypothesis.
Furthermore, when the number of observations was found to be small and the expected frequency in any cell of the contingency table was less than 5, a more appropriate form of analysis Fisher’s Exact test has been utilised (Cochran, 1952). Research proved that to deal with small observations, Fisher’s Exact test is particularly useful (Bower, 2003). This study combined the categories to form a 2 × 2 contingency table for Fisher’s Exact test. For the 2 × 2 contingency table, the Phi value has been reported to indicate the strength of the association between the categories (Franke et al., 2012). All the statistical analyses were performed using statistical package for the social sciences (SPSS) software with the significant p-value threshold set at 0.05.
Engagement time with the learning tasks
It was estimated by the researchers that the typical time for a student to complete each module would be 50 min. Despite the absence of direct or personal guidance, student engagement time with the learning modules was found to be satisfactory. The average engagement time ranged from 44 to 52 min for each learning module for both the experienced and inexperienced student groups. Table 1 displays the statistics of student engagement time obtained from the video records.
Table 1 indicates that the mean engagement time (M = 46.90, SD = 15.96) of the experienced group was lower than the inexperienced cohort (M = 50.50, SD = 21.64). Nonetheless, the engagement times of the inexperienced group are more spread out compared to the experienced group. Also, the inexperienced group took longer in their initial time to become familiar with the online environment. As found from the observation and video record data, inexperienced students generally engaged for an extended period (ranging between 2 to 5 min) at the start of the module in orienting or understanding the simulation environment. This prolonged initial familiarisation with the environment resulted in less engagement time attributable to exploring the target concepts. For example, one student exhibited a difficult time initially with a simulation activity that was intended to provide the student with an opportunity to learn basic ideas relating to the states of matter, i.e., the solid, liquid and gaseous phases of a substance (see Fig. 5). During the interview, this student explained the reason for their initial difficulty:
I think I am trying to move it (the lid of the container) up. Whenever I moved it up, I saw the cursor goes away, oh! and I lost it. Also, it took me a little while to realize how the pump works (in the simulation model). [p207]
This confirms that the student had faced initial difficulties in understanding the functions of the simulation parameters (e.g., the use of the container lid to change the pressure, and the pump to increase the volume of the substance). Another interview example reveals a different student’s reasons for their initial difficulty.
It took me a bit of time to figure out how to work with the play (button) and then press the heat (button) up for a long time to get the temperature up. [p103]
This observation suggests that inexperienced students had trouble initially navigating the simulations and therefore they took longer to engage with the activity than the experienced group.
Independent sampled t-test suggests that there were no significant differences between the mean engagement time of the experienced and inexperienced student groups t(28) = 0.486, p > 0.05. It was found, Table 2, that both groups satisfied the condition of homogeneity of equal variances (F = 0.498, p = 0.486).
Student engagement time with separate individual activities across the learning modules was explored further, a chi-square test of independence was conducted to ascertain if there was any significant association between engagement time and the types of activities. A range of scaffolding strategies and activities were included in each module, described in depth elsewhere (Al Mamun et al., 2020; Al Mamun, 2022).
The chi-square test of independence, in Table 3, revealed a significant association between engagement time and the types of activities, chi-square (4, N = 150) = 27.551; p < 0.05. Post hoc analysis revealed that among the types of activities, engagement time in open response, feedback and videos significantly differ from the expected count of the null hypothesis. This indicates that videos and feedback attracted significantly higher engagement time, and open response entries resulted in significant low engagement time.
It should be noted that the simulations were presented as the central activities in each of the learning modules so it was hypothesised that they would attract longer engagement time, but the data suggests otherwise. During the interview, students expressed why they had preferred videos that were also included in the modules in contrast to the simulations and had engaged for a longer time with the video mode compared to simulations.
I love the videos because it does not require so much input on your part. You can just sit back and take it all visually. [h102]
I prefer video to simulation because it explains things in a very short way. [h204]
I think naturally anyone is happy to see the videos. It explained well, and it helped my understanding of the structures of the water molecules in different phases. [p206]
The data suggests that the videos were perceived as easier to understand and did not require any physical interaction by the students, i.e., no active S-C interaction was required. Students appeared happy, and probably intrinsically motivated to engage with the videos as they could act receptively during the activities. The interviews with students also revealed that they had spent time engaging with feedback because they were intrinsically motivated to know whether their answer was correct or incorrect.
I like feedback. I think it makes understanding clear. [p207]
It was good to have that feedback and the little video afterwards. Now I know why I got it wrong, and I will not get it wrong again. [h101]
If I did not get the feedback and if I did not know the answer, I would just carry on without really understanding the concept. But because it allows you to answer and then give feedback on it, yeah, I think that is really helpful. [p103]
The above comments support the effectiveness of the feedback mechanism as scaffolding to engage students more deeply in activities, an outcome similar to that noted in a previous study (Mount et al., 2009).
Student effort applied to the task in different instructional settings
Persistence and systematic investigation were examined to identify students’ behavioural efforts during the S-C interaction process in three different instructional settings.
In Table 4, the chi-square result shows a statistically significant association between instructional settings and student persistence, chi-square (2, N = 68) = 15.579, p < 0.05. Post hoc analysis did confirm that students show significant high persistence in moderately guided activities and significantly low persistence in the minimal or open-ended instructional settings. Similarly, students showed a tendency to demonstrate more systematic investigation in the guided activities compared to unguided activities. However, the chi-square test shows that the association between instructional settings and students’ systematic investigation were not statistically significant, chi-square (2, N = 68) = 5.608, p > 0.05. So, students’ systematic investigation was not directly influenced by the instructional guidance.
In brief, activities without the instructional guidance were perceived to be less effective by students. The original intention of open and minimally guided activities was to support students’ independent exploration and learning. It was found from observation of behaviour in this study that this strategy did not work well for students, this finding is further supported by the data from the student interviews shown below:
It is not clear about the objective of this simulation. There should be clear instructions for the activities in the simulation (activity). [h206]
There are some parts (in simulation), need to do some activities but there are not enough instructions for me. So, I am struggling there. [h204]
The simulation was pretty hard to understand. Because I had to play around with the things myself. It will be better if somebody was voicing over or explaining it to me. [p205]
Additional specific insights into why the open exploration of simulations might have hindered students are provided in the more extended example of a student’s open exploration process below.
The simulation activity considered here was taken from the Heat topic module in which minimal guidance was strategically and deliberately offered. It represents the concept of thermal expansion at the molecular level (Fig. 6). The simulation has two important interactive tabs (functions) labelled ‘Heat’ and ‘Cool’ that enable the student to change the heat in the system. A student can initiate their independent exploration by clicking on either of these tabs.
One student [H103], during the interaction, was observed to continually attempt to increase the system heat by clicking on the ‘Heat’ tab, disregarding the ‘Cool’ tab which could have been used to reduce the system heat for comparison. In the interview, the student explained their behaviour:
I just heated it all the way to see how to get it to overflow (with the system heat). Because that was my intention. I did not think to cool down the system heat. [h103].
Students demonstrated that their exploration of the simulation model was found to be both beneficial and unproductive. For example, the above student sought to find out what might happen to an object when extreme heat was applied. Intuitively, freedom in general to explore a simulation seems appealing. Consequently, this autonomy in learning led them to have a new experience with the simulation model, perhaps, supporting the construction of new knowledge about molecular behaviour. In contrast, such freedom in the exploration might be interpreted as reaping unproductive results. In particular, overlooking the ‘Cool’ tab deprived the student of experiencing the molecular behaviour at a low temperature, and consequently probably left them in a state of an incomplete understanding of the thermal expansion process; that is, it was observed that the student had missed the opportunity to experience the effect of a low temperature on the behaviour of molecules.
This study also found that, despite the known benefits of guided activities, some students preferred the open nature of the activity. There was evidence of a belief that the simulation and its affordances were enough to support their self-exploration. A student in this category clarified their view in the post-module interview:
I think simulation itself can guide. The whole idea is kind of like making your way through … and playing around with all the concepts. Manipulate all these things and answer the questions, do what you want... you can do most things you like, kind of get yourself involved and learn at a deep level sometimes. [p207]
The ability to ‘do what you want’ was captivating for this type of student who appeared keen to embark on self-exploration. This infers that the implicit guidance instigated from the learning environment coupled with the consequences of the exploration met their requirements adequately.
The influence of prior simulation experience
The dichotomy in experience with exploring a simulation such as the one described above was investigated further in terms of whether the association between instructional settings and student persistence was influenced by prior simulation experience. Prior simulation experience was added as a control variable in the statistical analysis to ascertain its effect on students’ level of persistence and systematic investigation in different instructional settings. Fisher’s Exact test seems appropriate here, as the expected frequency is lower than 5 counts in the contingency table for chi-square test. Therefore, a 2 × 2 contingency table has been formed by combining moderate and strong guidance under the ‘guided’ category and open/minimal guidance has been put under the ‘unguided’ category.
Table 5 indicates that Fisher Exact test for the experienced student group showed statistically significant association between instructional settings and student persistence (Exact Sig. 2-sided) = 0.000; p < 0.05; and between instructional settings systematic investigation (Exact Sig. 2-sided) = 0.023; p < 0.05. The strength of the associations measured in Phi value showed strong association (0.589 and 0.389) for both the persistence and systematic investigation for the experienced group. In contrast, for the inexperienced student group, the Fisher Exact test shows that instructional settings do not significantly associate with persistence and systematic investigation. This result indicated that experienced students are more capable of utilising instructional guidance to engage meaningfully with the learning content in the self-directed environment. Overall, guided activities tended (as the % value indicates) to support higher student persistence and systematic investigation than activities that provided minimal support for the students.
Students’ task completion rate
Based on the number of S-C interactions, the student task completion rate was found to be higher for videos (93.6%) compared to simulations (55.9%) and open response questions (51.3%), as illustrated in Table 6.
Table 6 shows that the students exhibited reluctance to respond to open-ended questions with a response rate of 51.3% (around half) for the inquiry questions asked in the learning modules. Interview data indicated that for several students, an incomplete response could be attributed to their understanding still developing hence their inability to explain the concept.
I was tweaking my mind (about the ideas) and sometimes it took longer time to do things. Obviously, the concepts were not concrete in my mind and so obviously the understanding was. [h102].
I guess that I kind of knew the concept, but I did not really know how to word them. I had some sort of idea in my head but actually articulating them scientifically was what I had difficulty with’ [h205]
This suggested that students struggled in interpreting and reformulating their thoughts and ideas into precise explanations and therefore left these answers incomplete. The findings in Table 6, also supported by the interview data, further confirm those observed in “Study context and participants” section and “Data collection” section, where students generally revealed a positive attitude towards the video activities (completion rate 93.6%). Altogether, these data suggest that the video format attracted higher student engagement, albeit receptively. The simpler and less technically difficult videos demanded lower manipulative effort which in turn enabled students to participate visually and, perhaps, were supportive of their receptive understanding of the concepts (Al Mamun et al., 2020). As simulation activities are the central component of the learning modules, further exploration of students’ task completion rate in simulation activities in the three different instructional settings was considered.
The chi-square test of independence in Table 7 reveals a statistically significant association between instructional settings and students’ task accomplishment, chi-square (2, N = 68) = 11.274, p < 0.05. The post hoc analysis confirmed that it is the open-ended/minimal guided activity that causes the statistically significantly low task accomplishment rate. In contrast, the analysis clearly suggests that the guided activities provided support and motivation to students to complete the tasks. This finding supported the previous findings discussed in detail in “Data collection” section that the students’ degree of effort was lower in open-ended exploratory tasks. Further, students’ prior simulation experience was added as a control variable and Fisher Exact test has been conducted to understand how prior simulation experience impacted students’ task accomplishment rate.
In Table 8Fisher Exact test reports a statistically significant association, (sig. 2-sided) = 0.000; p < 0.05 between the instructional settings and higher task completion rate for the experienced student group. This indicates again that experienced students can best utilise the instructional settings in a self-directed environment.
Behavioural, cognitive, and emotional engagement are all important multidimensional constructs that are highly influenced by the learning context (Reeve et al., 2019; Schmidt et al., 2018). In this study, we have focussed on the behavioural engagement of students as they interacted autonomously with guided-IBL in science modules that were designed through the application of a POEE instructional model. The instructional design (Al Mamun et al., 2020) and findings related to student cognitive and emotional engagement as a function of the design have been described elsewhere (Al Mamun, 2022; Al Mamun et al., 2022).
Several factors that affect student behavioural engagement focusing upon S-C interactions have been explored in this study in the context of an online learning environment. The other study reported elsewhere (Al Mamun et al., 2022) also explores behavioural components such as task completion, persistence etc. in relation to student cognitive engagement and learning approaches. Based on the measures of different behavioural constructs reported in the literature (such as time on task and quality time spent) and the factors derived from the current study related to students and content, a relationship model is proposed (Fig. 7).
In this model, student behavioural engagement was conceptualised based on the relationship between different engagement measures and engagement factors linked to the S-C interaction process. Measures of different behavioural engagement were distilled from research literature while engagement factors were conceptualised from the data originating in the S-C interaction process. The underlying factors relating to both students and content are illustrated in Fig. 7.
Factors affecting students’ engagement time and task completion rate
Previous studies report that in an online learning context, students may lack the motivation to engage with the content in the absence of teacher guidance (Fryer & Bovee, 2016). In this study, the students’ total engagement time with the two online learning modules was found to be satisfactory. This is likely due to the underlying POEE instructional design supporting students to regulate their learning through a series of inquiry phases (Al Mamun et al., 2020). Also, as students worked independently in the absence of direct teacher support, a sense of autonomy during their interactions might facilitate intrinsic motivation (Deci & Ryan, 1987). Higher engagement time has been shown to improve student performance in a range of learning environments, including online learning environments (Baragash & Al-Samarraie, 2018), blended learning environments (Raspopovic et al., 2014), and traditional classroom settings (Gromada & Shewbridge, 2016).
The mean engagement time of the more experienced student group was lower than the student group who had no prior simulation experience. This observation appears to contradict previously published results where experienced students tend to engage longer in utilising the available technology resources and therefore were able to engage more meaningfully in the learning processes (Bates & Khasawneh, 2007). However, in the current study, interactive simulations were provided as a dynamic, interactive representation of science concepts and it was observed that inexperienced students spent more time initially investing in becoming familiar with the functions and orienting in the online environment before cognitively engaging in activities. Experienced learners, in contrast, spent less time familiarizing themselves with the environment and were observed to spend a greater amount of time engaged in actively processing understanding of the intended science concepts. It has been reported previously that when students utilise most of their cognitive ability on something extraneous, they often failed to engage meaningfully with the intended learning concepts with their remaining cognitive capacity (Mayer, 2019).
This study contributes further evidence that students who are inexperienced with simulations demonstrated lower behavioural effort in persistence and task accomplishment, likely due to their inappropriate use of cognitive capacity in learning the functions of the representations. They also failed to effectively utilise the incentives of instructional guidance that was provided in the self-directed environment in other activities. It may therefore initially appear that the strategy of providing multimedia representations is flawed, however recent evidence suggests that the provision of multiple representations can be successful in reducing extraneous cognitive load while supporting conceptual knowledge gains (Becker et al, 2020). Therefore, future modifications of the instructional guidance should aim to reduce the extraneous processing involved in the familiarisation with the environment by increasing the signalling and applying the contiguity principle (Mayer, 2017). One strategy that can be applied is the provision of a brief narrated ‘tour’ of highlighted interactive functions with modelling of how to notice changes using simulations, further research is required to evaluate this form of intervention.
When considering individual forms of activity within the S-C interaction process, videos and feedback activities secured the highest time on task compared to the interactive simulations and open response activities. This aligns with a recent finding set in an open online course, undertaken by a large cohort, that a major proportion of the students (67%) focused almost exclusively on video lectures amongst all of the courses' components and activities (Kovanović et al., 2019). The findings in the current study similarly provided an explicit understanding that students were more engaged with video activities and self-reported that they did not need to engage in manipulative effort and active participation compared to the simulation activities and open responses where greater effort was perceived to be required. Thus, when students are engaged in video activities as part of the learning process, it might increase student satisfaction (Bhadani et al., 2017) and reap improved learning performance (Shen, 2014).
The greater task completion that was observed when videos were the focus in comparison to the simulations and open response activities can be explained by the nature of interactions that are required, videos typically engage students receptively rather than interactively. Previous studies support the notion that a key reason that students are willing to dedicate their time to a task and persist to complete a task is the level of motivation that is aroused (Dev, 1997). In the online context, the psychological motivation factors accord with learners’ interests, motivation, and positive attitudes toward learning (J. Lee et al., 2019). According to Mayer’s dual processing theory, watching videos can contribute to the reduction of cognitive load due to the simultaneous use of auditory and visual channels (Mayer, 2005, 2017).
In contrast to a video as a mode of content interaction, the simulation models used in this study only engage visual channels to process the information. Research shows that attention can be increased, and cognition promoted, if auditory media are successfully employed (Hughes et al., 2019). Thus, the lack of narration, sounds or music in simulation models might hinder students from completing the task. In contrast, some studies are also concerned about the potential cognitive overload due to utilizing a variety of types of media in instruction. Limited capacity theory cautions that information processing channels have a limited capacity, and an overload of these channels can impair cognition (Chandler & Sweller, 1991; Mayer & Moreno, 2003). This would suggest that learning content employing a variety of media could lead to cognitive overload (Hughes et al., 2019).
The simulation format already requires manipulative interactions and demands active engagement with the activity. This ‘high element interactivity’ can cause working memory overload (Kehrwald & Bentley, 2020) thus inducing students to become psychologically demotivated in engagement to complete the task (Lee et al., 2019). This form of the intrinsic load is inherent in the simulations because of its complexity. Research confirms that increased complexity creates increased intrinsic load (Sweller, 1999). Thus, this area of study requires ongoing investigation to understand whether the integration of auditory media will have a negative impact on student learning or promote student cognition.
The current study did not offer any extrinsic motivation in the form of summative marks or certification hence the absence of external motivators might also contribute to the students’ low task accomplishment rate when a higher cognitive load is involved. In combination with intrinsic motivation, the rewards anticipated from the task completion are that the activities may stimulate a desire in students to engage highly with the task. Research shows that extrinsic motivation alone, no matter how powerful, cannot ensure maximal learning (Payne, 2019). In fact, attempting to maximize the learning outcomes directly through extrinsic rewards often leads to lower-quality motivation and performance (Ryan & Deci, 2000).
One strategy to reduce the extraneous cognitive load is to introduce explicit instructions to improve the value of the simulation, such as a narrated interactive video to orient students in the simulation functions (Mayer, 2017). This is supported by the temporal contiguity principle which shows the graphical movement and background narration describing them simultaneously (Mayer, 2019). However, a balance needs to be achieved between the freedom to explore, which makes students cognitively active, and the guidance which is required to support cognitive activity to make meaningful construction of knowledge (Mayer, 2004). Mayer (2019) in his review of thirty years of research in online learning favours guided activities and passive media argues that they can help students active cognitively during the learning process.
Further findings in this study reveal that students sustained their engagement for a longer period due to the provision of immediate feedback following their response to concept questions. The feedback system employed, helped students to link the discrete knowledge they had constructed of a concept towards a more comprehensive understanding. In fact, it was found that during interviews most students were in favour of receiving immediate feedback while studying online. Studies show that when students are motivated, they spent quality time undertaking online learning tasks (Romero & Barberà, 2011). Therefore, feedback can contribute significantly to motivating the students to ascertain whether their responses were right or wrong, adjust their understanding and continue. Therefore, student engagement time was rated as high regarding the feedback activities.
In contrast, students were observed to engage less in activities that involved their submission of an open-response explanation of a concept. This activity required students to cognitively process their understanding and translate them into words in entering a response. They needed to utilise their working memory in the process of synchronising both the manipulative and cognitive processes involved while writing their responses. This might create high cognitive stress, through the imposition of a higher cognitive workload, eventually leading to a low engagement time with the open response activities and low task accomplishment. Apart from the demand for physical input, there were a few other factors that militated against students from completing their answers, for example, shallow understanding of the concepts and their cognitive inability to respond to the questions correctly. Research shows that cognitive ability is an important element in the completion of a learning task (Sweller, 1988; Sweller et al., 1998). As the findings of this study revealed, students were presumed to know the associated science concepts but failed to respond with adequate explanations; as a result, they most frequently left the answer incomplete. Therefore, there is a need for module designers to tailor the open response activities by providing ‘hints’ to facilitate students’ thinking in translating their ideas into scientifically correct explanations.
The role of affective factors in behavioural engagement in guided IBL online is attracting increasing attention. A recent quantitative study, applying a predict, observe, explain inquiry-based model within an online learning environment (Hong et al, 2021) reports that student self-confidence increased as well as their critical thinking attitude. The affordances of a guided IBL approach appear to outweigh the limitations, the latter can be addressed to some extent by careful scaffolding and orientation in the learning environment. This emphasises the multidimensionality of engagement constructs which require further exploration.
Student persistence and systematic investigation in the guided activities
The other important factors affecting students’ quality of time are persistence and systematic investigation. Students were more likely to demonstrate high persistence and systematic investigation in guided activities than they were in minimally guided or open-ended instructional settings. Previous studies support the notion that guided activities attract higher student engagement (Fisher, 2010; Mason, 2011). Significantly, a recent study in inquiry-based STEM education confirmed that the higher the provision of guidance in an online environment, the higher the commitment students demonstrated in engaging with an activity (Sergis et al., 2019).
This study, to some extent, found that open exploration often reaps some positive results in the long run, as illustrated in the example described in “Data collection” section. In such a study space, being an independent learner means such a student is intrinsically motivated to explore a simulation (Deci & Ryan, 1987). Students might find such an open exploration appealing to them as they are allowed to have a satisfying experiential learning experience. When such an open environment is created, many students engage in productive exploration (Podolefsky et al., 2009).
Nonetheless, in the open exploration context, students were often observed to be unsuccessful in learning the underlying science concepts. In the example provided in “Data collection” section, the student only raised the heat to observe a change, they could have lowered the temperature to zero to experience how molecules stopped vibrating and completely froze, an opportunity that is impossible to view in the real world. So herein resides a pedagogical conundrum. Open exploration can lead students to acquire new information and construct new knowledge, yet they may not achieve the intended learning if they miss an opportunity. In offering a degree of latitude, only partial success may be realized. In fact, most of the previous studies reveal that inquiry learning without guidance is less successful (Alfieri et al., 2011; Clark et al., 2012; Kirschner et al., 2006; Lazonder, 2014; Luo, 2015). Additionally, open exploration in a technology-rich environment can create a high cognitive load which can disadvantage the learner (Paas et al., 2003; Sweller, 1999). Moreover, students are often led to incorrect conclusions when they are left on their own to explore and use the technology resources (Podolefsky et al., 2009). Therefore, a guided scaffolded design is recommended to support students' effective learning in the IBL environment.
While set in a STEM discipline context, the findings of this study can be translated into a wider range of disciplinary contexts. Guided IBL online, informed by the POEE framework, involves a sequence of inquiry phases that can apply to any stimulus context in learning. For example, case studies offer authentic inquiry contexts and are popular in nursing and clinical education, social sciences, business, law, pre-service teacher education and languages. Instructors should tailor the level of guidance and scaffolding tools required to their learning contexts, for example, the role of scaffolding and reduction of the cognitive load has been addressed in inquiry-based mobile learning in the context of a 5th grade social science field trip (Shih et al, 2010).
Prior experience to influence student persistence and systematic investigation
The findings of this qualitative study revealed that prior simulation experience significantly improved students’ level of persistence and systematic investigation in guided instructional settings. Students’ observed behaviours in demonstrating high persistence and systematic investigation support the idea that in the guided environment, students who have prior experience can better utilise the educational resources compared to their non-experienced peers. Previous studies show that experienced students are more successful in their use of a technology-mediated IBL environment (Lee et al., 2010; Pallant & Tinker, 2004). Moos and Azevedo (2008) further added that experienced students can engage with exploration meaningfully through a more discriminating selection of new resources from the technology-mediated environment. Therefore, it is unsurprising that experienced students demonstrate higher self-efficacy in a technology-rich environment (Cheng & Tsai, 2011) and commit to spending more time with the learning content (Bates & Khasawneh, 2007).
In contrast, Meyer (2014) argued that inexperienced learners were prone to a lack of engagement due to insufficient skills in this environment. In the technology environment, inexperienced students’ cognitive capacity become depleted as they have already utilised a significant portion of their working memory in getting to know and explore the rich contents prevalent in this environment (Kehrwald & Bentley, 2020).
Limitations and further research implications
The conceptual and empirical work cited above did not consider the multiple dimensions of student engagement; rather it focused only on students’ behavioural aspects. Studies show that there are situations when a student can demonstrate high cognitive engagement yet is committed affectively and behaviourally at lower levels. Similarly, a student can find a task to be important for learning, yet not capitalise on this understanding during interaction because the activity itself might not be personally enjoyable and interesting (Schmidt et al., 2018). Also, a student can demonstrate strong behavioural engagement, but invest less cognitive and affective effort, inferring that the student completed the task but very likely did not learn much from the exercise. Studies also show that students with low cognitive engagement usually struggle in understanding the concept and therefore adopt a surface level approach, focusing on completing the task as a means to end the activity instead of striving to understand the concept at a deep level (Fredricks et al., 2004). So, many scenarios are possible, but this study did not consider the multidimensional engagement context to allow for a coordinated result regarding student engagement and learning. A future study may investigate students’ emotional attributes either separately or in combined with other engagement dimensions to examine how their interests affect the interaction process.
This study used the POEE design framework to encourage students to become independent learners in an inquiry-based self-directed learning environment. As revealed, the absence of any guidance potentially secured less productive learning for some students. Nevertheless, strong guidance does not necessarily ensure the best learning experience either. A possible disadvantage of strongly guided support is that it might limit a student’s autonomy in the learning process and reduce the chance of their becoming independent learners, a phenomenon which was explored in this study. This dilemma of the balance between no guidance versus over-guidance needs to be explored further so future studies might experiment with various pedagogical designs to address this issue.
This study only focuses on the elements of intrinsic motivation to examine students’ engagement with learning tasks. Study shows that when extrinsic motivational components are appropriately combined with the learning process in parallel with intrinsic motivation, it improves student engagement and learning achievement (Ryan & Deci, 2020). Thus, future studies can integrate extrinsic motivational factors to examine further student engagement during the S-C interaction process.
Previous studies show sound effects (i.e., audio narration, music etc.) can increase student attention and cognition with the learning materials when it has been effectively incorporated within the activity (Hughes et al., 2019). However, the science simulations used in this study lack all sorts of auditory media and thus, the effect of auditory media during the S-C interaction process had not been examined. We recommend future study could incorporate the auditory media with the science simulation models to examine its effect on the S-C interaction process.
Another potential direction for future studies resides in the use of the POEE design to employ a gradual reduction fading of the degree of guidance from the learning activities to encourage students towards adopting more responsibility in the process of becoming independent learners. This design could provide novice learners with greater continuity in learning and lead them to develop coping mechanisms for interacting more productively with more complex online learning environments (Arbaugh, 2014). Research shows that when students experience similar activities repeatedly, they become more familiar with the technology resources and achieve a certain degree of control over the environment, therefore, becoming more independent of instructional guidance (Li et al., 2019).
Finally, this study involves the application of design principles with a small sample of undergraduate participants in a single context hence the findings contribute to increasing the collective body of research evidence that combines to inform practice rather than being claimed as generalisable outcomes. Mayer (2017) proposes a research agenda that supports the inclusion of studies that explore the application of design principles to advance understanding of student engagement, behaviour and learning achievement using multimedia. Thus, the exploration of learning achievement and consideration of prior academic ability could form the basis of a larger-scale quantitative study applying the framework and modules to the formal courses. This will help to examine the student learning achievement (high or low) by controlling the effect of gender and ESCS (economic, social, and cultural status) in the study.
The underlying POEE scaffolding strategy implemented in the multimedia learning modules highlights the student-content interaction process within the paradigm of individual cognitive constructivism. As no teacher or peer support was included in this study context, the findings of this study revealed several salient factors implicated in understanding the student content interaction process in the self-directed inquiry-based learning context. These factors are conceptualised to explain student behavioural engagement in this novel context that can support educators in creating learning environments conducive to supporting students in becoming independent learners. The relationship between the different measurable engagement criteria and student-content factors can further support educators in designing their instructional strategies applicable to an effective self-directed learning environment.
Availability of data and materials
The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.
Predict, observe, and explain
Predict, observe, explain, and evaluate
Learning management system
Science, technology, engineering, and mathematics
Physics education technology
Virtual networking computing
Statistical package for the social sciences
Abd-El-Khalick, F., Boujaoude, S., Duschl, R., Lederman, N. G., Mamlok- Naaman, R., Hofstein, A., & Tuan, H. L. (2004). Inquiry in science education: International perspectives. Science Education, 88(3), 397–419.
Aditomo, A., Goodyear, P., Bliuc, A. M., & Ellis, R. A. (2013). Inquiry-based learning in higher education: Principal forms, educational objectives, and disciplinary variations. Studies in Higher Education, 38(9), 1239–1258.
Aditomo, A., & Klieme, E. (2020). Forms of inquiry-based science instruction and their relations with learning outcomes: Evidence from high and low-performing education systems. International Journal of Science Education, 42(4), 504–525. https://doi.org/10.1080/09500693.2020.1716093.
Ahmad, A., et al. (2014). Inquiry-based learning for the arts, humanities, and social sciences: A conceptual and practical resource for educators. Emerald Group Publishing Limited. https://doi.org/10.1108/S2055-364120142.
Alfieri, L., Brooks, P. J., Aldrich, N. J., & Tenenbaum, H. R. (2011). Does discovery-based instruction enhance learning? Journal of Educational Psychology, 103(1), 1–18.
Al Mamun, M. A. (2018). The role of scaffolding in the instructional design of online, self-directed, inquiry-based learning environments: student engagement and learning approaches. PhD Thesis, The University of Queensland. https://doi.org/10.14264/uql.2018.607.
Al Mamun, M. A. (2022). Fostering self-regulation and engaged exploration during the learner-content interaction process: the role of scaffolding in the online inquiry-based learning environment. Interactive Technology and Smart Education, 19(4), 482–509. https://doi.org/10.1108/ITSE-11-2021-0195.
Al Mamun, M. A., Lawrie, G., & Wright, T. (2020). Instructional design of scaffolded online learning modules for self-directed and inquiry-based learning environments. Computers & Education, 144, 103695. https://doi.org/10.1016/j.compedu.2019.103695.
Al Mamun, M. A., Lawrie, G., & Wright, T. (2022). Exploration of learner-content interactions and learning approaches: The role of guided inquiry in the self-directed online environments. Computers & Education, 178, 104398. https://doi.org/10.1016/j.compedu.2021.104398.
Anderson, T. (2008). The theory and practice of online learning. AU Press.
Arbaugh, J. B. B. (2014). System, scholar or students? Which most influences online MBA course effectiveness? Journal of Computer Assisted Learning, 30(4), 349–362. https://doi.org/10.1111/jcal.12048.
Archer-Kuhn, B. (2020). Putting social justice in social work education with inquiry-based learning. Journal of Teaching in Social Work, 40(5), 431–448. https://doi.org/10.1080/08841233.2020.1821864.
Ayvacı, H. Ş. (2013). Investigating the effectiveness of predict-observe- explain strategy on teaching photo electricity topic. Journal of Baltic Science Education, 12(5), 548–564.
Baragash, R. S., & Al-Samarraie, H. (2018). Blended learning: Investigating the influence of engagement in multiple learning delivery modes on students’ performance. Telematics and Informatics, 35(7), 2082–2098. https://doi.org/10.1016/j.tele.2018.07.010.
Bates, R., & Khasawneh, S. (2007). Self-efficacy and college students’ perceptions and use of online learning systems. Computers in Human Behavior, 23(1), 175–191.
Becker, S., Klein, P., Gößling, A., & Kuhn, J. (2020). Using mobile devices to enhance inquiry-based learning processes. Learning and Instruction, 69, 101350. https://doi.org/10.1016/j.learninstruc.2020.101350.
Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M., Surkes, M. A., & Bethel, E. C. (2009). A meta-analysis of three types of interaction treatments in distance education. Review of Educational Research, 79(3), 1243–1289. https://doi.org/10.3102/0034654309333844.
Bhadani, K., Stöhr, C., Hulthén, E., Quist, J., Bengtsson, M., Evertsson, M., & Malmqvist, J. (2017). Student perspectives on video-based learning in CDIO-based project courses. In The 13th International CDIO Conference Proceedings, Calgary, Canada (pp. 689–704). https://research.chalmers.se/publication/250948.
Bilen, K., Özel, M., & Köse, S. (2016). Using action research based on the predict-observe-explain strategy for teaching enzymes. Turkish Journal of EducationTURJE, 5(2), 72–81. https://doi.org/10.19128/turje.70576.
Bond, M., Buntins, K., Bedenlier, S., Zawacki-Richter, O., & Kerres, M. (2020). Mapping research in student engagement and educational technology in higher education: A systematic evidence map. International Journal of Educational Technology in Higher Education, 17(2), 1–30. https://doi.org/10.1186/s41239-019-0176-8.
Bower, K. M. (2003). When to use Fisher’s Exact Test. American Society for Quality, 2(4), 35–37.
Boyatzis, R. E. (1998). Transforming qualitative information: Thematic analysis and code development. Sage.
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa.
Brenner, D. G., Matlen, B. J., Timms, M. J., Gochyyev, P., Grillo-Hill, A., Luttgen, K., & Varfolomeeva, M. (2017). Modeling student learning behavior patterns in an online science inquiry environment. Technology, Knowledge and Learning, 22(3), 405–425.
Chan, J. W. W., & Pow, J. W. C. (2020). The role of social annotation in facilitating collaborative inquiry-based learning. Computers & Education, 147, 103787. https://doi.org/10.1016/j.compedu.2019.103787.
Chandler, P., & Sweller, J. (1991). Cognitive load theory and the format of instruction. Cognition and Instruction, 8(4), 293–332. https://doi.org/10.1207/s1532690xci0804_2
Chen, P.-Y., & Hwang, G.-J. (2019). An IRS-facilitated collective issue-quest approach to enhancing students’ learning achievement, self-regulation and collective efficacy inflipped classrooms. British Journal of Educational Technology, 50(4), 1996–2013. https://doi.org/10.1111/BJET.12690.
Cheng, K.-H.H., & Tsai, C.-C.C. (2011). An investigation of Taiwan University students’ perceptions of online academic help seeking, and their web-based learning self-efficacy. The Internet and Higher Education, 14(3), 150–157.
Christenson, S. L., Reschly, A. L., Appleton, J. J., Berman-Young, S., Spanjer, D. M., & Varro, P. (2008). Best practices in fostering student engagement. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology (Vol. 5, pp. 1099–1105). National Association of School Psychologists.
Clark, R., Kirschner, P. A., & Sweller, J. (2012). Putting students on the path to learning: The case for fully guided instruction. American Educator, 36(1), 6–11.
Cochran, W. G. (1952). The χ2 test of goodness of fit. The Annals of Mathematical Statistics, 23(3), 315–345. https://doi.org/10.1214/aoms/1177729380.
Creswell, J. W. (2007). Qualitative inquiry and research design: Choosing among five approaches (2nd ed.). Sage Publications Inc.
Deci, E. L., & Ryan, R. M. (1987). The support of autonomy and the control of behavior. Journal of Personality and Social Psychology, 53(6), 1024–1037.
Dev, P. C. (1997). Intrinsic motivation and academic achievement: What does their relationship imply for the classroom teacher? Remedial and Special Education, 18(1), 12–19. https://doi.org/10.1177/074193259701800104.
Ding, L., Kim, C. M., & Orey, M. (2017). Studies of student engagement in gamified online discussions. Computers and Education, 115, 126–142. https://doi.org/10.1016/j.compedu.2017.06.016.
Dixson, M. D., Greenwell, M. R., Rogers-Stacy, C., Weister, T., & Lauer, S. (2017). Nonverbal immediacy behaviors and online student engagement: Bringing past instructional research into the present virtual classroom. Communication Education, 66(1), 37–53. https://doi.org/10.1080/03634523.2016.1209222.
Dyer, T., Aroz, J., & Larson, E. (2018). Proximity in the online classroom: engagement, relationships, and personalization. Journal of Instructional Research, 7(1), 108–118. https://doi.org/10.9743/jir.2018.10.
Fisher, K. (2010). Online student engagement: CCSSE finds enrollment status and online experience are key. Community College Week, 22(20), 7–9.
Franke, T. M., Ho, T., & Christie, C. A. (2012). The Chi-square test: Often used and more often misinterpreted. American Journal of Evaluation, 33(3), 448–458. https://doi.org/10.1177/1098214011426594.
Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74(1), 59–109. https://doi.org/10.3102/00346543074001059.
Fredricks, J. A., Wang, M. T., Schall Linn, J., Hofkens, T. L., Sung, H., Parr, A., & Allerton, J. (2016). Using qualitative methods to develop a survey measure of math and science engagement. Learning and Instruction, 43, 5–15. https://doi.org/10.1016/j.learninstruc.2016.01.009.
Fryer, L. K., & Bovee, H. N. (2016). Supporting students’ motivation for e-learning: Teachers matter on and offline. Internet and Higher Education, 30, 21–29. https://doi.org/10.1016/j.iheduc.2016.03.003.
Garrison, D. R., & Cleveland-Innes, M. (2005). Facilitating cognitive presence in online learning: Interaction is not enough. American Journal of Distance Education, 19(3), 133–148. https://doi.org/10.1207/s15389286ajde1903.
Gromada, A., & Shewbridge, C. (2016). Student learning time. OECD Education Working Papers, August (pp. 1–66). https://doi.org/10.1787/5JM409KQQKJH-EN.
Harris, L. R. (2008). A phenomenographic investigation of teacher conceptions of student engagement in learning. The Australian Educational Researcher, 35(1), 57–79. https://doi.org/10.1007/bf03216875.
Hong, J. C., Hsiao, H. S., Chen, P. H., Lu, C. C., Tai, K. H., & Tsai, C. R. (2021). Critical attitude and ability associated with students’ self-confidence and attitude toward “predict-observe-explain” online science inquiry learning. Computers & Education, 166, 104172.
Hong, J. C., Tsai, C. R., Hsiao, H. S., Chen, P. H., Chu, K. C., Gu, J., & Sitthiworachart, J. (2019). The effect of the “Prediction-observation-quiz-explanation” inquiry-based e-learning model on flow experience in green energy learning. Computers and Education, 133, 127–138. https://doi.org/10.1016/j.compedu.2019.01.009.
Horne, M., Woodhead, K., Morgan, L., Smithies, L., Megson, D., & Lyte, G. (2007). Using enquiry in learning: From vision to reality in higher education. Nurse Education Today, 27(2), 103–112. https://doi.org/10.1016/j.nedt.2006.03.004.
Hughes, C., Costley, J., & Lange, C. (2019). The effects of multimedia video lectures on extraneous load. Distance Education, 40(1), 54–75. https://doi.org/10.1080/01587919.2018.1553559.
Karamustafaoğlu, S., & Mamlok-Naaman, R. (2015). Understanding electrochemistry concepts using the predict-observe-explain strategy. Eurasia Journal of Mathematics, Science and Technology Education, 11(5), 923–936.
Kehrwald, B. A., & Bentley, B. P. (2020). Understanding and identifying cognitive load in networked learning. In N. B. Dohn, P. Jandrić, T. Ryberg, & M. de Laat (Eds.), Mobility, data and learner agency in networked learning, research in networked learning (pp. 103–115). Springer. https://doi.org/10.1007/978-3-030-36911-8_7.
Kim, T., & Park, J. (2019). More about the basic assumptions of t-test: normality and sample size. Korean Journal of Anesthesiology, 72(4), 331–335. https://doi.org/10.4097/kja.d.18.00292.
Kirschner, P. A., Sweller, J., Clark, R., Kirchener, P., Sweller, J., & Clark, R. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential and inquiry-based teaching. Educational Psychologist, 41, 75–86. https://doi.org/10.1207/s15326985ep4102.
Kirwan, A., & Adams, J. (2009). Students’ views of enquiry-based learning in a continuing professional development module. Nurse Education Today, 29(4), 448–455. https://doi.org/10.1016/j.nedt.2008.09.003.
Kline, R. B. (2013). Beyond significance testing: Statistics reform in the behavioral sciences (2nd ed.). In Beyond significance testing: Statistics reform in the behavioral sciences (2 ed.). American Psychological Association. https://doi.org/10.1037/14136-000.
Kopeinik, S., Lex, E., Seitlinger, P., Albert, D., & Ley, T. (2017). Supporting collaborative learning with tag recommendations. In Proceedings of the seventh international learning analytics & knowledge conference (pp. 409–418). https://doi.org/10.1145/3027385.3027421.
Kovanović, V., Joksimović, S., Poquet, O., Hennis, T., de Vries, P., Hatala, M., Dawson, S., Siemens, G., & Gašević, D. (2019). Examining communities of inquiry in Massive Open Online Courses: The role of study strategies. Internet and Higher Education, 40(2019), 20–43. https://doi.org/10.1016/j.iheduc.2018.09.001.
Kubicek, J. (2005). Inquiry-based learning, the nature of science, and computer technology: New possibilities in science education. Canadian Journal of Learning and Technology, 31(1), 1–13.
Lai, C.-L., & Hwang, G.-J. (2021). Strategies for enhancing self-regulation in e-learning: a review ofselected journal publications from 2010 to 2020. Interactive Learning Environments. https://doi.org/10.1080/10494820.2021.1943455.
Lai, C.-L., Hwang, G.-J., & Tu, Y.-H. (2018). The effects of computer-supported self-regulation inscience inquiry on learning outcomes, learning processes, and self-efficacy. EducationalTechnology Research and Development, 66(4), 863–892. https://doi.org/10.1007/s11423-018-9585-y.
Laursen, S. L., Hassi, M.-L., Kogan, M., Hunter, A.-B., & Weston, T. J. (2011). Evaluation of the IBL mathematics project: Student and instructor outcomes of inquiry-based learning in college mathematics. https://www.colorado.edu/eer/sites/default/files/attached-files/iblmathreportall_050211.pdf.
Laursen, S. L., Hassi, M.-L., Kogan, M., & Weston, T. J. (2014). Benefits for women and men of inquiry-based learning in college mathematics: A multi-institution study. Journal for Research in Mathematics Education, 45(4), 406–418. https://doi.org/10.5951/jresematheduc.45.4.0406.
Lazonder, A. W. (2014). Inquiry learning. In M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational communications and technology (pp. 453–464). Springer. https://doi.org/10.1007/978-1-4614-3185-5_36.
Lee, H. S., Linn, M. C., Varma, K., & Liu, O. L. (2010). How do technology-enhanced inquiry science units impact classroom learning? Journal of Research in Science Teaching, 47(1), 71–90. https://doi.org/10.1002/tea.20304.
Lee, J., Song, H. D., & Hong, A. J. (2019). Exploring factors, and indicators for measuring students’ sustainable engagement in e-learning. Sustainability (switzerland), 11(4), 985. https://doi.org/10.3390/su11040985.
Lei, M., Clemente, I. M., & Hu, Y. (2019). Student in the shell: The robotic body and student engagement. Computers and Education, 130, 59–80. https://doi.org/10.1016/j.compedu.2018.11.008.
Levett-Jones, T., Hoffman, K., Dempsey, J., Jeong, S.Y.-S., Noble, D., Norton, C. A., Roche, J., & Hickey, N. (2010). The ‘five rights’ of clinical reasoning: An educational model to enhance nursing students’ ability to identify and manage clinically ‘at risk’ patients. Nurse Education Today, 30(6), 515–520. https://doi.org/10.1016/j.nedt.2009.10.020.
Lewis, C., Wolff, K., & Bekker, B. (2021). Supporting project-based education through a community of practice: A case of postgraduate renewable energy students. World Transactions on Engineering and Technology Education, 19(1), 35–40.
Li, H., Gobert, J., & Dickler, R. (2019). Testing the robustness of inquiry practices once scaffolding is removed. In Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics), 11528 LNCS (pp. 204–213). https://doi.org/10.1007/978-3-030-22244-4_25.
Luo, T. (2015). Instructional guidance in microblogging-supported learning: Insights from a multiple case study. Journal of Computing in Higher Education, 27(3), 173–194. https://doi.org/10.1007/s12528-015-9097-2.
MacDonald, P. L., & Gardner, R. C. (2000). Type I error rate comparisons of post hoc procedures for I × J chi-square tables. Educational and Psychological Measurement, 60(5), 735–754. https://doi.org/10.1177/00131640021970871.
MacKinnon, S. L. (2017). "The Curiosity Project”: Re-igniting the desire to inquire and transformation through intrinsically-motivated learning and mentorship. Journal of Transformative Learning, 4(1), 4–21.
Marks, H. M. (2000). Student engagement in instructional activity: Patterns in the elementary, middle, and high school years. American Educational Research Journal, 37(1), 153–184. https://doi.org/10.3102/00028312037001153.
Mason, R. B. (2011). Student engagement with, and participation in, an e-Forum. Educational Technology & Society, 14(2), 258–268.
Mayer, R. E. (2004). Should there be a three-strikes rule against pure discovery learning? American Psychologist, 59(1), 14–19. https://doi.org/10.1037/0003-066X.59.1.14.
Mayer, R. E. (2005). Cognitive theory of multimedia learning. In R. E. Mayer (Ed.), The Cambridge Handbook of Multimedia Learning (2nd ed.). Cambridge University Press.
Mayer, R. E. (2017). Using multimedia for e-learning. Journal of Computer Assisted Learning, 33(5), 403–423. https://doi.org/10.1111/jcal.12197.
Mayer, R. E. (2019). Thirty years of research on online learning. Applied Cognitive Psychology, 33(2), 152–159. https://doi.org/10.1002/acp.3482.
Mayer, R. E., Heiser, J., & Lonn, S. (2001). Cognitive constraints on multimedia learning: When presenting more material results in less understanding. Journal of Educational Psychology, 93(1), 187–198. https://doi.org/10.1037/0022-06188.8.131.52.
Mayer, R. E., & Moreno, R. (2003). Nine ways to reduce cognitive load in multimedia learning. Educational Psychologist, 38(1), 43–52. https://doi.org/10.1207/S15326985EP3801_6.
McHugh, M. L. (2012). The Chi-square test of independence. Biochemia Medica, 23(2), 143–149. https://doi.org/10.11613/BM.2013.018.
Meyer, K. A. (2014). Student engagement in online learning: What works and why. ASHE Higher Education Report, 40(6), 1–114. https://doi.org/10.1002/aehe.20018.
Miles, D., Mensinga, J., & Zuchowski, I. (2018). Harnessing opportunities to enhance the distance learning experience of MSW students: An appreciative inquiry process. Social Work Education, 37(6), 705–717.
Molecular Workbench. (n.d.). Next-generation molecular workbench. Visual, interactive simulations for teaching & learning science (Vol. 2014). The Concord Consortium. http://mw.concord.org/nextgen/.
Moore, M. G. (1989). Editorial: Three types of interaction. American Journal of Distance Education, 3(2), 1–7. https://doi.org/10.1080/08923648909526659.
Moos, D. C., & Azevedo, R. (2008). Self-regulated learning with hypermedia: The role of prior domain knowledge. Contemporary Educational Psychology, 33(2), 270–298. https://doi.org/10.1016/j.cedpsych.2007.03.001.
Mount, N. J., Chambers, C., Weaver, D., & Priestnall, G. (2009). Learner immersion engagement in the 3D virtual world: Principles emerging from the DELVE project. Innovation in Teaching and Learning in Information and Computer Sciences, 8(3), 40–55. https://doi.org/10.11120/ital.2009.08030040.
O’Brien, J. (1993). Action research through stimulated recall. Research in Science Education, 23(1), 214–221. https://doi.org/10.1007/BF02357063.
Onwuegbuzie, A., & Leech, N. (2005). On becoming a pragmatic researcher: The importance of combining quantitative and qualitative research methodologies. International Journal of Social Research Methodology: Theory and Practice, 8(5), 375–387. https://doi.org/10.1080/13645570500402447.
Paas, F., Renkl, A., & Sweller, J. (2003). Cognitive load theory and instructional design: Recent developments. Educational Psychologist, 38(1), 1–4.
Pallant, A., & Tinker, R. F. (2004). Reasoning with atomic-scale molecular dynamic models. Journal of Science Education and Technology, 13(1), 51–66. https://doi.org/10.1023/B:JOST.0000019638.01800.d0.
Parker, A. (2003). Identifying predictors of academic persistence in distance education. United States Distance Learning Assocication Journal, 17(1), 55–62.
Payne, L. (2019). Student engagement: Three models for its investigation. Journal of Further and Higher Education, 43(5), 641–657. https://doi.org/10.1080/0309877X.2017.1391186.
Pedaste, M., Mäeots, M., Siiman, L. A., de Jong, T., van Riesen, S. A. N., Kamp, E. T., Manoli, C. C., Zacharia, Z. C., & Tsourlidaki, E. (2015). Phases of inquiry-based learning: Definitions and the inquiry cycle. Educational Research Review, 14, 47–61. https://doi.org/10.1016/j.edurev.2015.02.003.
PhET. (n.d.). States of matter: Basics. In PhET interactive simulations. University of Colorado. https://phet.colorado.edu/en/simulation/states-of-matter-basics.
Podolefsky, N. S., Adams, W. K., & Wieman, C. E. (2009). Student choices when learning with computer simulations. AIP Conference Proceedings, 1179(2009), 229–232. https://doi.org/10.1063/1.3266722.
Poncet, A., Courvoisier, D. S., Combescure, C., & Perneger, T. V. (2016). Normality and sample size do not matter for the selection of an appropriate statistical test for two-group comparisons. Methodology, 12(2), 61–71. https://doi.org/10.1027/1614-2241/a000110.
Raspopovic, M., Jankulovic, A., Runic, J., & Lucic, V. (2014). Success factors for e-Learning in a developing country: A case study of Serbia. International Review of Research in Open and Distance Learning, 15(3), 1–23. https://doi.org/10.19173/irrodl.v15i3.1586.
Reeve, J., Cheon, S. H., & Jang, H.-R. (2019). A teacher-focused intervention to enhance students’ classroom engagement. In Handbook of student engagement interventions (pp. 87–102). Elsevier Inc. https://doi.org/10.1016/b978-0-12-813413-9.00007-3.
Rodríguez, G., Pérez, N., Núñez, G., Baños, J. E., & Carrió, M. (2019). Developing creative and research skills through an open and interprofessional inquiry-based learning course. BMC Medical Education. https://doi.org/10.1186/S12909-019-1563-5.
Rojas, D., Kapralos, B., & Dubrowski, A. (2016). The role of game elements in online learning within health professions education. In Studies in health technology and informatics (Vol. 220, pp. 329–334). IOS Press. https://doi.org/10.3233/978-1-61499-625-5-329.
Romero, M., & Barberà, E. (2011). Quality of learners’ time and learning performance beyond quantitative time-on-task. The International Review of Research in Open and Distance Learning, 12(5), 125–137.
Romero, M., & Lambropoulos, N. (2011). Internal and external regulation to support knowledge construction and convergence in computer supported collaborative learning (CSCL). Electronic Journal of Research in Education Psychology, 9(1), 309–330. https://doi.org/10.25115/ejrep.v9i23.1439.
Ryan, R. M., & Deci, E. L. (2000). Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. American Psychologist, 55(1), 68–78. https://doi.org/10.1037/0003-066X.55.1.68.
Ryan, R. M., & Deci, E. L. (2020). Intrinsic and extrinsic motivation from a self-determination theory perspective: Definitions, theory, practices, and future directions. Contemporary Educational Psychology, 61, 101860. https://doi.org/10.1016/j.cedpsych.2020.101860.
Ryan, F., Coughlan, M., & Cronin, P. (2009). Interviewing in qualitative research: The one-to-one interview. International Journal of Therapy & Rehabilitation, 16(6), 309–314.
Samsudin, A., & Efendi, R. (2019). Teaching solar system topic through Predict-Observe-Explain-Apply (POEA) strategy: A path to students’ conceptual change. Journal of Education and Teacher Training, 4(1), 1–15. https://doi.org/10.24042/tadris.v4i1.3658.
Schmidt, J. A., Rosenberg, J. M., & Beymer, P. N. (2018). A person-in-context approach to student engagement in science: Examining learning activities and choice. Journal of Research in Science Teaching, 55(1), 19–43.
Sergis, S., Sampson, D. G., Rodríguez-Triana, M. J., Gillet, D., Pelliccione, L., & de Jong, T. (2019). Using educational data from teaching and learning to inform teachers’ reflective educational design in inquiry-based STEM education. Computers in Human Behavior, 92, 724–738. https://doi.org/10.1016/j.chb.2017.12.014.
Sharpe, D. (2015). Chi-square test is statistically significant: Now what? Practical Assessment, Research, and Evaluation Practical Assessment. https://doi.org/10.7275/tbfa-x148.
Sharples, M., Scanlon, E., Ainsworth, S., Anastopoulou, S., Collins, T., Crook, C., Jones, A., Kerawalla, L., Littleton, K., Mulholland, P., & O’malley, C. (2015). Personal inquiry: Orchestrating science investigations within and beyond the classroom. Journal of the Learning Sciences, 24(2), 308–341. https://doi.org/10.1080/10508406.2014.944642.
Sheeran, N., & Cummings, D. J. (2018). An examination of the relationship between Facebook groups attached to university courses and student engagement. Higher Education, 76, 937–955. https://doi.org/10.1007/s10734-018-0253-2.
Shen, W. (2014). Using video recording system to improve student performance in high-fidelity simulation. Lecture Notes in Electrical Engineering LNEE, 269, 1753–1757. https://doi.org/10.1007/978-94-007-7618-0_203.
Shih, J. L., Chuang, C. W., & Hwang, G. J. (2010). An inquiry-based mobile learning approach to enhancing social science learning effectiveness. Educational Technology & Society, 13(4), 50–62.
Sinatra, G. M., Heddy, B. C., & Lombardi, D. (2015). The challenges of defining and measuring student engagement in science. Educational Psychologist. https://doi.org/10.1080/00461520.2014.1002924.
Sotiriou, S. A., Lazoudis, A., & Bogner, F. X. (2020). Inquiry-based learning and E-learning: How to serve high and low achievers. Smart Learning Environments, 7(1), 29. https://doi.org/10.1186/s40561-020-00130-x.
Spronken-Smith, R., & Walker, R. (2010). Can inquiry-based learning strengthen the links between teaching and disciplinary research? Studies in Higher Education, 35(6), 723–740. https://doi.org/10.1080/03075070903315502.
Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 257–285. https://doi.org/10.1207/s15516709cog1202_4.
Sweller, J. (1999). Instructional design in technical areas. ACER Press.
Sweller, J., Van Merrienboer, J. J. G., & Paas, F. G. W. C. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10(3), 251–296.
Tallent-Runnels, M. K., Thomas, J. A., Lan, W. Y., Cooper, S., Ahern, T. C., Shaw, S. M., & Liu, X. (2006). Teaching courses online: A review of the research. Review of Educational Research, 76(1), 93–135.
Tang, L.-C., & Sung, H.-C. (2012). The effectiveness of problem-based learning on nursing studentsʼ critical thinking: A systematic review. JBI Database of Systematic Reviews and Implementation Reports, 10(57), 3907–3916. https://doi.org/10.11124/01938924-201210570-00005.
Theobald, K. A., & Ramsbotham, J. (2019). Inquiry-based learning and clinical reasoning scaffolds: An action research project to support undergraduate students’ learning to ‘think like a nurse.’ Nurse Education in Practice, 38, 59–65. https://doi.org/10.1016/j.nepr.2019.05.018.
van der Graaf, J., Segers, E., & de Jong, T. (2020). Fostering integration of informational texts and virtual labs during inquiry-based learning. Contemporary Educational Psychology, 62, 101890. https://doi.org/10.1016/j.cedpsych.2020.101890.
Vytasek, J. M., Patzak, A., & Winne, P. H. (2020). Analytics for student engagement. In M. Virvou, E. Alepis, G. Tsihrintzis, & L. Jain (Eds.), Machine learning paradigms intelligent systems reference library (Vol. 158, pp. 23–48). Springer. https://doi.org/10.1007/978-3-030-13743-4_3.
White, R., & Gunstone, R. (1992). Probing understanding. The Falmer Press.
Xiao, J. (2017). Learner-content interaction in distance education: The weakest link in interaction research. Distance Education, 38(1), 123–135. https://doi.org/10.1080/01587919.2017.1298982.
Yoo, Y., & Alavi, M. (2001). Media and group cohesion: Relative influences on social presence, task participation, and group consensus. MIS Quarterly: Management Information Systems, 25(3), 371–390. https://doi.org/10.2307/3250922.
Young, M. R. (2010). The art and science of fostering engaged learning. Academy of Educational Leadership Journal, 14(S1), 1–18.
The corresponding author acknowledges the supervisory support of Dr Tony Wright and editorial support of Dr Michael Boyle for writing the initial draft of the manuscript.
This research is funded by the Australian Government Research Training Programme Scholarship at the University of Queensland, Australia.
The study is approved by the UQSE Research Ethics Committee at the University of Queensland with the approval number 14-025.
The authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Al Mamun, M.A., Lawrie, G. Student-content interactions: Exploring behavioural engagement with self-regulated inquiry-based online learning modules. Smart Learn. Environ. 10, 1 (2023). https://doi.org/10.1186/s40561-022-00221-x