Skip to main content

The community of inquiry as a tool for measuring student engagement in blended massive open online courses (MOOCs): a case study of university students in a developing country


While massive open online courses (MOOCs) promise to democratise access to education, the literature reveals a nuanced understanding of engagement in these settings, especially in resource-constrained environments. Blended MOOCs combine MOOCs and physical classroom settings of contents and instructions. This study extends this discourse by focusing on blended MOOCs, which remain under-explored in the context of developing countries. The blended MOOC at the University of Cape Coast (UCC), Ghana, deals with third-party MOOCs as open educational resources (OERs) integrated with campus-based courses. UCC students have been using such blended MOOCs since 2016, when all level 100 students were mandated to enrol in a course entitled Information Technology Skills (ITS101). ITS101 is aligned to courses in a MOOC platform called Alison as an OER. Students' engagement is key to their continued use and satisfaction with online learning, such as MOOCs. However, among all the e-learning modes, students' engagement is the lowest in MOOCs, leading to high dropout rates. Blended MOOCs are one of the techniques recommended to reverse the undesirability of MOOCs, including engagement. However, few studies have been conducted on students' engagement in blended MOOCs, especially among university students in sub-Saharan Africa using MOOCs as OERs. Thus, this paper aims to measure student engagement in blended MOOCs using the revised Community of Inquiry for university students in a developing country. The rationale is to determine whether factors affect engagement positively or negatively. A two-stage cluster sampling technique was used to determine the participants for this study. A list of blended MOOC classes offered at UCC was obtained from the staff's mailing list. In the first stage, academic levels (100, 200, 300 and 800) were randomly selected from the strata using a lottery sampling technique. In the second stage, another simple random selection of blended MOOC courses or classes was used within each selected academic level. All students in the selected classes were then included in the study. Partial Least Squares Structural Equation Modelling was used to validate the model on the predictive relationships existing among the four presences (cognitive, learning, social and teaching) and engagement. Results from the structural model analysis proved a statistically significant predictive relationship among the constructs within the model. Learning presence had the most significant effect on student engagement. Thus, it should be included as one of the presences in the community of inquiry.


Envision an educational setting wherein students are actively engaged in their learning process, fostering collaboration among peers and engaging with their instructors locally and across various timeframes globally on a massive scale. This scenario is the promise of blending massive open online courses (MOOCs) with campus-based courses, which offer the potential to integrate the advantages of MOOCs, such as affordability, flexibility and accessibility, with the interactive and social components inherent in traditional campus-based face-to-face instruction (F2F) (Almutairi & White, 2018; Edumadze et al., 2022). With MOOCs, affordability deals with its fee component, which is typically free or very low-cost for certificates or specific course materials; flexibility is concerned with the self-paced, allowing learners to progress at their speed within enrolment windows or deadlines and accessibility looks at accessing MOOCs materials from any location with a stable internet connection, using a wide range of devices–phones, tablets, computers. This blended learning can bring about a transformative impact on the educational landscape of developing nations, especially at the tertiary level (Kruse & Pongratz, 2017), though also applicable at the secondary level (Koutsakas et al., 2020). Thus, MOOCs' scalability and adaptability across different educational levels are emphasised. The capacity of MOOCs to simultaneously give the same experience to tens of thousands of students breaks the pattern of conventional university education, having the potential to expand access to education and decrease educational costs (Groves, 2012). Increasing access has been the goal of all universities (Vieira et al., 2020), and any EdTech solution that “rides on the wings” of the internet can attain global reach (UNICEF Office of Innovation, 2022). Again, with the increasing cost of “bricks and mortar” education and the 'swallowing effect' of students' debt. As such, any mode of education that combines high-quality instructional delivery at the barest minimum cost has the potential to be embraced by education seekers. That is why MOOCs should occupy the attention of all learners, educators, teachers, administrators, and policy-makers. McNutt (2013) quotes journalist Fareed Zakaria's conversation with the prime minister of a large developing nation. This prime minister said delivering wireless internet to every region of his country would make higher education accessible. After that, he will tell students to attend free online courses from American colleges, like MOOCs, so that more people can acquire higher education. Though the Prime Minister oversimplified the solution, using MOOCs to increase access to tertiary education is an option that all must explore. Introducing teachers and students to the continuous use of innovative technology such as MOOCs is a way of leveraging technology in education, especially in cash-strapped institutions such as those in the global South, including Ghana. As a platform for online dissemination of academic content, MOOCs afford students unprecedented access to educational materials that are high-quality content and engaging.

MOOC is a method for distributing educational content via the Internet to anyone who desires to participate, with no restriction on the number of participants (Educause Learning Initiative, 2011). MOOCs are classes intended for many students (usually in hundreds of thousands) and made available anytime, anywhere, using any internet-driven device. They provide a free online educational experience, regardless of their prior entry qualification, training or education (OpenupEd, 2015). MOOCs open a world of educational possibilities for learners and lifelong learners alike, particularly in developing countries. Because of MOOCs' unlimited open access, a new breed of open educational resources (OERs) has emerged (Qaffas et al., 2020). HEIs are embracing this new breed in their on-campus courses (Kloos et al., 2015; Zhang, 2013). The embrace has resulted in blended or hybrid learning that includes any learning activity that integrates content or learning objects of MOOCs into a traditional campus-based curriculum of institutions. This educational technology integration strategy has opened avenues for HEIs, especially those from developing nations with no online learning components, to have a feel of online learning through the blend mentioned above. Through this means, students can access free but high-quality academic materials in the form of these OERs or open textbooks, which were previously very expensive or difficult to acquire. Through this integration, MOOCs represent an excellent opportunity to revolutionise blended learning across all levels of education, including organisations providing continuous professional development (Ulrich & Nedelcu, 2015). MOOCs also help in the classroom when licensed as the next generation of textbooks and become one of the tools a teacher uses to teach the course (TED, 2014, 13:09). The name for the combination of MOOC and the on-campus course is currently contentious. Terms such as blended MOOC (bMOOC/ B-MOOC), hybrid MOOC (H-MOOC), wrapped MOOC and distributed flip, among others, have popped up (Almutairi & White, 2018; Bruff et al., 2013; Koller et al., 2013; Yousef et al., 2015).

Blended MOOCs come in several models, as explained by Edumadze et al. (2022), which include university-created MOOCs utilised for credit transfer and integrating other MOOCs within the curriculum. The latter can entail establishing official collaborations with MOOC providers or informal adoption where faculty and students engage autonomously. At UCC, the method follows the second informal model where external MOOCs are included in classroom instruction, either partially or wholly, to supplement or improve the standard course content. In this situation, MOOCs function as supplementary open educational resources (OERs), offering versatility in instructional approaches such as flipped classes and reference materials. This integration showcases a dynamic utilisation of MOOCs to enhance the teaching environment.

Walters (2014) suggested the following possibilities exist because of the blended MOOCs:

  • Unbundling content by moving away from a single source (Professor) to multiple sources (online, students, professors (when co-lecturing) and experts from the industry),

  • Making it possible for students to take some courses locally at their enrolled university and others as MOOCs with content from different providers (institutions and universities),

  • It is causing the changing of classroom spaces from large lecture halls to small learning spaces.

Universities and tutors adopt blended MOOCs for various reasons, among which are:

  • Making students aware of the MOOC phenomenon and trends (Holotescu et al., 2014)

  • Enlarging knowledge/topics of the course, they have enrolled locally (Holotescu et al., 2014)

  • Creating awareness of MOOC among lecturers.

  • Serving to introduce e-learning to stakeholders of HEIs that do not officially use such technology.

  • Introducing lifelong learning practice among students.

  • Serving as a way of participating in open education (i.e., open access, OER) among stakeholders.

  • Exposing students to teaching materials and pedagogies from other HEIs in different countries.

  • The benefits of blended MOOC are optimising student engagement, satisfaction, and learning (Bruff et al., 2013).

Notwithstanding the benefits of MOOCs, studies such as Onah et al. (2022) and Yusof et al. (2017) have identified learners' lack of engagement as a significant drawback leading to high dropout rates. Monitoring how much time students spend on learning activities promotes high-quality learning and reduces course dropouts (Hussain et al., 2018; Rahimi, 2024). Learners' engagement has been seen as the reason for creating blended MOOCs and a key to making MOOCs work. Contact North (2016) declared that educators, educationists, instructional designers, instructional technologists, and others with a stake in education should be concerned about how engaged students are in online activities. Furthermore, Contact North (2016) also made the following points:

  • Not all courses have high levels of student engagement, and some give students few chances to build their learning (Contact North, 2016, p. 2).

  • Student engagement is the best way to predict how well students learn (Contact North, 2016, p. 7).

  • Institutions should pay more attention to their understanding of instructional design, student engagement, and assessment (Contact North, 2016, p. 7).

  • If we want more students to succeed, we should emphasise student engagement and learning design, and faculty should be very involved in this work (Contact North, 2016, p. 7).

Other researchers have come to a similar conclusion on student engagement, among which are:

  • Student engagement is one of the best learning and personal growth predictors (Sukor et al., 2021, p. 640).

  • Student engagement, learner interaction, and teacher presence explained many differences in student satisfaction and how much they thought they were learning in online learning environments (Gray & DiLoreto, 2016, p. 1).

  • Different things inside and outside the classroom can affect students' engagement (Vezne et al., 2023, p. 1866).

  • Students actively engaged in learning tend to do better in school (Deci & Ryan, 2000, as cited in Vezne et al., 2023, p. 1866). This is because they enjoy and see the value in their actions.

  • Student engagement strongly predicts educational results(Wang & Degol, 2014, p. 3).

  • Student engagement predicts learning and school performance (Haw et al., 2022, p. 226).

Student engagement represents a significant challenge in online learning, especially in blended MOOC environments. This challenge is not limited to specific geographies. However, it is a global concern impacting students, teachers, and academic institutions. Contact North (2016) underscores that educational institutions must critically reassess instructional design, student engagement, and assessment strategies to foster meaningful learning experiences. Garrison and Vaughan (2008) further highlight the importance of reevaluating course design in blended learning settings to optimise student engagement. These set the stage for understanding the universal relevance of student engagement in educational success.

However, the literature reveals a notable gap in the research concerning the impact of blended MOOCs on student engagement, particularly within the context of Ghana in Sub-Saharan Africa. Despite the proliferation of MOOCs globally, there is a marked scarcity of research on their effectiveness in developing countries, with these studies pointing to this gap (Almutairi & White, 2018; Maphosa & Maphosa, 2023; Mutisya & Thiong’o, 2021; Montgomery et al., 2015; Yunusa et al., 2021; Zakaria et al., 2019). Sub-Saharan Africa's unique challenges, such as limited internet access and diverse educational needs, present a compelling case for a tailored approach to educational technology (Maphosa & Maphosa, 2023). This study aims to bridge this gap by exploring how blended MOOCs can enhance student engagement and learning outcomes in such contexts. The specific focus on Ghana allows for a detailed examination of these dynamics, offering insights that are both locally relevant and globally applicable (Yunusa et al., 2021).

Building on the theoretical framework of the revised Community of Inquiry (RCoI), this research investigates their collective impact on student engagement in blended MOOCs. Empirically, this study addresses a critical gap by employing a robust methodological approach to explore the relationship between these presences and student engagement. This research validated a standardised scale to measure student engagement in blended MOOCs by combining instruments from Almutairi and White (2018) and Wertz (2022) leading to the standarisation of such instrument as identified as a gap for MOOCs by Deng et al. (2020). Study of this nature enhances the reliability and validity of the instrument leading to its standarisation.

Thus, this study aims to explore the impact of the revised Community of Inquiry (RCoI) framework elements—cognitive, learning, social and teaching presence—on student engagement in blended MOOCs at the University of Cape Coast, Ghana. Grounded in the RCoI framework, this research investigates the synergistic effect of its four presences on enhancing student engagement. The RCoI framework provides a robust theoretical lens through which the dynamics of engagement in blended learning environments can be examined (Garrison et al., 2000; Shea & Bidjerano, 2010). In achieving the stated aim, the following objectives will be pursued:

  • Analyse the Community of Inquiry (CoI) framework to identify indicators of student engagement in blended MOOCs.

  • Investigate the impact of the four CoI presences (teaching presence, cognitive presence, social presence, and learning presence) on student engagement in blended MOOCs.

  • Examine the specific contribution of learning presence within the CoI framework to student engagement in blended MOOCs.

In pursuance of the second question, the following hypotheses will be investigated:

  1. a)

    H1: No significant relationship exists between cognitive presence and students' engagement with blended MOOCs.

  2. b)

    H2: No significant relationship exists between learning presence and students' engagement with blended MOOCs.

  3. c)

    H3: No significant relationship exists between social presence and students' engagement with blended MOOCs.

  4. d)

    H4: No significant relationship exists between teaching presence and students' engagement with blended MOOCs.

Figure 1 shows the research framework for the study.

Fig. 1
figure 1

The research framework

This research addresses a vacuum in geographical research on blended MOOCs in Sub-Saharan Africa, notably Ghana, making a significant addition to educational technology’s literature. Examining student participation in this underrepresented region illuminates its unique educational problems and potential (Almutairi and White (2018). This study's regional is also relevant since Maphosa and Maphosa (2023) emphasise the necessity for MOOC initiatives in Sub-Saharan Africa. The study explores student engagement in blended MOOCs using the RCoI paradigm, enriching the theoretical landscape. The study confirms the theoretical validity of this approach, enhancing our understanding of how different presences affect student engagement synergistically. This research validated a standardised scale to measure student engagement in blended MOOCs, filling a gap identified by Deng et al. (2020). A rigorous engagement assessment tool improves blended learning empirical research by enabling more precise and replicable results across trials.

This research has significant implications for policymaking and instructional design, providing educators and designers with actionable insights to boost student engagement in blended learning. The study helps create more engaging blended learning experiences by identifying engagement tactics. Additionally, educational policies that promote equity and access are crucial, especially in regions facing infrastructural and resource limitations (King, 2015; Yunusa et al., 2021). Finally, this study adds to a more inclusive global educational technology debate by focusing on an underrepresented region with distinct educational needs and concerns. It ensures that Sub-Saharan African students' opinions are heard in educational technology transformation talks. This inclusive approach shows the study's dedication to a more equal and accessible learning landscape worldwide, promising to inform future educational policies and build a more inclusive educational environment for all.

The community of inquiry

As a social constructivist-collaborative paradigm, Garrison et al. (1999) initially suggested the Community of Inquiry (CoI) framework. The framework was built around three different ideas called "presences": cognitive presence (CP), social presence (SP) and teaching presence (TP). The way students interact with one another while taking an online course is called their "presence." Each node in the CoI framework is a conversation happening over the internet. The three presences impact learners' inquiry-based learning experiences within a learning community. Vaughan and Garrison (2008) said that CoI is founded on two fundamental notions for significantly higher education in an era powered by internet technologies–web 2.0 and social media: 'community' and 'inquiry'. The community recognises the social dimension of education, emphasising the need for interaction, collaboration, and conversation in knowledge construction (Arbaugh, 2007; Ranjan, 2020). The process of generating meaning through personal responsibility and choice is reflected in inquiry (Arbaugh, 2007). Again, Ranjan (2020) defined inquiry as how students develop meaning through their initiative and selection. Thus, CoI is a theoretical description of what makes an online learning environment engaging and cognitively developing. It is a robust and comprehensive approach to teaching, learning, and assessment.

Many authors recommend expanding the CoI framework's three presences to encompass several presences: autonomy, distributed teaching, emotional, instructor, engagement, vicarious and learning (Kozan & Caskurlu, 2018). These presences strengthen the CoI framework (Kozan & Caskurlu, 2018) and better describe the online educational experience (Anderson, 2017; Moore & Miller, 2022). The learning presence is the only planned addition that has been extensively embraced. Garrison and Anderson, CoI framework creators, have varied opinions about the learning presence. Anderson supports its inclusion, while Garrison rejects it. The learning presence introduces self-directed learning and transforms COI's "teaching model" into a "teaching and learning paradigm," expanding its use beyond traditional schools and educational settings, according to Anderson (2017). Though Anderson (2017) stated that the three presences provide a parsimonious advantage, Anderson (2017) argues that including learning presence in the CoI framework benefits:

  • The current CoI is only helpful for building and defining an efficient teaching framework.

  • It brings CoI closer to the principles of autonomous learning advocated by constructivist learning and heutagogical approaches to education.

  • It expands the scope of the CoI from purely pedagogical to one that incorporates learning, making it applicable in settings beyond the classroom. It has the learning potential, as demonstrated by the students.

Garrison (2017) argues against including learning presence in the CoI framework:

  • It contradicts the concept of a collaborative inquiry community. CoI members are supposed to have varied degrees of presence in the three areas. The teacher and learner are one in this scenario. Individuals have distinct responsibilities as teachers and students (Garrison, 2022).

  • Teaching and social presence were most related to learning presence, and cognitive presence emerged at the junction of these presences.

  • Adding more presences may increase framework complexity and violate parsimony principles.

The suggested presences can be incorporated into the model by expanding the scope of the three presences' definitions or the interrelationships between and among the three presences (Kozan & Caskurlu, 2018).

Shea (2010) suggested adding a "learning presence" to the CoI architecture. The suggestion came after observing that learner discourse—what students did—did not fit the then CoI model. Thus, Shea could not correctly code the identified discourse as social, cognitive, or teaching presences (Shea, 2010). Since online learning is electronic, social, and "self-directed," it was vital to study how students self- and co-regulate environments (Shea & Bidjerano, 2010). To do this, Shea and Bidjerano (2010) investigated various factors, such as the metacognitive abilities, motivational states and behavioural management strategies employed by successful online students. The lack of a learning presence that deals with the online learner role was observed as one of the limits to the CoI model (Shea & Bidjerano, 2010). The four presences of the RCoI framework are discussed in the subsequent sections.

Cognitive presence

Cognitive presence refers to the degree to which MOOC participants actively participate in substantive discussions and engage in critical thinking. The defining features of this phenomenon entail learners engaging in the exchange of ideas and perspectives, expanding upon one another's contributions, and critically examining their own and others' thought processes. With cognitive presence blended, MOOC participants are to participate in substantive discussions and engage actively in critical thinking. Cognitive presence significantly promotes learners' active participation and directly impacts their academic performance and overall satisfaction (Kang et al., 2007a, 2007b).

Social presence

Social presence refers to the degree to which learners experience a sense of connection and belongingness with their peers and the instructor within blended MOOCs. The phenomenon is also marked by perceiving support from their peers and perceiving their ability to contribute to the learning process actively. It is widely recognised as crucial in establishing a conducive learning environment that fosters support and collaboration. A positive correlation exists between social presence and students' level of engagement in their academic pursuits and within the educational institution (Alabbasi, 2022). Social interactions among students contribute positively to their social integration and attitude towards the subject and foster a competitive learning environment (Damm, 2016). Finally, Miao and Ma (2022) observed that social presence directly impacts engagement and mediates the association between self-regulation and engagement.

Teaching presence

Teaching presence refers to the degree to which the instructor establishes a nurturing and interactive educational setting within a MOOC. Teaching presence encompasses the deliberate planning, facilitation, and guidance of cognitive and social activities to achieve meaningful and valuable learning outcomes. Empirical research has provided support for the positive impact of teaching presence on various learning outcomes, such as perceived learning, learner satisfaction, and behavioural engagement (Caskurlu et al., 2020). The existence of effective teaching plays a crucial role in fostering student engagement and influencing the outcomes of their learning experiences (Zhang et al., 2016).

Learning presence

The blended MOOC that this research is based on has the traditional roles of teachers and students assigned. However, it subscribed to the student-centredness of MOOCs, hence the inclusion of the learning presence in this study. Learning presence refers to the degree to which learners actively acquire knowledge and skills and assume accountability for their learning. The phenomenon under consideration is distinguished by the active involvement of learners in establishing objectives, monitoring their advancement, and actively seeking out educational resources to facilitate their learning process. Learners actively acquire knowledge and skills, ultimately attaining the intended educational objectives. Learning presence deals with active participation in the educational process and achieving intended educational objectives.

At first, "learning presence" represented academic self-efficacy and other cognitive, behavioural and motivational factors contributing to online students' capacity to self-regulate their learning (Shea & Bidjerano, 2010). Researchers have found that self-efficacy and self-regulated learning are strongly linked to other important factors for learning success. Shea and Bidjerano (2010) thought that self-efficacy and self-regulation were important parts of the learning presence. Self-efficacy and self-regulatory skills both affect e-learning success. However, they also affect each other in a learning environment. Doo and Bonk (2020) found that self-efficacy did not affect learning independently but helped self-regulated learning. Later, "learning presence" indicated self-regulated learning within a community of inquiry (Jimoyiannis & Tsiotakis, 2017; Shea & Bidjerano, 2012). Statistics show high relationships between self-efficacy and self-regulatory scores in online and traditional learning environments, such that high self-efficacy and positive self-regulation are reliable predictors of academic success in online courses(Bradley et al., 2017).

Zimmerman (2000) defines SRL as "self-generated thoughts, feelings, and actions that are planned and cyclically adapted to personal goals" (p. 14). SRL theorists consider meta-cognition, conduct, and motivation (Zimmerman, 1986). SRL strategies are "actions and processes directed at acquiring information or skill that involve agency, purpose, and instrumentality perceptions by learners" (Zimmerman, 1989, p. 329). Examples are goal setting, time management, organisation, self-monitoring, and strategy adjustments. E-learning requires these skills because students’ study independently and at their speed. These variables help learners use e-learning materials. They can affect e-learning learners' motivation, engagement, persistence, and success. Self-regulated learning involves planning, monitoring, controlling, and regulating learning. Studies by Cho and Shen (2013) have shown that self-regulated learning (SRL) is essential for academic success in online learning environments. It directly contributes to students achieving their learning goals, especially in contexts with high student autonomy and minimal instructor presence.

Beyond self-regulation, co-regulated learning recognises that learning is often social and involves contact with others (Andrade et al., 2021). Assessment today involves co-regulation of learning through interactions with students, teachers, peers, and technology. Co-regulation occurs dynamically in online collaborative learning and may be detected and evaluated to improve learning design (Andrade et al., 2021). A study conducted by Liao et al. (2023) found that learning presence, specifically self-regulated learning (SRL) and co-regulated learning (CoRL), is a significant predictor of student engagement in blended learning. The researchers observed that CoRL is more strongly associated with emotional engagement, while SRL is more closely linked to cognitive engagement.

Students' self-efficacy is confidence in their ability to achieve in certain situations or tasks. Self-efficacy is "people's beliefs about their abilities to produce designated levels of performance that influence events that affect their lives" (Bandura, 1994, p. 71). With this, individuals cognitively appraise their capability to effectively perform and achieve desired outcomes in particular circumstances or tasks. Students with positive and comparatively high self-efficacy beliefs are more inclined to exhibit engagement in the classroom, as evidenced by their behaviour, cognition, and motivation (Linnenbrink & Pintrich, 2003a, 2003b). According to Zhang (2022), self-efficacy significantly enhances learner engagement. Self-efficacy is a significant cognitive factor influencing motivation and engagement (Schunk & Mullen, 2012). Zhang (2022) revealed that self-efficacy emerged as the sole significant variable influencing learner engagement. Azila-Gbettor et al. (2021) conducted a study revealing that self-efficacy and autonomous motivation positively impact peer and intellectual engagement. Since self-efficacy, self-regulated learning and co-regulated learning are significant predictors of student engagement. Thus, learning presence predicts student engagement.

Students’ engagement

One of the most important goals of e-learning environments in higher education is to get students more engaged in learning. This engagement is done through continuous interaction that builds cognitive and non-cognitive skills for success in school (Ituma, 2011). However, online learning such as MOOCs has considered problems inherent in retention and engagement (Lambert & Fisher, 2013). Thus, student engagement is critical for student retention and satisfaction in online courses. Fredricks et al. (2004) argue that within the research community, there remains a requirement for consensus about the definitions, frameworks, and constructs of engagement. Researchers employed a variety of indicators to predict learning outcomes and evaluate student engagement in various contexts. The analysis of indicators related to engagement is undertaken from a singular standpoint, operationalised through the participation of students in diverse activities. As an illustration, Wang et al. (2015) quantitatively analysed engagement levels within online discussion forums. Whitehill et al. (2015) employed measurements to assess the extent of video lecture viewership.

Furthermore, researchers have made numerous attempts to evaluate student engagement in addition to the abovementioned measures. For example, certain research investigations have employed self-report instruments, such as surveys or questionnaires, to collect data on students' subjective evaluations of their levels of engagement (Joksimović et al., 2018). Other studies have employed observational methodologies to directly assess students' behaviour and engagement levels in a classroom setting. However, some researchers have blended these approaches to acquire a more holistic comprehension of student engagement.

Student engagement in online courses refers to the degree to which students actively participate in critical thinking, verbal communication, and interaction with course materials, fellow students, and the instructor. Student participation in the learning process encompasses their engagement and collaboration with the instructor and their peers, indicating their level of involvement (Dixson, 2015a, 2015b). Hodges (2018) defined engagement as assessing one's level of involvement, enthusiasm, and commitment to a company. Engagement is described as energy in action, the connection between a person and the exercise they perform for a stated goal in academic settings. So, the student's active participation in a task or activity is vital for engagement. Gallup's 2018 study, 'School Engagement Is More Than Just Talk,' discovered that:

Engaged children are 2.5 times more likely to report receiving outstanding grades and performing well. They are 4.5 times more likely than their actively disengaged peers to be optimistic about the future (Hodges, 2018).

Students engagement in the community of inquiry

Students' engagement is key to the success of the CoI model. One of the originators of the CoI framework and a colleague (Garrison & Vaughan, 2008) made the following statements attest to that fact. The efficacy of the inquiry technique relies on the presence of engagement; engagement is crucial for a community of inquiry and the overall higher educational experience (p. 16). The educational process within the community of inquiry entails both the public and private domains; engagement within a community of inquiry refers to the convergence of these public and private realms (p. 16). The inquiry approach in education encourages students to engage in responsible learning activities (p. 112) actively.

The CoI framework offers a distinct paradigm for identifying student engagement. It extends beyond mere assessment of engagement and explores the calibre of engagements, the extent of analytical reasoning, and the whole educational experience for students. The CoI framework assesses the instructor's ability to lead the course effectively (teaching presence), the students' interaction and contribution to the community (social presence), the students' active thinking and comprehension of the material (cognitive presence), and the students' proactive approach to managing their learning and supporting their peers (learning presence). The simultaneous collaboration of these four presences signifies a notable degree of student engagement. If any of these elements are lacking, it can identify areas where instructors can enhance their teaching tactics or the course design to cultivate a more captivating learning environment.

CoI for online learning can make students' engagement easier (Oyarzun & Morrison, 2013). In online education, a community of inquiry promotes “epistemic engagement “(Shea & Bidjerano, 2010). Epistemic engagement involves actively engaging with knowledge to deepen comprehension, thus making online learning environments more successful. The implication is that students are actively engaged in acquiring knowledge, employing critical thinking skills to analyse the subject matter, and actively participating in discussions with their peers and instructors to foster a comprehensive comprehension of the material. Students' engagement is vital for discovering knowledge to make learners active and instil lifelong learning capabilities in the digital era characterised by abundant information and learning initiatives, including MOOCs.

Choo et al. (2020) support this notion, highlighting the framework's usefulness in measuring engagement. Damm (2016) also confirms the effectiveness of the CoI survey in measuring engagement within MOOCs; nevertheless, the author acknowledges a limitation: the survey cannot definitively pinpoint the cause of low engagement (e.g., is it due to a lack of strong peer interaction?). Despite such limitations, a study by Das and Madhusudan (2023) assessed the CoI model's ability to promote collaborative learning and enhance engagement. The study further indicated that the CoI model significantly contributes to learner engagement, fosters collaborative learning, and improves learner performance across cognitive, emotional, and behavioural domains. Building on these findings, Ginting (2021) suggests further optimising the presences to enhance student engagement on online platforms.

Research methodology

The research methodology refers to the systematic approach and techniques employed in conducting a study. It encompasses the overall design and data collection. The present study used a quantitative research methodology to examine the RCol presences that impact student engagement within blended MOOCs serving as an open educational resource (OER).


Most respondents said they participated in blended MOOCs as the first-year University of UCC students. They enrolled on the Alison MOOC platform for Microsoft Office 2010 -Revised 2018 as OER for a campus-based course entitled ITS 101, which is mandatory for all level 100 (first-year) students. They watched the videos on the platform, participated in both the discussion forums and answered the quizzes to obtain a certificate. The marks on their certificate constituted part of their continual assessment for the campus course in which they enrolled. The continued students at the upper levels (200–900) registered unto other MOOC platforms like Saylor.

The research sample comprised 2875 students at the University of Cape Coast(UCC), Ghana, actively engaged in a blended MOOC centred on multiple courses. The total student population, which is the target population as of 2022–2023, was 60,243 (see Table 1), comprising 54,236 undergraduate and 6007 post-graduate students.

Table 1 2022 Student population

Only the regular group is involved with blended MOOCs, making them the target population for the study, which is 26,527 regular students. Table 2 shows the distribution of the students in the regular group.

Table 2 2022 Regular students by level

Furthermore, 25,239 students from Table 2 are the research population. Except for students from levels 50, 250, 850, 900 and 950, most of these students have completed ITS 101: Information Technology Skills course at UCC. This course is a semester course based on blended MOOC instructional delivery. ITS101 is a mandatory course for all level 100 students at UCC. Some 25,239 students have again completed additional blended MOOC courses at levels 200–800. The blended MOOC was an open educational resource (OER), enabling unrestricted access and completion without associated costs. Lecturers selected MOOCs that fit their on-campus courses and instructed students to enrol. Upon completion, the marks students obtained from the MOOCs become part of the continuous assessment for their on-campus-based courses. Lecturers further organised supervised formative assessments on campus.

Sampling method

The sampling method selects a subset of individuals or items from a larger population to conduct research. The researchers employed a two-stage cluster sampling technique to determine the participants for this study. A list of blended MOOC classes offered at UCC was obtained from the staff's mailing list. Thus, the population was stratified by academic or year level (100–800). These levels 50, 250, 850, 900 and 950 were excluded as none of their courses uses blended MOOCs as an instructional format. In the first stage, academic levels were randomly selected from the strata using a lottery sampling technique. Each level of the population was assigned a level number beginning with 100. A single random sampling technique was used to select the levels participating in the research. The levels 100, 200, 300 and 800 were thus picked. In the second stage, another simple random selection of blended MOOC courses or classes was used within each selected academic level. All students in the selected classes were then included in the study.

Data collection

The Google survey questionnaire form was used to design the questionnaire and sent to all students within the stated categories (see Table 3) through their institutional email addresses to fill out—an estimated duration of 20 min for participants to complete the questionnaire. Tables 3 and 4 shows the level and class size of courses selected where students completed a blended MOOC course at UCC. Students from this population qualify to participate in the research study. Thus, the sample population shall be drawn from the accessible population. The benefit of using the Google Form approach is that all students have institutional mail based on the Google platform. There was a link to the survey in the email invitation. An introductory letter stating the purpose of the study and guaranteeing the student’s anonymity was added. It further explained to the students how the data would be used for academic purposes only, and they could decline it if they wanted to. They were also informed verbally and encouraged to respond to the questionnaire during classroom sessions. Reminder emails were sent bi-weekly to non-responders after two weeks following the initial survey invitation. The survey was open for the whole semester, and all responses were kept confidential.

Table 3 Level students participating in the survey at UCC
Table 4 Bio-demographic data of respondents

Data analysis

After completing the data collection process, the responses to each research question were scrutinised and pruned to ensure that they aligned with the stated instructions. For each research question, the researcher gave the response a numerical code uploaded into SPSS version 28. A missing values analysis was then performed to ensure that missing data did not render the rest invalid. Later, a descriptive analysis was used to assess data distribution normality. The rationale is to know whether a parametric or non-parametric analysis approach is based on the data's skewness. The model is composed of four research questions. The researcher analysed it by converting the data to a comma-separated values file (CSV) and uploading it in SmartPLS 4 software for a partial least square regression (PLS). The approach employed in this study involved two key steps. Firstly, the variables were condensed into a reduced set of components to enhance manageability. Secondly, the analysis was performed on these condensed components instead of conducting a least-squares regression analysis on the original data. The Partial Least Squares (PLS) algorithm employs a methodology akin to principal components analysis to reduce the number of variables. This reduction is achieved by extracting components that capture the strongest correlations among the determinants (Hair et al., 2019). The methodology employed in this study involves utilising various components as variables, with the aid of cross-validation, to determine the smaller components that exhibit the highest level of predictive capability (Helland et al., 2018). Data were analysed using SPSS version 28 and SmartPLS version 4. Descriptive statistics were employed to comprehensively describe the sample and Community of Inquiry (CoI) dimensions. The researchers employed structural equation modelling (SEM) to examine the postulated associations between Community of Inquiry (CoI) and student engagement dimensions.


The present study is subject to several limitations. The study was conducted within a single university (UCC) in Ghana. The generalisability of the study's findings to other settings may be limited. Furthermore, the investigation was carried out utilising a self-report questionnaire. This implies the existence of a potential for social desirability bias. Moreover, it is essential to note that the study employed a cross-sectional design (i.e., collecting data from multiple subjects at a specific moment), limiting its ability to establish causal relationships between the elements of conflict of CoI and student engagement. The study did not investigate additional variables that could impact student engagement in blended MOOCs, such as pre-existing knowledge or motivation.

Ethical considerations

The topic of ethical considerations is of paramount importance in academic discourse. It is crucial to carefully examine and address the ethical implications associated with it. Since this paper is extracted from the PhD dissertation of the lead author, the research was approved by the Institutional Review Board (IRB) of the University of KwaZulu-Natal, South Africa. The participants were provided with information regarding the study and provided their explicit consent to partake in the research. The data was gathered to ensure anonymity and stored in a secure facility. The participants were duly notified that their involvement in the study was entirely voluntary, and they were assured that they had the freedom to withdraw at any time without seeking permission from the researcher.

Response rate

The average response rate for research surveys is very different depending on the type of survey and the audience being surveyed. A meta-analysis of online surveys from various fields showed an average response rate of 39.6% with a standard deviation of 19.6% (Wu et al., 2022). The statement means that, on average, 39.6% of the people asked to participate in these online surveys did so. The standard deviation (SD) of 19.6% shows that the response rates to online surveys differed from one study to the other, running from as low as 20% to as high as 60% in some cases. Saunders (2014) also said that a response rate of around 30% in a random group is considered good. As indicated in Tables 3 and 4, the researcher sent the questionnaire to all the 3506 students identified in the accessible population. A link to the same questionnaire was added to the Moodle LMS pages for courses that use blended MOOCs to teach. This mailing list comprises 3506 students who have used blended MOOCs at UCC for at least one semester. However, of the 3138 who completed the questionnaire, 2875 students filled it out successfully without missing data. The response rate was 82%, which is adequate for the study.

Demographic characteristics

Table 4 shows the demographic and biographical data of the respondents. EdTech research depends on these bio-demographic data, which is crucial for designing, implementing and evaluating technology interventions. They could help find subpopulations that need more complicated and subtle treatments, eliminate confusing factors and make policy decisions about using technology in the classroom. In a blended MOOC study, including demographic data, like sex, may assist researchers in determining whether gender sex affects learning. This information helps to ensure that instructional technology is sex-neutral and easy to use. Studies suggest sex inequalities in educational technology usage and acceptance. Several studies indicate that males are more tech-savvy than women (Irene, 2019).

In educational research, sorting is often used to tell the difference in perception and use of EdTech between students in STEM programmes and those who are not. The grouping is based on the idea that STEM and non-STEM students have different educational needs and experiences. Lin et al. (2021) reported that the factors affecting students’ intentions to continue using the platforms differed between STEM and non-STEM courses, with perceived usefulness being more important in STEM courses and perceived ease of use being more critical in non-STEM courses. Another study by Alkhalaf and Nguyen (2020) also showed that EdTech positively affects student learning outcomes in STEM and non-STEM courses. However, the effect was more significant in STEM courses than in non-STEM.

It is essential to consider the academic year of the respondents when analysing the data on satisfaction, academic performance and other variables related to blended MOOCs, as it may impact their experiences and perceptions of the technology. Additionally, understanding how experience with blended MOOCs varies by academic year can inform future design and implementation of such technologies. Based on the data, most respondents (85%) enrolled in blended MOOCs at level 100, indicating they were in their first year of study when they registered. This result suggests that most respondents were relatively new to blended MOOCs and may have needed more experience with them than more advanced students. Additionally, the small number of respondents at higher levels (300, 500, and 800) suggests that blended MOOCs may be less commonly used or required at higher levels of study.

Measurement model analyses

Two stages are involved in the analysis of PLS-SEM. The first is to analyse the validity and reliability of the measurement model. After successfully passing the validity and reliability tests, the subsequent step involves the analysis of the structural model, as Hair Jr et al. (2021a) outlined. The data acquired from the study were subsequently analysed using the Partial Least Squares Structural Equation Modelling (PLS-SEM) technique in SmartPLS version A path model is a graphical representation that illustrates the hypotheses and relationships between variables in a structural equation modelling (SEM) analysis, as Bollen (2002, as cited in Sarstedt et al., 2021) described. A path model includes structural and measurement models. In PLS-SEM, the outer models are measurement models, while the inner models are structural. The measurement model deals with the individual survey items and respective latent variables (constructs) measured by the former. The structural model shows the cause-and-effect relationships that deal with the latent variables and their linking relationships. Hair et al. (2021a) suggested evaluating the elements discussed in the section to assess the study's measuring model.

Internal consistency

Internal consistency is usually judged by how items on the same construct relate to each other. It checks whether there is a link between the scores on different items meant to measure the same construct. It is also called internal reliability or internal consistency reliability.

Cronbach's alpha

Researchers can use internal consistency measures such as Cronbach's alpha, split-half, or test–retest reliability. These measures assess the consistency of responses to items or indicators over time or across different forms or versions of the same test. Cronbach’s alpha has been used for ages to check the reliability or consistency of each construct within the model. Cronbach’s alpha (α) is a simple way to determine a score’s reliability. It is used when more than one item measures the same underlying concept. Cronbach's alpha measures internal consistency and shows how closely related questions are as a group for a construct or latent variable (Ravinder & Saraswathi, 2020). The alpha value depends on how many indicator items are, how similar they are, and how many dimensions they have. The Cronbach’s alpha results should be between 0 and 1, but they can also get negative numbers. According to Tavakol and Dennick (2011), Cronbach’s alpha has certain limitations: scores with many items with lower reliability are generally associated with decreased accuracy. The Cronbach's alpha estimates for the constructs are presented in the fourth column of Table 5. The Cronbach's alpha estimates varied between Student Engagement (0.861) and Learning Presence (0.943), above 0.722, passing the minimum threshold and making the items appropriate for each construct.

Table 5 Outer loadings, construct validity and reliability of factors for the measurement model
Composite reliability

Jöreskog and Sörbom's (1995) composite reliability(rho_C) is a statistical measure used to assess the internal consistency of scale items. It serves a similar purpose as Cronbach's alpha, as Netemeyer et al. (2003) discussed. It considers the reliability of a set of items when loaded on a latent construct. Composite reliability thresholds are still a matter of contention, with varying recommendations from researchers. A reasonable threshold can be anywhere from (Nunnally & Bernstein, 1994; Shivdas et al., 2020). The complexity of a construct is very variable concerning its number of components. Fewer items on a scale imply poorer reliability, with more items producing improved reliability. Within the realm of exploratory research, it is generally accepted that composite reliability ratings within the range of 0.60 to 0.70 are deemed acceptable; furthermore, ratings falling between 0.70 and 0.90 are considered indicative of an adequate level of reliability. Composite reliability scores above 0.90 (significantly above 0.95) show that some indicators are the same, hurting construct validity (Diamantopoulos et al., 2012). The composite reliability estimates for the constructs are presented in the sixth column of Table 5. The composite reliability estimates varied between Student Engagement (0.894) and Learning Presence (0.95). The estimated values for the composite reliability of each construct exceeded the recommended minimum threshold of 0.70(Fornell & Larcker, 1981; Hair et al., 2021b).

Reliability coefficient

Cronbach's alpha underestimates how reliable both latent variable scores are, while composite reliability overestimates their reliability (Dijkstra & Henseler, 2015). According to Hair et al. (2021b), it was observed that Cronbach's alpha tends to yield conservative reliability estimates, while composite reliability tends to produce more liberal estimates. However, the actual reliability of a construct typically falls within the range between these two extremes. Hence, the reliability coefficient(rho_A) generally falls within the spectrum encompassing Cronbach's alpha and the composite reliability. The (rho_A) value can be between 0 and 1; the more reliable an item scale is, the higher the rho_A value. Higher rho_A values show more reliable item scales. The rho_A value of 0.7 is the bottom limit of adequacy (Ahmad & Hussain, 2019; Prasetyo et al., 2022). The results of the reliability coefficients of each construct are shown in the fifth column of Table 5. By inspection, rho_A values range from Student Engagement (0.862) to Learning Presence (0.943). All the constructs met the recommended threshold, indicating that the values for reliability were significant and acceptable.

Construct validity

Construct validity is how well each item indicator assesses its intended idea or construct. Assessing construct validity is especially important when researching latent constructs, which cannot be directly measured or observed; thus, measurable indicators are needed. Convergent and discriminant validity are used to examine construct validity, and when both prerequisites are met, a test has construct validity. Convergent validity measures how similar indicators relating to the same construct are identical. In such a case, the indicators should have a strong correlation. Discriminant validity tests whether, theoretically, unrelated constructs have unrelated indicators. In such a case, the indicators should have no or weak correlation.

Convergent Validity

Convergent validity refers to the extent to which the indicators of a particular concept exhibit consistent alignment in their measurements. The process involves elucidating the differences among the various items. Examining the outer loadings of the different items and calculating the average variance extracted (AVE) allowed us to check the convergent validity of the constructs.

Outer loadings

When the outer loadings of the indicator items that measure a construct are high, the items that make up that construct have a lot to share in common (higher commonalities). This situation is termed indicator reliability. According to the recommendation by Hair et al. (2017), loadings of 0.708 or above are considered statistically significant. When the indicator's loading is above 0.708, the construct accounts for more than 50% of the indicator's variance, signalling that the indicator has sufficient item reliability (Sarstedt et al., 2021). Though enormous studies have suggested that outer loadings of 0.50 indicate low but significant reliability (Afthanorhan, 2013; Hair et al., 2019; Hulland, 1999), the researcher eliminated all items that showed outer loadings lower than 0.70 during the initial analysis because they were less significant, as per Hair et al. (2017). The removed items were coded as CEBE 1, 2, 7, 8, 9, 10, 11, 12, 13, 14, 15; CPTE1, 2; CPR3, LPER4, LPMSR10, LPSEL1, ME1, 5, 6, and 7. Thus, the remaining items showed item loadings ranging from 0.701(TPDO1) to 0.798(CPI3), as presented in the third column of Table 5. These values indicate that the remaining items had significant indicator reliability and were included in the main study.

Average variance extracted

The average variance extracted (AVE) shows the variance of each indicator item explained by its construct. AVE assesses the extent to which the variance observed in a construct can be attributed to the construct rather than measurement error. To calculate AVE, square each indicator's loading and calculate the average. AVE for each construct is calculated by summing the squares of the standard errors of the indicator variances and then dividing by that amount. An acceptable value for AVE is 0.50 (Fornell & Larker, 1981). The seventh column of Table 5 shows that the measurement model's AVE values range from 0.546 to 0.609. The observation is that they were all more than the minimal value, making them acceptable. The indicator's outer loadings and AVE both pointed out that the remaining components of the measurement model possessed substantial convergent validity.

Discriminant validity

Discriminant validity pertains to the degree to which a latent variable exhibits dissimilarity from other latent variables and, thus, represents a phenomenon that other latent variables do not represent (Yeboah, 2020). Discriminant validity, sometimes called "divergent validity," looks at whether or not two things that are not supposed to be related are not. The discriminant validity method determines how unique the constructs being looked at are. The correlation between two ideas that do not go together will likely be weaker than the correlation between two things that go together (Nikolopoulou, 2022). In Smart PLS, there are three ways to determine if a discriminant is valid. These use a) cross-loadings, b) Fornell and Larcker's criteria, and c) the heterotrait-monotrait (HTMT) correlation ratio.


Cross-loadings are when an item significantly affects not just one but other factors. Items that load on two (or more) factors or a different factor than intended are said to have cross-loadings. The cross-loading principle posits that an item's loadings on its parent construct should exhibit greater magnitude than on any other research construct. Assume an item loads more on a distinct construct than its parent construct. It shows that the construct is not discriminatory and must be validated and improved to meet the standard. According to Yeboah (2020) and Yeboah and Nyagorme (2020), measurement items have significant convergent validity when their cross-loadings are estimated at least 0.708 and are higher on their respective constructs than their loadings on other constructs. As shown in Table 6, the items used in the study that remained after removing those with lower outer loadings had the highest cross-loadings on their respective constructs (bolded) than other constructs. They were all greater than the minimum recommended value. Thus, the instrument was found to have significant discriminant validity and was suitable for the study.

Table 6 Discriminant validity using indicator items' cross loadings
Criteria of Fornell and Larcker

Based on this criterion, it is required that the correlation between a construct and other constructs should have values smaller than the square root of the average variance extracted by the construct. The Pearson correlation refers to the correlation coefficient that quantifies the relationship between the relevant item indicators in this measurement. According to Fornell and Larcker's criterion, establishing discriminant validity is contingent upon fulfilling a specified condition. The researcher evaluated the instrument's discriminant validity using the Fornell–Larker criterion presented in Table 7. According to Yeboah (2020), the square root of the average variance extracted (AVE) for each construct, as indicated in the main diagonal, is anticipated to exceed the corresponding values in the vertical direction. The data presented in Table 7 suggest that the measurement model successfully meets the Fornell–Larker criterion.

Table 7 Discriminant validity using Fornell–Larcker criterion for constructs
Heterotrait-Monotrait (HTMT) ratio of correlation

The Heterotrait-Monotrait Ratio of Correlations (HTMT) is a metric utilised to evaluate the discriminant validity within the context of structural equation modelling (SEM) employing the partial least squares (PLS) approach (Henseler et al., 2015). The HTMT is grounded in the multitrait-multimethod matrix framework, which involves the examination of correlations between indicators of distinct constructs (heterotrait) and correlations between indicators of the same construct (monotrait) (Henseler et al., 2015). The HTMT is computed by dividing the mean of the heterotrait correlations and dividing it by the mean of the monotrait correlations, as Henseler et al. (2015) described.

The HTMT method offers certain benefits compared to alternative ways of evaluating discriminant validity, such as the Fornell–Larcker criterion and cross-loading analysis (Henseler et al., 2015). The HTMT method does not assume tau-equivalent measurement models, which are improbable to be applicable in most empirical research endeavours (Henseler et al., 2015). The HTMT measure proposed by Henseler et al. (2015) offers a more straightforward and intuitive approach to assessing discriminant validity. It accomplishes this by comparing the magnitude of relationships between constructs with the magnitude of relationships within constructs. According to Ringle et al. (2022), when the HTMT value falls below 0.90, it indicates the presence of discriminant validity between two constructs. The inference can be made that an HTMT value exceeding 0.90 suggests a deficiency in discriminant validity. However, it is recommended that researchers employ a threshold of 0.85 for the HTMT when there are substantial differences in the path model structures in terms of conceptualisation (Henseler et al., 2015). Per the cut-off value of 0.90 for the HTMT ratio defined by Henseler et al. (2015), the values shown in Table 8 are statistically significant. The HTMT values found for the constructs show that each construct in the model differed enough from the others and measured different characteristics. So, the measuring model could tell the difference between the two groups. Consequently, the measuring model successfully demonstrated discriminant validity.

Table 8 Discriminant validity using HTMT criterion for constructs


When two independent predictors are highly correlated, we have a problem known as collinearity. Collinearity means that two predictors are linked linearly. Multicollinearity is a problem in multiple linear regression models when two or more independent variables (predictors) are highly correlated, meaning that the predictors can make accurate linear predictions about one from the other. Condition indices and variance inflation factors (VIFs) help find multicollinearity (Lindner et al., 2020). There are several approaches to deal with multicollinearity, such as a) deleting one or more independent variables from the fit, b) performing a main components regression, and c) removing variables having strong partial correlations with other variables (Lindner et al., 2020). Feldman (2018) suggested these general rules of thumb: a) there is no multicollinearity among the factors if VIF = 1, b) there is moderate multicollinearity if 1 < VIF < 5 and c) there is high multicollinearity, indicating much overlap if VIF >  = 5.

Thus, a high VIF value means a construct is strongly linked to other constructs, making it hard to estimate and understand the coefficients. The VIF values of the model are shown in Table 9.

Table 9 Variance inflation values

The highest VIF number in the table is 2.252, which is below the standard threshold of 5 for finding high multicollinearity. This shows that there is no big problem with the model's constructs as they are not too similar. The lowest VIF number in the table is 1.000, meaning no multicollinearity exists between E and SP. The implication is that E and SP are two different constructs with no shared variance.In general, whereas social presence seems unrelated to other factors, cognitive presence, learning presence, and teaching presence show a modest level of multicollinearity. Nevertheless, Kock (2015) establishes that VIF values should not exceed a minimum threshold of 3.3. Thus, all VIF values from Table 9 were below Kock's (2015) suggested threshold, indicating that the estimated model was free from multicollinearity.

Discussion of the analysis of the structural model

It is necessary to conduct a structural model analysis to evaluate the hypothesised paths inside the calculated model for statistical significance. The methodology employed in this study involved a bootstrapping sequence consisting of 5000 resamples conducted using the Partial Least Squares Structural Equation Modelling (PLS-SEM) software. Figure 2 and Table 10 show what happened when the bootstrapping method was used.

Fig. 2
figure 2

Diagram of the path analysis (path diagram) using SmartPLS

Table 10 PLS-SEM paths’ significance results

The community of inquiry (CoI) model by Garrison et al. (2000) and later extension from Shea and Bidjerano (2010) known as the revised community of inquiry (RCoI) emphasises the importance of teaching, social, cognitive and learning presences to a) promote meaningful learning engagement in online environments and b) understand and design effective online learning environments, including blended MOOCs. Students’ engagement has many dimensions. Since the study deals with blended MOOCs, the item indicators for engagement were adapted from the blended MOOC engagement model (Almutairi & White, 2018). In this part, the researchers discuss how four presences  of RCoI are linked to student engagement with the help of the path diagram depicted in Fig. 2.

Teaching presence and students' engagement

H1: Teaching presence (TP) will positively impact students' engagement (SE) in the blended MOOC system.

The relationship between teaching presence (TP) and student engagement (SE) in blended MOOCs is complicated. Although TP has been demonstrated to improve SE, the magnitude of this effect depends on context and demographics. This idea is significant because it helps us understand how TP affects SE in blended MOOCs.

The statistics for the relationship between TP and SE are β = 0.109, t = 5.574 > 1.96 for α = 0.05, p = 0. 000, CI (0.072, 0.149), f2 = 0.015.

The Cohen’s effect size (f2) is classified as: a) 0.00 ≤ f2 < 0.20 as Negligible; b) 0.20 ≤ f2 < 0.50 as Small; c) 0.50 ≤ f2 < 0.80 as Moderate and d) 0.80 ≤ f2 is as Large (Cohen, 1988; Hair et al., 2017). Little is known about the effect of teaching presence on student engagement in blended MOOCs. However, the empirical evidence from this study found a positive relationship between TP and SE, with a path coefficient of β = 0.109 and statistical significance at p = 0.000, supporting the idea that TP is a determinant of SE in blended MOOCs (Littler, 2024). Although statistically significant, the impact size (f2 = 0.015) is negligible, explaining only 1.5% of SE variation(Hair et al., 2017). This result suggests investigating other variables that may moderate or mediate TP and SE.

This result aligns with the community of inquiry framework, which emphasizes the importance of TP, alongside social and cognitive presence, in fostering a rich educational experience (Garrison et al., 2000). The implications of these findings are significant for the design and delivery of blended MOOCs. They suggest that TP, which includes the design, facilitation, and direction of other presence or processes to support learning (Anderson et al., 2001), is a key factor in promoting student engagement. This is consistent with research suggesting that TP influences learning persistence in MOOCs (Jung & Lee, 2018). Based on the blended MOOC setting, teaching presence (TP) can affect student engagement (SE) differently. Course material, teaching tactics, and platform features can affect TP effectiveness. TP may significantly affect SE in courses requiring more cognitive presence (Cui et al., 2024; Pakula, 2024; Su, 2023). TP may affect SE differently depending on the MOOC's discipline. STEM and non-STEM courses have varied material and student engagement so that the TP may be perceived differently. TP effectiveness can also be affected by instructional design and student variables like age, level, culture, and online learning experience. These differences show that TP may not predict SE in all learning circumstances. TP can alter SE depending on student age, education, and culture. These aspects must be considered when evaluating data and exploring how TP can meet varied learner needs (Agarwal, 2021). The clarity of instructional objectives, content arrangement, and interactive components of the MOOC can increase or decrease TP's influence in SE (Agarwal, 2021; He et al., 2023). Well-designed TP-aligned courses may have a more significant impact on SE (Agarwal, 2021). According to the study, various characteristics may mediate or moderate TP-SE. Future research should examine mediators like cognitive and social presence and moderators such as student motivation and self-regulation (He et al., 2023; Su et al., 2023). Even though the effect size is minimal, the positive link between TP and SE emphasises its importance in blended MOOCs. For SE, educators and instructional designers could strengthen TP by giving clear guidance, timely feedback, and promoting dialogue (He et al., 2023; Pakula, 2024; Su et al., 2023).

Cognitive presence and students' engagement

H2: The students' cognitive presence (CP) will positively impact their engagement (SE) in the blended MOOC system.

H2 suggests that students' cognitive presence (CP) will boost their blended MOOC engagement (SE). Deep online learning relies on cognitive presence, the amount to which learners can generate meaning through persistent conversation (Garrison & Akyol, 2013). CP is crucial to meaningful learning in blended MOOCs, as Garrison et al. (2003) noted. This study confirms this.

The statistics for the relationship between CP and SE are β = 0.194, t = 7.140 > 1.96 for α = 0.05, p = 0.000, f2 = 0.036, CI (0.141, 0.247).

Analysis shows a significant positive relationship between CP and SE, with a path coefficient of β = 0.194. This substantial outcome (p-value < 0.001) supports the idea that CP is critical in fostering student engagement in blended MOOCs. This study’s result demonstrates that CP significantly impacts SE, supporting Lee's (2014) findings that higher CP correlates with increasing student engagement in online debates. Although significant, the impact size (f2 = 0.036) is negligible, explaining only 3.6% of SE variance. This outcome suggests that while CP is undoubtedly important, its overall impact on SE may be influenced by other factors or presences within the blended MOOC context.

In the context of blended MOOC systems, cognitive presence is crucial as it reflects the extent to which learners are able to construct and confirm meaning through sustained reflection and discourse (Garrison et al., 2001). A higher cognitive presence is associated with deeper levels of learning and understanding, which can lead to increased student engagement—a key indicator of successful learning outcomes (Chi, 2023). This assertion shows that deeper knowledge construction promotes engagement in the learning environment. This relationship may not be isolated. Blended MOOCs with collaborative tools and discussion forums can boost CP's impact on SE (Akyol & Garrison, 2013). CP affects Student Engagement (SE) together with teaching, social, and learning presence (Littler, 2024); Maranna et al., 2022). These interactions can result from a more holistic learning environment that fosters SE (Garrison et al., 2010). While our data show that CP directly affects SE, they also suggest that its position in the CoI framework has to be better understood. In future studies, these intricate interrelationships could inform blended learning instructional design and pedagogy. CP may mediate TP and SE, improving student satisfaction and learning (Shea et al., 2003). More research on how CP interacts with SP and LP may yield more insights.

Social presence and students' engagement

H3: The students' social presence (SP) will positively influence their engagement (SE) in the blended MOOC system

The idea that social presence (SP) dramatically affects students' engagement (SE) in blended MOOCs yields intriguing results(Bergdahl et al., 2020; Ma et al., 2022). Online and blended learning settings depend on social presence—the feeling of belonging and mutual support—to build a sense of community.

The statistics for the relationship between SP and SE are β = 0.082, t = 3.595 > 1.96 for α = 0.05, p = 0.000, CI (0.039, 0.128), f2 = 0.007).

This study found a positive correlation between SP and SE, with a path coefficient of β = 0.082, showing that an increase in SP leads to an increase in SE. A significant connection (p-value < 0.001) supports that SP positively impacts student engagement in blended MOOCs. Although statistically significant, the effect size (f2 = 0.007) indicates a low practical influence, accounting for only 0.7% of SE variance. This negligible effect size (Hair et al., 2017) forces us to examine SP's complex role in student engagement and its connection with the Community of Inquiry (CoI) framework.

The statistical results support the theory that social presence (social interactions and a sense of belonging in online learning environments) increase student engagement (Garrison et al., 2000; Littler, 2024). The positive β coefficient indicates that social presence promotes student engagement, although the effect size (f2) is considered negligible by Cohen's (1988) standards (Hair et al., 2017). This negligible effect size raises questions regarding social presence's multifaceted significance in student engagement. Despite its negligible effect size, social presence creates a compelling learning environment (Littler, 2024). Furthermore, a strong social presence in online courses can make learning more inclusive and engaging, increasing satisfaction and retention (According to Garrison et al., 2000; Littler, 2024; Shea & Bidjerano, 2009).

SP is crucial to SE, yet it is only one of several factors as engagement is multidimensional, including behavioural, emotional, and cognitive aspects (Fredricks et al., 2004). SP may mediate CP and SE by improving cognitive engagement with course content in a supportive social environment (Shea et al., 2003). Gupta et al. (2024) highlight how social media and digital platforms may inform instructors about student participation and build a sense of belonging in virtual learning environments. Blended MOOCs with collaborative tools and discussion forums can boost SP's impact on SE (Akyol & Garrison, 2013; Gupta et al., 2024). Backgrounds and experiences can affect students' SP perception and participation (Almasi & Zhu, 2023). However, the overarching theme of individual differences in engagement and the role of digital platforms might imply the need to consider these factors. Educational practitioners and instructional designers could promote online discussions and group activities to boost social presence while also addressing other engagement factors.

Learning presence and students' engagement

H4: The students' learning presence (LP) will positively impact their engagement (SE) in the blended MOOC system.

Learning presence(LP)–a composite of self-efficacy and other cognitive, behavioural and emotional characteristics that help online learners self- and co-regulate–significantly affects their engagement (SE) in blended MOOCs.

The statistics for the relationship between LP and SE are β = 0.350, t = 13.04 > 1.96 for α = 0.05, p = 0.000, f2 = 0.136, CI(0.299, 0.403).

The association between LP and SE is substantially supported by empirical evidence, with a significant path coefficient of β = 0.350 (p < 0.001). With an f2 = 0.136, LP accounts for 13.6% of SE variation, making it the strongest predictor of CoI's presence in this study, though negligible. This result indicates that as students' learning presence increases, their engagement in the blended MOOC system also tends to increase (Angelaina & Jimoyiannis, 2012; Popescu & Badea, 2020; Richardson & Swan, 2003; Wicks et al., 2015). Learning presence (LP) is a multifaceted construct encompassing self-efficacy and cognitive, behavioural, and emotional characteristics that facilitate online learners' self- and co-regulation. This, in turn, significantly influences their engagement (SE) in blended MOOC environments. The empirical support for the association between LP and SE is substantial, indicating that LP accounts for approximately 13.6% of the variance in SE, thus emerging as the strongest predictor of the revised Community of Inquiry's (RCoI) presence within this study, albeit the effect size being small. This result shows that students' blended MOOC engagement increases with their LP. Studies show that LP is crucial to SE in blended MOOCs (Angelaina & Jimoyiannis, 2012; Popescu & Badea, 2020; Richardson & Swan, 2003; Wicks et al.,2015). The impact of LP on blended learning environments' instructional design and pedagogy goes beyond SE (Shea & Bidjerano, 2010). In blended learning, contextual factors like teaching presence and individual factors like self-regulated learning (SRL) and co-regulated learning (CoRL) affect student engagement (Liao et al., 2023). Academic self-efficacy (ASE) has also been shown to improve academic success in online learning (Wolverton et al., 2020). Self-efficacy affects students' motivation, learning strategies, and online learning engagement and achievement (Bedi, 2023; Saefudin & Yusoff, 2021; She et al., 2021). This study shows that LP is essential in hybrid MOOCs. Educators can boost student engagement and ensure blended learning success by creating an LP-friendly learning environment. By prioritising LP and its components, educators may create successful and engaging blended learning experiences for students. Such an impact shows how important LP is in blended MOOCs for fully engaged learning. The findings have significant implications for blended learning instructional design and instructor practices due to LP's significant impact on SE.

Previous studies have shown that the CoI framework is essential for increasing engagement in online and blended learning environments, which these results support (Garrison & Cleveland-Innes, 2005; Garrison et al., 2010; Shea et al., 2010; Veletsianos & Kimmons, 2012). The CoI framework stresses creating a supportive, interactive learning community where students feel involved and driven to participate (Garrison et al., 2010). The results of the hypotheses show that the four presences can make students more engaged and interested in blended MOOCs. Therefore, the instructional design of blended learning courses must consider these elements to create an environment conducive to student engagement and learning.

Coefficient of determination

The coefficient of determination (R2, R2 or R-square) is a statistical metric that quantifies the extent to which one or more independent variables can account for the variability in a dependent variable in a regression model. The coefficient of determination is bounded within the range of 0 and 1. A value of 0 signifies that the model does not account for any variation observed in the response variable relative to its mean. In contrast, a value of 1 indicates that the model accounts for all the variation observed in the response variable relative to its mean. The coefficient of determination holds significance as it aids in assessing the degree to which data aligns with a statistical model. The purpose of its usage is to determine the degree of fit between a regression model and the observed data. When the value of R2 approaches 0, it indicates a lack of correlation between the independent and dependent variables. When the value of R2 comes to 1, it means a significant correlation between the independent and dependent variables.

Falk and Miller (1992) proposed a minimum threshold of 0.10 for the coefficient of determination (R2). Cohen (1998) proposed R2 values to assess the strength of endogenous latent variables, categorising them as substantial (0.26), moderate (0.13), and weak (0.02). Additionally, Chin (1998) proposed that the R2 values for endogenous latent variables can be categorised as follows: 0.67 (indicating a substantial relationship), 0.33 (indicating a moderate relationship), and 0.19 (indicating a weak relationship). The R square score for the model is 0.568, which shows a moderately positive link between the four presences of CoI and students' engagement in blended MOOCs (Chin, 1998). This result indicates that when the four presences of CoI are present, students are more likely to be interested in the course. The result for R2, which is 0.568, indicates that the four presences of CoI explain 56.8% of the difference in how engaged students are in blended MOOCs. This result means that the four presences of CoI are essential factors that affect how engaged students are in blended MOOCs. The path coefficient of SP (0.082) suggests a weaker relationship than the other three presences, while the highest path coefficient is for LP (0.350), which offers a strong relationship. The result indicates that LP is the most influential presence among the four in promoting blended MOOCs. The high path coefficient for LP shows that having a supportive environment that makes it easy for learners to self-regulate their learning is critical if researchers want to get students involved in blended MOOCs. On the other hand, CP and TP can be grown by helping learners with their training and teaching.

Q-squared validation

Predictive relevance, denoted by Q2, Q-squared or Q2, is another relevant indicator of the significant paths in the model's validity. Predictive relevance scores of 0.02, 0.15, and 0.35 indicate low, medium, and high relevance, respectively, as stated by Hair et al., (2021a, 2021b, 2021c). Thus, the Q2 value greater than zero indicates that the model has predictive relevance for the corresponding endogenous construct, students’ engagement for this study. The present study employs the Q-squared (Q2) validation technique to ascertain the significance of paths within the proposed model. Table 11 shows that the Q2 value for social presence is 0.347, indicating that the model has predictive relevance for this construct. The Q2 values for TP, CP, and LP are all 0, indicating that the model does not have predictive relevance for these constructs. This implies that the model is better at predicting social presence than it is at predicting teaching, cognitive, or social presence.

Table 11 Q-squared values

The interpretation of the Q2 values I gave earlier suggests that the model is better at predicting learning presence than it is at predicting teaching presence, cognitive presence, or social presence. This means the model has more predictive relevance for Learning Presence than the other three constructs. In practical terms, this could mean that interventions aimed at improving learning presence may be more effective at increasing student engagement than interventions aimed at enhancing teaching presence, cognitive presence, or social presence. However, it is important to note that this is just one way to interpret the results and further analysis may be needed to draw more definitive conclusions.

Importance-performance map analysis

The Importance-Performance Map Analysis (IPMA) is a valuable tool for decision-making because it provides a clear and concise representation of complex data and helps decision-makers identify underperforming and overperforming factors. This result makes it easy to identify trade-offs between competing priorities. The IPMA that SmartPLS makes is a graph with two dimensions. The horizontal axis shows each factor's importance, and the vertical line shows how well each factor performs.

  • "Importance" is how respondents or stakeholders value a construct or factor. It shows how significant each construct is thought to be. Higher scores from 0 to 1 indicate that respondents value the construct (Ringle & Sarstedt, 2016).

  • Performance is how well each construct performs compared to what respondents expected. It shows how well or effectively each construct works. Lower ratings indicate from 100 to 0 that the construct is not working as well as planned(Ringle & Sarstedt, 2016).

On the line, each factor is shown by a dot. The dot size shows how often the respondent talked about that factor. The factor's importance is indicated by where the dot falls. To use the IPMA for decision-making, we can split the constructs into four quadrants based on their importance and performance. From Ringle and Sarstedt (2016), we interpret the four quadrants as indicated below:

  • The constructs in the upper-right quadrant are very important and perform highly to the respondents, meaning they work well and should be kept.

  • The constructs in the lower-right quadrant are important but have low success ratings, so they need improvement.

  • The constructs in the upper-left quadrant are not very important but have a high-performance rate, which means they are doing too well, and resources could be moved elsewhere. They are an example of possible overdoing.

  • The lower-left section has unimportant constructs that do not perform well, which means they can be moved around or thrown out.

When assigning priorities, the constructions that fall into the upper-right quadrant are seen as having a high priority, whilst the constructs that fall into the lower-left quadrant are regarded as having a low priority.

As described in Table 12, the IPMA aims to identify direct and indirect exogenous variables within the model that exhibit strong performance or significant importance about the endogenous variable, which pertains to students' engagement(Hair Jr et al., 2021a; Ringle & Sarstedt, 2016). The performance measure is based on how well the direct variables have contributed to engagement using the available resources within a given time frame. The importance index quantifies how important each construct is in forecasting student engagement. The performance index is a metric that quantifies the average score of each construct.

Table 12 Performance index and importance index values for Student Engagement

The findings indicate that learning presence holds the highest importance, as evidenced by its index value of 0.225. It is closely followed by cognitive presence, which has an importance index of 0.125. Teaching presence ranks third in importance, with an index value of 0.07. Lastly, social presence demonstrates the lowest level of importance, as indicated by its importance index of 0.053. This finding implies that the primary determinant of student engagement is learning presence, with cognitive, teaching, and social presence as secondary factors. Concerning performance, each of the four constructs demonstrates relatively high-performance indices. Teaching presence exhibits the highest score (81.728), followed by cognitive presence (81.553), learning presence (81.126), and social presence (79.684). This observation implies that all four constructs exhibit satisfactory performance regarding their influence on student engagement.

Implications relating to the effects of students’ engagement within blended MOOCs

The findings suggest that the four presences of the revised Community of Inquiry (RCoI) framework, namely teaching, cognitive, social, and learning presence, significantly influence student engagement in blended MOOCs. Institutions offering blended MOOCs should focus on creating a learning environment that supports all four presence. Again, institutions offering blended MOOCs should focus on designing courses that promote these four presences. Similarly, institutions that provide blended MOOCs should put the most effort into making classes that help people grow the four presences.

The result shows that learning presence significantly affects students' engagement more than cognitive, teaching, and social presence in that order. Institutions that offer blended MOOCs should prioritise creating a good learning environment that encourages peer-to-peer learning, teamwork, idea sharing, and getting feedback from instructors and other students. The results also show that the four presences of RCoI explain 56.8% of the difference in student engagement levels in blended MOOCs, as demonstrated by the R square score of 0.568. This statement suggests that other things, such as students' attentiveness, the amount of academic difficulty, and the intellectual work they do, also affect how engaged they are (Ginting, 2021). As student engagement is a complex concept, academic institutions that offer blended MOOCs should broadly promote it.

Although statistical research forms a solid basis for comprehending these patterns, it is crucial to convert these findings into practical methods for educators, instructional designers, and policymakers who seek to optimise blended MOOCs to improve student engagement. Below are some specific ways to use these results in their blended MOOCs.

For educators and professionals in instructional design

Fostering teaching presence: To cultivate teaching presence, educators should prioritise cultivating effective communication channels, providing timely and constructive feedback, and establishing a nurturing online learning environment. Training programmes can be developed to improve instructors' digital pedagogical abilities, ensuring they are well-prepared to engage students in blended MOOCs effectively.

Enhancing cognitive presence: To enhance cognitive presence, instructional designers should integrate activities that foster critical thinking and facilitate knowledge creation. These activities may include case studies, problem-solving tasks, and assignments that undergo peer review. Course structures can promote active discourse and introspection, fostering a more profound comprehension and involvement with the subject matter.

Developing social presence: Facilitating student engagement through platforms such as discussion boards, collaborative projects, and peer feedback sessions can foster a strong feeling of community inside blended MOOCs. By integrating synchronous components, such as real-time webinars or virtual office hours, the immediacy of social interactions can be further intensified.

Supporting learning presence: Embedding strategies to cultivate students' self-regulation and self-efficacy inside the course design is crucial for fostering learning presence. These may encompass activities for setting goals, quizzes for self-assessment, and materials on efficient study techniques. Promoting student autonomy in their learning process is crucial for cultivating a strong learning presence.

For policymakers

Resource allocation: Policymakers should prioritise allocating resources towards developing blended MOOCs that facilitate interactive and captivating learning experiences. This strategy encompasses allocating resources towards digital infrastructure, providing opportunities for professional growth among educators, and researching the most effective online teaching and learning methods.

Accessibility and inclusion: Ensuring the accessibility and inclusion of blended MOOCs for a diverse student body is of utmost importance. Policy measures could be established to tackle obstacles to access. Educational institutions should explore cheaper alternative ways for internet connectivity, e.g. providing zero-rated access to MOOCs and negotiating better student internet deals with internet service providers (ISPs). Nations in the global south should consider adopting United States programmes that provide discounted internet access for educational purposes, such as the Affordable Connectivity Program (ACP)and the Federal Communications Commission's E-Rate programme. Furthermore, consideration should be given to creating culturally and linguistically inclusive material and locally relevant content. Lastly, improving the accessibility of MOOCs is also about making the courses themselves more accessible with the accessibility features required by Web Content Accessibility Guidelines (WCAG), like integrating captions, image descriptions, and various forms of learning into the course curriculum to meet many learners' needs like those with disabilities.

Quality assurance: Establishing standards for online course quality, which includes defining criteria for instructional presence, cognitive engagement, and social interaction, can guide the creation of high-quality blended MOOCs. It is necessary to develop continuous evaluation and improvement processes to sustain the effectiveness of online learning environments.

By taking into account these wider educational consequences, the results of our study can guide the creation of blended MOOCs that are both intellectually demanding and highly captivating while being easily accessible to a diverse group of learners. The objective is to utilise the distinctive capabilities of online learning to offer significant and all-encompassing educational experiences that cater to the requirements of the current varied student population.


The study contributes to the discussion on blended MOOCs by exploring students' engagement. By including the notion of learning presence in the revised Community of Inquiry (RCoI) framework, we have highlighted its crucial significance in enhancing the learning experience, accounting for a 56.8% variation in student engagement. Our analysis highlights the importance of learning presence, demonstrating its more significant influence on student engagement than the conventional triad of cognitive, social, and teaching presences. This discovery supports the idea of a fundamental change in perspective towards recognising learning presence as a central aspect of the CoI framework, strengthening its crucial function in promoting a comprehensive and engaging online learning environment.

The empirical results from this study emphasise the necessity for educators and instructional designers to create learning environments that facilitate individual learning, promote meaningful engagement, and offer well-organised pedagogical support (Garrison et al., 2000) Strategies such as collaborative projects, peer reviews, and interactive multimedia are crucial instruments for effectively engaging students. When applied successfully, these strategies can enhance student satisfaction and performance in blended MOOCs, thus enabling more personalised and powerful learning experiences. We support the idea of including learning presence within the CoI framework by other scholars (Anderson, 2017; Shea & Bidjerano, 2010). By adopting this integrated perspective, educators, instructional designers, and policymakers are equipped to navigate the complexities of online education, creating experiences that align with learners' varied needs and goals (Fink, 2013). Exploring the RCoI inside blended MOOCs is an academic pursuit and a collective dedication to revolutionising online education into a realm where any student, irrespective of their background, can flourish and attain their maximum capabilities (Tang, 2018).

While each presence affects engagement, their interaction implies a complex ecosystem worth further studying. Future research should examine the synergistic impacts of these presences across educational environments to inform online and blended learning experiences. Furthermore, future studies should investigate each presence's mediating or moderating roles concerning student engagement.

Availability of data and materials

Though the paper is part of an on-going PhD dissertation, the datasets are available from the corresponding author on reasonable request.


  • Afthanorhan, W. M. A. B. W. (2013). A comparison of partial least square structural equation modeling (PLS-SEM) and covariance based structural equation modeling (CB-SEM) for confirmatory factor analysis. International Journal of Engineering Science and Innovative Technology, 2(5), 198–205.

    Google Scholar 

  • Agarwal, R. K. (2021). MOOCS: Challenges & prospects in Indian higher education. In R. Chheda & S. N. Mehta (Eds.), Management practices in digital world. London: Empyreal Publishing House.

    Google Scholar 

  • Ahmad, S., & Hussain, A. (2019). Authentication of psychosomatic capability and workplace life of teachers scales by structural equation modeling. Journal of Educational Research, 22(2), 68–81.

    Google Scholar 

  • Akyol, Z., & Garrison, D. R. (2013). Educational communities of inquiry: Theoretical framework, research and practice (pp. 1–347).

  • Alabbasi, D. (2022, April). Factors influencing student engagement in virtual classrooms and their impact on satisfaction. In Society for information technology & teacher education international conference (pp. 142–151). Association for the Advancement of Computing in Education (AACE).

  • Alkhalaf, S., & Nguyen, T. (2020). Exploring the factors influencing the adoption of blended learning at higher education institutions: A study of instructors’ perspectives. Education and Information Technologies, 25(2), 1157–1178.

    Article  Google Scholar 

  • Almutairi, F., & White, S. (2018). How to measure student engagement in the context of blended-MOOC. Interactive Technology and Smart Education, 15(3), 262–278.

    Article  Google Scholar 

  • Anderson, T. (2017). How communities of inquiry drive teaching and learning in the digital age. North Contact, 1–16.

  • Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing teaching presence in a computer conferencing context. Journal of Asynchronous Learning Networks, 5(2), 1–17.

    Google Scholar 

  • Andrade, H. L., Brookhart, S. M., & Yu, E. C. (2021, December). Classroom assessment as co-regulated learning: A systematic review. In Frontiers in education (Vol. 6, p. 751168). Frontiers.

  • Angelaina, S., & Jimoyiannis, A. (2012). Analysing students’ engagement and learning presence in an educational blog community. Educational Media International, 49(3), 183–200.

    Article  Google Scholar 

  • Arbaugh, J. B. (2007). An empirical verification of the community of inquiry framework. Journal of Asynchronous Learning Networks, 11(1), 73–85.

    Google Scholar 

  • Azila-Gbettor, E. M., Mensah, C., Abiemo, M. K., & Bokor, M. (2021). Predicting student engagement from self-efficacy and autonomous motivation: A cross-sectional study. Cogent Education, 8(1), 1942638.

    Article  Google Scholar 

  • Bedi, A. (2023). Keep learning: Student engagement in an online environment. Online Learning, 27(2), 119–136.

    Article  Google Scholar 

  • Bergdahl, N., Nouri, J., & Fors, U. (2020). Disengagement, engagement and digital skills in technology-enhanced learning. Education and Information Technologies, 25, 957–983.

    Article  Google Scholar 

  • Bradley, R. L., Browne, B. L., & Kelley, H. M. (2017). Examining the influence of self-efficacy and self-regulation in online learning. College Student Journal, 51(4), 518–530.

    Google Scholar 

  • Bruff, D. O., Fisher, D. H., McEwen, K. E., & Smith, B. E. (2013). Wrapping a MOOC: Student perceptions of an experiment in blended learning. Journal of Online Learning and Teaching, 9(2), 187–199.

    Google Scholar 

  • Caskurlu, S., Maeda, Y., Richardson, J. C., & Lv, J. (2020). A meta-analysis addressing the relationship between teaching presence and students’ satisfaction and learning. Computers & Education, 157, 103966.

    Article  Google Scholar 

  • Chi, X. (2023). The influence of presence types on learning engagement in a MOOC: The role of autonomous motivation and grit. Psychology Research and Behavior Management, 5169–5181.

  • Chin, W. W. (1998). The partial least squares approach to structural equation modeling. Modern Methods for Business Research, 295(2), 295–336.

    Google Scholar 

  • Cho, M. H., & Shen, D. (2013). Self-regulation in online learning. Distance Education, 34(3), 290–301.

    Article  Google Scholar 

  • Choo, J., Bakir, N., Scagnoli, N. I., Ju, B., & Tong, X. (2020). Using the Community of Inquiry framework to understand students’ learning experience in online undergraduate business courses. TechTrends, 64, 172–181.

    Article  Google Scholar 

  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Lawrence Erlbaum Associates.

    Google Scholar 

  • Contact North (March 2016). Five ways MOOCs are influencing teaching and learning. Ontarios Distance Education and Training Network. March 2016

  • Cui, X., Qian, J., Garshasbi, S., Zhang, S., Sun, G., Wang, J., et al. (2024). Enhancing learning effectiveness in livestream teaching: Investigating the impact of teaching, social, and cognitive presences through a community of inquiry lens. STEM Education, 4(2), 82–105.

    Article  Google Scholar 

  • Damm, C. A. (2016). Applying a community of inquiry instrument to measure student engagement in large online courses. Current Issues in Emerging eLearning, 3(1), 9.

    Google Scholar 

  • Deci, E. L., & Ryan, R. M. (2000). The “what” and “why” of goal pursuits: Human needs and the self-determination of behavior. Psychological Inquiry, 11(4), 227–268.

    Article  Google Scholar 

  • Edumadze, J. K. E., Otchere Darko, S., Mensah, S., Bentil, D., & Edumadze, G. E. (2022). SWOT Analysis of blended MOOC from ghanaian university instructors’ perspectives. Shanlax International Journal of Arts, Science and Humanities, 10(1), 67–79.

    Article  Google Scholar 

  • De Freitas, S. I., Morgan, J., & Gibson, D. (2015). Will MOOCs transform learning and teaching in higher education? Engagement and course retention in online learning provision. British Journal of Educational Technology, 46(3), 455–471.

    Article  Google Scholar 

  • Diamantopoulos, A., Sarstedt, M., Fuchs, C., Wilczynski, P., & Kaiser, S. (2012). Guidelines for choosing between multi-item and single-item scales for construct measurement: A predictive validity perspective. Journal of the Academy of Marketing Science, 40, 434–449.

    Article  Google Scholar 

  • Dijkstra, T. K., & Henseler, J. (2015). Consistent partial least squares path modeling. MIS Quarterly, 39(2), 297–316.

    Article  Google Scholar 

  • Dixson, M. (2015a). Measuring student engagement in the online course: The online student engagement scale (OSE). Online Learning, 19(4), 143–158.

    Article  Google Scholar 

  • Dixson, M. D. (2015b). Measuring student engagement in the online course: The Online Student Engagement scale (OSE). Online Learning, 19(4), n4.

    Article  Google Scholar 

  • Doo, M. Y., & Bonk, C. J. (2020). The effects of self-efficacy, self-regulation and social presence on learning engagement in a large university class using flipped Learning. Journal of Computer Assisted Learning, 36(6), 997–1010.

    Article  Google Scholar 

  • Falk, R. F., & Miller, N. B. (1992). A primer for soft modeling. University of Akron Press.

    Google Scholar 

  • Feldman, K. (2018, November 7). Variance Inflation Factor (VIF). Isixsigma.

  • Fink, L. D. (2013). Creating significant learning experiences: An integrated approach to designing college courses. Wiley.

    Google Scholar 

  • Fornell, C., & Larcker, D. F. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research, 18(1), 39–50.

    Article  Google Scholar 

  • Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74(1), 59–109.

    Article  Google Scholar 

  • Garrison, D. R. (2015). Thinking collaboratively: Learning in a community of inquiry. Routledge.

    Book  Google Scholar 

  • Garrison, D. R. (2017). E-Learning in the 21st century: A community of inquiry framework for research and practice (3rd ed.). Routledge/Taylor and Francis.

    Google Scholar 

  • Garrison, D. R., & Cleveland-Innes, M. (2005). Facilitating cognitive presence in online learning: Interaction is not enough. The American Journal of Distance Education, 19(3), 133–148.

    Article  Google Scholar 

  • Garrison, D. R., & Vaughan, N. D. (2008). Blended learning in higher education: Framework, principles, and guidelines. John Wiley & Sons.

    Google Scholar 

  • Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical Inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2), 87–105.

    Google Scholar 

  • Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. American Journal of Distance Education, 15(1), 7–23.

    Article  Google Scholar 

  • Garrison, D. R., Anderson, T., & Archer, W. (2003). A theory of critical inquiry in online distance education. Handbook of Distance Education, 1(4), 113–127.

    Google Scholar 

  • Garrison, D. R., Anderson, T., & Archer, W. (2010). The first decade of the community of inquiry framework: A retrospective. The Internet and Higher Education, 13(1–2), 5–9.

    Article  Google Scholar 

  • Garrison, D. R. (2022, August 8). Shared metacognition and regulation response. The Community of Inquiry: Editorials

  • Garrison, D.R., & Akyol, Z. (2013). The Community of Inquiry Theoretical Framework. In Handbook of distance education (pp. 122–138). Routledge.

  • Ginting, D. (2021). Student engagement and factors affecting active learning in English language teaching. VELES (Voices of English Language Education Society), 5(2), 215–228.

    Article  Google Scholar 

  • Gray, J., & Diloreto, M. (2016). The effects of student engagement, student satisfaction, and perceived learning in online learning environments. NCPEA International Journal of Educational Leadership Preparation, 11(1), 98–119.

    Google Scholar 

  • Groves, R. (2012, September 21). Georgetown University. Our moment in time:

  • Gupta, D., Khan, A. A., Kumar, A., Baghel, M. S., & Tiwar, A. (2024). Socially connected learning harnessing digital platforms for educational engagement. In Navigating innovative technologies and intelligent systems in modern education (pp. 210–228). IGI Global.

  • Hair, J. F., Jr., Hult, G. T. M., Ringle, C. M., & Sarstedt, M. (2021a). A primer on partial least squares structural equation modeling (PLS-SEM). Sage publications.

    Book  Google Scholar 

  • Hair, J. F., Jr., Hult, G. T. M., Ringle, C. M., Sarstedt, M., Danks, N. P., & Ray, S. (2021b). Partial Least Squares Structural Equation Modeling (PLS-SEM) using R, classroom companion: Business. Springer Nature.

    Book  Google Scholar 

  • Hair, J. F., Jr., Matthews, L. M., Matthews, R. L., & Sarstedt, M. (2017). PLS-SEM or CB-SEM: Updated guidelines on which method to use. International Journal of Multivariate Data Analysis, 1(2), 107–123.

    Article  Google Scholar 

  • Hair, J., Jr., Hair, J. F., Jr., Hult, G. T. M., Ringle, C. M., & Sarstedt, M. (2021c). A primer on partial least squares structural equation modelling (PLS-SEM). Sage publications.

    Book  Google Scholar 

  • Hair, J. F., Risher, J. J., Sarstedt, M., & Ringle, C. M. (2019). When to use and how to report the results of PLS-SEM. European Business Review, 31(1), 2–24.

    Article  Google Scholar 

  • Haw, L. H., Sharif, S. B., & Han, C. G. K. (2022). Predictors of student engagement in science learning: The role of science laboratory learning environment and science learning motivation. ASIA Pacific Journal of Educators and Education.

    Article  Google Scholar 

  • He, J., Liu, Z., & Kong, X. (2023, September). A novel link prediction approach for MOOC forum thread recommendation using personalized pagerank and machine learning. In 2023 3rd international conference on educational technology (ICET) (pp. 37–41). IEEE.

  • Helland, I. S., Sæbø, S., Almøy, T., & Rimal, R. (2018). Model and estimators for partial least squares regression. Journal of Chemometrics, 32(9), e3044.

    Article  Google Scholar 

  • Henseler, J., Ringle, C. M., & Sarstedt, M. (2015). A new criterion for assessing discriminant validity in variance-based structural equation modeling. Journal of the Academy of Marketing Science, 43, 115–135.

    Article  Google Scholar 

  • Hodges, T. (2018, October 25). School engagement is more than just talk. Gallup

  • Holotescu, C., Grosseck, G., Crețu, V., & Naaji, A. (2014). Integrating MOOCs in blended courses. In 10th international scientific conference eLearning and software for education. Bucharest.

  • Hulland, J. (1999). Use of partial least squares (PLS) in strategic management research: A review of four recent studies. Strategic Management Journal, 20(2), 195–204.

    Article  Google Scholar 

  • Hussain, M., Zhu, W., Zhang, W., & Abidi, S. M. R. (2018). Student engagement predictions in an e-learning system and their impact on student course assessment scores. In Computational intelligence and neuroscience2018.

  • Irene, B. N. O. (2019). Technopreneurship: a discursive analysis of the impact of technology on the success of women entrepreneurs in South Africa. Digital Entrepreneurship in Sub-Saharan Africa: Challenges, Opportunities and Prospects (pp. 147–173).

  • Ituma, A. (2011). An evaluation of students’ perceptions and engagement with e-learning components in a campus-based university. Active Learning in Higher Education, 12(1), 57–68.

    Article  Google Scholar 

  • Jovanović, J., Gašević, D., Dawson, S., Pardo, A., & Mirriahi, N. (2017). Learning analytics to unveil learning strategies in a flipped classroom. The Internet and Higher Education, 33, 74–85.

    Article  Google Scholar 

  • Jimoyiannis, A., & Tsiotakis, P. (2017). Beyond students’ perceptions: Investigating learning presence in an educational blogging community. Journal of Applied Research in Higher Education., 9(1), 129–146.

    Article  Google Scholar 

  • Joksimović, S., Poquet, O., Kovanović, V., Dowell, N., Mills, C., Gašević, D., Dawson, S., Graesser, A. C., & Brooks, C. (2018). How do we model learning at scale? A systematic review of research on MOOCs. Review of Educational Research, 88(1), 43–86.

    Article  Google Scholar 

  • Jöreskog, K. G., & Sörbom, D. (1995). LISREL 8: Structural equation modeling with the SIMPLIS command language. Scientific Software International.

    Google Scholar 

  • Jung, Y., & Lee, J. (2018). Learning engagement and persistence in massive open online courses (MOOCS). Computers & Education, 122, 9–22.

    Article  Google Scholar 

  • Kang, M. H., Park, J., U., & Shin, S. Y. (2007). Developing a cognitive presence scale for measuring students' involvement during e-learning process. In C. Montgomerie & J. Seale (Eds.), Proceedings of world conference on educational multimedia, hypermedia and telecommunications 2007 (pp. 2823–2828). Association for the Advancement of Computing in Education (AACE).

  • Kang, M., Park, J. U., & Shin, S. (2007, June). Developing a cognitive presence scale for measuring students' involvement during e-learning process. In EdMedia+ innovate learning (pp. 2823–2828). Association for the Advancement of Computing in Education (AACE).

  • King, R. B. (2015). Sense of relatedness boosts engagement, performance, and well-being: A latent growth model study. Contemporary Educational Psychology, 42, 26–38.

    Article  Google Scholar 

  • Kloos, C. D., Muñoz-Merino, P. J., Alario-Hoyos, C., Ayres, I. E., & Fernández-Panadero, C. (2015). Mixing and blending MOOC technologies with face-to-face pedagogies. In Proceedings of the IEEE global engineering education conference (EDUCON), Tallin, Estonia (pp. 967–971).

  • Kock, N. (2015). Common method bias in PLS-SEM: A full collinearity assessment approach. International Journal of e-Collaboration (ijec), 11(4), 1–10.

    Article  Google Scholar 

  • Koller, D., Ng, A., Do, C., & Chen, Z. (2013). Retention and intention in massive open online courses: In depth. Educause Review, 48(3), 62–63.

    Google Scholar 

  • Koutsakas, P., Karagiannidis, C., Politis, P., & Karasavvidis, I. (2020). A computer programming hybrid MOOC for Greek secondary education. Smart Learning Environments, 7, 7.

    Article  Google Scholar 

  • Kozan, K., & Caskurlu, S. (2018). On the Nth presence for the Community of Inquiry framework. Computers & Education, 122, 104–118.

    Article  Google Scholar 

  • Kruse, A., & Pongratz, H. (2017). Digital change: How MOOCs transform the educational landscape. In H. Ellermann, P. Kreutter, & W. Messner (Eds.), The Palgrave handbook of managing continuous business transformation (pp. 353–373). Springer.

    Chapter  Google Scholar 

  • Lambert, J. L., & Fisher, J. L. (2013). Community of inquiry framework: Establishing community in an online course. Journal of Interactive Online Learning, 12(1), 1–16.

    Google Scholar 

  • Lee, S. M. (2014). The relationships between higher order thinking skills, cognitive density, and social presence in online learning. The Internet and Higher Education, 21, 41–52.

    Article  Google Scholar 

  • Liao, H., Zhang, Q., Yang, L., & Fei, Y. (2023). Investigating relationships among regulated learning, teaching presence and student engagement in blended learning: An experience sampling analysis. Education and Information Technologies, 28(10), 12997–13025.

    Article  Google Scholar 

  • Lin, K. Y., Wu, Y. T., Hsu, Y. T., & Williams, P. J. (2021). Effects of infusing the engineering design process into STEM project-based learning to develop preservice technology teachers’ engineering design thinking. International Journal of STEM Education, 8(1), 1–15.

    Article  Google Scholar 

  • Lindner, T., Puck, J., & Verbeke, A. (2020). Misconceptions about multicollinearity in international business research: Identification, consequences, and remedies. Journal of International Business Studies, 51(3), 283–298.

    Article  Google Scholar 

  • Linnenbrink, E. A., & Pintrich, P. R. (2003a). The role of self-efficacy beliefs instudent engagement and learning intheclassroom. Reading & Writing Quarterly, 19(2), 119–137.

    Article  Google Scholar 

  • Linnenbrink, E. A., & Pintrich, P. R. (2003b). The role of self-efficacy beliefs in student engagement and learning in the classroom. Reading & Writing Quarterly, 19(2), 119–137.

    Article  Google Scholar 

  • Littler, M. (2024). Social, Cognitive, and Teaching Presence as Predictors of Online Student Engagement Among MSN Students [Ph.D. thesis, Walden University]. Walden Dissertations and Doctoral Studies Collection.

  • Ma, Y., Zuo, M., Yan, Y., Wang, K., & Luo, H. (2022). How do K-12 students’ perceptions of online learning environments affect their online learning engagement? Evidence from China’s COVID-19 school closure period. Sustainability, 14(23), 15691.

    Article  Google Scholar 

  • Maphosa, V., & Maphosa, M. (2023). Opportunities and challenges of adopting MOOCs in Africa: A systematic literature review. In S. Goundar (Ed.). Massive open online courses-current practice and future trends. IntechOpen.

  • Maranna, S., Willison, J., Joksimovic, S., Parange, N., & Costabile, M. (2022). Factors that influence cognitive presence: A scoping review. Australasian Journal of Educational Technology, 38(4), 95–111.

    Article  Google Scholar 

  • McNutt, M. (2013). Bricks and MOOCs. Science, 342(6157), 402.

    Article  Google Scholar 

  • Miao, J., & Ma, L. (2022). Students’ online interaction, self-regulation, and learning engagement in higher education: The importance of social presence to online learning. Frontiers in Psychology, 13, 815220.

    Article  Google Scholar 

  • Montgomery, A. P., Hayward, D. V., Dunn, W., Carbonaro, M., & Amrhein, C. G. (2015). Blending for student engagement: Lessons learned for MOOCs and beyond. Australasian Journal of Educational Technology, 31(6), 657.

    Article  Google Scholar 

  • Moore, R. L., & Miller, C. N. (2022). Fostering cognitive presence in online courses: A systematic review (2008–2020). Online Learning, 26(1), 130–149.

    Article  Google Scholar 

  • Netemeyer, R. G., Bearden, W. O., & Sharma, S. (2003). Scaling procedures. SAGE, London: Issues and applications.

    Book  Google Scholar 

  • Nikolopoulou, K. (2022, September 2). What is discriminant validity? Definition & example. Scribbr.

  • Nunnally, J. C., & Bernstein, I. H. (1994). Psychometric theory (3rd ed.). McGraw-Hill.

    Google Scholar 

  • Onah, D. F., Pang, E. L., & Sinclair, J. E. (2022). An investigation of self-regulated learning in a novel MOOC platform. Journal of Computing in Higher Education, 1–34.

  • OpenupEd. (2015). Definition massive open online courses. Heerlen: EADTU.

  • Oyarzun, B., & Morrison, G. (2013). Cooperative learning effects on achievement and community of inquiry in online education. The Quarterly Review of Distance Education, 14(4), 181–194.

    Google Scholar 

  • Pakula, A. (2024). The role of tutor in massive social language learning: A case study of an academic Italian MOOC. In 18th international technology, education and development conference (pp. 2195–2203). Valencia, Spain.

  • Popescu, E., & Badea, G. (2020). Exploring a community of inquiry supported by a social media-based learning environment. Educational Technology & Society, 23(2), 61–76.

    Google Scholar 

  • Prasetyo, A., Tamrin, A. G., & Estriyanto, Y. (2022). A successful model of microsoft teams online learning platform in vocational high school. FWU Journal of Social Sciences, 16(2).

  • Qaffas, A. A., Kaabi, K., Shadiev, R., & Essalmi, F. (2020). (2020) Towards an optimal personalization strategy in MOOCs. Smart Learning Environments, 7, 14.

    Article  Google Scholar 

  • Rahimi, A. R. (2024). A tri-phenomenon perspective to mitigate MOOCs’ high dropout rates: the role of technical, pedagogical, and contextual factors on language learners’ L2 motivational selves, and learning approaches to MOOC. Smart Learning Environments.

    Article  Google Scholar 

  • Ranjan, P. (2020). Exploring the Models of Designing Blended & Online Learning Courses for Adoption in Regular Teacher Education Course. In Voices of teachers and teacher educators IX. National Council of Educational Research and Training (NCERT).

  • Ravinder, E. B., & Saraswathi, A. B. (2020). Literature Review Of Cronbach alpha coefficient (Α) And Mcdonald’s Omega Coefficient (Ω). European Journal of Molecular & Clinical Medicine, 7(6), 2943–2949.

    Google Scholar 

  • Richardson, J. C., & Swan, K. (2003). Examining social presence in online courses in relation to students’ perceive learning and satisfaction. Journal of Asynchronous Learning Networks, 7(1), 68–88.

    Google Scholar 

  • Ringle, C. M., & Sarstedt, M. (2016). Gain more insight from your PLS-SEM results: The importance-performance map analysis. Industrial Management & Data Systems, 116(9), 1865–1886.

    Article  Google Scholar 

  • Ringle, C. M., Wende, S., & Becker, J. M. (2022). SmartPLS 4. Oststeinbek: SmartPLS GmbH. Journal of Applied Structural Equation Modeling.

  • Saefudin, W., & Yusoff, S. H. M. (2021). Self-efficacy and student engagement in online learning during pandemic. Global Journal of Educational Research and Management, 1(4), 219–231.

    Google Scholar 

  • Sarstedt, M., Ringle, C. M., & Hair, J. F. (2021). Partial least squares structural equation modeling. In Handbook of market research (pp. 587–632). Springer International Publishing.

  • Saunders, M. (2014). Research methods for business students (6th edn, Greek language edition). Pearson Education.

  • Schunk, D. H., & Mullen, C. A. (2012). Self-efficacy as an engaged learner. In Handbook of research on student engagement (pp. 219–235). Springer US.

  • She, L., Ma, L., Jan, A., Sharif Nia, H., & Rahmatpour, P. (2021). Online learning satisfaction during COVID-19 pandemic among Chinese university students: The serial mediation model. Frontiers in Psychology, 12, 743936.

    Article  Google Scholar 

  • Shea, P. (2010). Online learning presence. In Proceeding of the European Distance and e-learning network (EDEN) annual conference. Valencia, Spain.

  • Shea, P., & Bidjerano, T. (2009). Community of inquiry as a theoretical framework to foster “epistemic engagement” and “cognitive presence” in online education. Computers & Education, 52.

  • Shea, P., & Bidjerano, T. (2010). Learning presence: Towards a theory of self-efficacy, self-regulation, and the development of a community of inquiry in online and blended learning environments. Computers & Education, 55(4), 1721–1731.

    Article  Google Scholar 

  • Shea, P., & Bidjerano, T. (2012). Learning presence as a moderator in the community of inquiry model. Computers & Education, 59(2), 316–326.

    Article  Google Scholar 

  • Shea, P., Fredericksen, E. E., Pickett, A. M., & Pelz, W. (2003). Student satisfaction and Reported learning in the SUNY Learning Network. In T. Duffy & J. Kirkley (Eds.), Learner-centred theory and practice in distance education. Lawrence Erlbaum.

    Google Scholar 

  • Shea, P., Hayes, S., & Vickers, J. (2010). Online instructional effort measured through the lens of teaching presence in the community of inquiry framework: A re-examination of measures and approach. International Review of Research in Open and Distributed Learning, 11(3), 127–154.

    Article  Google Scholar 

  • Shivdas, A., Menon, D. G., & Nair, C. S. (2020). Antecedents of acceptance and use of a digital library system: Experience from a Tier 3 Indian city. The Electronic Library, 38(1), 170–185.

    Article  Google Scholar 

  • Sukor, R., Ayub, A. F. M., Ab, N. K. M. A. R., & Halim, F. A. (2021). Relationship between students’ engagement with academic performance among non-food science students enrolled in food science course. Journal of Turkish Science Education, 18(4), 638–648.

    Google Scholar 

  • Tang, H. (2018). Exploring self-regulated learner profiles in MOOCs: A comparative study. The Pennsylvania State University.

    Google Scholar 

  • Tavakol, M., & Dennick, R. (2011). Making sense of Cronbach’s alpha. International Journal of Medical Education, 2, 53.

    Article  Google Scholar 

  • TED. (2014, January 27). Anant Agarwal: Why massively open online courses (still) matter [Video]. YouTube.

  • Ulrich, C., & Nedelcu, A. (2015). Moocs in our university: Hopes and worries. Procedia-Social and Behavioral Sciences, 180, 1541–1547.

    Article  Google Scholar 

  • UNICEF Office of Innovation. (2022, May 10). Can tech solve the global education crisis? UNICEF.

  • Veletsianos, G., & Kimmons, R. (2012). Networked participatory scholarship: Emergent techno-cultural pressures toward open and digital scholarship in online networks. Computers & Education, 58(2), 766–774.

    Article  Google Scholar 

  • Vezne, R., Yildiz Durak, H., & Atman Uslu, N. (2023). Online learning in higher education: Examining the predictors of students’ online engagement. Education and Information Technologies, 28(2), 1865–1889.

    Article  Google Scholar 

  • Vieira,D. Mutize, T. & Jaime Roser Chinchilla, J.B.(2020, December 21).Understanding access to higher education in the last two decades. UNESCO.

  • Walters, H. (2014, January 27). We need to change everything on campus. Ideas. Ted. Com.

  • Wang, M. T., & Degol, J. (2014). Staying engaged: Knowledge and research needs in student engagement. Child Development Perspectives, 8(3), 137–143.

    Article  Google Scholar 

  • Wang, X. Yang, D., Wen, M., Koedinger, K., & Rosé, C. P. (2015). Investigating how student’s cognitive behavior in MOOC discussion forums affect learning gains. In Proceedings of the 8th international conference on educational data mining (EDM 2015), June 26–29, 2015 (pp. 226–233). International Educational Data Mining Society (IEDMS).

  • Wertz, R. E. (2022). Learning presence within the Community of Inquiry framework: An alternative measurement survey for a four-factor model. The Internet and Higher Education, 52, 100832.

    Article  Google Scholar 

  • Whitehill J., Williams J. J., Lopez G., Coleman C. A., Reich J. (2015). Beyond prediction: First steps toward automatic intervention in MOOC student stopout. In Proceedings of the 8th international conference on educational data mining (EDM’15), June 26–29, 2015 (pp. 171–178). International Educational Data Mining Society (IEDMS).

  • Wicks, D., Craft, B. B., Lee, D., Lumpe, A., Henrikson, R., Baliram, N., Bian, X., Mehlberg, S., & Wicks, K. (2015). An evaluation of low versus high collaboration in online learning. Online Learning, 19(4), n4.

    Article  Google Scholar 

  • Wolverton, C. C., Hollier, B. N. G., & Lanier, P. A. (2020). The impact of computer self efficacy on student engagement and group satisfaction in online business courses. Electronic Journal of e-Learning, 18(2), 175–188.

    Google Scholar 

  • Wu, M. J., Zhao, K., & Fils-Aime, F. (2022). Response rates of online surveys in published research: A meta-analysis. Computers in Human Behavior Reports, 7, 100206

    Article  Google Scholar 

  • Yeboah, D. (2020). Predicting acceptance of WhatsApp as learning-support tool by higher distance education students in Ghana. Unpublished Ph.D. Thesis. Texila American University, Guyana.

  • Yeboah, D., & Nyagorme, P. (2020). Validation of non-linear relationships-based UTAUT model on higher distance education students’ acceptance of Whatsapp for supporting learning. Texila International Journal of Academic Research, 7(2), 27–39.

    Article  Google Scholar 

  • Yousef, A. M., Chatti, M., Schroeder, U., & Wosnitza, M. (2015). A usability evaluation of a blended MOOC environment: An experimental case study. International Review of Research in Open and Distributed Learning, 16(2), 69–93.

    Article  Google Scholar 

  • Yunusa, A. A., Umar, I. N., & Bervell, B. (2021). Massive open online courses (MOOCs) in Sub-Saharan African Higher Education Landscape: A Bibliometric Review. MOOC (Massive Open Online Courses), 1–25.

  • Yusof, A., Atan, N. A., Harun, J., & Doulatabadi, M. (2017). Understanding learners’ persistence and engagement in Massive Open Online Courses: A critical review for Universiti Teknologi Malaysia. Man in India, 97(12), 147–157.

    Google Scholar 

  • Zakaria, M., Awang, S., & Rahman, R. A. (2019). Are MOOCs in blended learning more effective than traditional classrooms for undergraduate learners. Universal Journal of Educational Research, 7(11), 2417–2424.

    Article  Google Scholar 

  • Zhang, H., Lin, L., Zhan, Y., & Ren, Y. (2016). The impact of teaching presence on online engagement behaviors. Journal of Educational Computing Research, 54(7), 887–900.

    Article  Google Scholar 

  • Zhang, Y. (2013). Benefiting from MOOC. World conference on educational multimedia, Hypermedia and Telecommunications (pp. 1372–1377).

  • Zhang, Y. (2022). The effect of educational technology on EFL learners’ self-efficacy. Frontiers in Psychology, 13, 881301.

    Article  Google Scholar 

  • Zimmerman, B. J. (1986). Becoming a self-regulated learner: Which are the key subprocesses? Contemporary Educational Psychology, 11(4), 307–313.

    Article  Google Scholar 

  • Zimmerman, B. J. (1989). A social cognitive view of self-regulated academic learning. Journal of Educational Psychology, 81(3), 329.

    Article  Google Scholar 

  • Zimmerman, B. J. (2000). Attaining self-regulation: A social cognitive perspective. In Handbook of self-regulation (pp. 13–39). Academic Press.

Download references


I express my appreciation to (1) Dr. Fadiyah Almutairi for the question items on the Blended MOOCs Engagement Model and (2) Prof Ruth E. H. Wertz for the question items on the community of inquiry. These items were used as part of the survey instrument for the study.


Not available.

Author information

Authors and Affiliations



JKEE wrote the PhD thesis from which is paper was extracted. DWG supervised the thesis and approved the final manuscript.

Corresponding author

Correspondence to John Kwame Eduafo Edumadze.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Edumadze, J.K.E., Govender, D.W. The community of inquiry as a tool for measuring student engagement in blended massive open online courses (MOOCs): a case study of university students in a developing country. Smart Learn. Environ. 11, 19 (2024).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: