Skip to main content

Designing and psychometric analysis of an instrument to assess learning process in a virtual environment

Abstract

Background

Today, methods that enable students to benefit from online programs to the fullest and learn independently and self-directed are of critical importance. Many scales have been developed to measure self-directed learning in the physical classroom. This study was conducted to design and assess the psychometric properties of an instrument to assess learning process in a virtual environment.

Materials and methods

A questionnaire for assessing s learning process in a virtual environment was developed following six steps. The process began with a systematic search for related articles. A qualitative study was then conducted to identify self-directed learning strategies and processes in virtual environments. The identified strategies were then compared with those from a literature review, and the scale items were developed accordingly. Expert validation, exploratory factor analysis, and reliability analysis were conducted to ensure questionnaire validity and reliability. This study included online postgraduate students from Iranian medical science universities in 2019.

Results

The scale consisted of 5 factors and 44 items. In exploratory factor analysis, five subscales explained 90% of the total variance. Cronbach’s alpha was 0.91 for the total scale. The interclass correlation coefficient between the test and retest was 0.77.

Conclusion

A questionnaire designed to assess learning process in a virtual environment for postgraduate virtual students has reasonable psychometric properties, including reasonable internal reliability and construct validity.

Introduction

Online courses permit learning anytime and anywhere at a personalized pace (Anderson et al., 2020). Typically, different groups of learners registered in an online course are required to make decisions related to their learning activities to achieve academic achievement. Hence, it is considered significant to support self-directed learning (SDL) strategies (Safikani et al., 2021).

SDL is considered the main element of lifelong learning and is a critical competency in the curriculum (Charokar & Dulloo, 2022). Several definitions of SDL have been proposed in the literature (Mocker & Spear, 1982; Van Woezik et al., 2019; Zeb et al., 2018). Knowles defined SDL as “a process in which a learner takes initiative, diagnoses their learning needs, creates learning goals, identifies resources for learning, applies appropriate learning strategies, and evaluates their learning outcomes” (Knowles, 1975).

SDL theory is a traditional framework that can help demonstrate components of a personalized system that supports learners develop their abilities to manage their learning activities and control their achievement (Giuseffi, 2021). SDL is about students' intellectual effort in which they intentionally manage themselves to achieve knowledge and solve problems (Curran et al., 2019).

Numerous previous studies on factors affecting SDL have demonstrated that SDL does not work without help and is not just an individual experience. SDL is a dynamic process that has functional relationships with a wide variety of other educational aspects such as the social settings in which learning takes place, the metacognitive behavior of students, and the didactical aspect of communication between teaching and learning (Giddings, 2015; Jennett, 1992). The distinctive characteristics of online learning same as flexibility, personalization, access to resources, and interactive learning influence students’ choices of SDL practices by allowing them to access a wide range of information resources, discover and appraise information, pursue their interests, and interact with teachers and other trainers (Song & Hill, 2007; Yildirim et al., 2023).

Moreover, the flexible structure of these environments allows for the fulfilment of students' critical needs and creates an opportunity for them to have more control over their learning paths (Bosch & Pool, 2019; Gerard et al., 2022). Accordingly, a properly designed online learning environment that provides a flexible structure, opportunities for collaboration, and authority over learning activities can provide students with the chance to develop SDL skills effectively (Teng et al., 2019). Self-directed online learners typically are involved more actively in learning exercises, in particular completing classroom assignments, engaging with the online learning material, planning and evaluating the main stages of learning, accessing online content, and navigating through the online learning platforms (Caravello et al., 2015; Fitzgerald et al., 2022). Nevertheless, many learners who engage in online courses encounter frustration and failure because they are ill-equipped for the challenging and isolated learning process (Baker & Moyer, 2019).

There are many studies on SDL and several scale-development studies on assessing SDL skills such as Guglielmino’s Self-Directed Learning Readiness Scale (SDLRS) (Guglielmino, 1977), Oddi Continuing Learning Inventory (OCLI) (Oddi et al., 1990), SDL Readiness Scale for Nursing Education developed by Fisher et al. (2001), and Self-Rating Scale of SDL (SRSSDL) proposed by Williamson (2007).

Guglielmino’s scale has been used in several studies to measure SDL readiness. As a result of its high validity, it is widely accepted. This scale is one of the most popular instruments for measuring SDL in educational research. The Oddi Continuing Learning Inventory (OCLI) was designed in 1990. This inventory uses 24 questions to assess self-directed students’ attributes. Fisher et al. (2001) introduced the SDLRS for nursing education as an alternative to Guglielmino’s SDLRS. This scale assesses SDL attitude, skills, and personal attributes. Williamson introduced a scale for measuring self-direction in nursing education. The validity and reliability of this scale in nursing education have been confirmed.

Defining and measuring SDL can be challenging, and there is no consensus as to what it entails. Researchers must clarify their SDL definitions. Putting theory into practice and implementing self-directed learning can also be challenging for students who have been taught in a more traditional, teacher-centered manner. There is no agreed-upon standard measure or assessment tool for self-directed learning because it is a complex and multifaceted construct. The use of technology to support SDL also presents challenges, including information overload and the need to develop digital literacy in learners (Abd-El-Fattah, 2010; Garrison, 2003; Song & Hill, 2007).

The measurement of SDL in online learning is not well understood. SDL processes in virtual learning environments will be better measured if an appropriate scale is developed. Furthermore, research should be conducted to determine how SDL practices can be adapted to online learning. This study was conducted to design and perform a psychometric analysis of an instrument to assess the SDL in a virtual environment.

Methods

Participant

Participants of this study were postgraduate virtual students of medical science universities in Iran in 2019.

Design

A comprehensive and rigorous process was utilized to develop the SDL in a virtual environment Questionnaire, employing widely-recognized methods as outlined in the 'Developing Questionnaires for Educational Research: AMEE Guide No. 87'. This modified process involved conducting a literature review and qualitative research, synthesizing the findings of both sources, developing a set of questionnaire items, obtaining expert validation, and conducting pilot testing.

Questionnaire development

  1. 1.

    Literature review

A systematic search was conducted across four databases (Google Scholar, Scopus, PsicINFO, and MEDLINE) from 1989 to 2019 using “self-directed learning,” “virtual environment,” and Keywords. Two independent researchers screened the articles for potential inclusion based on pre-determined inclusion criteria, including English language, publication type, relevance to self-directed learning, and the use of an instrument to evaluate it. The search strategy and screening process were rigorous and aimed to identify comprehensive literature on the topic.

  1. 2.

    Qualitative study

In this study, we employed a conventional content analysis approach to investigate self-directed learning in a virtual environment. Our participants consisted of 14 virtual students from Iranian medical sciences universities in Iran who were selected using a purposeful sampling method. To collect data, we conducted semi-structured interviews, which continued until data saturation was achieved and participated in the study.

During the interviews, the virtual students were asked a series of questions related to their experiences with self-directed learning in the virtual environment. These questions included, “What factors influenced your learning in the virtual environment?” “Could you please describe your experiences with independent learning in e-learning?” and, “What kinds of activities did you undertake during your independent learning in a virtual educational setting?” This qualitative study implemented the conventional content analysis proposed by Graneheim and Lundman for data analysis. The Themes and Categories were derived from the participants' text data, without reliance on pre-existing theoretical frameworks. The researchers transcribed the interviews verbatim and analyzed them line-by-line, identifying meaningful units throughout the process. Each meaningful unit was assigned a code, and the codes and data were continuously compared and grouped based on their similarities to form initial categories. These initial categories were then further analyzed and classified into more abstract categories. In sum, this approach enabled the researchers to gain a comprehensive understanding of the participants' self-directed learning experiences in a virtual environment. To enhance the credibility and reliability of our research, we utilized four features: credibility, conformability, dependability, and transferability. These features were employed to ensure the trustworthiness of the data collected and the findings obtained in our study.

  1. 3.

    Synthesizing the literature review and interviews

The strategies explained in the qualitative study were compared with the self-directed learning strategies identified in a literature review.

  1. 4.

    Development of items

To develop the items for the Self-directed Learning Process in a Virtual Environment Questionnaire, we recruited a panel of three experts in the field of medical education to serve as advisors and guide the drafting process.

  1. 5.

    Expert validation

Face validity

To assess the face validity of the proposed self-directed learning (SDL) scale for virtual environments, 10 virtual students were asked to rate the importance of each item on a five-point Likert scale, and an “Item Impact Score” was calculated by multiplying the frequency (%) and importance of each item. After that, the same students were asked to provide feedback on the “relevance”, “ambiguity”, and “difficulty” of the items. Based on their comments, some minor modifications were made to the preliminary questionnaire to enhance its face validity.

Content validity

To assess the content validity of the questionnaire, a group of eight experts in the field of medical education was selected and briefed on the study's objectives and the content validity assessment process. The questionnaire and content validity assessment form was sent to them via email, and the experts were asked to rate the relevance and clarity of each item on a four-point Likert scale ranging from 1 (not relevant) to 4 (very relevant) to calculate the content validity index (CVI). Items with a CVI value above 0.79 were considered acceptable (Waltz & Bausell, 1981). Additionally, the experts rated each item on a three-point Likert scale (necessary, useful but not important, and not necessary) to calculate the content validity ratio (CVR) using the Lawshe formula. Items with a CVR value above 0.75 were retained (Lawshe, 1975).

  1. 6.

    Pilot testing

To gather evidence on the construct validity (Exploratory factor analysis) and reliability of the scale, the SDL scale was developed as an online tool, and the link was distributed to virtual students of medical sciences universities in Iran.

An exploratory factor analysis (EFA) was done and the hidden factors were extracted. The Kaiser–Meyer Olkin was calculated for the sufficiency of the sample size. Bartlett’s test of sphericity was calculated for the fitness of the factor analysis model. Then, the hidden factors were extracted by Factor Principal Analysis using the Varimax rotational rotation and Scale Scree Plot. The analyses were performed in the SPSS V.22 software. Data sampling was performed using a convenience sampling method. Since the questionnaire had 50 items in this phase (While 12 items were excluded in expert validation), the sample size was calculated to be 215 virtual students (5 subjects per item).

Cronbach’s alpha coefficient was estimated to assess the internal consistency of the SDL scale in a virtual environment. Moreover, the stability of the scale over time (test-test), was calculated within the class using the intraclass correlation coefficient (ICC). The ICC was estimated using a two-way mixed-effect model with a 95% confidence interval. The Pearson correlation coefficient was also used to measure the stability of the scale over time.

Ethical considerations

The study was approved by the Ethics Committee of Tehran University of Medical Sciences (Code: IR.TUMS.MEDICINE.REC.1395.713). The students were informed of the objectives of the study and their participation in the study was voluntary. Informed consent was obtained from the students.

Results

Literature review

During the literature search for this review, a total of 2321 articles were initially identified. After screening the titles and abstracts and removing any duplicates, 120 articles were left. However, 42 of these articles did not meet the inclusion criteria and were excluded from the study. Ultimately, a total of 78 articles were included in the final review.

Qualitative study

This qualitative study involved the analysis of 1222 phrases from primary codes, which were subsequently categorized into 80 subcategories, 15 categories, and 5 themes that related to self-directed learning strategies in virtual environments. The identified themes were readiness to learn, directing towards the goal, purposeful effort, interest in learning environments, and excellence and progress. Additional information on the study's methodology and findings can be found in a separate manuscript.

Development of items

After comparing the self-directed learning strategies in the virtual environment identified in the qualitative study with the findings from the literature review, it was observed that there was a high similarity between the concepts. For each of the similar strategies, five items with a Likert scale were designed. In total, 62 initial items were developed.

Expert validation

The evaluation of the content validity index (CVI) revealed that the scale scored higher than 0.79 in terms of relevance, clarity, and simplicity. However, the content validity ratio (CVR) analysis resulted in 12 items scoring less than 0.49 and was excluded from the questionnaire, leaving a total of 50 items in the final version.

Pilot testing

The demographic characteristics of the 215 participants are presented in Table 1. The students were between the age of 24 and 43. The majority of them were female (n = 131, 60.9%) and MSc students (n = 192, 89.4%).

Table 1 Summary of demographic characteristics of virtual students

Explanatory factor analysis

EFA was conducted with Equamax rotation on the items of the questionnaire. KMO was 0.96, which indicated the sufficiency of the sample for factor analysis. Bartlett’s test showed a significant relationship between the items (Chi‑square = 12,615.29, P < 0.001), indicating the appropriateness of the factor analysis model. According to the Scree plot shown in Fig. 1, five factors altogether explained 80% of the total variance. The five-factor structure of the self-directed learning (SDL) scale in the virtual environment was shown in Table 3. Scale items' frequencies are displayed in Table 2.

Fig. 1
figure 1

Scree plot

Table 2 Frequency distribution of 250 respondents regarding assessing self-directed learning in the virtual environment

The five factors of the questionnaire along with its items are shown in Table 2. The factors were named according to the content of the items. The first (17 items), second (9 items), third (6 items), fourth (5 items), and fifth (7 items) factors were named “Self-directed learning prerequisites”, “Flexible and supportive environment”, Deep learning”, “Intelligent teaching”, and “Learning outcomes”, respectively (Table 3). It takes twenty minutes to complete this questionnaire.

Table 3 The five-factor structure of the self-directed learning (SDL) scale in a virtual environment

Reliability

Cronbach alpha was 0.923, 0.745, 0.98, 0.93, 0.98, and 0.91 for factors 1, 2, 3, 4, 5, and the total scale, respectively. Moreover, ICC between the test and retest was 0.81, 0.65, 0.81.0.72, 0.89, and 0.77 for factors 1, 2, 3, 4, 5, and the total scale, respectively. Finally, the Pearson correlation coefficient was 0.84, 0.73, 0.85, 0.81, 0.85, and 0.81 for factors 1, 2, 3, 4, 5, and total scale, respectively.

Discussion

Universities' competition for the most talented graduate students is becoming increasingly intense. This is even more critical in virtual colleges. Thus, it is necessary to have the most effective strategy to reduce student dropouts. Furthermore, at-risk students should be identified and supported to help them achieve course achievement to increase program retention in virtual learning.

SDL is essential to understanding virtual student readiness for online education. It also reduces the rate of student dropouts in online learning. In this study, the SDL Scale in Virtual Environment Questionnaire (SDLIVE) was developed and validated (Additional file 1).

SDLIVE scale has the potential to be an effective tool in online education where increased levels of SDL are needed by virtual students to self-organize the learning process. This is also critical for monitoring SDL competency continuously over time. It is also important for evaluating the impact of the different educational programs following sustained strategies to support SDL in web-based courses.

SDLIVE’s scale factorial structure supports its fitness to assess SDL in a virtual environment.

EFA of the SDLIVE scale indicated a five-factor structure that explained 80% of the total variance.

Certainly. Chen and Fan's (2023) study investigated the factor structure of the Self-Directed Learning Readiness Scale (SDLRS) among undergraduate students in China. The researchers used exploratory factor analysis to identify the underlying factors of the SDLRS. They found that the SDLRS had a six-factor structure, which they named love of learning, active learning, effective learning, independent learning, learning motivation, and creative learning (Chen & Fan, 2023). These factors explained 53.30% of the total variance in the SDLRS score. Through the Delphi method, Dulloo et al. (2023) developed a readiness scale containing 43 items categorized into four factors, including awareness, learning strategies and styles, motivation, and team building (Dulloo et al., 2023).

The first factor of the SDLIVE scale describes educational environments' requirements and characteristics, attitudes, behaviors, and experiences that promote self-directed learning.

SDL requires skills, such as using technology effectively for learning, according to Gurung and Rutledge (2014). To take advantage of the affordances of the e-learning context for SDL, digital learners need to be prepared for self-direction (Dulloo et al., 2023).

According to Lai et al. (2013), e-learners' readiness for SDL is crucial to online learning success. In this study, we found that e-learners' readiness to learn with technology is an important factor influencing their SDL. Those who are equipped with the beliefs, skills, and personal qualities required for SDL will have better chances of utilizing the opportunities ICT affords in SDL (Lai et al., 2013).

The second factor of SDLIVE, which emphasizes the importance of creating a flexible and supportive online learning environment, is consistent with findings from other studies on self-directed learning. According to Garrison and Kanuka (2004), self-directed learners need a flexible and supportive learning environment that enables them to take control of their learning (Garrison & Kanuka, 2004). Song and Hill (2007) also found that a supportive and positive learning environment promotes self-directed learning among students (Song & Hill, 2007).

Furthermore, previous research supports the need for technical, educational, and emotional support identified in the second factor of the SDLIVE scale. For instance, a study by Cho and Heron (2015) found that technical support availability was a crucial factor in promoting successful online and blended learning (Cho & Heron, 2015).

The third factor of the SDLIVE scale consisted of items describing deep learning. In the SDL process, deep learning activities facilitate a deeper understanding of the content and skills being learned by learners. To develop a more comprehensive and integrated understanding of the subject matter, learners can explore and experiment with creative ideas. Learners are more likely to engage in deep learning activities when they take control of their learning process. As part of these activities, students conduct research, collaborate with others, seek feedback, and reflect on their learning experiences. Education can promote SDL by helping learners develop the skills and knowledge they need to engage in deep learning and become lifelong learners (Al Mamun et al., 2022; Kek & Huijser, 2009; Padugupati et al., 2021).

The fourth factor of the SDLIVE scale consists of items describing how dependent learning is directed toward SDL. This is done under virtual instructors' supervision and through instructional scaffolding. Using digital tools, virtual instructors facilitate the active production of knowledge by the students. The community of inquiry framework builds existing collaborative-constructivist educational assumptions and focused on a social, cognitive, and teaching presence in an online learning environment. The model seeks to explain how to best analyze and ultimately promote higher-order learning—the cognitive and social processes associated with worthwhile and meaningful educational experiences. Teaching presence is a critical component of high-quality online learning environments. Anderson and his colleagues define teaching presence as “the design, facilitation, and direction of cognitive and social processes for the realization of personally meaningful and educationally worthwhile learning outcomes”. In their model, teaching presence has three components: instructional design and organization, facilitating discourse, and direct instruction (Garrison, 2022; Shea et al., 2022).

The fifth factor, SDL Outcomes, consists of items describing them. It is expected at the end of each SDL phase that virtual students will achieve outcomes such as being lifelong learners, being self-directed learners, having a sense of satisfaction, adapting to technology, maintaining in the virtual education system, and emotional outcomes such as attachment and eagerness to learn.

The present study has several advantages. First, to our knowledge, this is the first study to use established and validated methods to develop an SDL tool in a virtual environment. Second, we adopted a rigorous and standard procedure for the instrument design and judgment. The domains for inclusion were carefully defined by multiple judges from virtual education experts in diverse virtual schools in Iran. As part of the judgment, we used enough experts to calculate items and scale the CVI and CVR.

Despite its advantages, this study has some limitations. First, virtual students were recruited for convenience, and sampling bias must be considered. However, our high response rate may have moderated this limitation, and our sample is probably representative of the study population. All items were written in Persian and all respondents were located in Iran. Also, since some items may not be appropriate in other cultures, further research with people in other locations is required.

In this study, SDL in online learning was measured using a self-rating instrument. In future studies, more data on the impact of SDL on student achievement in online learning should be gathered. Moreover, if future studies in other contexts find that this scale displays no significant associations with the related concept, an adaptation of this scale would be recommended.

As instrument development is an iterative process, further studies should be conducted on a larger cohort at various locations. In addition to using the SDLIVE scale for self-assessment, a virtual teacher could use the SDLIVE scale to evaluate the SDL of students. Thus, future studies could explore the inter-rater reliability of the SDLIVE scale when used by virtual teachers to examine the relationship between perceived and actual perception, which would further strengthen the convergent validity of the SDLIVE scale. Furthermore, CFA could be conducted with a greater sample size to further validate the construct of the SDLIVE scale.

It is acknowledged that SDL was measured through the perception of virtual students rather than the actual demonstration of competencies. Regardless of researchers’ concerns about the effects of self-reported SDL methods, self-assessment could be used for reflective practice and support other ways of assessment.

Conclusions

Our study followed rigorous methods to develop a robust psychometrically sound measure of SDL in a virtual environment. The use of validated tools may help evaluate strategies for improving learning in a virtual environment. Moreover, using the SDLIVE scale. Cross-cultural studies are encouraging: understanding intercultural differences may support more effective educational interventions and surpass virtual education limitations. The SDL tool in a virtual environment is a psychometrically sound scale for measuring SDL in a virtual environment. Overall, it is a valuable assessment tool that can be used for various purposes such as (1) measuring SDL in online learning, (2) measuring the efficacy of courses designed to promote SDL in the online environment, (3) reducing the dropout rate in online learning, and (4) enhancing student satisfaction, learning, and persistence in virtual environments.

Availability of data and materials

The datasets used and/or analyzed during the current study available from the corresponding author on reasonable request.

References

  • Abd-El-Fattah, S. M. (2010). Garrison’s model of self-directed learning: Preliminary validation and relationship to academic achievement. The Spanish Journal of Psychology, 13, 586–596.

    Article  Google Scholar 

  • Al Mamun, M. A., Lawrie, G., & Wright, T. (2022). Exploration of learner-content interactions and learning approaches: The role of guided inquiry in the self-directed online environments. Computers & Education, 178, 104398.

    Article  Google Scholar 

  • Anderson, J., Bushey, H., Devlin, M. E., & Gould, A. J. (2020). Cultivating student engagement in a personalized online learning environment. In Handbook of research on fostering student engagement with instructional technology in higher education. IGI Global.

  • Baker, K. Q., & Moyer, D. M. (2019). The relationship between students’ characteristics and their impressions of online courses. American Journal of Distance Education, 33, 16–28.

    Article  Google Scholar 

  • Bosch, C., & Pool, J. (2019). Establishing a learning presence: Cooperative learning, blended learning, and self-directed learning. In Technology-supported teaching and research methods for educators. IGI Global.

  • Caravello, M. J., Jimãšnez, J. R., Kahl, L. J., Brachio, B., & Morote, E.-S. (2015). Self-directed learning: College students’ technology preparedness change in the last 10 years. Journal for Leadership and Instruction, 14, 18–25.

    Google Scholar 

  • Charokar, K., & Dulloo, P. (2022). Self-directed learning theory to practice: A footstep towards the path of being a life-long learne. Journal of Advances in Medical Education & Professionalism, 10, 135.

    Google Scholar 

  • Chen, S. L., & Fan, J. Y. (2023). Validation of the psychometric properties of the Self-Directed Learning Readiness Scale. Nursing Open, 10, 1639–1646.

    Article  Google Scholar 

  • Cho, M.-H., & Heron, M. L. (2015). Self-regulated learning: The role of motivation, emotion, and use of learning strategies in students’ learning experiences in a self-paced online mathematics course. Distance Education, 36, 80–99.

    Article  Google Scholar 

  • Curran, V., Gustafson, D. L., Simmons, K., Lannon, H., Wang, C., Garmsiri, M., Fleet, L., & Wetsch, L. (2019). Adult learners’ perceptions of self-directed learning and digital technology usage in continuing professional education: An update for the digital age. Journal of Adult and Continuing Education, 25, 74–93.

    Article  Google Scholar 

  • Dulloo, P., Singh, S., Vedi, N., & Singh, P. (2023). Development and implementation of a self-directed learning readiness scale for undergraduate health professional students. Journal of Education and Health Promotion, 12, 43.

    Google Scholar 

  • Fisher, M., King, J., & Tague, G. (2001). Development of a self-directed learning readiness scale for nursing education. Nurse Education Today, 21, 516–525.

    Article  Google Scholar 

  • Fitzgerald, R., Rossiter, E., & Thompson, T. (2022). A personalized approach to learning across time and space. In European conference on e-learning (pp. 105–110).

  • Garrison, D. (2022). Shared metacognition in a community of inquiry. Online Learning, 26, 6–18.

    Article  Google Scholar 

  • Garrison, D. R. (2003). Self-directed learning and distance education. In Handbook of distance education (pp. 161–168).

  • Garrison, D. R., & Kanuka, H. (2004). Blended learning: Uncovering its transformative potential in higher education. The Internet and Higher Education, 7, 95–105.

    Article  Google Scholar 

  • Gerard, L., Bradford, A., & Linn, M. C. (2022). Supporting teachers to customize curriculum for self-directed learning. Journal of Science Education and Technology, 31, 660–679.

    Article  Google Scholar 

  • Giddings, S. (2015). Self-directed learning (sdl) in higher education: A necessity for 21st century teaching and learning. Faculty of Education, Brock University.

  • Giuseffi, F. (2021). Renewing self-directed learning in e-learning experiences. ACM.

    Book  Google Scholar 

  • Guglielmino, L. M. (1977). Development of the self-directed learning readiness scale. University of Georgia.

  • Gurung, B., & Rutledge, D. (2014). Digital learners and the overlapping of their personal and educational digital engagement. Computers & Education, 77, 91-100.

  • Jennett, P. A. (1992). Self-directed learning: A pragmatic view. Journal of Continuing Education in the Health Professions, 12, 99–104.

    Article  Google Scholar 

  • Kek, M. Y. C. A., & Huijser, H. (2009). What makes a deep and self-directed learner: Exploring factors that influence learning approaches and self-directed learning in a PBL context at a Malaysian private university. In Proceedings of the 2nd international problem-based learning symposium (PBL 2009).

  • Knowles, M. S. (1975). Self-directed learning: A guide for learners and teachers.

  • Lai, C., Gardner, D., & Law, E. (2013). New to facilitating self-directed learning: The changing perceptions of teachers. Innovation in Language Learning and Teaching, 7, 281–294.

    Article  Google Scholar 

  • Lawshe, C. H. (1975). A quantitative approach to content validity. Personnel Psychology, 28, 563–575.

    Article  Google Scholar 

  • Mocker, D. W., & Spear, G. E. (1982). Lifelong learning: Formal, nonformal, informal, and self-directed, ERIC Clearinghouse on Adult, Career, and Vocational Education, the National….

  • Oddi, L. F., Ellis, A. J., & Roberson, J. E. A. (1990). Construct validation of the Oddi continuing learning inventory. Adult Education Quarterly, 40, 139–145.

    Article  Google Scholar 

  • Padugupati, S., Joshi, K. P., Chacko, T. V., & Jamadar, D. (2021). Designing flipped classroom using Kemp’s instructional model to enhance deep learning and self-directed collaborative learning of basic science concepts. Journal of Education and Health Promotion, 10, 187.

    Google Scholar 

  • Safikani, M., Kohan, N., Jahani, Y., & Nouhi, E. (2021). Self-directed learning outcomes and facilitators in virtual training of graduate students of medical education. Strides in Development of Medical Education, 18, 1–6.

    Google Scholar 

  • Shea, P., Richardson, J., & Swan, K. (2022). Building bridges to advance the community of inquiry framework for online learning. Educational Psychologist, 57, 148–161.

    Article  Google Scholar 

  • Song, L., & Hill, J. R. (2007). A conceptual model for understanding self-directed learning in online environments. Journal of Interactive Online Learning, 6, 27–42.

    Google Scholar 

  • Teng, Y., Wu, Y., Sun, T., & Yang, S. (2019). Research on the impact of learning feedback on the engagement in the context of self-directed learning platform.

  • Van Woezik, T., Reuzel, R., Koksma, J., & Serpa, S. (2019). Exploring open space: A self-directed learning approach for higher education. Cogent Education, 6(1), 1–22.

    Google Scholar 

  • Waltz, C. F., & Bausell, B. R. (1981). Nursing research: Design statistics and computer analysis. Davis Fa.

    Google Scholar 

  • Williamson, S. N. (2007). Development of a self-rating scale of self-directed learning. Nurse Researcher, 14, 66–83.

    Article  Google Scholar 

  • Yildirim, Y., Camci, F., & Aygar, E. (2023). Advancing self-directed learning through artificial intelligence. In Advancing self-directed learning in higher education. IGI Global.

  • Zeb, S., Yusuf, S., Mahmood, R. A., & Zeb, R. (2018). Gender based differences in self-directed learning readiness amongst medical students of Pakistan. Rawal Medical Journal, 43, 754–7564.

    Google Scholar 

Download references

Acknowledgements

We express appreciation to the participants in this study.

Funding

None.

Author information

Authors and Affiliations

Authors

Contributions

SA, NK, ZSM, HH, BSD, TR and AKHJ conceived and designed the study. SA, NK, ZSM and HH analyzed and interpreted the data, and drafted the manuscript. SA, NK, ZSM, HH, BSD, TR and AKHJ were involved in the composition of the study tool, supervision of the research process, and critical revision and review of the manuscript. All the authors read and approved the final manuscript.

Corresponding author

Correspondence to Noushin Kohan.

Ethics declarations

Ethical approval and consent to participate

The study procedures were carried out following the Declaration of Helsinki. This study was approved by the Ethics Committee of Tehran University of Medical Sciences. Informed consent was taken from all the participants. There was an emphasis on maintaining privacy in keeping and delivering the information accurately without mentioning the names of the participants. The participants were given the right to leave the interview at any time if they wished to leave the interview process, and they were promised to have the study results if they want.

Consent for publication

Not applicable.

Competing interests

The authors have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

SDL Scale in Virtual Environment Questionnaire (SDLIVE).

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ahmady, S., Kohan, N., Mirmoghtadaie, Z.S. et al. Designing and psychometric analysis of an instrument to assess learning process in a virtual environment. Smart Learn. Environ. 10, 35 (2023). https://doi.org/10.1186/s40561-023-00254-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40561-023-00254-w

Keywords