- Open Access
The effectiveness of an online grammar study scheme for Chinese undergraduate students
Smart Learning Environments volume 8, Article number: 3 (2021)
This paper outlines and describes the effectiveness of a virtual learning environment (VLE) as a grammar learning resource for first year students at a university in southern China over two academic years. The resource, named the Independent Grammar Study Scheme (IGSS), offered short grammar exercises that were completed over a 14-week period by two cohorts of year one students. The results of this study suggest that IGSS was successful in raising the grammar test scores of participants in both 2018 and 2019; that IGSS was more beneficial to students with lower English proficiency in 2018 but not in 2019; and that the VLE has been of great benefit to the university and to around one thousand students thus far.
In China, there has long been a great educational divide between the rich and the poor but with such rapid economic development, the country has been able to invest heavily in an online infrastructure that, when compared to the traditional context, provides more opportunities and higher quality educational settings for rich and poor alike (Wang et al., 2009, p. 78). In recent years, there has been a huge increase in the number of online schools and distance education providers in the country, offering a variety of learning opportunities (Wang et al., 2009, p. 78). In 2005, an investigation into virtual learning environments (VLE) in mainland China showed that while 49.1% of them offered a chat room, only 22.6% had online meeting rooms and a mere 3.8% had a virtual whiteboard function (Wang & Zhang, 2005, p. 42). Since then, online infrastructure offered by universities has come a long way and many universities have made further endeavours into the online learning world. As such, the technology and online resources available in China have become highly advanced (Cui & Wang, 2008, p. 74). In addition, since the advent of Moodle, a free and open source learning platform that “[a] nyone can adapt, extend or modify”, thousands more initiatives have been established (Moodle, 2020). This trend is set to continue and may be given a further boost by the current pandemic that the world is facing.
The purpose of this paper is to outline a VLE developed at an English as the medium of instruction university in southern China and its effectiveness as an independent learning resource that aims to improve knowledge of English grammar among first year students at the university. The resource, named the Independent Grammar Study Scheme (IGSS), was developed by the University’s English Language Centre (ELC) in cooperation with the Division of Science and Technology (DST) to tackle the lack of English grammar skills among some of the weakest students in the science department. Here, ‘weak’ is used to describe those students who consistently make simple grammar mistakes when writing and who lack the ability to self-correct their work while proofreading it. As English is the medium of instruction at the university, and English is a required course for all students, improving grammar knowledge among the student population was seen as a pressing issue that needed to be addressed by both the DST and the ELC.
IGSS was planned to help fill a void in VLEs offered by the University. In 2016, Moodle was being used for sharing documents and other files with students but there were no independent learning platforms for students. In 2016, the Dean of the DST approached the ELC for help with creating a site for students to learn more about English grammar outside of the classroom. The Dean of the DST believed that students within the science department were struggling with English grammar even in their fourth year at the University, especially when writing their final year papers. To ease the need for subject professors to repeatedly proofread every students’ assessment, it was agreed that the ELC would work on creating IGSS to encourage students to correct their own work and learn about English grammar through exercises that require them to notice mistakes in other peoples’ sentences, a key element of consciousness raising (see Schmidt, 1990, for more details).
The ELC has direct contact with all students within the University. Its experience with Chinese students would be vital in aiding the development of e-learning resources because understanding the target users of the resource would help make it more effective. Traditionally, Chinese students have faced a teacher-centred classroom dominated by Confucian thinking where “teachers imparted knowledge to the students passively” and questions, if allowed, were only asked after class (Lu et al., 2013, p. 115). In this environment, students are only expected to attend lectures and listen quietly (Lu et al., 2013, p. 115). At the same time, Chinese students expect extra help and interaction from teachers after class, something not often seen in Western environments (Kennedy, 2002, p. 434). The ELC aims to take on this traditional view of Chinese education by challenging students to become more interactive and communicative in the classroom. Kennedy also noted that “[s]tudents accustomed to more teacher-centred classrooms will need to be given time and support to make the transition to new forms of learning” (2002, p. 441), which is one of the reasons why the ELC had to put great effort into developing independent learning resources for students in recent years. The heavy investment in IT and computer resources at the University had also helped make the learning environment ripe for autonomous learning.
Several previous studies (Hills, 1998; Ho & Crookall, 1995; Lee, 1998) have looked at how Chinese learners have been able to adapt their learning styles to become more autonomous and in Lee’s 1998 paper, five features that help to promote autonomous attitudes among Chinese students were listed. They are:
• Creating voluntary and flexible programmes
▪ Offering frequent teacher feedback and encouragement
▪ Allowing students to set their own pace
▪ Allowing students to decide what/how much they do
▪ Providing a peer support network
(Lee, 1998, p. 283)
It would be important to consider this range of features when creating a VLE in order to give enough flexibility and support for the learners. Below, a brief review of other papers covering the subject of grammar learning and online learning are outlined.
Chinese students as grammar learners
In 2016, Chen et al. gave a questionnaire to 64 Chinese learners of English to gauge their perceptions of written corrective feedback. In general, the students admitted that they had “a favourable attitude toward error corrections and comments” and that grammar instruction was a necessary component of an English writing class (Chen et al., 2016, p. 12). Despite this, the majority “held neutral or negative opinions toward explicit grammar instruction” (Chen et al., 2016, p. 12). The results of the questionnaire revealed that when it came to grammar, too much focus on accuracy could lead to antipathy among students (Chen et al., 2016, p. 14). In the end, Chen et al. (2016) concluded by saying that many of the questionnaire respondents were interested in acquiring self-correction skills (p. 11).
In 2017, Zhou gave a questionnaire to 176 Chinese high school students to find out what they thought about English grammar. The respondents stated that they found “English grammar difficult, dislike grammar learning, and do not know how to use effective learning strategies” (p. 1244). In addition, the survey revealed that the employment of grammar learning strategies among high school students was very low and that there should be more concentration on social/affective learning strategies for grammar. Zhou (2017) believes that teachers should “pay more attention to the practical application of grammar, and guide the students [to] communicate with each other and learn grammar more efficiently” (p. 1246). These two studies that looked at Chinese students and their attitudes about English grammar go some way to showing that more effort is needed to inspire Chinese learners to deepen their understanding of English grammar.
Chinese students as online learners
Several studies have probed into the efficacy of online learning for Chinese students. In 2011, Wang split 176 university students into two learning groups: one group had an online learning component, while the other did not. The group of students who engaged in online learning had six periods of traditional class time and four periods of computer-aided learning. In contrast, the control group followed a traditional teaching pattern. Along with penning a weekly journal to record their learning performance, the experimental group also answered self-assessment questions to aid them in improving learner autonomy. At the end of the treatment, it was found that the students who had the extra online learning component in their course had greater English language ability (Wang, 2011, p. 584). It was also found that the students felt “positive about computer-aided autonomous English language learning … got more motivated during the first year … [and] found that they had learned a lot” (Wang, 2011, p. 584).
In addition, Tian & Liu (2012) conducted a study with 291 undergraduate students in mainland China over a 15 week period and found that students who used Moodle to cover units from the college textbook received higher scores in the post test when compared to a control group who only had a traditional English class, showing that the multimedia and interactive nature of Moodle was effective in improving the English proficiency of the students (Tian & Liu, 2012, p. 116).
Similarly, Rao (2015) carried out a study with 95 students at Tongren University, China, who engaged in online learning and interaction after class by using Moodle along with an audio chat group on QQTalk, a conference call tool created by Tencent which is part of its instant messaging software package called QQ (Baidu, 2021). The results of a questionnaire suggested that 59% of students in the experiment believed that Moodle was beneficial to creating a good autonomous learning environment and that 80% of the participants greatly enjoyed using both Moodle and QQTalk as learning tools (Rao, 2015, p. 80).
The studies above indicate that Chinese students adapt well to online learning and can gain a lot from it. Indeed, more recent studies within China (Teo et al., 2019; Wen & Yang, 2020, for example) have also had much success in getting Chinese students to accept Moodle, with many students now having “very positive attitudes toward this new teaching method” (Wen & Yang, 2020, p. 469). It can be said that online materials “present a brand new and refreshing experience to both teachers and students” within China (Cui & Wang, 2008, p. 78).
Studying grammar online
There have been several successful and significant studies into autonomous grammar learning. In 2005, Al-Jarf investigated a group of 238 female Saudi students who were taking an English grammar course for the first time. On the course, two groups of students studied a set list of grammar topics and completed two grammar tests. However, the instructor would share links to grammar activities and post grammar lessons on an online platform with one group of students every week. Throughout the semester, these students were encouraged to read the grammar lesson and post their answers to the grammar activities on the site. The facilitator would not correct their mistakes but would point out the type of error they made and encourage them to take a second look at their answers. At the end of the semester, the students who did the online activities showed significantly higher grammar test scores when compared to the control group. Al-Jarf concluded by saying that “achievement in the experimental group significantly improved as a result of using a combination of online and traditional face-to-face in-class grammar instruction” (2005, p. 177).
Li & Hegelheimer (2013) used undergraduate students from an intermediate writing class to test a smartphone application which asked users to edit and correct sentences for grammar mistakes. The application, named Grammar Clinic, was used “regularly as an out-of-class-assignment” in order to help the students improve their ability to correct and edit their own essays (Li & Hegelheimer, 2013, p. 140). Over the course of a semester, the quality of the participant’s essays improved and the majority of respondents in a post-project questionnaire believed that the “Grammar Clinic helped them notice errors in their own writing and in reviewing others’ papers” (Li & Hegelheimer, 2013, p. 147).
Likewise, Jin (2014) experienced similar results with a group of Korean undergraduate students who were taking an English grammar course. One group of students in the study were asked to engage in “topical discussions, as well as Q&A, information sharing, and any course-related activities” on a smartphone application (Jin, 2014, p. 17). During this time, the instructor merely observed the group discussion and did not interfere. According to Jin, the participants who engaged in the online discussions gained greater grammatical knowledge than the control group (2014, p. 22–23).
Similarly, Hashim et al. (2019) described how thirty secondary school students in Malaysia with relatively low levels of English proficiency took part in three online grammar game sessions. The research team used software such as Kahoot! and Socrative to investigate whether online games could benefit students when learning English grammar (Hashim et al., 2019, p. 45). The paper concluded by saying “[l] earners are able to obtain better results when they learn grammar using online language games”, with the participants scoring higher scores in the post-test than the pre-test (Hashim et al., 2019, p. 46). The results of the studies above give support for the effectiveness of online learning and extra-curricular activities for boosting linguistic competence, especially in English grammar.
The research outlined above has shone light on the idea that Chinese learners of English attach great importance to grammar, yet find it hard to master. In addition, this literature review has given support to Kennedy’s idea that learners of English in China are able to adjust to new learning styles when necessary (2002, p. 442). However, in spite of research showing the benefits of employing autonomous learning strategies to improve grammatical accuracy, there has been little movement in researching the area of online grammar learning for Chinese students. Several previous studies have looked at reading, writing, listening or speaking skills among the Chinese (outlined above), yet few have focussed on grammar. Bearing in mind that grammar is considered a highly important skill for Chinese students (Chen et al., 2016; Zhou, 2017), the present study is unique in the way that it fills this gap in the literature by looking at a VLE specifically designed to help Chinese learners of English with their grammar skills.
IGSS was developed with the aim of being creative, interactive, informative, easy-to-use and useful, which are key aspects of a VLE (Lee, 1998; Teo et al., 2019). In 2017, work began on developing the system and, after a small trial run, IGSS was officially launched in 2018. IGSS has been fully run twice at the University in its current form. The first run was from September 2018 to December 2018 and the second from September 2019 to December 2019. The following sections discuss both of these runs. Below, the methodology of IGSS is briefly outlined.
IGSS was run during the University’s normal semester, which lasts for fourteen weeks. It ran alongside English classes and other major and minor courses given by the University. Apart from workshops in the first and last weeks of the semester, the students were left unguided unless they specifically asked for help, making IGSS a truly independent learning platform. The fourteen week time-frame meant that students could work at a pace that suited them but the coordinator emphasised the importance of the ‘little and often’ policy of IGSS in the introductory workshop. Because this was the first time many of the University’s students would engage in independent learning, a timetable for students with suggestions for when they should complete each section of IGSS was supplied. If students followed this timetable, they completed two grammar tests per week over twelve weeks (with the first and final weeks reserved for the introductory and summative workshops, respectively).
There are four divisions at the University, namely the Division of Business and Management (DBM), the Division of Humanities and Social Science (DHSS), the Division of Culture and Creativity (DCC) and the aforementioned Division of Science and Technology (DST). Because IGSS was a cooperative project between the ELC and the DST, only DST students took part in it and the students from the other three divisions of the University (that being DBM, DHSS and DCC) were used as a control group. All participants were newly enrolled year one university students who had just left high school in mainland China. The students who made up the control group followed the normal “English I” course, which has a strong focus on reading and writing skills and a proportion of marks for grammar in every assessment. The experimental group also followed the “English I” course but completed grammar exercises using IGSS in addition. The total number of students in the experimental group was 469 in 2018; 462 in 2019. All year one DST students were enrolled in IGSS, regardless of their English proficiency or previous English test scores.
IGSS contained short grammar quizzes that took around ten minutes to complete for an average student. In total, there were twenty quizzes, split into four sets. The grammar quizzes in each set got sequentially more difficult in a way that more questions appeared in each proceeding quiz. There were five types of grammar error assigned to each set of grammar tests and each error type was repeated three times per set to allow consistency in the quiz content. In addition, each set of grammar tests was more difficult than the last, with the fourth set covering much more complex types of error than the first. A typical set of grammar tests can be seen in Table 1.
In each set of grammar quizzes, there were around 250 individual questions, making a total of 1000 questions in IGSS. Each question was unique and assigned at random. This meant that if a student wished to retake any test on which they did poorly, they saw a different set of questions. All of the questions were multiple choice in nature and began with a sentence that contained a particular grammatical form or a grammatical error, and then required the student to change the grammatical form or identify/correct the error by selecting one of four options. The majority of the grammar questions which appeared in IGSS came directly from, or were inspired by, students’ papers at the University. They were written by experienced English teachers from the ELC, who took advantage of their access to a wide body of student work to identify common student errors and then produce questions that exemplified these errors. A sample question can be seen below in Table 2.
The grammar quizzes were set up to mimic features of computer games by allowing students to level up and collect points. In each grammar quiz, a participant had to reach a certain number of points before the next quiz was unlocked. The tests were set up to allow for multiple attempts and, usually, a student had three chances at each question in the quiz to receive some points. A student who selected the correct answer on their first attempt received the maximum number of points. However, the more incorrect answers a student gave, the fewer points they received, making it more difficult (and slower) to unlock the next grammar unit. This feature encouraged students to review and correct their mistakes immediately if they wanted to level up and continue. A summary of the instructions shown to students before each grammar test is shown below in Table 3. After each attempt at a quiz, a participant could go back and review their answers once again and find out where they went wrong. This encouraged the student to learn from their mistakes.
Testing IGSS’s effectiveness
To test the effectiveness of this VLE, it was important to compare whether participation in it would change a student’s comprehension of the rules of English grammar. To do this, pre-scheme and post-scheme grammar tests were employed and the results of the two tests were compared for each student. It was assumed that after participating in IGSS, a student would get a higher score in a grammar test. The study also wished to find out whether students with less knowledge of English grammar would benefit more from taking part in IGSS than their peers. One of the most important, original motivations for IGSS’s creation was to improve the grammar knowledge of the weaker students in the University. Therefore, three hypotheses were posited and are tested in the next section. These hypotheses, with corresponding null hypotheses, are written below.
Hypothesis 1 (H1)
There will be a significant increase in grammar test scores for learners who use IGSS.
Null hypothesis 1 (H10)
There will be no significant increase in grammar test scores for learners who use IGSS and any increase will be due to chance.
Hypothesis 2 (H2)
There will be a significant positive correlation between participation in IGSS and gains in grammar test scores.
Null hypothesis 2 (H20)
There will be no significant positive correlation between participation in IGSS and gains in grammar test scores and any positive correlation will be due to chance.
Hypothesis 3 (H3)
IGSS will be significantly more beneficial for lower level English learners than for higher level learners.
Null hypothesis 3 (H30)
IGSS will not be any more beneficial for lower level English learners than for higher level learners and any significance in the benefits will be due to chance.
In the next section, the results of the study are discussed and special attention is given to determining whether or not there is any support for these hypotheses.
Results and discussion
Firstly, the total number of enrolled students who participated in IGSS is outlined. For the purposes of this study, a student was regarded as participating in IGSS if they had taken at least one attempt at the first grammar test. As later grammar tests would not be unlocked until the first test had been completed, this was the easiest way to determine how many students used the platform. Below, in Figs. 1a and b, it can be seen that 93.8% of students from the 2018 run of IGSS were counted as participants. For 2019, the number was 66.5%.
It was not expected that a high percentage of students would take part in IGSS considering it was optional. The large difference in participation between 2018 and 2019 is perhaps due to the Dean of the DST strongly supporting the project in 2018. In this first run of IGSS, the Dean asked his team of professors to encourage their students to use IGSS. This did not happen in 2019 and even though students received a lot of notifications and interaction with the ELC in the second run, the staunch support from the DST was not present.
In Table 4 below, the number of students who completed each grammar set is listed and it is clear that the later grammar sets had a lower level of participation than the earlier sets. As previously stated, a participant must reach a certain score in a grammar test before the next one is unlocked. This aspect of IGSS meant that for a student to reach grammar set D, they would have to complete all five grammar tests in the previous three sets. This would require very high levels of engagement and motivation. Therefore, it could be said that the progressively lower rates of completion were to be expected.
A further point is that around 20% fewer students completed the entire first set of tests than completed the first single grammar test. In 2018, while 93.8% of students finished the first grammar test, only 77.6% completed the entire first set. Similarly, in 2019, 66.5% of enrolled users completed the initial grammar test but only 48.4% finished the entire first set. Investigating the possible reasons for this almost 20% drop in completion rates by the end of the first set, and the gradual decrease thereafter, is not the focus of this study but could be investigated in the future.
Testing the hypotheses: testing H1
To test H1, paired sample t-tests were conducted on the pre-test and post-test grammar scores of both the control and experimental groups in 2018 and 2019. If the results were significant for the experimental groups, there would be support for H1 (that there would be an increase in grammar test scores for IGSS participants). The results showed that for 2018, the difference in the experimental group’s pre-test (M = 16.6, SD = 3.8) and post-test (M = 19.0, SD = 3.1) scores was significant, t (321) = − 11.1, p < .001 (one-tailed). Similarly, the 2019 experimental group’s pre-test (M = 17.9, SD = 3.7) and post-test (M = 18.4, SD = 3.4) scores were also significantly different from one another, t (287) = − 1.8, p = .035 (one-tailed).
For the control groups, who were only exposed to the usual grammar content of the University’s English course, there was a significant difference between the pre-test (M = 16.9, SD = 3.5) and post-test (M = 18.3, SD = 3.4) mean scores in 2018 — t (1202) = − 13, p < .001 (one-tailed) — but there was not a significant difference between the 2019 control group’s pre-test (M = 18.2, SD = 3.5) and post-test (M = 18.4, SD = 2.9) scores, t (750) = − 1.2, p = .109 (one-tailed).
As the results are significant for the experimental groups in both 2018 and 2019, H10 can be rejected and it can be said that the results of this study give support for H1: there is a significant increase in grammar test scores for those students at the University who took part in IGSS.
These results seem to indicate corroboration with previous studies (Al-Jarf, 2005; Hashim et al., 2019; Jin, 2014; Li & Hegelheimer, 2013) in showing the effectiveness of online grammar learning environments for university students.
Testing the hypotheses: testing H2
To test H2, the experimental group data from 2018 and 2019 were analysed and participants were ranked by the total number of points they scored in IGSS. In IGSS, each question was worth a maximum of five points and there were around one thousand questions in total. Therefore, a student who completed more questions, with fewer incorrect attempts, would receive a higher score. The data were examined with Pearson’s Correlation Coefficient and for the purpose of testing this hypothesis, “amount of participation” was defined as the total score received in IGSS, while “gains in grammar test scores” was defined as the change in a participant’s pre-test and post-test scores.
The results of the Pearson’s r test revealed that in 2018, there was no significant correlation between grammar test gains and IGSS participation, r (308) = .07, p = .099 (one-tailed). However, in 2019, the correlation between IGSS participation and grammar test gains was found to be significantly positive, r (286) = .11, p = .033 (one-tailed). Although there was significant positive correlation between these two variables in 2019, this is a weak correlation. When coupled with the insignificant result from 2018, there is not enough support for H2 (that there will be a relationship between participation in IGSS and gains in grammar test scores) and therefore H20 cannot be rejected.
IGSS is an independent learning platform yet students at the University were new to this learning environment. Dikli believes that platforms like these are “likely to overwhelm the ESL students” (2010, p. 123) and this is especially true for students who are not familiar with independent learning. If students became overwhelmed by IGSS, they likely would not continue on to later grammar test units. This is supported by the results in Table 4, where participation rates gradually decreased with each set of grammar tests. In addition, research by Hsu suggests that Chinese students are driven by deadlines and that many of them will submit assignments during the last few days before the due date (Hsu, 2020, p. 285). This mentality may not be complementary to platforms that do not set deadlines. For the 2018 run of IGSS, no reminders were sent out for participants between week 2 and week 11. At the end of week 11, an email reminder was sent out to let students know that the deadline for completing IGSS was nearing. In 2019, three email reminders were sent out between week 2 and week 11. In these reminders, however, there was no mention of the word ‘deadline’. This could be one of the reasons why the results were not significant for this hypothesis and as such, IGSS could tailor itself more to the Chinese student mindset in the future. In doing so, there may be a stronger correlation between participation in IGSS and grammar test scores (Hsu 2000).
Testing the hypotheses: testing H3
To test H3, the participants in the experimental group were split into three groups according to their English grades on the general English course at the University. Those students who were in a level 1 or level 2 English class at the University would be regarded as ‘high English level participants’, while those at levels 3–5, and levels 6–9 would be deemed ‘medium English level participants’ and ‘low English level participants’, respectively. Each group’s mean pre-test and post-test scores were compared using ANOVA tests. To give support for H3, the results for the low English level students would have to be insignificantly different from the mid-level or high-level English students in the post-test. This would mean that the students with lower English ability were able to match the scores of their peers. The results of the one-way ANOVA test (F (2,319) = 15.5, p < .001) for the 2018 pre-test revealed that there was a statistically significant difference between the three levels of English learners. For the post-test, the one-way ANOVA test (F (2,319) = 3, p = .53) showed that the difference between the groups was not significant, meaning that there was some support for H3.
For the 2019 pre-test scores, the one-way ANOVA test (F (2,285) = 19.7, p < .001) found that the difference between the three levels of English learners was significant, while the results for the post-test (F (2,285) = 9.9, p < .001) were also significant.
Due to the insignificant result in the 2018 post-test, the results of its post-hoc Tukey HSD tests were investigated to find out where the insignificance laid. The outcome can be seen in Table 5.
From Table 5, it can be seen that there was no statistically significant difference between the high English level and medium English level groups (p = .293) or between the medium English level and low English level participants (p = .514) in the 2018 post-test. However, there was still a significant difference between the high English level and low English level students (p = .041). In other words, there is evidence to suggest that, in the 2018 post-test, the low English level participants matched the scores of the medium English level students, while the medium English level participants’ scores seem to have matched the high English level group. While the results for 2019 were not noteworthy, the 2018 results do still give some support for H3 and, for the 2018 run of IGSS, H30 can be rejected.
In previous research (such as in O’Neill & Russell, 2019), students with lower English proficiency levels had concerns over the amount of feedback the online platform gave. The participants in O’Neill and Russell’s study were unsure if the grammar corrections were accurate and, in some cases, could not comprehend them (p. 50). Similarly, Dikli (2010) mentioned that lower proficiency learners may not benefit from “the redundant, extensive, and generic nature of the … feedback” given online (p. 123). However, it appears that in the present study, low English level participants of IGSS in 2018 benefited, to some extent, from its feedback.
This paper did not look into the entirety of the data collected by IGSS. Future research may want to look more closely at the reasons behind certain groups scoring differently and/or the variance within groups. It could also look further at the effect of the University’s English courses on the students’ grammar skills. Another angle for further research could be to look qualitatively at student experiences of IGSS and survey them about their reactions and interactions with it. This would help the University to gain further insight into how the students feel about IGSS and independent learning in general.
In addition, although the present study’s significant results for learning grammar online generally corroborate with previous studies (Chen & Cheng, 2008; Kan & Tang, 2018; Liao, 2016; O’Neill & Russell, 2019), it also found support for Liao’s idea that “human scaffolding should be provided … in addition to machine assistance to ensure successful learning” (2016, p. 89). For the future, IGSS, and other VLEs like it, should ensure that their tasks align with “pre-existing uses and expectations of learners” (Chwo et al., 2018, p. 69).
IGSS set a goal of being a creative, interactive, informative, easy-to-use and useful tool for students to improve their English grammar skills in their own time. However, for the future, it would be suggested that IGSS be made smarter by employing some kind of peer support network and/or a dedicated facilitator contact to help participants. IGSS may also benefit through the diversification of its content and the creation of extra questions to deepen its benefits.
Despite the above limitations, IGSS has managed to help almost one thousand students in the DST over a two-year period. Some students have benefited more than others but the nature of independent learning means that each student will develop and learn at a different pace. IGSS also met many of the criteria suggested in Lee’s (1998, p. 283) list of goals to promote independent learning. For example, IGSS is entirely voluntary and it offers plenty of flexibility for students to do as much or as little they want to do every week. It also offers corrective feedback and external learning resources to participants. In short, IGSS successfully filled the independent learning vacuum at the University and the results of this study go some way to showing that it is of great benefit to students. It is also hoped that it will help other educators in other universities in their creation and implementation of VLEs.
Availability of data and materials
The datasets used and analysed in this paper are not currently publicly available.
Virtual Learning Environment
Independent Grammar Study Scheme
English Language Centre
Division of Science and Technology
Division of Business and Management
Division of Humanities and Social and Science
Division of Culture and Creativity
- H10 :
Null Hypothesis 1
- H20 :
Null Hypothesis 2
- H30 :
Null Hypothesis 3
Al-Jarf, R. S. (2005). The effects of online grammar instruction on low proficiency EFL college Students' achievement. The Asian EFL Journal Quarterly, 7(4), 166–190 https://www.asian-efl-journal.com/main-editions-new/the-effects-of-online-grammar-instruction-on-low-proficiency-efl-college-students-achievement/.
Baidu (2021). QQTalk. https://baike.baidu.com/item/QQTalk
Chen, C. F. E., & Cheng, W. Y. E. (2008). Beyond the Design of Automated Writing Evaluation: Pedagogical practices and perceived learning Effectivness in EFL writing classes. Lang. Learn. Technol., 12(2), 94–112 http://llt.msu.edu/vol12num2/chencheng/.
Chen, S., Nassaji, H., & Liu, Q. (2016). EFL learners’ perceptions and preferences of written corrective feedback: A case study of university students from mainland China. Asian-Pacific Journal of Second and Foreign Language Education, 1(5), 1–17. https://doi.org/10.1186/s40862-016-0010-y.
Chwo, G. S. M., Marek, M. W., & Wu, W. C. V. (2018). Meta-analysis of MALL research and design. System, 74, 62–72. https://doi.org/10.1016/j.system.2018.02.009.
Cui, G., & Wang, S. (2008). Adopting cell phones in EFL teaching and learning. Journal of Educational Technology Development and Exchange, 1(1), 69–80. https://doi.org/10.18785/jetde.0101.06.
Dikli, S. (2010). The nature of automated essay scoring feedback. CALICO J., 28(1), 99–134 https://www.jstor.org/stable/calicojournal.28.1.99.
Hashim, H., Rafiq, K. R. M., & Yunus, M. M. (2019). Improving ESL learners’ grammar with Gamified-learning. Arab World English Journal, Special Issue on CALL, 5, 41–50. https://doi.org/10.24093/awej/call5.4.
Hills, K. (1998). Asian learners’ experience of Western distance learning. In R. Carr (Ed.), The Asian distance learner–conference papers, (pp. 159–163). Hong Kong: The Open University of Hong Kong.
Ho, J., & Crookall, D. (1995). Breaking with Chinese cultural traditions: Learner autonomy in English language teaching. System, 23(2), 235–243. https://doi.org/10.1016/0346-251X(95)00011-8.
Hsu, C. (2000). Collaboration through Online Discussion Board: A Discourse Analysis of CALL in a Normal University in China. Arab World English Journal, Special Issue on CALL, 6, 278–289. https://doi.org/10.24093/awej/call6.18.
Jin, S. H. (2014). Implementation of smartphone-based blended learning in an EFL undergraduate grammar course. Multimedia-Assisted Language Learning, 17(4), 11–37. https://doi.org/10.15702/mall.2014.17.4.11.
Kan, Q., & Tang, J. L. (2018). Researching Mobile-assisted English language learning among adult distance learners in China: Emerging practices and learner perception of teacher role. International Journal of Computer-Assisted Language Learning and Teaching, 8(3), 1–28. https://doi.org/10.4018/ijcallt.2018070101.
Kennedy, P. (2002). Learning cultures and learning styles myth-understandings about adult (Hong Kong) Chinese learners. International Journal of Lifelong Education, 21(5), 430–445. https://doi.org/10.1080/02601370210156745.
Lee, I. (1998). Supporting greater autonomy in language learning. English Language Teaching Journal, 52(4), 282–289 https://www.deepdyve.com/lp/oxford-university-press/supporting-greater-autonomy-in-language-learning-tHzCOWhtQp.
Li, Z., & Hegelheimer, V. (2013). Mobile-assisted grammar exercises: Effects on self-editing in L2 writing. Language Learning & Technology, 17(3), 135–156 https://scholarspace.manoa.hawaii.edu/bitstream/10125/44343/17_03_lihegelheimer.pdf.
Liao, H. C. (2016). Enhancing the grammatical accuracy of EFL writing by using an AWE-assisted process approach. System, 62, 77–92. https://doi.org/10.1016/j.system.2016.02.007.
Lu, J., Jiang, H., & Throssell, P. (2013). Autonomous learning in tertiary university EFL teaching and learning of the People's Republic of China: Challenges and new directions. The International Journal Of Learner In Higher Education, 19(2), 111–121 https://ro.uow.edu.au/sspapers/328.
Moodle (2020). About Moodle. https://docs.moodle.org/39/en/About_Moodle
O’Neill, R., & Russell, A. M. T. (2019). Stop! Grammar time: University students’ perceptions of the automated feedback program Grammarly. Australasian Journal of Educational. Technology, 35(1), 42–56 https://ajet.org.au/index.php/AJET/article/view/3795/1514.
Rao, H. H. (2015). 基于QQTalk及Moodle的大学英语听力教学环境的实践研究. Journal of Xi’an Aeronautical University, 33(2), 76–96 https://kns.cnki.net/kcms/detail/detail.aspx?FileName=XHGZ201502019&DbName=CJFQ2015.
Schmidt, R. (1990). The role of consciousness in second language learning. Appl. Linguis., 11, 129–158. https://doi.org/10.1093/APPLIN/11.2.129.
Teo, T., Huang, F., & Zhou, M. (2019). Factors that influence university students’ intention to use Moodle: A study in Macau. Educ. Technol. Res. Dev., 67, 749–766. https://doi.org/10.1007/s11423-019-09650-x.
Tian, X. B., & Liu, X. F. (2012). 基于Moodle平台的大学英语教学模式的研究与实践. Journal of Tongren University, 14(4), 112–116 https://kns.cnki.net/kcms/detail/detail.aspx?FileName=TRSF201204033&DbName=CJFQ2012.
Wang, J. (2011). Improve college students’ autonomous English learning effectiveness with new learning model. Journal of Language Teaching and Research, 2(3), 580–587. https://doi.org/10.4304/jltr.2.3.580-587.
Wang, Q., Zhu, Z., Chen, L., & Yan, H. (2009). E-learning in China. Campus-Wide Information Systems, 26(2), 77–81. https://doi.org/10.1108/10650740910946783.
Wang, Z. Z., & Zhang, W. Y. (2005). 我国普通高校网上教学平台及 网站建设的现状分析. Journal of Distance Education in China, 242(2), 40–44 https://kns.cnki.net/kcms/detail/detail.aspx?FileName=DDJY200502010&DbName=CJFQ2005.
Wen, J., & Yang, F. (2020). Use of Moodle in college English language teaching (Reading and listening) in China: A narrative review of the literature. International Journal of Information and Education Technology, 10(6), 466–470. https://doi.org/10.18178/ijiet.2020.10.6.1408.
Zhou, Z. (2017). The investigation of the English grammar learning strategy of high school students in China. Theory and Practice in Language Studies, 7(12), 1243–1248. https://doi.org/10.17507/tpls.0712.11.
The author declares that he has no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Windsor, R.J. The effectiveness of an online grammar study scheme for Chinese undergraduate students. Smart Learn. Environ. 8, 3 (2021). https://doi.org/10.1186/s40561-021-00147-w
- English grammar
- Chinese students
- Higher education