From: Predicting Master’s students’ academic performance: an empirical study in Germany
Authors | Prediction type | Used features | Used algorithms | Results |
---|---|---|---|---|
Nghe et al. (2007) | Academic performance | Demographics (marital status, Gross National Income, age, gender), Pre-enrolment features (academic institute, entry GPA, English proficiency, TOEFLa score etc.) | C4.5; Bayesian Networks | They found that C4.5 performs better then Bayesian Networks. They also found that the prediction accuracy of 2 classes (pass and fail) is much higher than that of 3 or 4 classes. Their results also show that the highest accuracy is achieved for the largest class (“Very Good” students) |
Yadav et al. (2011) | Academic performance | Post-enrolment features (attendance, test grade, seminar grade, assignment grade, and lab work) | CART; ID3; C4.5 | They found that CART produced the best accuracy (56.25%) followed by ID3 (52.08%), then C4.5 (45.83%) |
Zimmermann et al. (2011) | Academic performance | Pre-enrolment features (undergraduate achievements) | Random Forest | They found that third year bachelor's achievements are more predictive than the first-year grades in predicting the master’s students’ GPA |
Zewotir et al. (2015) | Time to graduate or dropout | Demographics (race, gender, age, and financial aids); Post-enrolment features (major, and term records) | Survival analysis | They found that age and financial aids affect the prediction. However, gender does not. They also found that race has no effect on predicting dropouts. However, it influences the time it took to graduate. Moreover, students’ studying duration is affected by their major |
Badr et al. (2016) | Course grade | Post-enrollment features (English grade, and course grade) | CBA rule-generation | They found that CBA rule-generation produced an acceptable accuracy (between 62.75% to 67.33%) |
Calisir et al. (2016) | Academic performance | Demographics (gender and employment status)Pre-enrolment features (ALES scorea, English proficiency exam score, and undergraduate GPA) | Logistic Regression | They found that ALES score, English proficiency exam score, undergraduate GPA, and employment status are the features important for performing their predictions |
Jeno et al. (2018) | Academic performance & drop out of a degree | Post-enrollment features (controlled motivation, autonomous motivation competence, need-supportive teachers, and students' intrinsic aspirations) | Regression | They found that "autonomous motivation" and "perceived competence" positively predict academic achievement and negatively predict dropout intentions |
Abu Zohair (2019) | Course grade | Demographics (age) Pre-enrolment features (bachelor’s degree type and bachelor’s degree GPA) Post-enrollment features (course grades and instructors’ names) | Neural Networks; Naive Base; Support Vector Machine; K-nearest Neighbor; Linear Discriminant Analysis | They found that Support Vector Machine and Linear Discriminant Analysis perform the best in comparison with the rest of the classifiers |
Rotem et al. (2020) | Drop out of a degree | Demographics (Background features) Post-enrollment features (academic performance) | Logistic Regression | They found that using Logistic Regression can accurately predict academic failure and academic performance features predict dropping out better than background features |
Zhao et al. (2020) | Academic performance | Demographics (age, marital status, gender, citizenship, Pre-enrolment features (GRE gradeb, TOEFL grade, months since bachelor’s degree, previous GPA, previous major, previous school rank, previous school country, and previous school language) | Decision Tree; Support Vector Machine; Neural Networks; Naïve Base; K-nearest Neighbor; Ensemble Learner L; Random Forest; Logistic Regression | They found that Random Forest and the Ensemble Learner L achieved the two best overall predictive accuracy |