-
Improvement
-
Resolution: Won't Do
-
Minor
-
None
-
3.5
-
None
-
MOODLE_35_STABLE
Currently the accuracy of models is reported only in the Log action menu. Model accuracy is a key aspect of learning analytics, and needs to be displayed more prominently. Accuracy varies with time-splitting methods, but only one time-splitting method can be active for a model at a time, so the accuracy level for the currently used time-splitting method should be displayed on the list of models. More detail should be available about the accuracy of each model, as well.
One simple way to display this information is by using a ROC curve plotting the true positive rate of the model predictions against the false positive rate: https://en.wikipedia.org/wiki/Receiver_operating_characteristic
Display the following details for each model:
- Contingency distribution (graph)
- ROC curve (graph)
- Area under curve (number)
- Contingency table and diagnostics: (note that most of these are simple calculations of a small number of basic values)
- True positive n
- Power
- Recall/Sensitivity (TPR)
- False positive n
- Type I error
- Fall-out (FPR)
- False negative n
- Type II error
- Miss rate (FNR)
- True negative n
- Specificity (TNR)
- Predicted positive n
- Prevalence
- Precision (PPV)
- LR+
- Predicted negative n
- False omission (FOR)
- LR-
- Condition positive n
- Condition negative n
- Total n
- Diagnostics:
- Accuracy
- False discovery rate
- Negative prediction value
- DIagnostic odds ratio
- F1 score
- Detail of indicators:
- Selected indicator ROC curves (graph) with control to select how many indicators to display
- Model effect size (standardized)
- Indicator graphs:
- Indicator significance (p-value)
- Standardized effect sizes with confidence intervals for all indicators
Alternative charts, such as the detection error tradeoff graph or zROC could be used as the header graph instead of a ROC. Usability studies should determine which kind of display would be easiest for most users to understand and use in decision-making.
Not that the ROC curve is only applicable for binary predictions, but at present these are the only prediction models supported by Moodle learning analytics. This could be extended to REC or RROC curves once we extend support for other types of predictions.
- has a non-specific relationship to
-
MDL-62521 Access to learning analytics models should be role-based
- Closed
- has been marked as being related by
-
MDL-65370 Provide tools to check for bias in analytics models
- Closed
-
MDL-62443 Provide tools to conduct forensics on analytics when prediction fails
- Closed
- will be (partly) resolved by
-
MDL-62302 Improve analytics models display and administration
- Closed
- will help resolve
-
MDL-64434 Provide service to collect and distribute learning analytics models and training data sets
- Closed