# Add behat tests for analytics model's editing options

XMLWordPrintable

#### Details

• Status: Closed
• Priority: Major
• Resolution: Fixed
• Affects Version/s: 3.4, 3.8
• Fix Version/s:
• Component/s:
• Labels:
• Testing Instructions:
Hide

### Prerequisites

1. Proper Behat setup.

### Test

1. Run the following command:

 vendor/bin/behat --config /path/to/your/CFG_behat_dataroot/behatrun/behat/behat.yml --tags="@core_analytics" 

2. Confirm the tests pass.
Show
Prerequisites Proper Behat setup . Test Run the following command: vendor /bin/behat --config /path/to/your/CFG_behat_dataroot/behatrun/behat/behat .yml --tags= "@core_analytics" Confirm the tests pass.
• Affected Branches:
MOODLE_34_STABLE, MOODLE_38_STABLE
• Fixed Branches:
MOODLE_38_STABLE
• Pull from Repository:
• Pull Master Branch:
MDL-58974-master
• Pull Master Diff URL:

#### Description

We need behat tests for the built-in models, it is not very quick to setup the environment to test models like students at risk of dropping out because we need finished courses and current courses, we need a significant amount of students activity... Having behat test could help not only by allowing us to automatically test this, but also to quicky set up the environment and develop against the behat site instead of the "production" (wwwroot) one.

Proposed approach to automatically test the behaviour without having to deal with complex context data preparations:

Step definitions:

We could have a analytics/tests/behat/behat_analytics.php with a Create model "MODELNAME" step where MODELNAME is a closed list of 2 testing models where each MODELNAME value is a method in behat_analytics class. We could have a couple of simple models, one static and one non-static, we can use the static one to test report_insights and the non-static one to test tool_models actions.

These models can use very simple core indicators like user set profile or user track forums (both in core_user) so it is easier to prepare the context data for them to work. We don't need to carefully test all internal API details, just the UI, it is not important how good the model is at predicting. As long as we use core indicators, a core analyser (analytics/classes/local/analyser/courses.php would be enough) and a core time splitting method (analytics/classes/local/time_splitting/no_splitting.php) we shouldn't need to add testing logic into production code to make this work (if I am not missing something).

Create model "MODELNAME" could:

• Prepare the MODELNAME context data (as said above not much, just enough to have samples with different predictions)
• Create a new model using a testing target (analytics/tests/fixtures/test_MODELNAME_target.php) and using basic core indicators. Code to create a model can be copied from admin/tool/models/db/install.php

Features:

"Manage models" -> tool_models features using a non-static model:

• Logged in as a manager
• Go to admin/tool/models/index.php and evaluate the testing model
• View the evaluation log (no need to check the model performance)
• Edit the model, add and remove some indicators
• Check that the changes were correctly applied
• Enable the model and select a time splitting range
• Execute "Get predictions" action
• Check that "Insights" columns lists contexts with available insights and you can select one of them

"View insights" -> report_insights features using a static model:

• Go to admin/tool/models/index.php and get predictions
• Use admin/tool/models/index.php insights select to view report/insights/insights.php?modelid=X&contextid=Y
• Check that the sample name is displayed correctly
• Check that the prediction outcome is correct
• View the prediction details
• Click on the suggested action and check that the user lands in the page specified by the testing model
• Log in as a teacher of one of the courses with insights
• Check that they receive the notification, that they can click on it and that it leads to the insights page of that model in that context

#### People

Assignee:
Víctor Déniz Falcón
Reporter:
David Monllaó
Peer reviewer:
David Monllaó
Integrator:
Tester:
CiBoT
Participants:
Component watchers:
Amaia Anabitarte, Carlos Escobedo, Ferran Recio, Ilya Tregubov, Sara Arjona (@sarjona)
0 Vote for this issue
Watchers:
4 Start watching this issue

#### Dates

Created:
Updated:
Resolved:
Fix Release Date:
18/Nov/19

#### Time Tracking

Estimated:
0m
Remaining:
0m
Logged:
3d 2h 38m