Uploaded image for project: 'Moodle'
  1. Moodle
  2. MDL-57905

Provide a display of relative importance of features in engine

    XMLWordPrintable

    Details

    • Type: Improvement
    • Status: Open
    • Priority: Minor
    • Resolution: Unresolved
    • Affects Version/s: 3.3
    • Fix Version/s: None
    • Component/s: Analytics, Reports
    • Labels:
    • Affected Branches:
      MOODLE_33_STABLE

      Description

      The idea is to show something about the features being used in the model. It doesn't have to be highly detailed. The purpose is to make sure the model is "open," not a "black box" that no one understands.

      Examples (specific to certain algorithms):

      http://scikit-learn.org/stable/auto_examples/ensemble/plot_forest_importances.html#sphx-glr-auto-examples-ensemble-plot-forest-importances-py

      http://datascience.stackexchange.com/questions/14734/feature-selection-and-classification-accuracy-relation

        Attachments

          Issue Links

            Activity

              People

              Assignee:
              Unassigned Unassigned
              Reporter:
              emdalton1 Elizabeth Dalton
              Participants:
              Component watchers:
              Elizabeth Dalton, Amaia Anabitarte, Carlos Escobedo, Ferran Recio, Ilya Tregubov, Sara Arjona (@sarjona), Adrian Greeve, Jake Dallimore, Mathew May, Mihail Geshoski, Peter Dias
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

                Dates

                Created:
                Updated: