Uploaded image for project: 'Moodle'
  1. Moodle
  2. MDL-65370

Provide tools to check for bias in analytics models

XMLWordPrintable

    • Icon: New Feature New Feature
    • Resolution: Won't Do
    • Icon: Minor Minor
    • None
    • 3.6
    • Analytics
    • None
    • MOODLE_36_STABLE

      Even when no information about minority status or other discriminated group membership is included explicitly in models, many models risk implicit bias because the indicators used have hidden relationships to sensitive data. Legislation is being introduced to require users of models to audit their models for bias and discrimination. (e.g. https://www.technologyreview.com/s/613310/congress-wants-to-protect-you-from-biased-algorithms-deepfakes-and-other-bad-ai/)

      This proposal is to create a special category of "validation indicators" which are never included directly in predictive models, but are used to audit models for bias in a separate validation process, e.g. as described in "Evaluating the fairness of predictive student models through slicing analysis", Josh Gardner, Christopher Brooks and Ryan Baker, LAK'19. This evaluation would be conducted during model training, before predictions are generated. If a model does not show equal predictive performance across sensitive subgroups, it is flagged during the evaluation process. Tools to review the performance of individual indicators (e.g. as described in MDL-62192) should also be provided.

            Unassigned Unassigned
            emdalton1 Elizabeth Dalton
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

              Created:
              Updated:
              Resolved:

                Error rendering 'clockify-timesheets-time-tracking-reports:timer-sidebar'. Please contact your Jira administrators.