Landing : Athabascau University

Bigotry and learning analytics

http://boingboing.net/2016/05/24/algorithmic-risk-assessment-h.html

Unsurprisingly, when you use averages to make decisions about actions concerning individual people, they reinforce biases. This is exactly the basis of bigotry, racism, sexism and a host of other well-known evils, so programming such bias into analytics software is beyond a bad idea. This article describes how algorithmic systems are used to help make decisions about things like bail and sentencing in courts. Though race is not explicitly taken into account, correlates like poverty and acquaintance with people that have police records are included. In a perfectly vicious circle, the system reinforces biases over time. To make matters worse, this particular system uses secret algorithms, so there is no accountability and not much of a feedback loop to improve them if they are in error.

This matters to educators because this is very similar to what much learning analytics does too (there are exceptions, especially when used solely for research purposes). It looks at past activity, however that is measured, compares it to more or less discriminatory averages or similar aggregates of other learners' past activity, and then attempts to guide future behaviour of individuals (teachers or students) based on the differences. This latter step is where things can go badly wrong, but there would be little point in doing it otherwise. The better examples inform rather than adapt, allowing a human intermediary to make decisions, but that's exactly what the algorithmic risk assessment described in the article does too and it is just as risky. The worst examples attempt to directly guide learners, sometimes adapting content to suit their perceived needs. This is a terribly dangerous idea.

Comments

  • Richard Huntrods May 29, 2016 - 10:18am

    Too right, Jon. In many ways, past experience is often the *WORST* predictor of future performance, because it often ignores motivation.

    Example: in Grade 9 my "teacher" lost my math mid-term (how I'll never understand). So he gave me "52% because you're probably mediocre". I was so angry that I determined to show the jerk and got 82% on the year-end departmentals. I also went on to get high 90's in my high school math courses.

    Never underestimate the "I'll show you" effect. :-)

  • Kyle Loree May 31, 2016 - 12:14am

    To quote Shawn Achor from his TED talk1, "If I asked a question like, "How fast can a child learn how to read in a classroom?" scientists change the answer to "How fast does the average child learn how to read in that classroom?" and we tailor the class towards the average."  This ignores those above and below the curve. 

    I suspect that the issues surrounding analytics and their impact on our lives will become more pronounced as AI use increases.  There will be many cases where an algorithm comes to the wrong conclusion.  How can a clear feedback loop be identified to verify and correct misdecision?

    Richard, I like the "I'll show you" effect.  I'm going to use that one.  I've had similar personal experiences as well.

    Cheers,
    Kyle

    1 https://www.ted.com/talks/shawn_achor_the_happy_secret_to_better_work?language=en