SamLR.com

WR duo

Posted: 22th September 2019

Two items today, an actual article and a news piece which I mostly want to talk.

Today's article: 21st century datacenter locations driven by 19th century politics by George Moore. History is something that I utterly failed to engage with as a subject until I was about 22. How I remember it being taught was with minimal reference to today. When I was 22 I started reading Bertrand Russell's History of Western Philosophy[1] which covered a lot of historical context, I think this was one of the first times I really appreciated how historical events can still impact the modern time and hence got me interested in history. Anyway, back to the article at hand: it's a much shorter read that Russell's door stop but is a lovely compact version of "this one weird thing from a hundred years ago is still having an effect today, even beyond its obvious impact". Also Russell's History is well worth reading too, if you have a month or so.

And now the news piece Artificial intelligence being used in schools to detect self-harm and bullying which I'm highlighting because it's stuck in my brain. tl;dr is that a company (Steer global) is being paid[2] for a psychological test system ("AS Tracking", they also offer "Footprints", "Usteer" and "CAS Tracking") to provide an AI[3] that claims to provide early warnings of bullying, pressure and self harm (all with an accuracies of between 77% and 88%).

I applaud the effort but I have a lot of problems with this, firstly that's not great accuracy (no idea what the false-positive rate is). Secondly, identifying at-risk children is most useful if the support is then given to help them (and piling more work onto teachers doesn't really count). Thirdly the evidence that Steer present for the actual usefulness of their product is pretty weak: in the MH case the AI was trained on 4,000 pupils and the actual "improvements to self harm risk" are based on only 13 pupils (similarly the grade improvements stuff is based on a study of 69 pupils). Finally there's the broader ethical argument against the whole thing in terms of, is permission being gained from the pupils? What sort of data control is being placed around this (you're linking potentially very personal information to people's full ID, I hope you have VERY good security) and frankly, is this actually more useful than more teachers or proven mental health provisioning?

[1]:I think I finished reading it about age 24, it's chunky. [back]

[2]: £25,500 for every 1,200 pupils (£21.25/pupil) not exactly cheap; it's enough to pay for a newly qualified teacher (even if it shouldn't be) [back]

[3]: In this case a support vector machine. Most of this information is taken from Thinking, straight or true?, Simon P. Walker (2015) which is long so I skimmed it, in this case I've gotten these mental health numbers from sections 5.13 and 5.14 [back]