Dog owner and writer LUCY ALEXANDER shares some practical tips on what to do if approached by an aggressive dog following the ‘CityNews’ cover article last month on the problem with Canberra’s dangerous dogs.
“Algorithm – Something programmers use when they don’t want to explain their code.” – Urban Dictionary
A FUNNY thing happened the other night at my place.
We were sitting on the couch wondering what we wanted to watch on Netflix. My girlfriend suggested I choose something, so I flicked the user account to mine and started browsing.
“Oh, no, you don’t! ” she shouted.
“We always watch the good shows on your account and now Netflix thinks I’m stupid.”
This is a bigger problem than you at first might think.
Netflix the company, of course, has few cares about the intelligence or otherwise of its viewers.
But the algorithms deep in the guts of its software make observations about viewers and try to use that information to suggest more shows we might like.
If my girlfriend’s account only ever accesses trashy reality shows (the sort of thing I might retire to another room to avoid) the software will only suggest more of the same.
Years ago, when I tried an early software-driven music service, I had the flipside of this problem.
That service kept asking me if I liked the music that was playing, in order to try and tailor a music experience for me. The problem was I’d self-report being happy with higher-brow material than I actually like. The result was playlists stuffed with Shostakovich and utterly lacking in DragonForce and Alestorm (the musical equivalent of trashy reality TV, one might say).
Modern services such as Spotify pay less attention to self-reporting and more to what we listen to without skipping.
This is all well and good if it was just limited to what music we hear and what TV shows we watch.
But machine learning, big data and the algorithms that drive them are intruding into more and more spheres of our life.
Facebook decides what news you want to see based on how you responded in the past to other items in your feed.
AirBnB shows you potential places to stay based on how likely its software thinks the owner of the apartment is to accept your application.
Around the world, fire brigades and food-safety inspectors are targetting their inspections algorithmically.
Police and intelligence agencies are tempted to profile this way as well.
An early failure after September 11, 2001, is instructive.
A bright spark decided to map out the phone calls of everyone involved. The idea being that people who had been called by multiple conspirators would be worthy of investigation themselves.
This was moderately disastrous for the purveyors of home-delivery felafel in their areas of operation.
Machine learning often operates in what are called “black boxes”, no one knows what assumptions are being made.
It’s bad enough if the police keep pulling you over because your skin is a particular colour.
It’s much worse if you’re continually the subject of government inspections because you share a single unknown characteristic with a number of otherwise unrelated miscreants.
There’s no appeal from the increasing computerised characterisations of ourselves.
Imagine, after that, what it’s like to come home from a long day of regulatory harassment to be offered a bunch of garbage on Netflix.
John Griffiths is the online editor of citynews.com.au