My first introduction to algorithms via computers ruling the world was through Landru on the original Star Trek series (The Return of the Archons). In many ways, I preferred the Terminator movies. At least there was an enemy to shot, to fight. Landru was much more subversive, working quietly behind the scenes, eroding freedom and privacy, until nearly everyone was complicit in its rule.
It was the stuff of science fiction.
Still is. I’ve seen a book with the very name on my book feed just yesterday.
Except now, we see the signs of the algorithms becoming more science fact.
Google, Amazon, Yahoo, and facebook use algorithms. Anyone who searched online for a project, has seen the rise of the ads based on their search history. And then there’s the ever-so-helpful algorithm that finishes what you’re typing in the search bar. Time and again, I’ve heard programmers say they’re not sure how the algorithms work once they start learning in the ‘real’ world.
Of course, they’re not all bad. They can help match sellers and buyers by targeting those most interested in their products based on online activity.
But there’s a downside: In the 2016 elections in the US, ads featuring lynchings and other persecutions of people of color were promoted on specifically targeted users to discourage certain people from voting (and yes, the polling numbers reflect that it worked).
It has also been documented that social media algorithms will show users ever more violent/extreme posts based viewing history. The more you like the posts, the more violent/extreme they become. In other words, you’re indoctrinated by millimeters until you’re sucked in and they clog your feed.
The algorithms control what you see.
And what you don’t.
People have reported silence on their feed relating to the events in Ferguson, Missouri. Apparently, the algorithms didn’t think the stories would receive enough likes so they weren’t shown anything. At all.
Sadly, we didn’t need an algorithm for this kind of censorship to happen. When coal miners went on strike in the 1920s and 1930s in Matewan, no newspapers reported the murders, violence, and abuse by the company, its enforcers, and the local police. The result was the Battle of Matewan (Matewan Massacre) the most of the country remained happily ignorant of.
Similar to what happened to the Americans protesting the Dakato Access pipeline.
But let’s face it. Most folks don’t want to be bothered. They want their coal just like folks today want their gas. As long as they don’t pay the price, the others are just standing in the way of progress.
So what happens when the algorithms impact bleeds into reality in other ways? Like the judicial system where some places use it. The supposedly impartial algorithm brought with it racial biases and sentenced citizens of color to harsher sentences.
But we don’t really care about criminals, do we? They’re bad people, right?
The real story lies in the average Jane. Jane whose father has a pacemaker and whose online activity has become more and more violent/extreme until the algorithm deems him a threat to society. Why not have the computer tap into his pacemaker’s software and deal with the problem. Jane’s dad dies peacefully in his sleep. Society doesn’t have to incur the costs of a trial and incarceration.
There could also be a malfunction in the computer integrating a car’s system. A car could speed up around a curve, the steering wheel aims beyond the guard rails of a road with a steep drop.
And with the gas heater on a timer, some poor enemy of the state could live in a house were the pilot light is extinguished while gas flows killing all occupants in one night (genetics might reveal a code similar to other criminals, saving future tax payers money).
Then one day, the computer decides that humanity is too prone to violence and extreme behavior. And we have to go.
And those are the stories I like to read. I just hope they’re out of the news because they’re still fiction and not because the algorithm doesn’t want me to know about them.
Until next time!