“For You” = especially for me?

“For You” – that was the tag on a present I recently received. It wasn’t giftwrapped and tied with a ribbon, but delivered to me digitally. It was a collection of music put together especially for me. Kind of like those mixtapes we used to make back in the pre-digital era.

Algo-Rhythm

“For You” was compiled specially for me by a crowd that calls themselves Algo-Rhythm. I think it might be a play on the Spanish word “algo”, meaning “something”, because it is indeed “something rhythmic”. Algo-Rhythm sent me several hours worth of new music that I hadn’t heard before, a selection based totally on the kind of music I like.

This musical gift wasn’t once-off. I get a new “For You” every week from this hard-working crew. It seems that Algo-Rhythm’s dedicated musical experts spend seven days trying to please me by curating bespoke music.

Look, they don’t always get it right. It’s not every one of their musical suggestions that hits the spot for me. But a lot of the new tunes they offer do turn out to be the kind of music I enjoy. Some of the selections have been revelations and have even become new favourites. Another advantage is that these compilations have led me to search for more songs and albums from many of the musicians and groups introduced to me.

I greatly appreciate all the effort that Algo-Rhythm puts into making recommendations for new music I may like… Whoops – wait a minute. Sorry, it seems I got the spelling wrong.

It’s not Algo-Rhythm, but Algorithm

And it’s not a bunch of people catering to my personal musical tastes, but a bunch of robots running computerized mathematical models based on my personal musical data.

So it’s algorithms that are used to prepare my weekly playlists. The robots do not spend all week on this, but run their calculations in nanoseconds. And not just for me – personalised playlists are apparently delivered each week to millions of music fans around the world under the label, “For You”.

Algorithms are used in ever more aspects of our lives, from driving cars to running political campaigns, and their use is coming under increasing criticism. One reason algorithms are seen as a threat relates to the fear that robots are taking over jobs traditionally done by humans.

There is also concern about the way that algorithms influence important personal decisions, e.g. in schools and banks. You might have thought that algorithms would be less biased than a school principal or banker. You would be wrong, according to Cathy O’Neil, who describes algorithms as “opinions embedded in mathematics”.

Weapons of Math Destruction

The title of her book is a parody of WMDs, Weapons of Mass Destruction. In Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, O’Neil blames algorithms for reinforcing prejudice and discrimination. She believes that numerical WMDs do harm because, like military WMDs, their use is high-impact and opaque. This means that large numbers of people are affected. And that both those doing the analysis and those being analysed are not aware of what is going on.

Such charges have given rise to new fields of Algorithmic Justice and Transparency. The ProPublica news group identified a major perpetrator of algorithmic injustice as Facebook. Its algorithms derive data from Likes and everything else that social media users happily Share about themselves. (Note to My New Old Self: try again to improve my Privacy Settings, on Facebook and all my other internet interactions.)

ProPublica’s investigation revealed that some of Facebook’s advertisers have been using categories based on “ethnic affinity” in their ads for housing, jobs and credit. A bit like the Whites Only and Blacks Only signs during the apartheid era in South Africa.

Are you at risk for Algorithmic Injustice?

Now maybe you’re thinking that this doesn’t apply to you as an older person who does not spend that much time online. Or that you can easily choose not to feed your personal data as fodder for algorithms that aim to sell you stuff.

Sorry to tell you that, unless you conduct your life from a cave, sans credit card, your Data Footprint is out there. I’m sure you’ve noticed that right after you do a Google search on X, your screen is flooded with advertisements for more than you ever wanted to know about X, including how you can buy X. You may be able to reduce your supplying of data for algorithmic purposes if you’re an ace at digital management, but few can escape this entirely.

These are some of the reasons why many activists are campaigning for algorithmic literacy to become a part of our basic education. Information Science expert David Lankes warns that there will soon be “a class of people who can use algorithms and a class used by algorithms”. I think I know which class I’ll be in.

Leave a Reply