Everytime you listen to a song on Spotify, swipe right on Tinder, or buy something on Amazon, their algorithms learn more about your tastes and attempt to present you with a person or song they think you’ll like.

An algorithm is an artificial intelligence system that analyses your actions online and uses that information to personalise your online experiences.

How these digital platforms use their algorithms to collect or use data about you isn’t clear and that’s how companies like it - they don’t want us to understand how their algorithms work so they can make more money, a team of researchers from University of Auckland have found.

In a paper published by the Journal of the Royal Society of New Zealand in April, lawyers and music experts tried to better understand how the algorithms of Spotify and Tinder work.

Co-author Fabio Morreale said companies refuse to share details about their algorithms and are becoming “increasingly hostile in the face of academic research and criticism”. 

“Algorithms are incredibly influential. In a democratic system, having these companies working in an opaque way while being so influential is problematic,” Morreale said.

“It is most alarming with platforms like Facebook and Twitter guiding us towards particular news. They are essentially changing our personalities.”

International research has shown algorithms can influence your mood, political beliefs and sexual preferences.

Fabio and his team are one of many research groups around the world putting pressure on these companies to be more transparent about how their algorithms work and learning what little they can from public documents.

With no access to the algorithms, Fabio and his team analysed the terms and conditions and privacy policies of Spotify and Tinder to find any details available on these companies' algorithms.

They found that these documents were “ambiguous and lack detail” about how these companies' algorithms collect and use our data.

Without details about these systems, users cannot make informed decisions about if and how they want to engage with these platforms, the researchers said.

They also found that what people were being recommended by the algorithms could be changed based on commercial agreements. These commercial agreements are different across different platforms.

On Instagram, for example, commercial partnerships are highlighted as sponsored posts. 

But he said on some platforms sponsored content will be mixed in with options selected by the algorithm.

Fabio said letting researchers access these systems is only the first hurdle. They then need to understand them.

“A lot of the time the incredibly smart people who create these algorithms don’t even know how they work,” Fabio said.

This is known as black box Artificial Intelligence (AI). 

Engineers know how to create these systems and can get them to produce the result they want but they aren’t always exactly sure how AI does it. 

In the case of these algorithms, an engineer tells an AI it wants it to analyse hundreds of millions of actions from all the users. 

For example, Spotify’s AI might analyse every song listened to, every search made, when it is listened to, how long and when in the day you listen. Likely hundreds or thousands of different data points for each of its 422 million users.

The engineers can tell the AI to gather all of that data and produce recommendations based on those factors, but how the AI reaches those conclusions is so complicated that it is often impossible for people to recreate.

“The process isn’t human or necessarily understandable to us,” Fabio said. 

More stories:

Inside the club night that’s elevating fashion in Auckland

This Africa Day, we revisit our night at the best party in town.

Fight to lower voting age to 16 will head to Supreme Court

A date is yet to be set but Make It 16 will be heading to the Supreme Court.

Building a Sydney marae: Our culture must thrive in another land

“We are told from our people, we need this.”