Social Media and the fragility of freedom
Social media are sexist and racist by default. That’s one of the conclusions of a Social Sorting Experiment, organised yesterday evening by Radboud Reflects. ‘The problem is that we don’t have an overview of all the information we are feeding into the system.’
It’s a regular Thursday evening on campus, a quarter of an hour before eight. The time of day right after the last lectures and workgroups have finished, most offices have closed and the last trays at the Refter have been cleared away. It’s getting empty on campus while some fifty-odd people are entering theatre hall C. They have come to participate in an experiment.
The so-called Social Sorting Experiment, to be precise. This experiment, organised by Radboud Reflects and executed by organisation The Smartphone Orchestra, is supposed to bring the concept of social media to life. It doesn’t need much. A numbered pattern on the ground and a vast amount of phone screens with participants attached to them. That is, however, enough to establish a chilling atmosphere.
Once the experiment has begun, it appears everyone has stepped out of the night-time campus and into a Black Mirror episode. People are standing on a numbered spot in the pattern so close to each other that there is no room left for a comfortable amount of personal space. And that’s exactly the point. People are supposed to be uncomfortably close to each other, while alternating between bending their heads towards the phone in wait for instructions and observing peers, merely centimeters away, that they are supposed to rate.
The experiment is physically just as uncomfortable as its digital counterparts the likes of Instagram or Facebook should make everyone feel. After all, they are based on the same concept: a constant rating of first impressions that is fed into the net of big data. A net that, then again, presents content based on preferences established through earlier ratings. It’s a psychological questionnaire everyone is filling out constantly. The largest data collection of them all, used to reinforce likes of the past.
’Why is that a problem?’, asks one of the participants afterwards. The problem, according to Fleur Jongepier, a professor specialised in cyber ethics and one of the speakers invited to talk after the experiment, is that we don’t have an overview of all the information we are feeding into this system. A system running on algorithms that, Jongepier says, are not neutral, because the people who built it are neither.
These biases, Rob Holland, professor for social psychology, who has been invited to speak as well, says, are very human. They help us function in everyday life. They can, however, also result in prejudice and harm if not consciously corrected in specific situations. Something, Jongepier emphasises, algorithms aren’t doing, making them sexist and racist by default. And those algorithms are guiding the information we receive and, ultimately, influence the decisions we make.
The event was meant to raise awareness for the problem, but it also shows that we are as far away from a solution as ever. Social media is part of a dystopian future in the making, but it is also undoubtedly a huge part of our lives. And while the data fed into it and the information we receive accordingly is influencing important societal notions, it becomes increasingly more difficult to cut loose from the networks.
A fight like that, says Jongepier, is almost impossible to be championed by individuals. Not many have the time to file lawsuits against Facebook and it is hard to communicate in 2020 without the use of WhatsApp. According to the professor, lawmakers should take responsibility for the matter, making sure that these companies cannot collect and sell data to the degree that they are doing right now. And that as soon as possible.
The experiment is an event organised in the light of seventy-five years of freedom. And it will not produce the actions appropriate to the danger of its topic. People will go home, will check their Instagram feeds and write WhatsApp messages, while governments won’t change over night and neither will Facebook. We will continue to feed data into algorithms that will continue to enforce biases accordingly. ’It’s not all bad.’, as Jongepier said at the end of the night, but it most certainly is a sign that many appear to have forgotten how fragile freedom can be.