Tuesday 6 February 2018

The "Facebook" Effect: Just Status Quo

The "Facebook" Effect: Just Status Quo

Facebook has an algorithmic formula for society; a system where personalized and emotionally sensitive data is categorized and sold to advertisers. A system that methodologically treats the human element, human lives, as a limited resource to be traded and bartered with. Rejecting this sort of system is the intuitive human response, but, as in many cases, the intuitive human response is one of fear. Fear is usually wrong.


Of course, there is the inevitable argument to be had that pits maximizing overall human happiness with the potential loss of freedom or personality. We see Facebook as a shadowy figure, silently micromanaging our online experience, taking most of the control out of our hands explicitly for profit. Like most people, we have an inveterate desire to acknowledge the presence of control in our own lives. It makes us feel empowered; it makes us feel truly conscious. The more influence we have, the more conscious we feel.

What's in our best interest? It might be to set aside how we feel about our interests altogether. Just because Facebook manages our interests for us, doesn't mean our interests don't exist. Facebook isn't stealing anything away from us that we already have, just, as some feel, what we might potentially see.

The internet serves two very primary functions. It services our desire to be entertained, and it services our desire to be informed. In modern times, it might be necessary to combine the two, although, at what expense.

Facebook manipulates what we see into what they think we want to see

This algebraic recommendation process might not align with what we actually want. The alternative to their guiding hand sorting out all the junk for us, is to drown in it all. In this respect, they're right. The substantial conflict here is, simply, that it works too well.

The news you see on a timeline feed caters to you. That means your sports teams, your political parties, your television shows. You only see what is relevant to you; this is what maximizing human happiness is all about. There is a clear bubble effect that happens. We are only presented with information that agrees with our own, already established identities; conformation bias.

I would argue that without these algorithms, we would still actively seek out information that confirms our beliefs, and ignore information that we don't agree with. It would just take significantly longer. There are other benefits of analytics. The news you see on a timeline feed caters to your location, your friends (the ones who are better than just ordinary acquaintances) and your interests, even if they are just some form of confirmation bias.

We have a desire to be entertained, and a desire to be informed. In modern times, it might be necessary to combine the two. But we shouldn't worry about the mixture. Facebook can do that for us.

No comments:

Post a Comment