It is not a mystery that on the internet we are watched and our data is stolen. The funny thing is that, despite this, people seem to say: “I know but who cares” and not talk about it among friends. However, users may become more aware of this espionage when they are browsing their social media, which stands out for having very good algorithms that are based on providing users with more personalized content.
And here comes the kit of the matter: although users know that their data is being stolen, they still feel comfortable on social media because they receive personalized content, helping them feel “normal”. This activity influences the user’s emotions even if he does not notice it directly, since it helps him feel, in some way, unique and normal. This allows us to feel identified with social media although deep down we know that it is wrong.
On the other hand, saying that someone feels normal in a social network can be seen as the opposite, i. e., that the person does not have the ability to discriminate and allows someone else to decide for him. But this is not something new, a German editor already said it in 1926: “They all surrender to American tastes, they conform, they become uniform” in reference to how American cinema influenced Germans. However, being “normal” in digital culture is not frowned upon since it resembles a time when the audience watched television programs where stories were told for life and people felt identified and could transfer it to their normal lives, thus being able to feel that they were not alone.
If we have to transfer this example to the digital age, we can say that there are websites where they try to imitate this feeling and such is the case of Expedia where you can book hotels. If you enter the website you will see that some notifications will appear next to you that says: “10 other people are looking at this hotel” or “10 people have already booked this hotel” as well as telling you to hurry up or you will run out of space. We will never know if there are also 10 other people seeing the same thing as us (and at the same time), but there is something clear here: this notification affects our unconscious mind so that we buy quickly and on the other hand, it gives us a feeling of company.
But not all spaces on the internet are the same. For the most part, social networks act subtly when it comes to personalization. Big data tries to recognize you in order to locate you in a specific “space” so that the information you see is of interest to you and thus, you can feel at home, i. e., comfortable. This personalization makes us think that we are unique more than normal for the internet.
With all this, the problem of personalization lies only in wanting to make us look for our authentic selves, i. e., to want to be more than normal. And for the personalization algorithms to work, we must first identify our emotions and preferences. There is no customization without the algorithm and this sentiment can be confusing to users. This issue was already addressed by Freud in his theory of personality where the human personality is the product of the struggle between our destructive impulses and the search for pleasure, i. e., everything is based on and around the “I”. Based on this, the sociologist Eva Illouz talks about emotional capitalism in which the utopias of happiness are mediated by consumption.
Nowadays, it is very easy to realize this emotional capitalism since platforms like Facebook allow us to classify our emotions in each post by selecting the “Like”, “haha,” “wow,” “sad,” or “angry”. In the same way it happens when we see publications that use emojis, the more emojis are seen, the more they influence our emotions and it is usually a decisive factor when interacting.
The thing is that people don’t realize that the user is not taken as an individual “I” but rather is seen as a marketing strategy where digital capitalism analyzes and collects our data and behavior. We could call this digital capitalism “surveillance capitalism” since it translates what people do into data. And this term is not new. Surveillance capitalism has developed rapidly since 2002, when Google noticed trends that allowed it to capitalize on user behavior. In turn, this data creates “prediction products” that anticipate human behavior. Surveillance capitalists use “instrumental” power to shape consumer behavior without consumers realizing it.
Your innermost thoughts, motives, intentions, needs, preferences, moods, and personality are food for someone else’s benefit. Your responses to sales pitches or social media posts create data, which analysts refine to generate improved prediction products.
On the other hand, we should start to learn that our individuality is more important than being normal although this can be complicated and not easy because these data technologies consider anything as permissible behavior hiding behind the promise that regardless of your identity, all are included. Those who do not feel identified with this, feel uncomfortable and trapped.
Big data allows everything to revolve around you and produces a division between the user himself and the user’s neighbors. But also, it considers me as a universal user that divides me demographically, i. e., a user with average income and who lives in the suburbs of a certain city. This is because the data synthesizes all the time the information of a user, which is compared with the information of another user. In this way, you are part of a city that thinks like you. This means that we are all globally related in a forced way. If you want to feel normal on the internet, that’s fine, there’s nothing wrong with that. But the internet could also offer spaces where we could stop being ourselves and although digital culture is based on the authentic user, it would also be good to try to escape from it.
Visit my Homepage for more