One question that seems simple to answer, but given the complexity of digital relationships and interactions, can have many meanings. For you, what is fake?I believe that any type of paid boosting or the purchase of followers can constitute a manipulation of reality.
If the current influence indicator is the number of likes, would influence that was acquired through purchase be legitimate?Whenever I see someone reaching the long-awaited goal of one million followers, I wonder what percentage is actually human and the result of spontaneous interactions?
What would philosophers Aristotle, Socrates, and Plato think about the change in people’s behavior on social media, aiming to please algorithms and gain likes?
What would Freud and Jung’s analysis be on the act of buying followers to “seem more perceived” than one actually is?It is not an exaggeration to say that we are facing a new type of “digital opium” of the masses. In the algorithmic society, an interesting point of reflection is about neutrality.
It is difficult to find a neutral position when an algorithm pursues a goal for a result. We need to deal with its consequences, especially if there is misuse, excess, or abuse. In every free society, along with freedom, comes responsibility hand in hand.
An example that illustrates the advancement of this manipulated digital reality are the images captured by British photographer Jack Latham as he portrays the illicit industry of “click farms” in Vietnam (photo above).
Over the course of a month in 2023, the photographer documented these unknown operations that artificially inflate traffic to forge user interactions, manipulate digital algorithms, and, of course, increase the profit of digital platforms and their influencers.
Through very cheap labor, thousands of likes, comments, and shares are generated for individuals and corporations worldwideIt is not difficult to think that the technique has already been used for political propaganda and the dissemination of misinformation during elections around the world.
It is important to collectively question the authenticity of what appears to be popular. With free access to digital media, a fundamental question emerges: can anyone be whatever they want – even if it’s a lie?We are exposed to the risks of a “clickbait culture”, including the distortions that accompany this new situation.
One of the first signs that we live in at least a strange situation is when we identify ads for “buying followers” and “buying likes”.
The impression is that a person with the economic power to make these investments can transform their credibility on social networks.As everything is related to industry, if we understand that it is a legitimate business model, then the question hovers over transparency.
After all, the public has the right to know if they are interacting with people or machines (posing as people) on the other side of the screen. It is more about the right to choose, based on clear information, than prohibition.But a point that always challenges everyone is: “who watches the watchers”?
Is there too much power in the hands of a few? After all, there is a high risk of some level of manipulation in the use of algorithms within platforms that can distort not only reality but also guide opinion.
And this, for sure, is a threat to freedom and democracy, as already addressed by Cathy O’Neil in her work “Weapons of Math Destruction”.
Therefore, the debate on how to establish clear rules that can predict limits between what is lawful and unlawful is so relevant. Where does abuse begin and what are the specific and distinct penalties, proportional to each of the players involved, according to their participation and responsibility?
Transparency mechanisms on social networks are fundamental for the sustainability of the technology itself and help prevent harm. Just as collaborative, proactive, and preventive action by platforms is necessary.
We are in an algorithmic technological society, in which there is a huge dependence on big tech companies, essential players within this ecosystem.Only with the joint participation of Civil Society, the State, and the market will it be possible to develop a digital and robotic society that is ethical, safe, and sustainable.
For the sake of the community, business leaders and technology pioneers need to be concerned with developing responsible technologies that can be used constructively and peacefully, thus together taking humanity to another level.* Patricia Peck is CEO and founding partner of Peck Advogados.
With 46 published books, she is a Digital Law professor at ESPM. Appointed advisor to CnCiber (three-year term). She was a Titular Counselor of the National Data Protection Council (CNPD).”