24.7 C
New York
Wednesday, August 10, 2022

Legal but Harmful: Why Networks Allow Harassment but Not Nipples

- Advertisement -
- Advertisement -

[ad_1]

A social network user consults news on a platform.syringes

It’s always been said that those of us who don’t get a good grade in medicine or fine arts get a law degree; a routine and boring race, but with “many exits”. I started studying law before Spain joined the European Union (EU), in the midst of a hairless crisis very similar to the one we are experiencing now. The fact that I am older than the black thread is no longer a secret for anyone. Back then there was no room for dreams. Parents didn’t see you as special or encourage you to have fun at work. In what mind could any of the Lord’s punishments for original sin be funny? What was due was done without much expectation of life. So I chose law as my first option with the intention of getting a state law practice, which I would be preparing for in the third year of a five-year career.

I was bored as a fox and didn’t study constitutional law or any other opposition. However, when I finished, I found the practice of law exciting and law, an infinite combination of social, technical and psychological factors that never ends. Regulating citizens’ social realities, expectations and needs in a time of extreme volatility is as exciting as it is complicated. We deal with old problems, but to such an extent that they become new. The rules we had to regulate the relationships of the same people are useless and the clashes with classic rights are a tangle difficult to unravel. Especially when corporate and economic interests appear in the relevant offices.

Disinformation, legal but harmful content is a problem we have been dealing with since Brexit and Donald Trump’s election as President of the United States. As the phenomenon became uncontrollable, heads turned to lawyers and politicians, demanding a solution to speech that, protected by freedom of speech, is clearly harmful. Something this intuitive has proven very difficult to regulate for various reasons. We assume diffuse classifications of what they are fake newswhat is misinformation, what is not convenient, or statements that, because they are not convenient or polite, are protected by freedom of speech. It wasn’t in the manuals how to protect society from the reinforcement that benefits the idiots, the wicked, or the lazy.

The UK is looking for a definition of ‘legal but harmful’: self-harm, bullying or eating disorders, but they’ve gone no further

Always able to find effective and imaginative solutions to complex problems due to the flexibility of their system, the Anglo-Saxons opened up the complicated melon of defining the “legal but harmful speech, (legal language but harmful) and ran into the same problem as everyone else. The British Project “Online Security Invoice”(the proposal for an Online Safety Act, a regulation that, if passed, would be the world’s toughest on content policing on-line) is confused as he tries to come up with a definition of what “legal but harmful” speech is. They’re aware that posts about self-harm, bullying, or eating disorders would fall under that definition, but they didn’t go any further.

It’s so complicated that the EU hasn’t even tried. That Digital Services Act (Digital Services Act or DSA, for its English acronym), approved in July, obliges major platforms and search engines to avoid any “abuse”. To do this, they must conduct a risk analysis of that abuse, take steps to mitigate that risk, and then verify everything with an independent third party. The risks are so general and difficult to assess (election misinformation or manipulation, cyber violence against women or harm to minors) that any analysis and assessment can do us good. What European lawmakers have paid particular attention to is the definition of what is “illegal content”: it will be what each member state says, and content can only be removed if it is illegal under national law, not in other countries.

In the US, they have chosen to protect minors, which is always easier and more rewarding. Senators Richard Blumenthal (Democrat of Connecticut) and Marsha Blackburn (Republican of Tennessee) introduced one invoice inspired by some of the DSA’s obligations, such as creating algorithmic transparency requirements, although they would only apply to minors, leaving adult users unprotected.

The debate is why there are no public places that are not subject to the interests of corporations that are outside the jurisdiction of their judges

All these complications to avoid being held responsible for the content they publish by service providers and search engines of the likes of Facebook, TikTok, Twitter or Google. These operators take refuge in the irresponsibility imposed on them by the Section 230 and the European regulations that copy it to allow the publication of anything without filter or moderation. At the beginning of the commercial Internet, in the 1990s, the principle of the librarian or bookseller was established, who does not need to know everything that he is selling. If a book is illegal, the book will be removed, but the bookstore will not be closed. However, if a newspaper’s editor publishes illegal content, the newspaper is responsible, except for these opinion polls, for which they are not responsible (and they are very good at it).

But we are not on the internet from routers Gargle. Lenders are outrageous profiteers who teach cars to drive themselves and are perfectly capable of reading the entire library. They already do this, applying the algorithm so as not to offer us the content we have chosen, but rather those that correspond to their economic interests and those of their customers who are not us. You are already doing editorial work from which it is impossible to escape from the absolute tranquility of not being responsible for anything. In fact, this editorial work is at the root of the problem.

Many think that lifting or eliminating the irresponsibility of these providers would mean that by protecting themselves from global claims, they would protect their interests by censoring left and right. It’s time to let go of hot towels. Our relationship with these companies is contractual. If we break the profanity rules and post a nipple or create the company’s Christmas video with a copyrighted song, our account will be banned in a flash or the content will be removed, but if we harass someone to make their life unbearable, they will move out of it Letter of Freedom of Speech. Because it’s free. The debate is not about whether blaming social networks and search engines for the content they already publish is an attack on freedom of expression, or whether it is technically possible, which it is. This is already done by prioritizing crap over other content. The debate is why there is no competition so that citizens can debate in public places that are not subject to the interests of company shareholders that are outside the jurisdiction of their judges.

It is not about, as the DSA says, “carefully weighing” whether timid and technical measures mean a restriction of freedom of expression, but about tackling the problem at the root. It is about applying the environmental principle Whoever pollutes pays.

You can follow EL PAÍS TECHNOLOGY in Facebook Y Twitter or sign up here to receive ours weekly newsletter.

reduced by 50 percent

Subscribe to continue reading

read without limits

[ad_2]
Source elpais.com

- Advertisement -

New Articles