WMF are idiots!
WMF idiot wrote:
It is one of the most beloved websites in the world, as well as one of the most trusted sources for up-to-date knowledge about COVID-19. All of this is only made possible by laws that protect its volunteer-led model. But now, that people-powered model is getting caught in the cross-fires of the DSA proposals.
Well it's a good thing they have tireless volunteers like Ottawahitech working to deliver up-to-date information. OH WAIT,
they blocked him. I should really get around to invite to this forum sometimes. He was attacked by the mighty Beeb of Wikipediocracy fame.
WMF wrote:
Here are four things policymakers should know before finalizing the DSA legislation:
The DSA needs to address the algorithmic systems and business models that drive the harms caused by illegal content.
DSA provisions remain overly-focused on removing content through prescriptive content removal processes. The reality is that removing all illegal content from the internet as soon as it appears is as daunting as any effort to prevent and eliminate all crime in the physical world. Given that the European Union is committed to protecting human rights online and offline, lawmakers should focus on the primary cause of widespread harm online: systems that amplify and spread illegal content.
A safer internet is only possible if DSA provisions address the targeted advertising business model that drives the spread of illegal content. As the Facebook whistleblower Frances Haugen emphasized in her recent testimony in Brussels, the algorithms driving profits for ad-placements are also at the root of the problem that the DSA is seeking to address. New regulation should focus on these mechanisms that maximize the reach and impact of illegal content.
Fair enough. Surprised they did not address that already.
WMF wrote:
But lawmakers should not be overly focused on Facebook and similar platforms. As a non-profit website, Wikipedia is available for free to everyone, without ads, and without tracking reader behavior.
Well until you ask a basic question and some idiot like Bbb23 CUs you.
WMF wrote:
Our volunteer-led, collaborative model of content production and governance helps ensure that content on Wikipedia is neutral and reliable. Thousands of editors deliberate, debate, and work together to decide what information gets included and how it is presented. This works very differently than the centralized systems that lean on algorithms to both share information in a way that maximizes engagement, and to moderate potentially illegal or harmful content.
Until the 'right opinion' is (((discovered))) and everyone with 'wrong opinion' is blocked.
WMF wrote:
empowering users to share and debate facts is a powerful means to combat the use of the internet by hoaxers, foreign influence operators, and extremists. It is imperative that new legislation like the DSA fosters space for a variety of web platforms, commercial and noncommercial, to thrive.
HA!! Wikipedia's history is RIFE with hoaxes, foreign influence operators, and extremists. AND STILL IS. Right now there are probably at least one hundred hoaxes on enwiki alone. And state-backed idiots continue to edit war to their death. And they KNOW about them, but they protect by willing corrupt admins and arbs. Until they eventually get blocked and replaced by other state actors when ArbCom is run by a different set of corrupt idiots.
“Wikipedia has shown that it is possible to create healthy online environments that are resilient against disinformation and manipulation. Through nuance and context, Wikipedia offers a model that works well to address the intricacies required in content moderation. Yes, there might be disagreement amongst volunteers on how to present a topic, but that discussion yields better, more neutral, and reliable articles. This process is what has enabled it to be one of the most successful content moderation models in this day and age.”
You have been blocked indefinitely. --Bbb23 (talk)
Appeal declined and talk page access disabled. --331dot (talk)
UTRS appeal declined -- Yamla
idiot wrote:
Terms of service should be transparent and equitable, but regulators should not be overly-prescriptive in determining how they are created and enforced.
AKA "if are terms are clear, we can't be as corrupt."
If they had to do this, they couldn't ban people like Abd for no reason and win a court case.
WMF wrote:
The draft DSA’s Article 12 currently states that an online provider has to disclose its terms of service—its rules and tools for content moderation— and that they must be enforced “in a diligent, objective, and proportionate manner.” We agree that terms of service should be as transparent and equitable as possible. However, the words “objective” and “proportionate” leave room for an open, vague interpretation. We sympathize with the intent, which is to make companies’ content moderation processes less arbitrary and opaque. But forcing platforms to be “objective” about terms of service violations would have unintended consequences. Such language could potentially lead to enforcement that would make it impossible for community-governed platforms like Wikipedia to use volunteer-driven, collaborative processes to create new rules and enforce existing ones that take context and origin of all content appropriately into account.
Lies, lies, and more lies. They don't people transparently, they just gang-up on people they don't like and harass them. If they had to explain why in court, they might need to actually enforce rules consistently which would mean (((awful))) things like Kumioko being an admin or Boing! said Zebedee being banned.
WMF wrote:
The policies for content and conduct on Wikipedia are developed and enforced by the people contributing to Wikipedia themselves. This model allows people who know about a topic to determine what content should exist on the site and how that content should be maintained, based on established neutrality and reliable sourcing rules. This model, while imperfect, keeps Wikipedia neutral and reliable. As more people engage in the editorial process of debating, fact-checking, and adding information, Wikipedia articles tend to become more neutral. What’s more, volunteers’ deliberation, decisions, and enforcement actions are publicly documented on the website.
Literally the opposite,
professors or anyone else knowledgeable will usually just get blocked by some idiot like Bbb23 for violating RS, IDHT, or edit warring.
WMF garbage wrote:
This approach to content creation and governance is a far-cry from the top-down power structure of the commercial platforms that DSA provisions target. The DSA should protect and promote spaces on the web that allow for open collaboration instead of forcing Wikipedia to conform to a top-down model.
{{ArbComBlock}}
"Foundation Global Ban - do not reinstate. Questions can be directed to
ca@wikimedia.org)"
Savior complex: the foundation wrote:
Article 14 states that online platforms will be responsible for removing any illegal content that might be uploaded by users, once the platforms have been notified of that illegal content. It also states that platforms will be responsible for creating mechanisms that make it possible for users to alert platform providers of illegal content. These provisions tend to only speak to one type of platform: those with centralized content moderation systems, where users have limited ability to participate in decisions over content, and moderation instead tends to fall on a singular body run by the platform. It is unclear how platforms that fall outside this archetype will be affected by the final versions of these provisions.
"ECP talk page for grammar changes by (((trolls)))."
WMF wrote:
People cannot be replaced with algorithms when it comes to moderating content.
ClueBot NG
WMF wrote:
On Wikipedia, machine learning tools are used as an aid, not a replacement for human-led content moderation. These tools operate transparently on Wikipedia, and volunteers have the final say in what actions machine learning tools might suggest. As we have seen, putting more decision-making power into the hands of Wikipedia readers and editors makes the site more robust and reliable.
So they can't be used due to errors and bias, so we use them and biased humans? Makes sense according to WMF idiots.