Italian Court Rules The Wikimedia Foundation Is Just A Hosting Provider For Wikipedia's Volunteer-Written ArticlesThe article is, ah, over-optimistic. This is what the article has from the Court:
In a ruling that provides strong protection for Wikipedia's community governance model, the Court once again recognized that the Wikimedia Foundation is a hosting provider, and that the volunteer editors and contributors create and control content on the Wikimedia projects. The Court also made clear that a general warning letter, without additional detail about the online location, unlawfulness, or the harmful nature of the content as recognized by a court, does not impose a removal obligation on a hosting provider like the Wikimedia Foundation.
... the Court took notice of Wikipedia's unique model of community-based content creation, and the mechanisms by which someone can suggest edits or additions to project content. It found that Wikipedia has a clear community procedure for content modification, which Mr. Previti should have used to address his concerns. He could have reached out to the volunteer editors, provided reliable sources, and suggested amendments to the article, instead of sending a general warning letter to the Foundation.
I agree with the ruling of the court, at least in part. It is not correct, however, that "Wikipedia has a clear community procedure for content modification." It has a sloppy, ad hoc, unreliable method for that. There is a fundamental problem which the court did not address, it appears. Suppose that there is a biased community, with anonymous members, prone to defamation of those it dislikes. This community elects members of the Board of Trustees for its "encyclopedia project," as a Foundation. If someone comes in who is seen as "not one of us" by the community -- on Wikipedia they call it "not here to build an encyclopedia" -- , if they object to the article, their edits are reverted and they are ignored or blocked. With copyright violation, the community polices that, but the Foundation, if it rejects a takedown order, can become liable for copyright violation. What about libel?
Suppose the wiki was Rightpedia, and the target of the article is Oliver D. Smith. If Rightpedia refuses to take down libel, does the owner of Rightpedia become liable? If not, then "mobs," ad hoc groups of faceless users, may harass and defame with targets having no effective recourse.
With the monkey selfie, the WMF decided to rule the copyright invalid, on an argument that has never been tested in court. It is often misstated, as claiming that the monkey holds the copyright, or can't hold copyright, but most who have argued about this miss that precedent is clear in copyright law: if one who arranges a photograph makes a significant contribution, they may be a co-owner (and under some conditions possibly a sole owner), and a co-owner may enforce copyright as if a sole owner.
In the Italian case, there appear to be technical details that led to the Court's conclusion. Had a demand been clearly stated and specific, as the Court wrote that it was not, would it then have been binding, assuming that the article was factually defamatory?
In the case I am considering, the content and community process is not at issue, as such. There are some users who may have personally defamed me, but the case with the Foundation is over WMF support for the defamation by issuing a public global ban, even though they don't state the reason.
There seems to be some background assumption that a "community" cannot commit libel, that it is immune. At what number of members does this immunity arise? Wikipedia has no due process, the Court was mistaken. There are suggested guidelines and even policies are not fixed, they are routinely disregarded. Absolutely, the WMF, with the "community" in routine control of content, is protected from a priori libel claims. But after there has been a specific, actionable complaint?
The goal of Wikipedia is to build an encyclopedia, the "sum of all human knowledge." Noble goal, but is libel "knowledge"? Is it important that there be a truly neutral process for vetting content? It is not difficult for skilled editors to create neutral content, even in the presence of their own opinions,
if neutrality is the goal. But often it is not the goal, rather presenting a slant on the available information, cherry-picking sources and even misrepresenting them, to promote a point of view -- which may be a majority point of view, sometimes -- is common.
Wikipedia process is incredibly inefficient, but it's a Chinese water-torture, one drop at a time. So it seems like full-on, inclusive consensus process, what it would take to find genuine consensus (degree of consensus in a broad population being the best measure of consensus) is too much work. Not "wiki," or "quick." And, of course, arguing with all those "POV-pushers," usually meaning "people we disagree with," would be a waste of time.
(Wikipedians in general have little clue about genuine consensus process. It is not about "arguments." It's about communication and seeking common ground and building from there. And it can have rules, particularly civility enforcement, but not the black or white draconians Wikipedian response to some angry comment, rather process that would be designed to regulate behavior and encourage the "assumption of good faith," instead of whacking someone for not making the assumption, which itself involves assumptions of bad faith. Wikipedia has long been a mess, a "difficult community" as one functionary told me ... but the difficulties arise from the structure, or lack of structure and coherent leadership. Lord of the Flies is one of the ways that an ad hoc community can go, if protections are not built in.)
The Foundation is responsible for overseeing the community, by default, because it enables the community and we are responsible for what we enable, and so it represents the community in responsibility for overall community behavior. I see exempting the WMF from liability for its own actions -- and inactions upon notification --, on a pure service provider basis, as unconscionable, risking the creation of an unregulated monster, free to malign and defame, serving uncontrolled agendas. The WMF, by hosting, allows relative anonymization of users, making it more difficult to hold users accountable.
It is already done through OTRS, with copyright. There is little difference in principle with libel.
The claim that filters would be necessary was bogus. (But filters are already used for some kinds of content, and the global blacklist has been abused for content control). Rather, an article would not simply be removed once. If a subject is notable, the WMF might require some kind of arbitration -- real arbitration, not the phony "which side is right" arbitration of the Arbitration Committee), and might, while the article is being worked on, blank it or replace it with a notice, full protected, and with strict enforcement.