Jump to content

User talk:Jimbo Wales

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Silver seren (talk | contribs) at 19:23, 20 June 2012 (WMF demands control of Wikinews.com domain: Reply). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.


(Manual archive list)

Fox news

Fox news on porn, Wikipedia:[1] Albacore (talk) 20:10, 6 June 2012 (UTC)[reply]

The culture of Wiki is such that a brouhaha would ensue to actually effect the change. Would require an iron clad policy backed up by penalties for its abrogation. Cmt - I.e., it's called "institutional wp:BIAS." For example, I cast a !vote for the deletion of "Mooseknuckle" and this, along with the eventual deletion of this partucular wp:DICDEF article, was deemed completely non-controversial. But, dare say, when I immed. thereafter thought to nom "Camel toe" for deletion, all hell broke loose, the vast majority of !voters making an insecure child's tantrum at the prospect of being deprived of a favorite teddy bear seem mild (accusing me of every type of nefarious motives imaginable: disrupting WP to make a point, ad infinitum, with but one or two commenters limiting their arguments to any actual point at hand. But, whatever.--Hodgdon's secret garden (talk) 21:25, 6 June 2012 (UTC)[reply]
Well, one thing that we could do might not bee too hard. I got this idea from the NOINDEX debate. How hard would it be to apply a similar tag, which NOINDEXES a page from search with SafeSearch (or similar) on, but is accessible to a search engine with SafeSearch off? - Jorgath (talk) (contribs) 21:43, 6 June 2012 (UTC)[reply]
There was no consensus to permit "tagging" of images. I suspect that Wikipedia will have to "force consensus" to meet up with all the new statutes being enacted, however. Collect (talk) 21:57, 6 June 2012 (UTC)[reply]
Yeah, this might have to be a "directive from the Foundation to meet Florida law" or somesuch. - Jorgath (talk) (contribs) 22:11, 6 June 2012 (UTC)[reply]
The problem is that there is no neutral way to mark content as "safe". It is extremely viewpoint dependent and in the worst case our personal view of the world or what the audience might expect. While "safe" sounds nice as word, it actually divides the content in "safe" and the opposite of safe (try to imagine suitable words for yourself). This second part feels like and is some kind of discrimination, since it imposes the personal view of the "tagger" onto the audience, which applies to both directions. --/人 ‿‿ 人\ 署名の宣言 22:25, 6 June 2012 (UTC)[reply]
It seems fairly implausible to me that a child could "stumble" upon a "vast repository of pornographic images"... I've managed to rack up 12,000-odd edits and viewed countless articles yet have very rarely come across images that are sexual in nature or depict nudity. Of course, if a child types in "penis" or "sex" then they're going to find some frank, illustrative material. I wonder what Fox News' definition of "pornographic" is? ŞůṜīΣĻ¹98¹Speak 23:02, 6 June 2012 (UTC)[reply]
Well, the day a kid looking for free pics of, say, Skittles candy at the Commons hits the 4th image down, that will sure be one interesting student-teacher conversation. Tarc (talk) 00:44, 7 June 2012 (UTC)[reply]
There's all these references to Commons, but how many of our readers actually go to Commons? I find it unlikely that very many do that haven't been sent there from some sort of news article discussing this very "controversy". Now, can you stop mentioning Commons, a separate Wikimedia site, and actually discuss Wikipedia? SilverserenC 00:52, 7 June 2012 (UTC)[reply]
I don't know, Tarc has a point about File:Skittles SoftCore.jpg. It isn't helping us build an encyclopedia. Viriditas (talk) 00:58, 7 June 2012 (UTC)[reply]
The purpose of Commons is to provide a repository of educational resources in general, not just to help Wikipedia. Dcoetzee 01:10, 7 June 2012 (UTC)[reply]
What is educational about a young lady pouring skittles on her tattooed groin? I can see the image being used for artistic purposes, which is really entertainiment, not education, in this context. So, it's an entertaining image, but educational? I don't think so. Viriditas (talk) 01:13, 7 June 2012 (UTC)[reply]
I didn't look at the image until now. I would vote to delete this particular image as out-of-scope, it's a non-notable work of art and I struggle to imagine an educational purpose. Dcoetzee 01:14, 7 June 2012 (UTC)[reply]
It isn't educational in the slightest, but the Commons regulars bloc-vote to keep such things citing it being "different" or "unique", because uptil now the universe has been deprived of seeing sugar-coated candy on a woman's vagoo. The same reason why there are 50+ different pictures of penises, because each represents a unique and varied view of the World of Cock. And Seren, kindly stick a cork in it. Don't presume to instruct others on what they can and cannot discuss. Tarc (talk) 01:17, 7 June 2012 (UTC)[reply]
I was asking you to discuss Wikipedia, as that's what Suriel was talking about in terms of searching for sex-related articles, which has nothing to do with Commons. (And, yes, that image should be deleted unless there's actually an article it's relevant to and used in.) SilverserenC 05:33, 7 June 2012 (UTC)[reply]
Indeed. The Fox article specifically states "Wikipedia" in the title, "Wikipedia" and "encyclopedia" in the first paragraph and justifies quoting Sanger due to him being "Wikipedia co-founder". ŞůṜīΣĻ¹98¹Speak 07:50, 7 June 2012 (UTC)[reply]
Commons should be very inclusive and permit a wide range of content, even that with little obvious use to the random passerby. The mere fact that people here object to this particular image, not as dull and pointless but as some sort of offense, should be a clue that it has some social significance worth thinking about further. We see here a simple but colorful snapshot reminiscent of an old tradition, in which the classic Judaic analogy of sex as a forbidden fruit has been updated to reflect our substitution of sugar, corn syrup and hydrogenated palm oil for the fruits of Eden and the imagination. Truly, it is the Skittles, not the "softcore", that poses the risk to the children. Wnt (talk) 12:08, 7 June 2012 (UTC)[reply]
Fox are specifically criticizing Wikipedia not the Commons, which is a different project with a wholly different mission. The Commons is not a point of reference or a place where "children would carry out research" (per the article) because it's simply a collection of free media. ŞůṜīΣĻ¹98¹Speak 12:37, 7 June 2012 (UTC)[reply]
That is an inaccurate statement. The media has historically conflated all wikimedia projects under one "Wikipedia" banner. When Fox News has reported in this in the past, they have indeed spelled out issues at the Commons Tarc (talk) 13:05, 7 June 2012 (UTC)[reply]
Commons might have it's issues, but what you mean is that you have a problem with content that you dislike. Thats all. --/人 ‿‿ 人\ 署名の宣言 14:36, 7 June 2012 (UTC)[reply]
Niabot, as you are one of the main problem users and porn peddlers at the Commons, as well as one of the shameless supporters of the now-banned Beta M, your opinion of the matter is beyond irrelevant. Tarc (talk) 17:15, 7 June 2012 (UTC)[reply]
Thank you for your usual near personal attack, out of place comments if someone has a different opinion on a topic as you have. --/人 ‿‿ 人\ 署名の宣言 21:34, 7 June 2012 (UTC)[reply]
Tarc, this ongoing censorship debate has created the backdrop for the disputes which have in part led to the MBisanz/Fae ArbCom case. One of the questions SirFozzie asked there is "Have the involved parties in this case attempted to use affiliations of other users as a platform to dismiss or discredit their views?" Some of your fellow critics of Fae seem to be saying that it is improper for someone to bring up that another editor is discussing someone in unflattering terms on Wikipedia Review or Encyclopedia Dramatica. Would you say that same courtesy should be extended to those who simply hold a certain policy opinion on Commons? Wnt (talk) 21:45, 7 June 2012 (UTC)[reply]
No, the same courtesy should not be extended. Commons is a cesspool which the Commons regulars have proven completely unwilling (a very different thing from not able) to clean up on their own. They could not even ban a convicted pedophile on their own a few months back, it necessitated an office action. These people are not entitled to their opinions, much less courtesy regarding them, no. Tarc (talk) 00:28, 8 June 2012 (UTC)[reply]
That isn't true. Commons was presented with some vague allegations about the beliefs of a poster, together with a dubious identification by name, by someone who I've seen cry wolf before. It wasn't clear any of it was true or relevant. It took some time for the participants in the discussion to work through and get to the truth of the matter - and of course, at the very moment that we started to see something to worry about, WMF stepped in decisively, quite possibly with additional private information still unknown to the people on Commons. That doesn't mean there's anything at all wrong with Commons, just that the WMF can take action faster than a community discussion when things look serious. Wnt (talk) 01:29, 8 June 2012 (UTC)[reply]
That wasn't the case. All we knew at that time were some rumors without any real facts that backed them up. It wasn't even clear that the account and the real life person were identical. Even now not all needed facts are present or public. Thats why nobody was willing to do anything, following basic principles like Presumption of innocence. But even if we had proof that the person/account was found guilty in the past; what is achieved by blocking the account, if the person in question can just create a new account and continue under a new unknown identity? Overall this is an even worse scenario as it previously was. Now you wonder why people can't decide if to block someone? --/人 ‿‿ 人\ 署名の宣言 06:20, 8 June 2012 (UTC)[reply]
Nice history rewrite there kids. John lilburne (talk) 13:05, 8 June 2012 (UTC)[reply]
"The media has historically conflated all wikimedia projects under one "Wikipedia" banner." – That doesn't make Fox's article any more accurate. ŞůṜīΣĻ¹98¹Speak 09:57, 8 June 2012 (UTC)[reply]
Fox News here makes a vague reference to Children's Internet Protection Act, a law in the U.S. requiring libraries receiving "E-Rate" funds to use censorware to block random internet porn unless adults or people doing research request otherwise. Note that according to that article, within four years 1/3 of libraries chose to reject this significant subsidy, rather than suffer this affront to principles of freedom of press and inquiry. It's not like libraries have a huge pool of funds, either. Note also that using such censorware means that porn from whatever source, including Wikipedia, would be blocked to these children, so what exactly would a Wikipedia block be supposed to accomplish? Doing parents' job for them, because only Fox News knows how to raise their children? Their reporter also failed to understand that the proposed image filter would not block anyone who wanted the images, of any age, from obtaining them, as Wikipedia still fails to require a photo ID from editors when they click the Create A New Account button. Imagine that... an encyclopedia that believes that your right to knowledge shouldn't flow from having your national ID card in good standing. Truly subversive, that is. Wnt (talk) 00:20, 7 June 2012 (UTC)[reply]
Folks, you are all being trolled silly. Robert Greenwald exposed this nonsense in Fox Attacks: Decency and "Fox News Porn" in 2007.[2][3] Fox "News" is in no position to criticize Wikipedia for hosting sexual content when, according to Greenwald, Fox "News" has a long and sordid history of distributing it on their own network channel 24/7.[4] Viriditas (talk) 00:34, 7 June 2012 (UTC)[reply]
Fox News has its issues, but I myself support the idea of some kind of filter on WMF projects' adult content. Cla68 (talk) 00:41, 7 June 2012 (UTC)[reply]
That's how repressive regimes begin. First you start with the sexual content that offends people, then you move on to the religious content, and finally, the political content. Funny how it's always the people screaming "freedom" and "liberty" the loudest who are always trying to curtail it. The facts show that Wikipedia is an encyclopedia that covers some topics and subjects that might concern sexual content. As good people who only want the best for this site, we hope such content is conveyed with a respectful and reasoned approach, in an educational manner and with an eye on informing readers and improving access to knowledge. Nothing about this statement says that we must cover all subjects, just that it should be relevant and informative to human knowledge. Can Fox "News" say the same? No, they cannot, and more importantly, will not, because their primary impetus is not to inform and educate but to disinform and promote ignorance. More to the point, they sexualize the content they report in a demeaning and gratuitous manner, so much so, that many people would call Fox "news porn". Viriditas (talk) 00:51, 7 June 2012 (UTC)[reply]
The cinema has had age ratings for decades. It has not affected the ability of people on any end of the political spectrum to make movies one way or the other. Wikipedians are the only ones who see their ability to show the most bizarre types of porn to children as somehow inextricably linked with human freedom. And most of them don't even have children. JN466 19:23, 7 June 2012 (UTC)[reply]
Nonsense. The heavy hand of the censor is brutally apparent in American movies under this ostensibly "voluntary" system. Think of how many films didn't dare to show even a purely romantic same-sex kiss until just a few years ago, and the impact that this had on youth already facing significant persecution. There have been an appalling number of suicides by teens who just couldn't take the constant wearing down. Censorship doesn't just make for bad movies, it kills people. It is palpably and deliberately evil. Wnt (talk) 21:59, 7 June 2012 (UTC)[reply]
Agreed. Now, can someone explain why the people who are always trying to censor sexual content have no interest whatsoever in censoring violence? Why is it unacceptable to use a dirty word or show a breast, but perfectly acceptable to point a gun at someone, threaten to kill them, and then, using realistic special effects, show damage to the human body and dramatize emotional and physical trauma? In other words, why are we arguing about sexuality and pleasure, when threats to commit violence and the depiction of violent imagery have the greatest social harm? If someone can answer this glaring contradiction, I would be most grateful. Viriditas (talk) 22:08, 7 June 2012 (UTC)[reply]
Both of youse, Wnt and Viriditas are into some heavy Goodwin's Law territory. Somehow, unless Common is allowed to show low quality photos of people sticking toothbrushes into all sorts of places, people will kill themselves. Seriously? And teen-suicides are all about the fact that there's a NC-17 rating? And then there's the whole red-herring of violence... how is this exactly related? Unless you're talking about stuff like the crappy misogynist Donkey Punch video which the same group of Commons admins fought to keep and insert into Wikipedia articles. What the hell does it have to do with the topic? Way to derail the subject. And welcome to planet Insane.VolunteerMarek 23:44, 7 June 2012 (UTC)[reply]
I'll say again - if you cite Godwin's law when no one else has mentioned Hitler, you're the first to mention Hitler (by reference) which makes you the loser of a Usenet argument. Besides, Mike Godwin is, alas, not the WMF counsel anymore. What I cited was not a comparison to Naziism, only an example of one of the many ways that censorship kills people. Not a stretch, not hyperbole, but a commonplace. Censorship killed people in the 1980s when TV stations were too "moral" to run condom ads, and even the Surgeon General was being daring to mention the word. Censorship killed people when protesters were infiltrated and disrupted from stopping the war in Vietnam or the terrorist attacks on Nicaragua. It kills people when bestgore.com is threatened with an obscenity prosecution for the crime of catching Luka Magnotta, and people stop talking about the horrors of the world.
Wnt, just shut up. If you really think that the existence of a NC-17 rating in movies "brutally kills people" or something then your opinion really has no place in intelligent discourse. And now you're going off on some crazy tangents about Vietnam and Nicaragua and comparing the obviously horrible things that happened there to ... lack of ads for condoms during the 1980's. Why do normal, reasonable, common sense, constructive Wikipedia editors have to put up with this bat shit crazy stuff and why are reasonable proposals (and ones supported by both Jimbo and WMF - just so we're clear here) held hostage to nutzoids like you? And people wonder about the dismal editor retention on this project.VolunteerMarek 02:00, 8 June 2012 (UTC)[reply]
Never mind Godwin. If you want a Godwin, I'll give you something beyond a Godwin, an answer to Viriditas about why it is important to protect the right to view violence, often even more than the right to view sexual matter. I present you with a modern day Christian martyr, a man who surely shall stand beside Perpetua in the Kingdom of Heaven. I present you with that quite possibly "obscene" site, a truly terrible video, though it was freely broadcast in Egypt: [5] And I say this: what is most remarkable about this video is not the blood, or the severing of the vertebrae, but the calm and resolute faith of the man, even as his life is so brutally ended, his willingness to refuse even to make a token white lie of recantation and acceptance to Islam. Never mind the powers of rules and knives; the power of belief, see that power conquer all, in our world and the next. Wnt (talk) 01:48, 8 June 2012 (UTC)[reply]
What the hell are you talking about? Please at least try to be coherent. It's not possible to have a conversation otherwise.VolunteerMarek 02:00, 8 June 2012 (UTC)[reply]
At this point, Wnt's words are just trolling-tinged hysterics. This is what the Commons porn crowd does; whenever anyone tries to make reasonable suggestions questioning the need to host dozens upon dozens of penis pics and dozens upon dozens of boob pics or the Skittles-on-vagina or whatever, they scream censorship and just derail the conversation. Tarc (talk) 03:07, 8 June 2012 (UTC)[reply]
Observation: Most of the porn on Commons originates from Flickr which has a filter. Go figure. John lilburne (talk) 06:53, 8 June 2012 (UTC)[reply]
Yep. Even funnier, German accounts can't see the restricted Flickr stuff at all, because German law is incredibly strict and requires an Age Verification System for these media, which Flickr don't have. But if you transfer restricted Flickr media to Wikimedia, Germans can see them too. :) Just another way in which Wikimedia is helping. --JN466 04:28, 9 June 2012 (UTC)[reply]
You cry that we justify all with censorship. But interestingly Larry seams to be quite a censor if it comes to comments on the video that tell the audience the truth about it: [6] Clearly pure trolling and canvasing campaign since i see quite a lot of people from WR in this discussion. ^^ --/人 ‿‿ 人\ 署名の宣言 07:05, 8 June 2012 (UTC)[reply]

Here is an example of what happens when Wikipedia censors:

File:Scunthorpe problem.png

Scunthorpe problem

--Guy Macon (talk) 02:51, 7 June 2012 (UTC)[reply]

Wow! Thanks, now I learned something today AND was highly amused! - Jorgath (talk) (contribs) 04:34, 7 June 2012 (UTC)[reply]
To get all Wikipedian on you, WP:OTHERSTUFFEXISTS is not a valid argument. Also, those other Internet places which host porn don't try to portray themselves as altruistic, educational charities. Part of it is just that it's the hypocrisy that grates.VolunteerMarek 23:47, 7 June 2012 (UTC)[reply]
    • I'm not really seeing how that addresses what we're talking about. Obviously it isn't the best place, but that it has a place at all is the problem. Tarc (talk) 18:23, 7 June 2012 (UTC)[reply]
      • The argument is threefold; first, the benefit of censoring Wikipedia is small. If censoring Wikipedia actually kept kids away from porn, I would be more inclined to overlook the downside. Second, there is a downside. Already in the discussion above I am seeing calls to get rid of the skittles image - an image that is not even close to being pornographic. Third, I don't trust the censors. I remember well the time that X-Stop -- software that was marketed to libraries as blocking only legally obscene material -- blocked www.quaker.org because the owners of the software had a religious disagreement with Quaker pacifism. --Guy Macon (talk) 00:00, 8 June 2012 (UTC)[reply]
Well, even to start with, we are not talking about censoring Wikipedia or anything here (and this is even ignoring the fact that the word "censor" is inapplicable in this context - we do "censor" vandals, trolls, POV pushers etc. all the time and there's no reason to think that this is any different - it's called 'editorial judgment'), just not forcing readers to have someone's shitty homemade pron thrown in their faces. So... back up a bit and make an argument that is actually relevant to the discussion.VolunteerMarek 00:18, 8 June 2012 (UTC)[reply]


YouTube video

Larry made a little YouTube video ... Does Wikipedia have a porn problem? Dad investigates..

To see hardcore porn in Wikipedia, look no further than A Free Ride, or try the top result here, or try this, or this, or this. NSFW – and some of it could land you in jail, depending on what country you're in. You have been warned! --JN466 16:26, 7 June 2012 (UTC)[reply]

Again, doing a Multimedia search is much the same as Commons, in that i'm quite sure very few of our readers use it, if even know what it means or how to use it. I know i've never used the multimedia tab in the search, since i'm looking for an article, not anything else. If i'm looking for an image, i'm going to use Google. And if i'm determined to find it on Wikipedia for some reason, then i'm going to go to whatever relevant article is related to the image I want and look for the image in that article. And, i'm sorry, but why would anyone be searching for Devoirs on English Wikipedia? Let alone an image search of it? And what exactly do you think should be done with A Free Ride, a famous, historic pornographic film? SilverserenC 19:38, 7 June 2012 (UTC)[reply]
Commons' search function is one of the most used pages in Commons (one of the few that isn't about sex). As for devoirs, people might be looking for it on French Wikipedia, might they not? At any rate, the video somehow finds tens of thousands of viewers: [7], [8], etc.
Practically everyone in the world accepts age restrictions on bizarre porn. Only Wikimedia pretends that this is somehow some sort of novel idea, created as a special and inequitable imposition on Wikimedia, that threatens the survival of the civilised world – as though that survival depended on people's ability to upload their sex-tourism porn made in Thailand and images of their inflated scrotums anonymously, 18 USC 2257 record-keeping requirements be damned.
Wikimedia collects its donations on the strength of free education for everyone, especially the proverbial little girl in Africa, or the little girl in Peru or Brazil. WMF's PR work cites statements like the following:

"Wikipedia helps me teach my children about the world in a safe, clean and trustworthy manner. Free from bias, banter, commercial interests and risky content. I know I can count on Wikipedia to give well documented answers on almost ever subject imaginable. It truly has become one of the most intriguing successes of the internet!"

"We are a family that live in the interior of Brazil in a very poor state. We have opened a learning center and work with local children from nearby villages. Wikipedia is INVALUABLE for this work. The knowledge available to them on Wikipedia is a thread of contact with the 'outside world' and empowers them!"

When it comes to the question of donations, all the lofty PR talk is about helping underprivileged children. But when it comes to their beloved porn, suddenly Wikimedians say none of their projects are for children. It is pure hypocrisy! And judging by the realities of page views, the curation of Wikimedia's media collections is governed by the desire of rich, white, first-world exhibitionists and perverts to upload photos and videos of themselves wanking, of the genitals of prostitutes they exploited on their sex holiday, or to watch out-of-copyright dog-on-nun sex. It is not governed by the needs of the little poster girl in Africa, or South America.
Jking has just told me that he has put up a page on 18 USC 2257 record-keeping requirements in Meta.
This says exactly what Jimbo was saying two years ago, before he was shouted down by the mob in Commons: that uploaders, admins and editors handling any kind of sexually explicit material whose creation involved real people engaged in sexually explicit conduct may be, personally and individually, criminally liable for failure to keep records on models' age, name and consent. This is what Jimmy said two years ago in Commons:

I would say that images that would trigger 2257 record keeping requirements are the obvious starting point. I know there is some question as to whether the 2257 requirements apply to the Foundation (apparently not), but they may very well apply to the uploader. But that's not really the point. The point of 2257 in our context is that it does provide a reasonably well-understood and objective "line" beyond which we do not go.

I do not mean to imply that if an image doesn't cross the 2257 line, it's ok. A suitably tasteful image of nudity helpful in a medical/instructional context is not the issue. "Homemade pr0nz" as Alison put it, is just stupid and should go.--Jimbo Wales (talk) 10:30, 6 May 2010 (UTC)

The main reason I'm using 2257 as a starting point is not that it applies to the Wikimedia Foundation (opinions may vary about that, but it really isn't relevant to my rationale). I like it because it is a clear definition that is pretty easy to judge. It's a starting point. Using it as a starting point has the added benefit of removing *all* legal risk to the Foundation, *and* to uploaders here. But that isn't really the point. The point is that it is a clear rule, written by someone else, that we've used and discussed a lot in the past, so people can generally understand it.--Jimbo Wales (talk) 16:22, 6 May 2010 (UTC)

That is exactly the conclusion that is now in Meta, following this discussion on Philippe's talk page. The Foundation is not legally liable, but every admin and editor involved in handling sexually explicit material may be personally liable for failure to possess and maintain records documenting age and consent. JN466 20:50, 7 June 2012 (UTC)[reply]
That is, if they're Americans, and if they are subject to "theoretical" prosecutions, and if they decide to respect an unconstitutional law. Wnt (talk) 21:55, 7 June 2012 (UTC)[reply]
So non-Americans don't need to give explicit permission? I guess it's OK to exploit them then. --SB_Johnny | talk00:21, 8 June 2012 (UTC)[reply]
According to the new Terms of Use, all users, regardless of where they are, are legally responsible for all of their contributions, edits, and re-use of Wikimedia content under the laws of the United States of America and other applicable laws (which may include the laws where they live or where they view or edit content), and may not use the services in a manner that is inconsistent with applicable law. Participation in Wikimedia projects is predicated on accepting the Terms of Use. JN466 10:03, 8 June 2012 (UTC)[reply]
TLDR. File:Skittles SoftCore.jpg and the similar media deserve to be deleted under the consideration that such image can be recreated easily at anytime by anyone if there's ever an article in any Wikimedia sister project demands it. This additional rule can help us to deal with such OBJECTIONABLE media more effectively. -- Sameboat - 同舟 (talk) 03:26, 8 June 2012 (UTC)[reply]
Agreed, but the contention is that Commons regulars tend to block such deletions and default to keep rather than delete. The burden needs to be on those who argue that we need to keep such an image, not those who argue for deletion. Viriditas (talk) 03:49, 8 June 2012 (UTC)[reply]
Is it me, or does it seem like Larry Sanger in his video is talking down to us, as in a sort of patronizing way, or like a parent would talk to a child? But since I'm not in with the crowd that refuses to accept how much explicit content damages our prime objective, I can say I like it. Score one for Larry. Though we are not censored, and therefore we should keep images depicting innocent nudity, there's no need for most of the usual exhibitionism we see on Commons, which is entirely gratuitous and therefore unnececessary. -Stevertigo (t | c) 07:50, 8 June 2012 (UTC)[reply]
The problem is not that we have such images. The problem is that the search delivers them somehow at the first spot (pure description text search), even so the shown examples are constructed by searching for "bad images", reading the description, grabbing the keywords and putting them into the search until a word gives the expected result, showing a "bad image" at best at the top of the results. It is no coincidence that he used exactly the same words as Jayen466 did half a year ago, repeating them again and again in every discussion, to make a point. I proposed an improvement for the search engine at Commons. But so far no one has picked up the idea.
It is also very interesting that Larry would ask his kid to search for fisting if he doesn't know what it is. ^^ --/人 ‿‿ 人\ 署名の宣言 08:01, 8 June 2012 (UTC)[reply]
Niabot wrote: The problem is not that we have such images. The problem is that the search delivers them somehow at the first spot. No, you are incorrect. There may be some way for a filter to determine which usage of "cucumber" is porn-free, even with "cucumber" being in the description of the file, but that's besides the point. An intelligent discussion would have to deal with the image itself and we thus ask ourselves the question if the image is actually helping advance our prime objective. Does having an image of a woman with a cucumber in her twat advance our primary mission? -Stevertigo (t | c) 08:13, 8 June 2012 (UTC)[reply]
Yes it does. Even sex toys or the use of vegetables as a sex toy is part of our culture, should be explained, and is subject of common knowledge. Even images that are thought to be replaceable at any time, at the time they were created, are today subject for research. Additionally you should remind that Commons is a repository for freely licensed images that could be used by anybody for various purposes, not just for Wikipedia itself. -- /人 ‿‿ 人\ 署名の宣言 08:45, 8 June 2012 (UTC)[reply]
So according to that logic, should we also have a picture of a woman with a cucumber in her anus (肛門)? Would this advance the so-called "research" you referred to? The anus is a part of our body, so is sticking things into it automatically to be regarded as a part of our culture also? When a website shows a picture of some kind of anal insertion, is it always for "research" purposes, or is it sometimes done for prurient reasons?
Additionally you should remember that Commons is attached at the hip to Wikipedia, and its Wikipedia's prime objective of providing a free global resource that trumps Common's idea of being a catch-all. I agree that not all Commons material needs to be encyclopedic, but things sort of got out of hand when we allowed exhibitionists and pornographers to have free reign. -Stevertigo (t | c) 09:52, 8 June 2012 (UTC)[reply]
I'm aware of an incident in which a contributor to a Wikimedia project used a photo of a penis (on-wiki) and was contacted (again, on-wiki) by the alleged creator of said photo. The message was along the lines of "I just wanted you to know that is my penis in that picture". ŞůṜīΣĻ¹98¹Speak 10:08, 8 June 2012 (UTC)[reply]
You will find that Commons porn patrol will argue that Commons should have multiple such images, and I kid not: 1) because each one will show a different angle of insertion, 2) will have different lighting, 3) will have a different girth, 4) will have a different length of penetration, 5) may be better to fapp to. John lilburne (talk) 10:25, 8 June 2012 (UTC)[reply]
I gave up on arguing with people on Commons years ago (and more recently with the, er, "gifted" editors on Wikinews). At some point, the Foundation will need to understand that several of its sister projects have gone rogue and are no longer representing its best interests. Viriditas (talk) 11:07, 8 June 2012 (UTC)[reply]
Wikipedia is not just for "rich, white, first-world exhibitionists and perverts", who do not deserve to be denigrated for their ethnic or sexual identity; it is also not just for the girl in Brazil. It should be all things to all people, and it can be, provided we focus our efforts toward building resources, rather than allowing a destructive phenomenon of censorship that will result in people tearing down each others' work in the lust for political power, a looter's agenda to dominate and seize this common resource so many people together have created, to break this wild and free creation to saddle and bit as the private asset of those with one particular agenda. Wnt (talk) 11:48, 8 June 2012 (UTC)[reply]
Nice try wNT. I've restored a bunch of penis and vulva images from the articles after they've been hidden by some random newbie months ago. Now that I'm a politician who supports the agenda of censorship because I said no to the unused media which can be re-created by exhibitionists so easily. Media without historic significance or immediate usefulness is not resource in my book. Thank you. -- Sameboat - 同舟 (talk) 12:06, 8 June 2012 (UTC)[reply]
Eliminating some particular low-quality image is not the same as censorship as Fox News, Sanger, and some of their friends here would have it. Even so, we must be careful that those decisions are being made for the right reasons, and not in deference to their wishes.
Wnt, I don't support any form of censorship, but arguing with a Commons admin or regular user over there is a waste of time because they live in their own fantasy world and cannot be reasoned with in the first place. Most of the arguments made by the badsites crowd here have a kernel of truth. While it is fun to talk about, in reality people just can't do what they want in some kind of libertarian anarchist paradise. That place only exists in the minds of teenagers and in the streets of Somalia. Viriditas (talk) 12:32, 8 June 2012 (UTC)[reply]
Why do you say they "just can't do what they want"? Commons has been doing what it wants, and that has generated a truly remarkable resource, a massive treasure trove of content. Sure, the gold and diamonds are stowed away with the occasional old leather boot and cracked jar of pickles, but I don't trust those guys with the sacks at the gate looking to haul away the "trash" for us.
Remember that Commons emerged right here in America, as a celebration of our values. Don't forget that anarchy started here - it's what our revolution was called, what was copied in 1789 and 1848 and only much later interpreted, in less effective forms, by the offspring of the First International. We're the country of Henry David Thoreau and Lysander Spooner; the very first political May Day was our Haymarket Square protest. America is the place where the workers stood up for their 40-hour week and their right to unionize and got the good wages and the time off to do cool projects on the Internet. So don't tell me that this successful, ongoing, triumph of our values is something that can only exist in "Somalia". Wnt (talk) 13:12, 8 June 2012 (UTC)[reply]
Anarchy did not start "here." In fact, the American Revolution was about the "rule of law" and not remotely based on "anarchy." When making such over-reaching claims, be careful of what the claim is - when it is too far afield, others will notice. Collect (talk) 13:19, 8 June 2012 (UTC)[reply]
"Shall we plunge at once into anarchy, and reject all accommodation with a Government (by the confession of the wisest men in Europe, the freest and the noblest Government on the records of history,) because there are imperfections in it, as there are in all things, and in all men?" -- Letter from a Virginia to the Members of Congress at Philadelphia, 1774 (unsigned - but by Wnt apparently)
Um -- a polemic from a person vehemently opposed to the American Revolution is not a really strong source for what the supporters of the American Revolution sought. Tiy klleft out some of the "good stuff" such as Can we seriously hope that a great Nation, a proud Nation, will be insulted and degraded with impunity by her Colonies and that the Americans are virtually represented represented in the British government and can not assert taxation without representation <g>, that the American Revolution is a war which must begin where wars commonly end, in the ruin of our trade, in the surrender of our ports and capitals, in the misery of thousands. Then assert that the supporters of the American Revolution were founded in "anarchy" <g>. Sorry -- your post does not show any such thing at all. Collect (talk) 14:24, 8 June 2012 (UTC)[reply]
If inexplicable comparisons between the Commons and the Declaration of Independence have been made then Godwin's law will doubtlessly be invoked soon. But I think that perfectly illustrates the major problem with governence of the Commons - there are an intensely-vocal group who believe that any suggestion to remove questionable material constitutes a vile persecution of their civil liberties and a descent into Nazism fascism. ŞůṜīΣĻ¹98¹Speak 13:54, 8 June 2012 (UTC)[reply]
You people say Commons can't possibly be the way it is. You suggest it is un-American. I show it is the way it is because America, surprising as this may be, actually does have some traditions of supporting freedom. Then you complain that I disagree with you. Wnt (talk) 14:05, 8 June 2012 (UTC)[reply]
Here's the thing, though: contrary to popular opinion, "freedom" can't be forced on people. True freedom isn't unidirectional, it's being able to freely choose to have access to something and to be free from it. It's really about personal responsibility, and whenever we feel less "free", it's because we've either given up our own personal responsibility, or it has been taken from us by force. You have a number of editors in this thread alone saying they want to be free from having to view the images. And, they want their children to be free from having to view them as well. Viriditas (talk) 14:18, 8 June 2012 (UTC)[reply]
They are free to install a "net-nanny". They are free to set up a free project to make a free net-nanny, or indeed (alas) can expect such a 'service' free from many libraries and schools. They are free to copy Wikipedia and take out the stuff they don't like. The only thing I don't want them to be free to do is to tell other contributors that they're not free to upload and use content of interest to them, solely on account of its type of content. Wnt (talk) 14:25, 8 June 2012 (UTC)[reply]

Wnt says: They are free to install a "net-nanny". They are free to set up a free project to make a free net-nanny, or indeed (alas) can expect such a 'service' free from many libraries and schools. They are free to copy Wikipedia and take out the stuff they don't like. - It would be hard to come up with worse answers for the porn problem than these. Installing net-nanny and "copy[ing] Wikipedia" are not reasonable solutions. The reasonable solution would be to remove any material that's exhibitionist and prurient, and get back to the job of writing an encyclopedia that's available to everyone. -Stevertigo (t | c) 20:48, 8 June 2012 (UTC)[reply]

  • Good luck getting Skittles Softcore deleted at Commons. Administrators close debates and several of them are porn hobbyists. anybody who has spent any time at all over there knows who they are... If you've got another, better Skittles Softcore image or if this one is slightly out of focus, you might have a chance of a Delete result. Outside of that, their Commons Mission trumps the encyclopedia's mission and common sense. Carrite (talk) 18:00, 9 June 2012 (UTC)[reply]
We differ on opinions of "reasonable", where the serious informational resources of Wikipedia and Commons are concerned, but in all this we have gotten away from Sanger's main focus on Simple Wikipedia, which is perhaps more problematic. I don't pay attention to it because I never got the point of it, but it does have an identity crisis. The emphasis on children on its main page is indeed potentially misleading, when the "Multimedia" search puts you straight through to Wikimedia Commons, and when, more importantly, like any WMF project, it's free for anyone to edit. That's no safe haven for kids in any case. It's my understanding that a true "safe for children" site requires intensive community monitoring, not just a few admins you can call if there's trouble, but literally somebody watching every post in every chatroom and somebody watching them also. Wnt (talk) 18:08, 9 June 2012 (UTC)[reply]

Wikimedia's fundraising "stories"

The stories here – http://wikimediafoundation.org/wiki/Stories2/en – are linked from pages like http://wikimediafoundation.org/wiki/Ways_to_Give/en and many emphasise how useful Wikipedia can be to underprivileged children. These are just some examples:

  • We are a family that live in the interior of Brazil in a very poor state. We have opened a learning center and work with local children from nearby villages. Wikipedia is INVALUABLE for this work. The knowledge available to them on Wikipedia is a thread of contact with the 'outside world' and empowers them!
  • Wikipedia helps me teach my children about the world in a safe, clean and trustworthy manner. Free from bias, banter, commercial interests and risky content. I know I can count on Wikipedia to give well documented answers on almost ever subject imaginable. It truly has become one of the most intriguing successes of the internet!
  • What I'm trying to say is that Wikipedia has provided children with a supplemented education outside of the formal school system with the freedom to learn what they find interesting. Wikipedia is helping to create a better future across the globe.
  • Wikipedia has been a wonderful recourse for my children and me to learn new terms, knowledge, and culture background as an immigrant family. It is a safe and trustworthy website for children to do their research. I especially moved by the spirits of all the volunteers around the world to make this happen.
  • Thanks to websites like 'Wikipedia', children of all ages can continue their endeavor in learning. Kudos Wikipedia on creating a human interface that allows us all to teach and be taught! The future is NOW!
  • For my children the ability to get information in Serbian is also an extremely useful tool.
  • Sometimes at bedtime, for a few years now, we pick a subject of interest to our young children and look the subject up on Wikipedia. It almost never fails to interest and leads us to other subjects. We think it leads to a greater curiosity about the world and a way to find answers.
  • I am happy that my children have the likes of Wikipedia and that my granddaughter will be able to use its ever-growing base of knowledge... a true living book to take one one on an ever-expanding universe of knowledge.
  • It's amazing wealth of knowledge benefits me as an adult, and I can't even begin to imagine the impact it will have on children who, like myself, are enamored with learning about the world around them. You can't put a price on making that learning possible.
  • Wikipedia is like an enormous virtual research library that can be employed in an instant. It's a resource that didn't exist in my childhood, and I am delighted with how it will enrich my own children's lives.
  • I'm an English teacher. My job is to spread knowledge and language and to give people the skills to communicate globally. Wikipedia is invaluable to me and my students. Having the combined knowledge of millions at our fingertips lends a whole new dimension to my classes. It allows my students and I to explore any topic at the drop of a hat and to learn and grow outside the limits of musty old textbooks. Without Wikipedia, I am certain that I would not be the teacher I am today.
  • Our grandchildren use Wikipedia for general info on various topics, both here and at their home computer. Thank you!

Yet when the topic of a controversial content filter is raised, people opposed to any kind of filter on Wikipedia will typically refer to the

  • "ridiculous proposition that Wikipedia claims to be child-friendly" ... "Wikipedia is NOT an all-ages-appropriate encyclopedia." ... "the people who should be responsible parents should block Wikimedia on there home network" ... "Were I a parent, I would place stiff restrictions on my kids' use of the internet, including Wikipedia" ..."it's the parents responsibility to monitor what the kids are looking at." ... "Parents must parent!"

If you reply,

  • "Well, what about children whose parents have told them that Wikipedia is a good site to surf on?"

then the same people will ask you,

  • "What moron would tell his or her child that 'Wikipedia is a reputable educational site good for [you] to use"?'

All the above quotes are taken from actual image filter discussions. Obviously, the morons are Wikimedia donors who have fallen for the fundraising spiel!

Larry is right. He says that of the following three, Wikipedia can only choose two:

  1. call yourself kid-friendly;
  2. host lots of porn;
  3. be filter-free

In other words,

  1. If you call yourself kid-friendly and host lots of porn, you cannot be filter-free.
  2. If you host lots of porn and are filter-free, you cannot call yourself kid-friendly.
  3. If you call yourself kid-friendly and are filter-free, you cannot host lots of porn.

Which is it to be? --JN466 18:47, 8 June 2012 (UTC)[reply]

I disagree with the premise. Kids can cope much better with random porn than many grown-ups realize. Moreover, the problem is very much overblown. I've been on Wikipedia for nearly 9 years. The only time I've seen anything remotely pornographic was when following quite particular instructions from the recurring porn crisis discussions. --Stephan Schulz (talk) 19:09, 8 June 2012 (UTC)[reply]
Let's just re-emphasize the major premise that your argument is based on: "Kids can cope much better with random porn than many grown-ups realize. ". You want to go out there and make that argument to, you know, people who actually have kids, with a straight face? Part of the insanity of this discussion is precisely attitudes/beliefs/claims/weirdness like that.VolunteerMarek 01:02, 9 June 2012 (UTC)[reply]
Jayen is right. Porn on Commons is a problem, and it does damage to our credibility and integrity. It doesn't matter that more dedicated porn sites exist, the fact that we host any of it goes against our prime objective of providing an educational resource for the world's young people. Plus, Stephan, your anecdotes about not seeing anything pornographic in your Wiki travels is not a real argument. Larry's video made it clear that we have all too many images for "fisting" and that even searching for "cucumber" can locate pornographic imagery. Regards, -Stevertigo (t | c) 20:59, 8 June 2012 (UTC)[reply]
Stephan, just make sure Wikimedia put that in the fundraiser then, as one of the "stories", would you? "Kids can cope much better with random porn than many grown-ups realize." "Thousands of them watch our porn, even the bestiality porn, and they're none the worse for it, honestly." And drop the hypocritical testimonials about how Wikimedia is "Free from bias, banter, commercial interests and risky content", and suitable for people's grandchildren. Okay? If porn is so great for kids, put your money where your mouth is, and do your fundraising with that message. --JN466 00:18, 9 June 2012 (UTC)[reply]
I wasn't aware that stats.groks.se now has a children detector. Any source for that? --Stephan Schulz (talk) 21:02, 9 June 2012 (UTC)[reply]
Yea, according to wikipedia administrator Stephan Schulz thousands of kids watch wikipedia porn and they are non the worse for it, advertise the foundations educational/charity status with that. Youreallycan 00:57, 9 June 2012 (UTC)[reply]
Please refrain from making grossly wrong statements about me. Thanks. --Stephan Schulz (talk) 20:59, 9 June 2012 (UTC)[reply]
Where did he say "thousands", eh? Stephen's right. In all the time I've been on Wikipedia, I've only found porn when I've gone looking for it. What's truly amusing about this tempest in a teapot is that when you get right down to it, Wikipedia is the internet equivalent of the Sears Catalog. There's nothing here that any kid searching for porn didn't already find 100 better images/movies of via a simple Google search. That's not to say there is no reason to consider our little corner of the internet, but seriously, the mother-hen screaming about it will never create the kind of discussion you guys are hoping for. Mostly you'll just annoy people. Resolute 01:13, 9 June 2012 (UTC)[reply]
The occasional explicit image is a useful shot across the bow. This is an encyclopedia anyone can edit. We can't promise parents that there aren't pedophiles looking for chances to recruit their kids off the site. We can't tell parents it's safe for their kids to post material here, when any pictures could be picked apart for EXIF data, sometimes even GPS from cell phones, when anything embarrassing can and probably will be spread and posted all over the internet. If there's something wrong here, it's any impression of safety given in the green quotes, not the reality of the red ones. Wnt (talk) 01:17, 9 June 2012 (UTC)[reply]
Its not simply that anyone can edit, its that Commons admins can edit, where they add categories, names, and descriptions, to the things to making that images the more likely to turn up in random searches. Additionally each new image that is uploaded with some fetish or slang term, adds another image to the serach results, as time progresses porn become more likely in the results not less. John lilburne (talk) 07:05, 9 June 2012 (UTC)[reply]
You don't need to be a Commons admin to add or indeed remove categories on Commons. For example, I took an inappropriate image out of a Commons category where the word used for the category also had an American English meaning that applied to the image I recategorised. So I wouldn't be so sure that tendency is for porn to become more visible, afterall no-one is arguing for each category to contain a random and unrelated porn image. Maybe part of the solution is for those who care about such things to make sure that porn images are correctly categorised and described. ϢereSpielChequers 10:27, 10 June 2012 (UTC)[reply]
Here is an example: File:Naked_in_train.jpg. Look at all the categories added here for example by a Commons admin. That edit made sure that the streaker now streaks through all of the following category pages:
That's the kind of edit the public might expect a troll to make, but not a Wikimedia admin. It's a joke. JN466 13:18, 10 June 2012 (UTC)[reply]
So you're saying we should leave images out of categories they belong to, because they might look bad in a search? And sanction people who call a British Rail Class 450 a British Rail Class 450? I would suggest we fix the search, or implement personal display options which do not require Wikipedia to formally agree on which images are good or bad. But even the proposed image-filter idea, with the endless battleground rating images, would be better than asking people to break the category system itself and rate admins based on how they rate images. Wnt (talk) 12:21, 13 June 2012 (UTC)[reply]
Depending on the age and maturity of a child, someone more mature should generally be overseeing what they read and look at, this includes the factually incorrect science text, the poorly written literature text, the unwholesome general media, and the internet, which contains the preceding perils and more. This does not mean that children cannot use Wikipedia to their profit (as the green quotes state), they just should be parented or guided (as the red quotes state). In short, the two types of quotes are not contradictory, they just address different things. There are "kid friendly" encyclopedias but Wikipedia is a general audience one. As to the general issue, it would seem to make more sense to have a "Wikipedia for kids" Project, and of course eliminate all content that does not suit the purpose of this general audience Pedia, but no one should be mistaken (and everyone responsible for the education of a child has a duty to know) about how this general audience project is put together and edited, with all the good and bad that entails. Alanscottwalker (talk) 12:56, 9 June 2012 (UTC)[reply]
You are being ridiculously literal in interpreting those words. No human being just learning about Wikipedia is going to read those green sentences and think "that only means Wikipedia is suitable for children if they have adult supervision". That's just not the tone and emphasis of the quotes, even if their literal wording allows that to be true.
(And even then, I'd note that some of the quotes can't be even literally read that way. One of them says that Wikipedia is free from risky content. Is anyone going to read that as "Wikipedia has images that are not kid friendly"? Ken Arromdee (talk) 16:21, 10 June 2012 (UTC)[reply]
Since your comments contradict themselves it is difficult to see the point of them. At any rate, readers are assumed to exercise ordinary judgment. We trust them too. Alanscottwalker (talk) 01:33, 11 June 2012 (UTC)[reply]
You didn't bother to explain how you think my comments contradict themselves, and there's no contradiction. There is a difference between the way that something is literally worded, and how it is written. The message that all those green quotes are trying to communicate is that Wikipedia does not contain material that is unsuitable for children. Just because the literal wording of many of those quotes can be read differently doesn't change that; sentences communicate more than just their literal words. Ken Arromdee (talk) 23:59, 11 June 2012 (UTC)[reply]
The first paragraph of your penultimate comment said I was reading literally, and the second suggested I was not reading literally enough. As the green comments are said to be the experience of those users, there is no reason given not to take them at their word that those are their experiences. They are not claiming to be other people's experiences. Alanscottwalker (talk) 11:09, 12 June 2012 (UTC)[reply]
You are reading most of the remarks against their intent but in accordance to their literal words; you are reading the remaining ones against both their intent and their literal words.
The intent of those statements is to say that Wikipedia is suitable for children. This contradicts the claim in the porn discussion that Wikipedia is not for children. Saying "well, it is suitable for children, as long as they're supervised to keep them away from the porn" contradicts what those quotes actually say, even if it doesn't always contradict their literal meaning. Ken Arromdee (talk) 16:48, 12 June 2012 (UTC)[reply]
No. The intent of the green comments is to relate those users experience on the pedia. And no. There are allot of things children should be protected from. Alanscottwalker (talk) 21:40, 12 June 2012 (UTC)[reply]
Well, then why not add a few alternative testimonials on the topic? I would be happy to supply some. Let's make the "stories" page NPOV, rather than a PR effort, in line with fundamental Foundation policy. JN466 17:33, 13 June 2012 (UTC)[reply]
What is the basis for the claim that NPOV governs all Wikimedia Foundation projects? If I recall correctly some Wikimedia projects have rejected NPOV as incompatible with their mission. There do appear to be some universal precepts listed in Resolution:Controversial content, including "not censored" "educational" and "free" but NPOV is not among them. It would seem strange that the Foundation's own website, from which those green quotes are taken, does not have any POV. The Foundation website does not appear to be governed by NPOV. However, if you wish to forward stories to the Foundation, please do; it will be up to them how they use them when you license those stories to them. Alanscottwalker (talk) 11:29, 14 June 2012 (UTC)[reply]
No general-purpose encyclopedia features NSFW illustrations like these or these or these, and only a denizen of Wikiland would state that the site was correctly advertised to the public as "a safe and trustworthy website for children to do their research". JN466 18:32, 10 June 2012 (UTC)[reply]
So, the person who said that is a "denizen of Wikiland," whatever that means -- what else do you know about him or her, or of their experiences with the pedia and protecting their children? Do you have anything else you want to tell them about themselves? Alanscottwalker (talk) 01:43, 11 June 2012 (UTC)[reply]
The fallacy is that you act as if Wikipedia can be a "safe" place for children. We know better. It's no longer a matter of speculation that when anybody can edit an encyclopedia, that includes pedophiles and even the occasional serial killer - it's not even an argument anymore. We're as public as a city street. If a parent wants to protect a child from "tit torture", or something more important like being drawn off-site to a pedophile's special project, it's going to take additional effort. The same has long been true of the Internet as a whole, but that hasn't stopped kids from getting a lot of good, and relatively safe, use out of it. Wnt (talk) 06:02, 15 June 2012 (UTC)[reply]

Why not an image filter?

I've just watched Larry Sanger's video, and I think it does raise a serious issue, no matter how much some people would prefer to stick their heads in the sand and pretend it doesn't exist. But it seems to me there's a fairly simple solution: why doesn't Wikipedia have an optional 'safe search' function, which would filter out images likely to be inappropriate for kids? Plenty of websites have a 'child-safe' mode or something similar. (e.g. Google, YouTube, Flickr, Deviantart.) I can't imagine it would be that hard to implement. Yes, there would inevitably be disputes over whether certain images are 'adult content', but it should be possible to reach a broad consensus on certain types of images, such as anything pornographic. And no, no filter would ever be perfect, but even an imperfect one would be better than what we have at the moment.

I seem to remember similar proposals being raised during the controversy over images of Muhammad. That problem could be addressed with a filter as well, by allowing users to choose not to see images which could be religiously offensive. It seems like such an obvious solution I don't know why we haven't done it yet. Isn't it about time we recognised that not all of our content is going to be considered appropriate by every user, and permitted our users the ability to choose not to see things they find inappropriate? Robofish (talk) 14:16, 8 June 2012 (UTC)[reply]

Consensus is one thing, but who gets to tag the images for the safe search filter? What if some people start tagging images that show too much thigh or wearing shirts that don't cover their belly button? SilverserenC 14:21, 8 June 2012 (UTC)[reply]
In the early days of the discussion about image filters I suggested a method of filtering that anti-censorship people would accept, which doesn't compromise WMF's legal position. You merely enable each user to keep a private blacklist of images he doesn't want to have served, and to transclude other users' blacklists into his own. That way the source files are untouched, and there are no edit wars about it, and no calls for banning or blocking users who set the rating to the "wrong" value. Wnt (talk) 14:28, 8 June 2012 (UTC)[reply]
Well, that's just what I proposed. Personal filter lists, specific to each account like a watchlist, with crowdsourced standard lists for different requirements. All written up and discussed here: [9]. WereSpielChequers made another user-driven filtering proposal. It was on the table, Wnt. Nobody from the Foundation ever even bothered looking at the proposals. JN466 17:58, 8 June 2012 (UTC)[reply]
the easiest solution is for people to select their own filter an just provide the facilities to enable filters to do their work. If pastafarians want to exclude all images of the their spaghetti monster then one or more would set up a filter using say categories, actual files names and patterns. Hopefully most of the work could be done on the user side using some standard javascript. New users could search amongst the filters and find one they liked, often used ones could have protection applied just like templates are often protected. Dmcq (talk) 15:11, 8 June 2012 (UTC)[reply]
Silverseren - it would be subject to consensus like anything else. If there was no consensus on whether an image should be tagged as inappropriate, it would default to not being tagged. But there'd be nothing to stop us having different levels of filtering - one filter for people who only want to filter sex images, a broader one that might cover skimpy clothing, and so on. There could also be filters for images of violence and gore, animal cruelty, religiously offensive images and so on.
Wnt's suggestion is an alternative way of doing this, although it would put more work on the users by requiring each user to create their own personal blacklist. That seems to me to defeat the purpose of site-wide filters somewhat. But it would still be a step in the right direction. We should be doing something about this, and optional filters (that would of course be 'off' by default) really have nothing to do with 'censorship'. Robofish (talk) 15:07, 8 June 2012 (UTC)[reply]
I expect that in practice there would only be a few users who get a reputation for watching their blacklists, and most of those using blacklists would have a page like "User:So-and-so/Blacklist" containing only a short text like "{{User:Defender Of The Faith/Blacklist/PG-13 subset}}". Wnt (talk) 15:12, 8 June 2012 (UTC)[reply]
Yep that gets my vote. There are a couple of things that would help a bit like having easy access to the categories of files on commons, currently you need to indirect from the image on Wikipedia, but essentially that is what I envisage too except with better filters and being able to find and specify popular ones easily. Dmcq (talk) 15:38, 8 June 2012 (UTC)[reply]
Okay, I've just described this idea in more detail at User:Wnt/Personal image blocking. Wnt (talk) 16:01, 8 June 2012 (UTC)[reply]
Wnt, could I suggest you append your proposals to the many other ignored proposals on http://meta.wikimedia.org/wiki/Controversial_content/Brainstorming ? That's where they are all collected. Cheers, --JN466 18:02, 8 June 2012 (UTC)[reply]
It was proposed at [10], without further outcome. Wnt (talk) 09:08, 9 June 2012 (UTC)D'oh! No, that was a different idea. I shouldn't edit WP when I get woken up in the middle of the night. Ah, here it was. Wnt (talk) 17:16, 9 June 2012 (UTC)[reply]
A personal blacklist would only work for logged-in users. Most users of Wikipedia do not log in. And I suspect that especially those offended by images of M&Ms will overwhelmingly not log in. Neither will those looking out "for the CHILDRUN" (who probably get their porn directly from Google or Hogtied, anyways (ambiguity noted and left in intentionally as most likely true either way)). --Stephan Schulz (talk) 18:58, 8 June 2012 (UTC)[reply]
Creating an account is free and as I understand it will remain so. If people want an image filter is it so unreasonable to require them to register a free account in order to opt in to such a free service? If there is significant demand out there for such a filter that could lead to a large increase in registered users who don't edit, but that wouldn't be a problem would it? ϢereSpielChequers 05:33, 11 June 2012 (UTC)[reply]
Robofish, the Foundation has urged us to implement an image filter but it was rejected during a community discussion that I wasn't a party to - on Commons I think. JN466 can probably point us to the relevant discussion. Personally, I think the community has had long enough on this, and the Foundation should simply do the right thing per Wikipedia:You don't own Wikipedia. --Anthonyhcole (talk) 15:41, 8 June 2012 (UTC)[reply]
Jayen, so I'll follow those links and get back to you. --Anthonyhcole (talk) 18:08, 8 June 2012 (UTC)[reply]
I remember the second discussion. I came in late with, I think, the best suggestion on the table: that we ask for Google's help with the filter design. They're well-disposed toward us; and they would have already thought through a lot of this. Didn't I just see that Britannica and Bing are in some kind of alliance?
Did anyone from the Foundation even look at that discussion? --Anthonyhcole (talk) 18:51, 8 June 2012 (UTC)[reply]
Perhaps they did. Who knows? When asked about it, none of them replied. And none ever to my knowledge made any other specific comment about the proposals on that page and their feasibility or suitability either. JN466 00:51, 9 June 2012 (UTC)[reply]
I find it interesting to note how "doing the right thing" always seems to equate to "doing what I want". Unbelievable arrogance, that. Resolute 01:18, 9 June 2012 (UTC)[reply]
It's the right thing. It's a question of access. Readers should be able to click a drop-down menu to select different degrees of image filtration on our site, the way we do on Google or Bing. Universal access is being sacrificed to make some puerile point about freedom to offend. Great. --Anthonyhcole (talk) 06:57, 9 June 2012 (UTC)[reply]
I imagine we've all been guilty of that at some point or another. But it does seem especially prevalent in these censorship debates—the attitude, that is, that those who oppose censorship are out to do harm. Perhaps a better description is that they think that the other side shares their viewpoint (that this content is unacceptable and harmful) and wishes to retain it out of active malice, rather than the far more likely scenario that the other side disagrees that this IS harmful, and wishes to retain it out of a good-faith belief that it could be useful to one of our projects. That attitude is unfortunate and always unproductive. Given Commons' broad scope, it is not difficult to see why some would hold the position that if there's any chance of someone finding a use of a file (a Wikipedia article on the particular practice being depicted, a Wikibook on human sexuality or a facet of it, a Wikiversity course on sexuality in different cultures...), that the file should be retained.
As to big red letters, if we really want to put something in big red letters, we could always shout "WARNING: INTERNET MAY CONTAIN DAMN NEAR ANYTHING, AND BELIEVE ME I MEAN ANYTHING. SUPERVISION OF CHILDREN STRONGLY ADVISED DURING USE." What we should not do is attempt to reduce the Internet, or Wikimedia projects, to a "kid-friendly" level, in order to relieve parents of this responsibility of...well, parenting. And for all the assertions above that those who do not believe in Internet censorship do not have children, well, I do. They use the Internet, I've taught them how to recognize potentially dangerous situations, and I check on what they're doing periodically. How hard is that? I don't need everyone else on Wikimedia projects, or the Internet, to parent my children for me, and I don't want them to. What is and is not appropriate for my children is my call, not theirs.
All that being said, I can't imagine there would be a ton of objection to a 100% opt-in image filter, provided each user, not an editor, makes the judgment as to what is offensive and should be blocked. That's been the main sticking point in previous proposals—putting an "Offensive" stamp on something is inherently POV, namely a POV that sexuality, or Muhammad, or what have you, is "offensive", and that those offended by it are right to be. But if a particular user wants to set such a filter on their own account, I don't see why we can't let them (could even be done via cookies or other persistence tools for anonymous users). The only sticking point I would have is, to prevent use as a censorship tool, blocked media should not be blocked completely, but rather readily accessible at a single click. This should be a "surprise prevention" tool, never one to keep people from seeing something they want to see, and never one usable for that purpose. Seraphimblade Talk to me 06:41, 9 June 2012 (UTC)[reply]
No. The a filter is a user option, to term it censorship is quite false. If I'm at work and need to look for an image of human female I don't want to have wade through a puddle of urination images, at home it might be different. A filter that allows me to direct what is or is not returned by the search, based on need, is an improvement in the user experience. Terming it censorship is a gross caricaturisation John lilburne (talk) 07:20, 9 June 2012 (UTC)[reply]
The proposal of a 100% opt-in image filter where each user, not an editor, makes the judgment as to what is offensive and should be blocked has been on the table since November 2011 at least, and possibly longer. Well before the image filter project was quietly and unceremoniously dropped, at any rate. JN466 12:59, 9 June 2012 (UTC)[reply]
I have in the past advocated for a client-side filter designed specifically for Wikipedia, compatible with all major browsers, that can be customized to suit the needs of individual parents and communities using crowdsourced community blacklists. As a client-side mechanism, it would not require logging in, and could be made nontrivial to circumvent. For people interested in self-censorship, it could easily provide (via Javascript) mechanisms to hide/show individual images and classes of images. I believe such a solution would satisfy many people on both sides of the issue, and could be a valuable business opportunity, but it requires a substantial development investment. Dcoetzee 07:34, 9 June 2012 (UTC)[reply]
I fully agree that it has to be a user choice rather than a drop down list in Wikipedia. Suppose the filter is just a list of files like Wnt says with a brief description by the maintainer, how does user get to hear about it and use it? I'm not keen on stars and votes but I think it would have to be some way like that. A person fills in their criteria like 'flying spaghetti monster' and the first page of filters back would be ones which were relevant to excluding images of the spaghetti monster with stars besides them saying how well they were rated. Things always get misused on the internet so a worry is that we'd be advertising the images or articles which caused most offence by listing the names of all these files. The number of stars though would probably still be okay as a measure though.
A facility to generate such a list from simple criteria would I believe be quite a useful general facility in Wikipedia. I can see people wanting for instance an easy personal facility to get a list of all the files in particular categories or satisfy some other search filter and compare the list with what was there a week ago. Dmcq (talk) 07:49, 9 June 2012 (UTC)[reply]
Seems a technical matter which should come with the other functionalities the Foundation provides for viewing; I take it they have not figured it out but I agree it's odd they have not. Are there issues (legal or otherwise) around failure rate? Alanscottwalker (talk) 13:21, 9 June 2012 (UTC)[reply]
I've added my filter design to the brainstorming session. m:Controversial content/Brainstorming#"Report this picture". --Anthonyhcole (talk) 10:31, 10 June 2012 (UTC)[reply]

I think this can be investigated. AI is pretty efficient in identifying nudity in particular. This would give users the option to filter content they do not want. Humans can still mark incorrect filtering or tag ones that escaped the filter which would train the machine learning algorithm to improve its detection. I think this is the more feasible option for implementation. -- A Certain White Cat chi? 14:41, 9 June 2012 (UTC)

I don't believe that there are any automatic AI filters that are capable of determining inappropriate images in any reliable manner. John lilburne (talk) 14:57, 9 June 2012 (UTC)[reply]
The tin god you propose has a little man hiding in it. The software would still be trying to implement the various attitudes of those programming or "training" it. You're not talking about a way to avoid categorizing the various nude and nude-ish images on Wikipedia, but only a way to automate the process. Wnt (talk) 17:39, 9 June 2012 (UTC)[reply]
Algorithm is pretty accurate actually. I don't know how capable stackoverflow.com programmers on that thread are but nudity identification isn't typically perceived as a challenge at AI workshops. An algorithm is far better than regular users tagging as an algorithm would not be as biased as the individual. If the objective is detecting penises, this can work. -- A Certain White Cat chi? 18:17, 9 June 2012 (UTC)
Well I'm fairly used to my judgement not being as good as a recommender system or statistics at making a judgement but the criteria still need to be set. Somebody needs to say 'I do not want to be shown images of the ineffable spaghetti monster' and somebody needs to train the systems so it includes some plates of spaghetti with meatballs and exclude others and they need to write it up saying what is filtered. That is a considered judgement of someone or a group of people. Dmcq (talk) 19:56, 9 June 2012 (UTC)[reply]
This is true. Another possibility is instead of having a master filter, each user can train their own filter by for example selecting categories. ie selecting category for male genitalia or selecting spaghetti or both. The algorithm would hide similar images denoting how many are hidden with perhaps a link to show them if the user wants (There are different UI approaches which can be perfected). This wouldn't be as trivial as hiding genitalia but it wouldn't be impossible either. I have seen far more difficult tasks be handled at workshops which were even able to determine the sentiment conveyed by the image or identify objects on the image. Accuracy can be further increased by using meta data as well as image description pages. If you like you can read my abstract (for Wikimania 2012) and/or report (for CLEF 2011 conference - particularly ImageCLEF) on such possibilities. -- A Certain White Cat chi? 00:47, 10 June 2012 (UTC)
AI is pretty rubbish at differentiating between the flesh in porn and swimming galas. But we could algorythms to match people who seem to want to filter the same things and enable them to share their filterlists anonymously - I've specced it at m:Controversial content/Brainstorming/personal private filters ϢereSpielChequers 10:09, 10 June 2012 (UTC)[reply]
It depends on how you want to optimize the algorithm. If you are just looking for color, you will also miss B&W content. If you are using edge detection + pattern detection then that will work better. You can also feed the algorithm swimming gals as non-problematic. AI is far better than humans are capable of in terms of recall and precision as well as the amount of time it takes to filter content. The goal includes hiding fresh uploads that are used for vandalism as well. Humans cannot react to that fast enough, particularly on smaller wikis where admin activity is scarce or non-existent. -- A Certain White Cat chi? 13:12, 10 June 2012 (UTC)
The biggest problem I see above is in the words 'inappropriate content'. Wikipedia has a problem with that. The best we can go for is helping people to select a filter as recommended by others. The actual selection of images or other stuff is not a huge problem though some tools can help, the big problem is how to enable people to select a filter they like whilst not having an official 'sex' or 'violence' filter, and second how can parents ensure their children use the selected filter? To solve this it probably would be best to get advice from firms already involved in filtering and have the child protection bit dealt with by them, but we could certainly provide some facility to set up and have users rate the filters. In fact they'd also be rating the provider as one would want a filter that is updated every so often. Dmcq (talk) 10:37, 10 June 2012 (UTC)[reply]
That could be an easy implementation. You could have a "common" list of stuff that people often find offensive or inappropriate ie sex, violence as "default" options with the ability to add more "custom" options. The default filters could be crowdsourced so people can mark files (files filter missed, files incorrectly in the filter) to adjust their score on the filter. It can even be fine-tuned under preferences. Custom lists could also be shared among users like .css/.js items. There are different ways to handle the UI. -- A Certain White Cat chi? 13:12, 10 June 2012 (UTC)
I think you're missing the point. It just won't fly if it is listed as a standard list on Wikipedia. It would have to be wholly outside Wikipedia or not involve Wikipedia in classifying things that way. Besides such a common official list in Wikipedia if it was allowed would lead to quite dreadful wars. The only way it will work is if the filters are user filters set up by and maintained by editors or groups of editors under their own names and with no central support of any specific ones. Dmcq (talk) 13:40, 10 June 2012 (UTC)[reply]
That was indeed the proposal. At most there might be a Wikimedia page listing filter lists maintained by third parties. Although from a youth protection law viewpoint it would be preferable to filter certain things by default, making full access dependent on registering an account which can then opt out of filtering – basically, the Flickr method. --JN466 14:11, 10 June 2012 (UTC)[reply]
I can't believe people are still talking about this. Preferences > Safe Browsing > On. Update Safe Browsing list. Mark images as Nudity > Depiction of sexual acts > Crime scenes. Are we done? Viriditas (talk) 14:29, 10 June 2012 (UTC)[reply]
Yep, if we had such settings under Preferences, at the top of this screen, that would be swell. As long as we don't, we'll be talking about this. --JN466 18:42, 10 June 2012 (UTC)[reply]
There are other options as have been outlined quite well enough above, it is not your way or no way. Dmcq (talk) 20:29, 10 June 2012 (UTC)[reply]

German court case

I was just checking in on German Wikipedia, and read in the German Signpost ( http://de.wikipedia.org/wiki/Wikipedia:Kurier ) that a well-known German admin, User:Achim Raschka, was recently taken to court in Germany for adding one of these black-and-white early porn films to the German Wikipedia's article on pornography. The following summary is based on the Kurier article, written by Achim himself.

Achim, who is well known as the author of the German vulva article that made an illustrated main page appearance in German Wikipedia a couple of years ago, was prosecuted for having violated a German law against "distribution of pornographic writings". According to the German WP article on it, the law aims to prevent pornographic writings from getting into the hands of minors without the knowledge of their parents, and also aims to protect adults from unwanted exposure to pornography. The prosecution came about because a woman reader in Germany had reported the German Wikipedia to the police, who determined that Achim had been the one to insert the video – which showed clear pictures of an erect penis, oral sex and penetration. They contacted him by e-mail in February.

Wikimedia Germany (Achim is a past board member) and the JBB legal firm helped Achim defend himself against the charges. They recently succeeded in having the case dropped, as a minor offence not worth prosecuting. Achim says in his Kurier article that the prosecutor's office did not respond to requests for a clarification of the legal position. Achim took the video out of the article in February and has said he will not put it back in; he also refrained from linking to it in his Kurier article, saying he would rather not have a corresponding police record. --JN466 15:48, 9 June 2012 (UTC)[reply]

Between this and the Fae case, it sounds like these national boards are the target of significant political firepower. Then I read these: [11][12]. Somebody here knows what's really going on, and I wish they'd share it with the rest of us. Wnt (talk) 13:41, 17 June 2012 (UTC)[reply]

2257 compliance statement

Further to [13] and Jesse's research on Wikimedians' potential criminal liability for failure to keep age records for sexually explicit material they upload, insert on a page or "manage" ( see http://meta.wikimedia.org/wiki/Legal_and_Community_Advocacy/Age_Record_Requirement ), I have asked Jesse to have a look at [14], which says:

(d) A computer site or service or Web address containing a digitally- or computer-manipulated image, digital image, or picture shall contain the required statement on every page of a Web site on which a visual depiction of an actual human being engaged in actual or simulated sexually explicit conduct appears. Such computer site or service or Web address may choose to display the required statement in a separate window that opens upon the viewer's clicking or mousing-over a hypertext link that states, “18 U.S.C. 2257 [and/or 2257A, as appropriate] Record-Keeping Requirements Compliance Statement.”

I am not aware of any Wikimedia project currently using such compliance statements. JN466 17:26, 9 June 2012 (UTC)[reply]

That's handled by the WP:OTRS records peopl are asked for if they upload images subject to that. It doesn't affect the main point about user image filtering, it is more about protecting people who might be exploited by the sex industry. Dmcq (talk) 18:49, 9 June 2012 (UTC)[reply]
Talk about chilling effects. I'm just waiting for them to try to prosecute someone for what essentially amounts to linking to a sexual image that was already made. I have enough faith in the courts that they would recognize that, first, linking to an already-made photo cannot in any reasonable sense be construed as "producing" it, and that secondly, imposing "record keeping" requirements on a downstream reuser that they have no hope of fulfilling amounts to de facto censorship of such speech/reuse and violates the First Amendment. Your German case, on the other hand, is frightening. Someone who speaks German and doesn't live in Germany needs to register a deep-anonymity account and be given ipblock-exempt so they can help with that. It's sad it would come to that in a First World, purportedly "free" country. Seraphimblade Talk to me 19:06, 9 June 2012 (UTC)[reply]
@User:Seraphimblade - are you suggesting - that wikipedia admins help someone to be unidentifiable and to replace the content? Youreallycan 19:10, 9 June 2012 (UTC)[reply]
It seems to be worse than that. He's suggesting a joint enterprise to circumvent child protection laws in another country. John lilburne (talk) 19:18, 9 June 2012 (UTC)[reply]
That's handled by OTRS? OTRS contact uploaders and obtain written documentation of models' name, age and consent? I really, really think you're quite severely mistaken here, Dmcq. If what you are saying were true, we would not have Flickr account holders posting complaints like this one, and there would be no need for board resolutions like this one. --JN466 00:28, 10 June 2012 (UTC)[reply]
What you are talking about is the systems for dealing with this regulation not working well in a number of cases, the place is run by volunteers checking the pictures others put up. The whole business of that regulation is just totally irrelevant and your bit is irrelevant too. It makes no difference to user filtering whether we have the OTRS records or not, that regulation is irrelevant. It is only relevant to looking after the welfare of, for instance, minors who might be exploited in the sex industry and it protects people from having their images used without their permission. Dmcq (talk) 09:15, 10 June 2012 (UTC)[reply]
I agree with you that the image filter and 2257 record-keeping requirements are separate issues. I don't share your view however that compliance with US legal requirements like the one posted at the top of this section is irrelevant. --JN466 13:25, 11 June 2012 (UTC)[reply]
If you read the definition of the word "producer" at the top of the document, I think you will see that entities such as Wikipedia are specifically excluded. Looie496 (talk) 19:36, 9 June 2012 (UTC)[reply]
If you read http://meta.wikimedia.org/wiki/Legal_and_Community_Advocacy/Age_Record_Requirement you'll find that while the Wikimedia Foundation is excluded, individual Wikimedians are not. Wikimedians uploading, inserting or managing this content may be considered primary or secondary producers in the eyes of the law. Their actions have to comply with the law. Remember, Achim Raschka was prosecuted because someone reported Wikipedia. Wikipedia was not prosecuted: he was, as an individual. And incidentally, the new http://meta.wikimedia.org/wiki/Terms_of_use affirm that "you are legally responsible for all of your contributions, edits, and re-use of Wikimedia content under the laws of the United States of America and other applicable laws". JN466 19:51, 9 June 2012 (UTC)[reply]
@John lilburne/Youreallycan: Well, yes, of course. We have frank articles on human sexuality, just as we do on everything else. If editors living in Germany would be at risk from editing/illustrating those types of articles, we should not ask them to take that chance, but instead have editors who are not facing that risk do it. Incidentally, I'd advocate just the same if German editors would be at risk for putting a swastika in the article about Nazism—one of us from another country should come along and help out. I certainly don't blame the German editor who under duress removed the material, but we shouldn't just let that happen! As to "child protection", it is the job of parents to parent, not Wikimedia. There are a whole lot of child-unfriendly things here (and on the Internet at large), and that goes well beyond sex images. Letting kids use the Internet unsupervised is roughly equivalent to dropping them off at a large shopping mall. If you wouldn't trust them to do one, you ought not do the other either. Seraphimblade Talk to me 20:49, 9 June 2012 (UTC)[reply]
Seraphimblade: As to "child protection", it is the job of parents to parent, not Wikimedia. - Sure, that's true. But if Wikimedia/Wikipedia/Commons aims to make the parents' job harder in this respect, then it's got no business calling itself "kid friendly".VolunteerMarek 00:38, 10 June 2012 (UTC)[reply]
I don't really think we need a list of laws you would be prepared to conspire against internationally, circumventing laws on child protection is enough. John lilburne (talk) 21:03, 9 June 2012 (UTC)[reply]
Oh no, John, it's no bother! Let's start with some other laws I'd be willing to "conspire against internationally". I would be willing to "conspire against" Sharia laws banning showing unveiled women or Mohammed, Chinese laws prohibiting display of the Tank Man image or other realistic depictions of Tiananmen Square, Thai laws forbidding criticism of rulers, North Korean laws prohibiting anything but the loudest praise of their dictators, and that list can keep going for some pages. And before you say "That's not the same thing!", consider that every one of those governments would tell you those laws exist to protect "the children" and their society as a whole. Seraphimblade Talk to me 21:22, 9 June 2012 (UTC)[reply]
Laws dealing with the corruption of minor have some direct counterparts in the US too. John lilburne (talk) 21:36, 9 June 2012 (UTC)[reply]
John, I sure hope you didn't just compare illustrating an article on a topic to sexually assaulting a child, but that's sure what your last comment looks like to me. Could you please clarify if I'm misinterpreting? Seraphimblade Talk to me 21:57, 9 June 2012 (UTC)[reply]
Perhaps you missed "Ebert also sentenced Rathbun to two years of probation on a corruption of minors charge for showing pornography to another girl." (emphasis mine) Bielle (talk) 22:06, 9 June 2012 (UTC)[reply]
Precisely we aren't talking about Thai royality, or Tiananmen square, or North Korea, or unveiled women, we are talking about conspiracy to circumvent laws against showing pornography to kids. John lilburne (talk) 22:36, 9 June 2012 (UTC)[reply]
Seraphimblade, laws making it illegal to criticise a country's ruler are geographically highly restricted. Showing kids porn, on the other hand, is considered inappropriate in widely diverging countries all over the world, with the notable exception of Wikiland, where it seems to be a cherished part of site policy. JN466 00:56, 10 June 2012 (UTC)[reply]
JN, you're getting straw all over the place. Illustrating a sexuality article with illustrations on sex is in no way the same thing as "showing kids porn". It is the responsibility of the children's guardians to ensure they do not view material inappropriate for them, it is not the rest of society's responsibility to ensure such material doesn't even exist. Seraphimblade Talk to me 03:57, 10 June 2012 (UTC)[reply]
One of the ways that the parents/guardians of children ensure that they are less likely to be inappropriately exposed to sexually-explicit material is to support the passing of laws which makes making such material available to children illegal. Such laws exist. If you wish to argue that they shouldn't, do it somewhere else. Wikipedia isn't an 'anti-censorship' campaign. AndyTheGrump (talk) 05:00, 10 June 2012 (UTC)[reply]
The above court case in Germany was about a Wikimedia admin illustrating an article on pornography with an actual, R18-rated hardcore pornography video. JN466 12:41, 10 June 2012 (UTC)[reply]
Unlucky for us, but maybe at the same time lucky for us and the editor, the case was closed by the prosecutor itself. This leaves the question if it was legal or illegal wide open. I doubt that the case would have any success because of "R18-rated hardcore pornography", more likely (if!) because of zoophilia. § 184 a StGB All in all we are as wise as before. --/人 ‿‿ 人\ 署名の宣言 18:43, 10 June 2012 (UTC)[reply]
You could always be brave, put it back into the article, then report it to the authorities, and insist on your day in court. John lilburne (talk) 18:52, 10 June 2012 (UTC)[reply]
No one would be happy about it. My argument is: Even after this actions it is entirely unclear if it is legal or illegal and for what reasons. It is neither an argument against it or the other way around. --/人 ‿‿ 人\ 署名の宣言 19:09, 10 June 2012 (UTC)[reply]
The film shows an erect penis and penetration, it has always been my understanding that, in Germany, such depiction almost always had to be behind some age-verification. That the law applied when the material was targeted at Germans (use of the German language is a trigger) and that if the organisation had any assets in Germany, or the maker available was German, then bingo prosecuition is sure to follow. Which part of that is not true? John lilburne (talk) 19:26, 10 June 2012 (UTC)[reply]
You missed two points. The first is that this laws (§184 StGB and corresponding definitions) aren't the only laws which apply in such cases. A judge will always have to weight them against granted freedoms. The second are the definitions inside the StGB which do not apply to every case and have a lot to do with the intention. Showing "an erect penis and penetration" to minors simply to make profit (pornography) would most likely or even surely be illegal. But showing "an erect penis and penetration" in educational context or in art would fail the definition of pornography, which then would nullify §184 StGB entirely.
Just picking a German or other foreign law (eg. §184 StGB) and applying English definitions to the translated words inside paragraphs won't work in most cases. A law can only apply if the definitions also apply without doubt. This also means that your example is incomplete or not sufficient to be a template for a generic/general rule. --/人 ‿‿ 人\ 署名の宣言 19:51, 10 June 2012 (UTC)[reply]
Back in 2007 Yahoo lawyers took the view that non-commercial adult material uploaded to flickr, would render the site liable under German laws, unless an age-verification system was put in place. Rather than do that they limited what Germans copuld see on the site, other alternative which they dismissed would be to make unavailable (delete) specific notified content. So I think that simply saying that there has to be a profit motive is wrong. But why not just restore the film to page and see whether you get an email from the prosecutors office? John lilburne (talk) 20:31, 10 June 2012 (UTC)[reply]
The material would be non-commercial, but the service itself is commercial and doesn't limit itself (the users) to educational, artistic or other purposes that aren't touched by the law. It has an entirely different mission which is also reflected inside the laws. The lawyers of Yahoo took the position that an age-verification system would be to complicated/unattractive. Since removing/deleting such content would be an "expensive task" they dismissed this idea as well. The result was the current solution that suits the purely commercial driven decision process.
I guess your intention is to compare flickr with Commons, which suits the parole Lets do it like the others!. There is a reason why i don't like this comparison. But this is already included in and in between the previous lines. --/人 ‿‿ 人\ 署名の宣言 20:55, 10 June 2012 (UTC)[reply]
If you believe that WMF's status means that you are exempt then walk the walk and readd the video to the page. --John lilburne (talk) 02:01, 11 June 2012 (UTC)[reply]
(ec) The case was dropped after lengthy arguments from Achim's lawyers, and was dropped only because it was a "minor offence" ("aufgrund geringer Schuld", Achim writes), not because there wasn't an offence. That's not unreasonable, given that they were talking about a single edit. JN466 18:55, 10 June 2012 (UTC)[reply]
That a prosecutor drops a case because of "aufgrund geringer Schuld" does only mean that the prosecutor accepted the case, not denying it immediately. It is entirely unclear what the prosecutors opinion was after the response, before he dropped it. At least we can assume that there was doubt for success, otherwise it wouldn't have been dropped. --/人 ‿‿ 人\ 署名の宣言 19:09, 10 June 2012 (UTC)[reply]
I suspect it was more a case of the Wikimedia Germany lawyers begging and pleading, explaining that Achim had only done it out of misguided enthusiasm for knowledge, and promising that he wouldn't do it again. ;) At which point the prosecutors likely thought it would waste taxpayers' money to take the matter to a full trial, given that he had already taken the media out again and promised not to reinsert it. JN466 02:19, 11 June 2012 (UTC)[reply]
This is one theory out of hundreds of theories, without proof for any. If i remind correctly Achims lawyers asked also for a statement on which basis case was accepted. But the response didn't mention any reason why it was accepted in the first place and why it was dropped. As i said, this leaves the question completely open.[15] --/人 ‿‿ 人\ 署名の宣言 08:47, 11 June 2012 (UTC)[reply]

(indent reset) In which case, we're back to needing a non-German editor, preferably highly anonymous, to help out with any media the authorities there might attempt to come after. I'd happily volunteer but speak almost no German. Seraphimblade Talk to me 02:53, 11 June 2012 (UTC)[reply]

Indeed; if my point didn't mirror yours above, I'd say more, but I I'll briefly sum up my thoughts. If a government's idea of what's legal conflicts with our aims, to the extent that it's legal here in the US other governments can go to hell if it exists. That it's the German government this time as opposed to Burma or China doesn't change anything in my mind; the only distinction is the number of times and issues on which we have to say that. I'm an American, and while I'm here I shouldn't be worried about the bitching of anyone's legal system but my own. Incidentally, I'm not the biggest fan of Commons, but just endlessly sniping about it here I know won't get anything done. I know I've brought this up before, but do read what Point 6 said until fairly recently. The Blade of the Northern Lights (話して下さい) 03:51, 11 June 2012 (UTC)[reply]

UK law

While we're looking for laws that Seraphimblade can campaign about, can I point out that in UK law, having an indecent image of a child download into your temporary internet cache may constitute "making" such image ("making" as used by this legislation includes making a copy), and knowing it's there and not immediately deleting it may constitute "possession". Commons' lax approach to image management may therefore constitute a threat to editors (although realistically, one would have to have more than a couple of images from Commons in there to have any risk of prosecution - unless the cops were already looking for something to charge you with). Elen of the Roads (talk) 14:31, 11 June 2012 (UTC)[reply]

Given that the precautionary principle is liberally applied elsewhere, that's one reason I don't understand why Wikimedia does not simply withdraw editing privileges from anyone who
  • uploads sexually explicit images without the legally required 2257 documentation,
  • consistently votes to keep such images,
  • inserts such images on Wikipedia pages without adding a 2257 compliance statement (see section above).
The Terms of Use say that all contributors have to comply with applicable US law. Both the Foundation and contributors would be a lot safer if they did, and the Terms of Use give the Foundation a way of acting without interfering directly in content management. --JN466 16:36, 11 June 2012 (UTC)[reply]
In practice, Elen, viewing a possibly/marginally indecent image on Commons isn't going to get you in trouble of the jailtime sort (of course; if you get spotted with it on screen it may end up with you coming under investigation - which is a more likely issue). To be perfectly honest, of the more marginal images on Commons that I've seen or been made aware of none of them would be flagged up by any competent investigator (YMMV etc.) FWIW I am not sure I see what useful endgame 2257 compliance brings us; other than basically deletion of all nude/sexual content (maybe that is the end game?). It certainly doesn't seem a useful way to bypass the ridiculous standard of Commons and force a cleanup. At the end of the day actual legal experts have stated that whether or not 2257 even applies is years from resolution (if ever) so there seems no way of convincing the community it is an urgent matter. --Errant (chat!) 16:51, 11 June 2012 (UTC)[reply]
The end game is not to get rid of all nude/sexual content, but to get rid of all anonymous sexual content. I really don't think it behooves an educational charity to use such content, regardless of whether it is revenge porn, violates privacy, or was uploaded without the knowledge of the person depicted. Wikimedia could spend some of the $20 million it took last year and pay some nude models and a photographer for professionally done shots covering all our encyclopedic needs. Leave anonymous porn image sharing to 4chan. Besides, just think how many volunteer hours would be released if anonymous wank videos and penis shots that will never have 2257 documentation were simply deleted, rather than debated endlessly. And if someone has their stuff together enough to contribute professionally (by which I mean to a professional standard of competence), with proper documentation, then they are of course very welcome. JN466 17:09, 11 June 2012 (UTC)[reply]
How do you propose we handle 2257 compliance? (especially given the mess made of image permissions already). Enforcing 2257 compliance basically means deleting all of the existing sexual nudity - if that is what you are after then fine, but as a pragmatist I seriously doubt you will see it happen. And even if you do, it doesn't really solve the problem as there are plenty of nudey graphics appearing in search results. --Errant (chat!) 17:19, 11 June 2012 (UTC)[reply]
The way applicable US law requires it to be handled. IANAL, but broadly, there has to be a record, that record has to be available to reusers, and there has to be a compliance statement on every page that features such material. --JN466 17:21, 11 June 2012 (UTC)[reply]

You misunderstand; if someone uploads an image and adds the compliance template... how do we police that they are compliant? --Errant (chat!) 17:37, 11 June 2012 (UTC)[reply]

Cf. [16], which spells out the requirements. Someone reminded me of a prior discussion of this the other day. Verifying that the records exist would likely be an OTRS job. Wikimedians simply need to design a bureaucratic process that meets legal requirements. Lots of people follow these requirements as a matter of course – it's not like Wikimedians are the first people ever faced with this task. All I can tell you is that at the moment Wikimedians are not even trying to comply. JN466 19:37, 11 June 2012 (UTC)[reply]
(edit conflict)Well first off, Elen, while I am most flattered that you think my opinion matters so much I could conduct such a campaign alone, I'd direct you to the several other editors who have spoken up in support, and those many editors who routinely disagree to censorship both here and at Commons. I'm just the one who's not generally afraid to speak up, and say what I think, despite JN's implied threat that apparently I (and many others) should be blocked for expressing an opinion or making good-faith content edits. In this case, what I think is pretty clear—not censored means not censored. If we illustrate the article about the Empire State Building or the beagle with freely-licensed images of the article's subject, we ought to do that with sexuality articles as well, provided we have such images with an appropriate license. Refusing to appropriately illustrate articles when we have illustrations available is a type of systemic bias, and is a point of view that human sexuality is dirty and wrong, and should, if discussed at all, be restricted to text and the occasional line drawing. It's no different whatsoever than if we deleted all sexuality articles outright because of their content. If someone wants to make Kidopedia, with only articles widely thought to be "child-friendly" in a given culture, they can pull a DB dump, nuke whatever they don't like, and host it wherever they want to. You could even petition WMF to start such a project. But that's not the aim of this project. The aim of this project is to be comprehensive, and the real world is not always kid friendly.
Personally, I find the images and subject material at Rape of Nanking, let alone Holocaust, to be far more shocking and distasteful than a depiction of sex acts (including, and perhaps especially, the image at Rape of Nanking which quite graphically depicts a nude murdered woman). But they're real and notable things, and on a comprehensive project, it's our job to accurately document the abhorrent along with the angelic, and to show media which really represents the article's subject, if we have such media under a free license. If you find a subject area personally distasteful, by all means, don't visit or edit the articles. I don't even have any trouble with an opt-in image filter, where the images/categories to be filtered are chosen by the user, and images can be clicked once to be unfiltered. But project-wide censorship of sexuality topics, or sexuality depictions? Yep, that I've got a problem with.
Additionally, I'd note that we're talking about sexual images featuring adults here, not children. Sexual images of children are illegal in the US as well. And in closing, I'd note that if WMF's General Counsel thought the project were in actual legal jeopardy from this, an office action would be taken, as in all such cases. I note that hasn't happened. Seraphimblade Talk to me 17:54, 11 June 2012 (UTC)[reply]
I think that there are a few things that people on both sides can agree on - (a) that Wikimedia isn't willing to host illegal child pornography (by the U.S. definition), and (b) WMF isn't going to use donor funds to hire actors for pornography of any sort. Following all other nations' child pornography laws is a lot harder than you'd think, because for example Australia bans adult women with small breasts and Homer Simpson (fanfic) cartoons as child pornography.[17][18] Even in the U.S. there could be considerable debate as to when something is educational as opposed to prurient - for example, we could receive a contribution of highly relevant images of female genital mutilation, but are people looking out of scientific curiosity or in search of cheap thrills? (Is there a difference?) Wnt (talk) 18:43, 11 June 2012 (UTC)[reply]
Seraphimblade, you simply do not understand that the Wikimedia Foundation is covered by Section 230, but you, as an individual editor, are not. The WMG legal department keep saying that they represent the interests of the Foundation only, and not those of individual editors, but no one seems to be listening. In Germany, the prosecutors came after Achim, not after Wikipedia, although according to Achim the original complaint by the member of the public was about Wikipedia, and not about him. The police identified Achim from the edit history. As for our images featuring adults, there is often no way of knowing this without records. Take the article ejaculation. The four-image panel in that article was uploaded by User:Shadowhead69, whose user page was deleted with the reason "(Speedy deleted per CSD G3, pure vandalism or blatant and obvious misinformation. using TW)". Why is Wikipedia using media uploaded by such an account? Without the legally required records, you have no way of knowing whether the person whose penis is shown ejaculating in that article (or the people in all the other anonymously uploaded masturbation videos in Commons) was 16, 19, or 35 at the time. There are enough teenagers sexting for there to be a realistic chance that the person was underage. You have no idea where that file came from, and under what circumstances it was produced. If you feel happy handling such files, that's up to you, but it isn't a good strategy for Wikimedia. --JN466 19:52, 11 June 2012 (UTC)[reply]
Jayen, Shadowhead never created a userpage. A different editor put harassment of him (likely because of the image) on his userpage. Since that was the only edit ever made to it, it was deleted as vandalism. You might have in your head that this editor had a whole bunch of porn on the userpage, or something like that, but it's not so. Any other admin can verify that for you, if you like. As to the account itself, I imagine it's probably a legitimate alternate account for privacy.
As to the rest, yes, I know you want to delete anything mildly suggestive, because it could conceivably be a minor. But it's very likely not. We don't require documented, individual proof of copyright and source for every image we accept, we rather take the uploader's word unless someone shows otherwise. If the uploader lied, it is the uploader, not anyone else, who should be held responsible for that action, not others who were later victims of their dishonesty. If it looks like a kid, we nuke it faster than you can say "Delete that thing!".
Far as the legalities go, I cannot imagine that the U.S. Supreme Court would find that a law that essentially bans any nude image of the human body, except for professional organizations that can afford the onerous documentation requirements, passes First Amendment scrutiny. I imagine that's doubly true when the images are intended for educational purposes, and not just sexual titillation, since such content has always been granted a greater degree of deference than content intended solely for sexual gratification. And to date, no prosecutions against amateur nude/porn uploaders have been tried under that statute, because of the very real risk that they would drag defendants into court for uploading images of themselves. The law is horribly overbroad and clearly unconstitutional, and so no, I'm not all that worried about it. So same thing again—if you are worried about it, by all means, don't touch the things. But don't tell everyone else that they must follow your lead too. If you're not uploading or using the images, you're at no risk, so if your concern is legal issues for individual editors, why not leave well enough alone at that? Seraphimblade Talk to me 20:25, 11 June 2012 (UTC)[reply]
Complete balderdash. Have you ever read the law? I can't believe you have. Sexually explicit conduct is defined as (i) sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal, whether between persons of the same or opposite sex; (ii) bestiality; (iii) masturbation; (iv) sadistic or masochistic abuse; or (v) lascivious exhibition of the genitals or pubic area of any person. [19] How is that "any nude image"? --JN466 23:57, 11 June 2012 (UTC)[reply]
At this point I think it's safe to say you've bludgeoned us with your personal opinion on this matter. We get what you're saying, but we're actually entitled to disagree on the matter, and your opinion on it isn't any more valuable than that of anyone else. Simply saying it over and over and over and over and over and over and over and over and over and over and over again doesn't do anything other than annoy people who don't feel the need to cater to the most extreme possible position. If it was as big a deal as you're making it out to be, somehow I think WMF's legal people would have done something, and I note that they haven't; screaming about how it's A HUGE PROBLEM OH MY GOD!!!!!!!! comes off as attempting to create a scare where there is none to foist your personal analysis on us. I don't actually think that's your intention, and I certainly agree there are real problems that have to be resolved, but endlessly flooding this page with opinions disguised as facts and treating it like a battleground isn't helping. Wikipedia doesn't exist to cater to the people with the most extreme positions who use the most empty, insubstantial rhetoric to get their way, else we wouldn't have policies like this; that's applicable here too. Once again, I find myself agreeing with the substance of Seraphimblade's comment above, so I'll say no more for now. The Blade of the Northern Lights (話して下さい) 22:56, 11 June 2012 (UTC)[reply]
Exactly what i was thinking since the very beginning of this discussions. --/人 ‿‿ 人\ 署名の宣言 10:41, 12 June 2012 (UTC)[reply]
The WMF legal team represent the Foundation; they do not represent editors. They do try to make that clear, but it seems to have trouble sinking in. One of the first conditions mentioned in the Wikimedia Terms of Use is "Responsibility — You take responsibility for your edits (since we only host your content)." As Achim found out, if you create illegal content, the law will come after you, not after the Foundation. They only host the content and are not responsible for it. Even so, the Foundation kindly did ask Jesse in the Legal Department to research how contributors might be affected by the law, and put the results of that research up in Meta. That's really all they can do. As an adult (I presume), you can avail yourself of that information or not. The page states very clearly that even though the government may choose not to prosecute individual users for § 2257 violations, the scope of the law is broad enough to allow them to do so. JN466 13:10, 12 June 2012 (UTC)[reply]

A man who informed police when he found child abuse images on his computer has not been allowed to be alone with his daughter for four months. Elen of the Roads (talk) 22:47, 11 June 2012 (UTC) Cite news|BBC Humberside|Date 6 March 2012[reply]

Elen, it's not entirely clear which side of the debate you are attempting to support with that link. On the one hand, it suggests that Wikipedia has little to fear in terms of UK law, because someone's possession of child porn can be at a level where they are considered by Social Services to be a risk to their own children, yet not at a level where the police feel it is worth prosecuting. On the other hand, it may suggest that perfectly innocent internet activity can land people in official bother, but only if you feel able to take the protagonist's tale at face value (of course, it is not impossible that he is being entirely candid, but we have no way of knowing). Formerip (talk) 00:21, 12 June 2012 (UTC)[reply]
Given the rarity, severity, absent logic and unpredictability of obscenity-related charges, I think that they're more readily compared to terrorist actions (such as murder attempts against people who draw Muhammad cartoons) than proper law enforcement. There's simply some point past which the number of charges laid is going to depend solely on the number of ambitious prosecutors, not how diligently users try to follow the law. Wnt (talk) 00:59, 13 June 2012 (UTC)[reply]
Personally, I'm sympathetic to those expressing concerns about sex-related images and arguments expressing support for filtering, but a lot of these concerns sound like they are coming straight from the American Family Association and related groups. Viriditas (talk) 01:56, 15 June 2012 (UTC)[reply]
This is simply about professionalism, rather than religion. It is about mainstream views, reflected in the modus operandi of all major websites, not fringe right-wing religious or political views. --JN466 02:13, 15 June 2012 (UTC)[reply]
My point is, what does it mean when one can't tell the difference between an argument made by an extremist group and one from a moderate group? The American Family Association has been making these same arguments since the 1980s. Viriditas (talk) 02:17, 15 June 2012 (UTC)[reply]
By "all major websites", JN is excluding Wikipedia - one of the world's major websites. How many others? It is true that some companies "filter" to try to avoid disturbing customers - just as some "family-friendly" bookstores might not contain all the strange beatnik stories and anarchist sci-fi sex fantasies that you would expect to find stocked at any proper city public library. But Wikipedia should be aligning itself with the public libraries - the ones who turned down funding rather than accept CIPA restrictions - because Wikipedia is a similarly wide-ranging educational enterprise where academic freedom is paramount. Wnt (talk) 05:56, 15 June 2012 (UTC)[reply]
This is supposed to be a mainstream, NPOV project. Whatever your views are, they are not mainstream. --JN466 14:37, 15 June 2012 (UTC)[reply]
You can't be mainstream and NPOV. You can't be mainstream and cover "the sum total of all human knowledge". Because "mainstream" is a common product because it is carefully crafted to sell. It is not the truth, the whole truth, and nothing but the truth; it is a limited and inoffensive truth tucked away in a small little can, only as much as the casual browser is willing to buy that day. The American Library Association is not "mainstream" either, but they're a powerful and well-subscribed voice of reason, which represents the way things should be done, if you want to cover the whole truth, and educate rather than indoctrinate. Wnt (talk) 21:42, 15 June 2012 (UTC)[reply]
Google, Flickr and YouTube do not present the whole truth? There is very little that is legal that Google will not find for you with safe search off. There is very little you cannot find on Flickr, equipped with an adult account. YouTube contains some absolutely disgusting stuff, videos of suicides and God knows, but they still all manage to make at least an effort to keep this stuff out of view of children and all those adults who really do not want to see it. And do you really believe that everyone except Wikimedia is here to indoctrinate? The entire world's scholarship? What a conceit! And let us note that Wikimedia does its own indoctrination. Within a couple of clicks from Fuck, an article any Wikipedia-using child is guaranteed to look up at some time during their childhood, you can be at ass to mouth, felching, facial and rusty trombone, thanks to the helpful navigation template. The amount of space Wikipedia gives to the most bizarre (and even the completely made-up) kinds of kink and porn is completely out of proportion to the amount of space it gives to more common sexual behaviour. The sexual slang template has all the above terms, and many other obscure ones most people have never heard of, but lacks pedestrian and well-known slang terms that are in wide use, such as "come"/"cum", "blowjob"/"blow", "go down on", "fuckbuddy" or "heavy petting". Wikipedia inflates the importance of the rare and bizarre, while neglecting the ordinary. Kids looking up gel bracelet find more about the urban legend of sex bracelets than they do about gel bracelets (and are given a helpful link to gerbilling). It's no wonder that some kids today think that having sex must somehow involve doing something bizarre. --JN466 14:15, 16 June 2012 (UTC)[reply]
Which argument specifically do you mean? That Wikipedia should comply with legal requirements, or be professional? I am re-reading what you wrote, above: "I'm sympathetic to those expressing concerns about sex-related images and arguments expressing support for filtering, but a lot of these concerns sound like they are coming straight from the American Family Association and related groups. ...what does it mean when one can't tell the difference between an argument made by an extremist group and one from a moderate group?" It just means that Wikipedia is taking an extreme stance. When moderates and extremists agree on something, it does not mean that all moderates are covert extremists: it usually means that the view they agree on is an extreme view. For example, far right views are opposed by both moderates and the loony left. That does not mean that anyone opposing far right views belongs to the loony left, does it?
As for the AFA, the Wikimedia Foundation is careful to appeal to the AFA demographic with quotes like these in its fundraising materials: "Wikipedia helps me teach my children about the world in a safe, clean and trustworthy manner. Free from bias, banter, commercial interests and risky content." "Wikipedia has been a wonderful recourse for my children and me to learn new terms, knowledge, and culture background as an immigrant family. It is a safe and trustworthy website for children to do their research." "School districts are increasingly adding assignment for children to complete as homework via the internet. Many of which require research on sites such as the one you had the insight to create." "Wikipedia has become indispensable. We never cease to be amazed at how much our children have learned." "I worked for a non-profit in India and even the poorest children who were receiving education there knew about Wikipedia and were familiar with the site." You cannot with intellectual and moral integrity tout your website as God's gift to children when you are speaking to donors, and then say that Wikipedia is not for children when asked to install even rudimentary child protection measures. --JN466 14:37, 15 June 2012 (UTC)[reply]
Considering your bold quotes: Wikipedia provides material for all ages. If such material might be or might not be suitable for younger people is not our major concern.1 It can't be. If it would, than we would write "The Free Schoolbook Encyclopedia" or "The Children Encyclopedia". It's up to the parents, teachers, ..., any person that has to interact with children to select suitable content. We don't teach children differentiation in first class. Neither do we tell them about the truth behind criminality, war or sexual activities. You find a lot of examples for material that is not suited for children x years old and thats not only limited to Wikipedia or even the WWW as a whole. It is up to the parents or the legal guardians if they let a child search for it's interests on it's own, with guidance or not at all.
1 Our major concern is to write good articles that cover most if not all topics of common knowledge.
Side note: Every time I read one of your comments I have the strong concern that you describe the average understanding as an extreme leftist view to say that the center of the right spectrum should be our goal. Am I right? --/人 ‿‿ 人\ 署名の宣言 15:38, 15 June 2012 (UTC)[reply]
No, you are not. I have never voted for a conservative party candidate in my entire life. When I lived in Germany, I loathed the CDU/CSU and all it stood for. As for your other point, Flickr, too, hosts vast amounts of sexually explicit material. However, they host it responsibly: users don't see it unless they say they want to see it, and register with an adult account. JN466 15:47, 15 June 2012 (UTC)[reply]
I already explained under 2257 compliance statement why it isn't a wise decision to compare Wikipedia or Commons with Flickr. --/人 ‿‿ 人\ 署名の宣言 16:21, 15 June 2012 (UTC)[reply]
And I've already explained above that a German prosecutor's office recently indicted a German admin under a law designed to prevent the spreading of pornography to children, as well as the involuntary exposure of adults to pornography. There is a reason why, when you walk through a pedestrian zone, you do not see images of hardcore pornography. Commercial organisations follow these guidelines just as much as non-commercial ones. That's just a reality, and on balance I prefer it that way. I do not want to see explicit adverts for porn sites on double-decker buses, and nor do most people. I am not aware of any significant movement whatsoever to change this state of affairs. It enjoys very broad consensus. --JN466 16:53, 15 June 2012 (UTC)[reply]
I think the movement is called the internet. Seriously, dislike of porn censorship has been a major driving force, perhaps the major driving force, in its development from the first days of uuencoded multi-part text files to the present day. You can't get away with telling us that "pornographic" images are the most popular files on Wikipedia, and saying they have no popular support. Wnt (talk) 21:57, 15 June 2012 (UTC)[reply]
Viewing pornography online is not at all the same as advocating for its presence in advertisements on double-decker buses, or advocating that children should be exposed to it, just as 30 years ago buying a VCR, and going to an adult store to buy porn videocassettes did not mean that any person doing so, and enjoying porn at home by themselves or with their partner, was in favour of porn videos being sold to their children, alongside videos of children's programmes.
We have somehow moved very far from the question of why we don't have 2257 compliance statements anywhere in our projects. Elen raised a very important point, above, which I think has not gotten enough attention. Without compliance statements, readers of Wikipedia have no assurance whatsoever that the sexual media they are seeing in Wikipedia all depict someone aged 18 or older. (See [20].) This actually exposes them to a legal risk, as well. Basically, it seems highly inadvisable for the Foundation to allow anonymous uploads of sexually explicit material without the legally required documentation. JN466 13:39, 16 June 2012 (UTC)[reply]
Seriously - you're telling me that someone would knowingly upload child pornography to Wikipedia, but would balk at the thought of lying and saying he has 2257 documentation on file? Or are you suggesting that Wikipedia actually get copies of every driver's license of every subject of the photos for our hacker-proof vaults? (then run each one by the state DMV to check that they haven't simply been photoshopped with a different picture or age) Wnt (talk) 14:53, 17 June 2012 (UTC)[reply]
All that would be required would be cast iron identification of the uploader, and a copy of the compliance statement emailed to OTRS. It may reduce the number of autofellatio images we host, but better safe than sorry. --Anthonyhcole (talk) 15:29, 17 June 2012 (UTC)[reply]

I don't believe that's true (although IANAL). Sure, § 2257 is unclear about some details, and is written so broadly that much of it may be found unconstitutional. But assuming it stands as written, I believe that:

  1. The identification needed is of the performers, not the uploader (unless the uploader happens to be a performer).
  2. The records that must be kept about each performer include name, date of birth, and any aliases or nicknames they have.
  3. Although Wikimedia would not be subject to § 2257 requirements, our editors who "produce" sexually explicit content might well be.
  4. "Produce" may have a much broader meaning that you might think, including editors who simply "make editorial or managerial decisions concerning the sexually explicit content". This might include uploading the content, adding the content to an article, adding a category to the content, or voting "Keep" in a deletion discussion. (28 CFR § 75.1(c)(2) and (k))
  5. Although § 2257 is titled "Record Keeping Requirements", producers are also require to "affix" a statement to every copy of the content saying where the records are kept. Here (and I quote) "the term “copy” includes every page of a website on which matter described in subsection (a) appears." (§ 2257(e)(1))
  6. People who simply "sell or otherwise transfer" materials containing such content interstate need only ensure that a compliance statement is affixed, without needing to verify its veracity or keep records. (§ 2257(f)(4)) Presumably this is why you think we would only need a copy of the compliance statement from the uploader. But I don't see any such exception for producers, a group which might include many editors here, not just those who upload the content.

For details, see meta:Wikilegal/Age Record Requirement and the relevant legislation linked above. --Avenue (talk) 02:59, 19 June 2012 (UTC)[reply]

Thanks for clarifying that. So the producer would have to put an actual address on each web page containing the image, but not their name?
The name of an individual producer does not have to be shown. If the producer is an organisation, the relevant employee's name and job title must be shown. See § 2257(e). --Avenue (talk) 12:50, 19 June 2012 (UTC)[reply]
Also, the EFF says the details do not have to be on the web page - a link from each page to a compliance statement with relevant details would do. If we had a central repository for the records, a link on the main page might be enough. --Avenue (talk) 21:39, 19 June 2012 (UTC)[reply]
Although it matters that we clarify the legal position, there is a moral perspective too. As a community, we care that children are not exploited sexually and that no one's privacy is violated. Obliging uploaders to comply with 2257 would help to ensure we are not encouraging or enabling that. The degree of legal compulsion we're under here is important to clarify but, regardless of that, 2257 provides us with a useful framework for doing what we want to do anyway. --Anthonyhcole (talk) 04:50, 19 June 2012 (UTC)[reply]
So, since WAANL (We All Are Not Lawyers), why not let any editor who's concerned about any potential legal issue talk to someone who is? In the meantime, WMF's GC has made his statement (which raises more questions than answers, and that does seem to be the current status of the law), and since WMF isn't directly affected, they're not going to jump in. (That, and if they were going to, they already would have.) As to the "moral position," the vast majority of images I've seen cited in this discussion involve either A: Models who are very clearly over 18 and show clear awareness that they are being photographed/filmed, or B: Subjects who are unidentifiable, because not enough of them is shown. If a particular image fails one or both of those criteria, I'd say we should carefully consider whether we should retain it (or in the case of it obviously being a kid, nuke it and call the police.) But I've seen no examples whatsoever, where the model is identifiable and appears to be a child and/or photographed without knowledge or against his or her will. Seraphimblade Talk to me 08:12, 19 June 2012 (UTC)[reply]
It is often very difficult to say what the age of a performer is. For example, the media on the ejaculation page might conceivably show a minor, which would mean that everybody viewing the page is breaking the law. Images uploaded to Commons that only show genitals may turn out to be cropped versions of an image available elsewhere that shows an identifiable minor. Basically, we don't have an idea what age any of the people in these and other images are. Sloppy. JN466 12:34, 19 June 2012 (UTC)[reply]
Purely theoretical: No one knows the age or the persons identity. How could it be illegal or someone found guilty without knowing age and identity? --/人 ‿‿ 人\ 署名の宣言 15:30, 19 June 2012 (UTC)[reply]
Law Enforcement do. They will have access to 100,000s of images, and that image of someone's genitalia, that was copied off the web, and that some one on commons has been arguing 'keep' on, or adding categories to, may well be just one of a series where the identity and age is NOT beyond dispute. John lilburne (talk) 15:46, 19 June 2012 (UTC)[reply]
*cough* and whilst that case involved secret filming, I note that one of the appeal judges in the case commented "Jahnke's argument would mean that his girlfriend's privacy interest in not being recorded in the nude was left unprotected any time she permitted anyone, under any circumstances, to view her nude, he said, and if it were legal to record her in those circumstances, it would be legal to distribute the recording." Legal opinion is increasingly turning to the view that privacy rights are not terminated simply because one allows something to happen in private. John lilburne (talk) 12:54, 19 June 2012 (UTC)[reply]
Some people use a 2257 record-keeping service, which means that they do not have to give their address and open their home to record inspectors. Images shared purely privately between spouses e.g. are exempt, but images "traded" with strangers online are not, even if they are of oneself. The law actually requires you to make a photocopy of your own ID, and keep it on file, available for inspection. JN466 12:34, 19 June 2012 (UTC)[reply]
JN, you keep ducking this question when I make any other point with it, so here it is by itself. If you're concerned about legalities, but we've already agreed that individual editors, not the project, are the only ones at any conceivable risk, why don't you stay away from the area if you find it too risky, and let others willing to accept the risk do so if they want to? Since the project itself is in no jeopardy, why are you arguing we should make legal decisions for individual editors based upon murky, untested, and quite possibly unconstitutional law? Even the GC, who has a law degree and a significant amount of experience in practice (unlike, I would imagine, any of us here), stated in his analysis that the law is very unclear and untested, especially in a case like ours. Seraphimblade Talk to me 16:44, 19 June 2012 (UTC)[reply]
You could similarly argue that WMF is not responsible for copyright violations, at least if it responds to DMCA takedown notices, so why bother removing them until a notice is filed? Individual editors would be the ones legally responsible. Do you think that would be desirable? I take your point that the law is untested, quite possibly unconstitutional, and I'd agree that it's unclear exactly which editors would be at risk. But does that mean the community should ignore the law completely? --Avenue (talk) 21:22, 19 June 2012 (UTC)[reply]
The reason is that Wikipedia was a challenge to copyright law. Its purpose was to create material everyone could use. To have copyrighted stuff mixed in with the other content means that a reuser is less likely to be able to use that content, and he can't tell the difference. Nothing on the copied text or image tells you if copyright is going to be applied to it or not. And copyright has been highly internationalized, so the idea is that text that is OK in one place will be OK in many others. But the reuser can tell looking at a photo directly whether it is going to violate some country's local censorship law or not, so that's not something that has to be done in advance for him, and even if it were, what to do in advance would vary for every end user. Wnt (talk) 22:32, 19 June 2012 (UTC)[reply]
That's a good point about copyright problems being invisible, in contrast to sexually explicit content, which should be obvious to any reuser examining it. However, if we don't collect and release enough information to enable them to satisfy their obligations under § 2257, US reusers will not be able to legally use it. It would therefore not really be free content, regardless of its copyright/licensing status. To allow legal reuse of sexually explicit content under § 2257, I think we would have to not only keep records of performers' names and dates of birth, but also copies of their identification documents, and make these available to reusers. (This might discourage some potential content donors.) I don't think we can worry about censorship laws in every country, but the US is a special case. --Avenue (talk) 01:00, 20 June 2012 (UTC)[reply]
There are legal mysteries here I can't answer - my assumption would be that if a photo was taken outside the U.S., 2257 doesn't apply to it. I also assume that if a foreign citizen uploads an image and doesn't tell you it was from the U.S., that you can't be expected to keep 2257 paperwork. And American citizens uploading U.S.-made recent images are already warned by the WMF counsel of the possibility that this unconstitutional law might be used against them. But one of two things should be true - either any American photo should be 'washable' (copyright licensing permitting0 by some foreigner stripping out the 2257 crap and posting it on some server, or else using any random foreign photo puts unknowing Americans at great risk of 2257 prosecution. I'm thinking the former. Wnt (talk) 16:57, 20 June 2012 (UTC)[reply]

Hello Jimbo. Am I to understand that your putting above into your archiv not bothering to answer my post or my e-mail means that you don't think it necessary to take any action, and that you want me to take it to court? And if so, are you sure that that is in the best interest of Wikipedia? I'd appreciate an answer. Thanks, Ajnem (talk) 07:12, 18 June 2012 (UTC)[reply]

Please see Wikipedia:No legal threats. I'd advise reading it all the way through carefully. If you want to engage in legal stuff like that then that is your business, but you should get in contact with the Wikimedia Foundation and not go on about it here and you should cease any editing on any relevant subject. Dmcq (talk) 07:41, 18 June 2012 (UTC)[reply]
It's no threat, and it's not a legal action against Wikipedia I'm referring to. And I'm afraid I don't understand the rest of the post. I do have an e-mail address, though. Ajnem (talk) 08:49, 18 June 2012 (UTC)[reply]
"and that you want me to take it to court?" sounds like a legal threat to me. I have started WP:AN/I#Possible legal threat on Jimbo's talk page since you don't seem to be taking the message onboard. Dmcq (talk) 10:26, 18 June 2012 (UTC)[reply]
Ajnem has been blocked indefinitely or until such time as they can clarify on their talk page about that they are not thinking of legal action. Dmcq (talk) 12:14, 18 June 2012 (UTC)[reply]
are you thinking of a pink elephant right now ? or intending to later ;) Penyulap 08:22, 19 Jun 2012 (UTC)
In any event, I can barely read German and so I'm not the right person to ask about this. If legal action is contemplated, then a chat with Geoff Brigham may be useful, or someone at the German chapter may be able to provide some feedback. --Jimbo Wales (talk) 14:31, 18 June 2012 (UTC)[reply]

Thinking 52,700 weekly 10,100 daily editors

See thread below: #Evidence many high-count editors working higher

In trying to estimate the editor-count statistics for May 2012 (which seem to have been under-calculated near 71%-74% of prior levels), I noticed the monthly counts for "≥3 edits" and "≥25 edits" which IMHO are more what I would follow, as being "weekly edits" and "daily edits". The current garbled data table shows the weekly and monthly editors as:

A person who edits once per week always counts in "≥3 edits" but perhaps never in the "≥5 edits" and would include semi-active editors which I saw edit only 4 times in some months. Also, the counts for "≥25 edits" would include (finally) people who only made 99 edits per month (which is a lot like 100 per month!). Hence, a person who made a few daily edits, such as 1, 2 or 3 edits per day (31-62-93) would count among the 10,100 editors who make "≥25 edits" per month.

Avoiding extreme edit-counts: While I think the counts of editors making 3+ or 25+ monthly edits are better statistics, I would beware 1-edit or 1,000-edit counts, as perhaps very misleading:

  • I see 1-edit monthly as almost mistaken, or "They quit but have 1 edit to say"
  • But 1,000-edits is more like "click edit 1,000 times" (or reverts).

The rationale for questioning 1,000 monthly edits would be to beware rote, or 1-word changes, compared to making perhaps 95 monthly edits which expanded text or added several changes within each edit. I think there will be a strong correlation there, between high monthly edit-counts and 1-word changes or reverts. Meanwhile, the counts for 100-edit levels are still valuable, but omitting the editors who make 25-99 monthly edits (as not being "highly active") gives the false impression that people who make only 2-3 edits per day are not doing much beyond those who make 5 edits per month. Hence, I recommend watching the counts for "≥3 edits" (as "weekly editors") and "≥25 edits" (as "daily editors"). In that sense, then we have, on average, more than 10,100 daily editors (≥25), as more than triple the official "highly active" editors (3,500 ≥100). So, that figure, exceeding 10,100 editors, helps to explain what happens in day-to-day activities during the month, on English Wikipedia. Again, the count of daily editors (≥25 edits) is more than triple ≥100 edits, as a better indicator of editing patterns. -Wikid77 (talk) 15:14, 18 June 2012, revised 09:10, 19 June 2012 (UTC)[reply]

  • Other considerations: There are several other issues involved in the counting of active editors.
  • The 3-table split by article/talk/other namespaces lowers active editors, where a person who made 95 article edits and 90 talk-page edits plus 70 template or image edits (total 255) would not be counted as "very highly active" >250 edits, but merely an "active editor" with occasional edits.
  • Editor levels should be compared by dividing as per-day counts, such as January divided by 31 days, then compared to February divided by 28 days (or leap year 29 days) to show February levels are actually higher, even though the February all-month totals seem lower than January counts.
  • The trends need to be confirmed by the full data counts, when the truncated-count data file is updated at end of June 2012.
More issues later. -Wikid77 (talk) 12:02, 20 June 2012 (UTC)[reply]
  • Looking at large numbers of contributor histories suggests few editors totalling fewer than 100 edits per month contribute at a steady 1-5 edits per day. Much more typical is irregular bouts of say 10-20 edits or more in one session. Many editors have a pattern of such sessions at intervals greater than one month - ie they do nothing for say 3 months, then write an article, check over old ones, or whatever. A separate table of avge per day figures derived from the monthlies would be useful, though the number of weekends and holidays in a month also will affect these. Johnbod (talk) 12:56, 20 June 2012 (UTC)[reply]
Would counts >35, >55 or >65 better reflect daily activity? Especially considering the short months, perhaps >65 would be more representative of "daily" editors. I have seen several cases of 70-edit editors counted as being the occasional >5 editors. So, the suggested level might be ">65" (or ">55" or such) to separate the >3 editors from "daily editors". Otherwise, counting 99 edits the same as 5 would underreport the editors who are quite active, editing 99 article changes per month. I am not sure Jimbo has thought about "how many edits per month" would indicate the "daily" editors. -Wikid77 (talk) 16:34, 20 June 2012 (UTC)[reply]

Evidence many high-count editors working higher

See thread above: #Thinking 52,700 weekly 10,100 daily editors.

The truncated data tables (down 26%-29%) show a slight drop this year, of the occasional editors, with fewer at lower monthly-edit levels; meanwhile, many others seem to be working at higher activity. The corrected data tables are likely to show the same trends. It is just not true that "everyone" has reduced to "fewer edits per month" (no); instead, for whichever editors slow down or quit, other editors quicken to replace them, on average, each month. Viewing the monthly editor counts (at 3+, 5+, 10+, 25+, 100+, 250+, 1000+, 2500+ edits), the number of editors, logged for lower counts of editing, has been shrinking during the past 16 months. However, on average, the number of editors making >100 edits or >250 edits per month (8+ per day) has increased (to "970" editors), compared with editors making >25 edits per month ("25,377" editors). Contrary to talk of slowed activity, the number of high-activity editors has grown larger, relative to other groups.

Even though the data tables, this week, are based on truncated counts, varying from 71%-74% of typical monthly counts, the relative counts show there is a slight shift in monthly-edit patterns: the pool of semi-active editors is slowly shrinking, as the highly-active editors increase.

Counts of editors making changes to articles
Edits ≥ 1 3 5 10 25 100 250 1000 2500 10000 25000
May 2012 85389 38240 25377 14704 7334 2449 970 136 25 2
Apr 2012 83849 37270 24928 14684 7459 2376 905 108 25 2 1
Mar 2012 84950 38159 25309 14798 7408 2422 963 123 22 3
Feb 2012 85915 37494 24568 14325 7198 2289 870 103 23 3
Jan 2012 88887 38835 25587 14813 7451 2486 968 122 28 3
Dec 2011 83402 37144 24995 14652 7300 2345 935 102 26 2
Nov 2011 86604 37747 24991 14467 7208 2330 862 116 28 2 1
Oct 2011 89143 38744 25374 14712 7210 2409 907 121 18 1
Sep 2011 88094 38214 25233 14507 7208 2300 906 116 26
Aug 2011 89351 39373 26072 15213 7572 2434 975 105 28 2
Jul 2011 87818 39160 26017 15209 7563 2349 862 118 26 2
Jun 2011 89447 39316 25954 15095 7419 2336 866 111 25 1
May 2011 93028 40417 26584 15434 7456 2399 879 121 22
Apr 2011 93575 40862 26710 15486 7524 2357 873 96 17
Mar 2011 98298 42387 27608 15951 7818 2483 908 110 18
Feb 2011 90999 40029 26263 15117 7452 2375 912 101 26 1
Jan 2011 91088 40483 27052 15773 7857 2552 1014 127 23 1
Dec 2010 81897 36515 24306 14207 7028 2298 847 107 19 1
Nov 2010 84393 37406 24846 14391 7237 2397 896 128 31 2
Oct 2010 86501 38517 25563 14968 7572 2494 1004 139 24
Sep 2010 84251 37900 25220 14819 7559 2471 933 113 32 2 1
Aug 2010 85957 38676 25828 15275 7753 2549 963 127 27 2
Jul 2010 85451 38358 25606 15087 7584 2517 931 129 24 1
Jun 2010 86939 38513 25703 14859 7392 2396 885 112 27 1
May 2010 95299 42113 27852 16113 7957 2543 994 137 25 2
Apr 2010 94754 41780 27420 15791 7906 2590 958 123 21 1
Mar 2010 98218 43014 28295 16311 8098 2616 1030 131 26
Feb 2010 93522 40756 26897 15533 7744 2489 959 121 16
Jan 2010 97615 42634 28187 16366 8118 2714 1101 99 15
Edits ≥ 1 3 5 10 25 100 250 1000 2500 10000 25000
Data from table#2 in: http://stats.wikimedia.org/EN/TablesWikipediaEN.htm

Again, it is just not true that "numerous editors" have all reduced to "fewer edits per month" (no); instead, for whichever editors have slowed down or quit, other editors have quickened to replace them, on average, each month. The preliminary data shows that, as the pool of semi-active editors has been shrinking by thousands, the core of highly active editors has been growing by hundreds making several edits per day (for the past 4 months). -Wikid77 (talk) 09:10, 19 June 2012, revised 06:17, 20 June 2012 (UTC)[reply]

Please correct me if I am wrong, but I think you must mean to say "working more," however, I would not argue with the premise in your choice of section header. Now, please tell us whether they are working smarter. 17:55, 19 June 2012 (UTC) — Preceding unsigned comment added by 207.174.73.89 (talk)
  • Data shows remaining editors talking more and perhaps smarter: Well, we hope the remaining editors are working smarter, but at least the data shows they are talking more, in the sense that talk-page editor counts, during the past 13 months, have remained at the same monthly levels, or slightly higher, while perhaps 2,000 to 7,000 semi-active article editors are no longer active (editing articles). So, there are more talk-page edits, per person, averaged over the fewer people who edit articles. Hence, people are talking more. Data tables are split into 3 groups: article editors, talk, and other editors. Perhaps the semi-active editors have been reduced by blocking many occasional troublemakers, who rarely edited the talk-pages. Anyway, with more than 10,100 users who edit articles >25 times per month, it is difficult to "summarize" who those people are. It is good to know that talk-page edits are level, or higher, so there is zero evidence that numerous talkers have left grumpy non-talkers, in frustration. Instead, the numerous editors who no longer edit articles monthly had, essentially, no effect on talk-page edit levels. It is as though they weren't talking much, they simply left. Meanwhile, the 3rd group of "Other editors" such as for templates, fair-use images, or help/policy pages, have increased slightly (3%) during the past 3 months. So, if working on templates, uploading images, or writing help/policy pages means those people are smart, then the group is in fact working "smarter". Overall, the higher editor counts have been pushing higher, not lower, during the past 4 months since the prior year. -Wikid77 (talk) 06:17, 20 June 2012 (UTC)[reply]
I agree that the proportion of users working on templates is a better metric than word or sentence length in measuring the extent to which editors are more knowledgeable, but there is a point of diminishing returns. If you really want to monitor this, you might be interested in measuring the rate at which articles in WP:SPVA grow. You can get their size from the mw:API. 75.166.206.120 (talk) 19:23, 20 June 2012 (UTC)[reply]

your opinion

please read this thread here, [21]. i would like your opinion on the last comment by user:frotz, [22].-- altetendekrabbe  15:21, 18 June 2012 (UTC)[reply]

To the extent he's calling for reliable sources, I agree with him. To the extent he's personally editorializing, well, I have my private opinion but - like his - it isn't important for Wikipedia.--Jimbo Wales (talk) 16:30, 19 June 2012 (UTC)[reply]

Discussion at Talk:Rape_culture#RFC_-_Multiple_Factors

No strong opinion--Jimbo Wales (talk) 16:28, 19 June 2012 (UTC)[reply]
The following discussion has been closed. Please do not modify it.

You are invited to join the discussion at Talk:Rape_culture#RFC_-_Multiple_Factors. 4 Points for consideration - Synonymic Usage, Quotations, Sources. Media-Hound 'D 3rd P^) (talk) 20:13, 18 June 2012 (UTC)Template:Z48[reply]

Ha ! Penyulap 08:27, 19 Jun 2012 (UTC)

WMF demands control of Wikinews.com domain

This happened back in April, but I think it's still relevant and I didn't notice a discussion about it in the archives. So, basically, the WMF allows the website to run for seven years, everything perfectly fine and dandy, as Wikinews.com had a link to our Wikinews on their site. And, besides, their site was made before our Wikinews, so they are the "legitimate" one more or less, even if we get vastly more traffic. But then the WMF suddenly demands ownership of the domain and offers to "pay the registration fees"? We all know that that is pretty much offering nothing. Not to mention that the Wikinews.com domain hasn't even been transferred yet, just shut down. It looks like the WMF is right on track to becoming a corporate monolith that only cares about itself and not about the people it harms. I was going to say accidentally harms, but this seems pretty intentional. SilverserenC 20:56, 19 June 2012 (UTC)[reply]

-1 uncaring corporate monolith. 64.134.221.135 (talk) 21:18, 19 June 2012 (UTC)[reply]
I don't get it. What harm? According to the story you linked, the site had no content other than a pointer to the wikinews.org site, and had not paid its domain name registration fee for several years. It seems that the owner just wants to get some money in spite of never having invested any effort in the site. Looie496 (talk) 21:30, 19 June 2012 (UTC)[reply]
If it was something that the owner didn't want, then the WMF could have worked out a deal with him. But their first action was to invoke the UDRP and send a cease and desist letter. If this is how the WMF acts outwardly, then it is extremely concerning. If the WMF had filed this sort of thing when they originally obtained the trademark for Wikinews, then it would make sense, not years later. And, again, they have yet to even do anything with the domain, so it seems like this was some sort of posturing, i'm not even sure. SilverserenC 21:42, 19 June 2012 (UTC)[reply]
This is standard procedure for all Internet entities interested in protecting their brand, so nothing to see here. Call me when they desysop and block the knuckle dragging cretinous morons currently running the Wikinews site, at which point I will send the WMF a large check. Viriditas (talk) 22:33, 19 June 2012 (UTC)[reply]
If you were going to fork Wikinews, are there any policy changes you would institute, or do you think the problem lies entirely with personalities? 75.166.206.120 (talk) 04:40, 20 June 2012 (UTC)[reply]
Closing interesting but off-topic disagreement between Viriditas and William S. Saturn. I encourage people who are interested to take this particular discussion to wikinews.
The following discussion has been closed. Please do not modify it.
I think implementing a global policy initiative and tightly integrating the wikis into more of a "federation" would help a great deal. The Wikinews admins are fond of exclaiming "This isn't Wikipedia!" and it would put a stop to the irrationality over there. We don't really need to posit a forking thought experiment; what we need are metrics and "secret shoppers"—WMF employees who go undercover as new users while abiding by a set of protocols to discover what works and what doesn't. It's pretty much a given that the inmates have taken over the asylum over at Commons, and IMO, also over at Wikinews. (I've heard this said about Wikiversity, but I don't know if this is true.) In any case, if the WMF isn't actively pursuing a mystery shopping-type program to find out how new users are being treated and what kind of hoops editors have to jump through in order to get anything done, then they should. Viriditas (talk) 05:24, 20 June 2012 (UTC)[reply]
There is nothing irrational about the processes at Wikinews. However, sometimes irrational editors go there, disrupt the project, and treat the volunteers with a lot of disrespect. [23].--William S. Saturn (talk) 05:31, 20 June 2012 (UTC)[reply]
I said the admins are irrational over at Wikinews, not the processes. Try to read closer for comprehension, Mr. Saturn. I haven't disrupted any project, nor do I have the inclination nor the free time to do so. Repeatedly saying something over and over again doesn't make it true. The fact of the matter is, Mr. Saturn, is that Wikinews attracts disruptive users, users who have managed to bring the site to a crawl with few editors and even fewer edits. The prevailing admins on that site have done everything possible to scare away new users and to discourage editing. They have, in effect, created a walled garden, a site that nobody can use and that nobody reads, and I maintain they have done this on purpose to promote their own interests. Well done, Mr. Saturn. Viriditas (talk) 05:55, 20 June 2012 (UTC)[reply]
Undoubtedly you have an issue with the process, since you believe a review should take only a few seconds. But let's look at how much credibility you have on the matter before judging your comments: you tried to use sources from before an event happened to prove that an event happened; you called for the number one reviewer to step down because he failed your article based on your inability to understand that things must be verified after they happen; you added more than necessary source links to an article, just to create more work for a reviewer. Plus there is strong evidence you tried to use sockpuppets to continue to disrupt the project after you were unanimously banned by the community.
The fact is that at the moment Wikinews is doing very well with original reporting. New editors such as LauraHale, Crtew, Bddpaux, and others have come along recently and demonstrated excellent reporting. Incompetent users who fail to understand the processes are few and far between.--William S. Saturn (talk) 06:20, 20 June 2012 (UTC)[reply]
The only problem with your "assessment" is that it is completely untrue. I have never done any of the things you have claimed I have done; this is illustrative of the greater problems over at Wikinews. Your "number one reviewer" made up some crazy sock puppet story and others like yourself, keep repeating it. There is, at this very moment, approximately zero evidence for such a claim, and I fully addressed it on Cirt's talk page in February. Apparently, your "number one reviewer" blocked multiple users under the false allegation that they were me. This is exactly the kind of bad behavior I'm talking about, and this is why the WMF should take over that site. Such is the state of Wikinews: falsities wrapped up in lies wrapped up in character assassination. Putting aside your penchant for generating and perpetuating fiction, within the last 24 hours there has been virtually no activity on Wikinews, except for, of course, activity from your "number one reviewer". Oh wait, according to the recent changes log, an IP did make two edits in the last 24 hours. Even your admins claim that there only "20 active editors" on Wikinews, but apparently they too have disappeared. Wikinews is not a wiki, it is a closed community with almost no activity due to the efforts of your "number one reviewer" who has managed to prevent new editors from using the site, block others for bogus reasons, and invent lies about "sock puppets" to keep others blocked. The recent changes log doesn't lie: two edits by one IP in the last 24 hours, with all the other edits made by your "number one reviewer". The evidence speaks louder than the nonsense you've invented. Viriditas (talk) 06:36, 20 June 2012 (UTC)[reply]
Thankfully, readers can go to the site and judge it for themselves. It's all documented there. But I must say, your conspiracy theory doesn't make much sense. Why would the admins want to run people away? What is your evidence for this? The fact that a reviewer failed your article because it used sources from before an event to verify that the event happened? As a "new" user, was it too unreasonable for you to find recent sources that verified your news story rather than simply attack a reviewer? --William S. Saturn (talk) 06:49, 20 June 2012 (UTC)[reply]
Yet another false statement. No, it is not "all documented there", as your "number one reviewer" deleted most of the discussions on the news talk pages that were locus of this dispute, and that includes multiple news articles. All that is left to review are multiple personal attacks made against me by two admins on my former talk page. Furthermore, the sock puppet/joe job/character assassination was typical of the Wikinews admins. This discussion has nothing to do with my contributions to Wikinews or my dispute with the admin(s). It has to do with the fact that Wikinews is not operating as a Wiki, but as a personal web server run by your "number one reviewer" and has almost no new editors nor any activity as a result. The only reason I was even active over there, was because I had been personally invited. My experience is that Wikinews does not uphold the values and principles of the WMF in any way and is operating as a rogue wiki. I would encourage the WMF to go undercover, create new accounts, and attempt to write news stories to see it for themselves. Also, be sure to review the recent changes and new editor creation logs. Viriditas (talk) 06:57, 20 June 2012 (UTC)[reply]
I also encourage people to edit wikinews. New users are always welcome.--William S. Saturn (talk) 07:02, 20 June 2012 (UTC)[reply]

I don't know anything about this particular issue, but I would very much not jump to conclusions. As I understand the timeline (correct me if I am wrong), the WMF took no action until the domain was already lapsed. It's really not fair to characterize an offer to go to UDRP as coercing someone - when you have a dispute, you should go through the dispute resolution process. If the previous owner had a right to the name, the process would presumably have found so. (For those who don't know, the UDRP was designed precisely for this kind of situation, to avoid people ending up in very expensive court battles - it's a low-cost way for people to put forward their competing views before a neutral arbitrator!) As it stands, even according to the very biased blog post about this, the owner of the domain was given an offer and accepted it - nothing was taken from him in court or by the UDRP process. It sounds to me that the WMF was quite generous in the offer. Saying that the WMF "allowed the website to run for seven years" is a bit of a stretch - there was never anything materially there.--Jimbo Wales (talk) 09:18, 20 June 2012 (UTC)[reply]

This story looks similar to that of [steam.com], which is "not for sale" even though Valve Corporation would pay a fortune for it. Even though that website is swamped by [steampowered.com]. The WMF should just move on and realize that their wikinews.org site gets orders of magnitude more hits than wikinews.com. Wer900talkcoordinationconsensus defined 16:26, 20 June 2012 (UTC)[reply]
Significant order of magnitude as well, it got practically no traffic. Likely, this is because people get to Wikinews not by typing in the URL, but by following a link from some other Wiki page, like our front page. Or just by running into a Wikinews article on Google. No, Jimbo, the issue I have is two-fold. 1) The article states that the UDRP was not met, as there was no bad faith intended from the URL (as even stated by Jay Walsh). Therefore, we had nothing to stand on and even less so considering the URL was taken before we ever made Wikinews. And 2) It's been two months and still nothing has been done with the URL. If you're going to bully someone into taking it (and it's quite clear from Castells comments that this is indeed bullying), then at least make use of it. Otherwise, it's like we're punching him in the face for a second time, because we're showing we didn't actually need the URL at all and this was all a show. SilverserenC 19:23, 20 June 2012 (UTC)[reply]