I've noticed a problem with overzealous deletionists on Commons. While this may be something of a legal and political issue, it's also operational and affects multiple *[m,p]edias at the same time.
I've spent some time over the years convincing public figures that we need official pictures released for articles, rather than relying on fan (or publicity or staff) produced pictures. Because of my own experience in the academic, computing, political, and music industries, I've had a modicum of success.
I also ask them to create an official user identity for posting them. Since Single User Login (SUL), this has the added benefit that nobody else can pretend to be them. From their point of view, it's the same reason they also ensure they have an existing facebook or linkedin or twitter account.
This week, one of the commons administrators (Yann) ran a script of some sort that flagged hundreds of pictures for deletion, apparently based on the proximity of the word facebook in the description. There was no time for actual legal analysis, at a rate of more than one per minute. The only rationale given was: "From Facebook. No permission."
https://commons.wikimedia.org/wiki/Commons:Deletion_requests/File:Sharon_Agu...
In this case, timestamps indicate the commons photo was posted before the facebook photo, and the facebook version is somewhat smaller, so there's not even the hint that it was copied "From Facebook." Besides, many public figures also have facebook accounts, so it shouldn't matter that a photo appears in both places.
A bot posted a link to the notice on the en.wiki talk page that used the photo, where in turn it appeared in my watchlist.
Then, despite my protest noting that the correct copyright release was included, the administrator (Yann) argued that "The EXIF data says that the author is John Taylor. The uploader has another name, so I don't think he is allowed to decide a license."
That appears to be post-hoc explanation, as the facebook one obviously wasn't applicable. Self-justifying strawman argument.
In this case, as is usual in the most industries, the *camera* owner appears in the EXIM file. A public figure who pays the studio for headshots owns the picture itself. The photographer would need the public figure's permission to distribute the photo!
After pointing out the nomination didn't even remotely meet the deletion policy nomination requirements (that I cited and quoted), this administrator wrote: "I see that discussion with you is quite useless."
Then, minutes later, another administrator, BĂ©ria Lima, deleted the photo without waiting for the official 7 day comment period to expire. That indicates collusion, not independent review.
There are a number of obvious technical issues. YouTube and others have had to handle this, it's time for us.
1) DMCA doesn't require a takedown until there's been a complaint. We really shouldn't allow deletion until there's been an actual complaint. We need technical means for recording official notices and appeals. Informal opinions of ill-informed volunteers aren't helpful.
2) Fast scripting and insufficient notice lead to flapping of images, and confusion by the owners of the documents (and the editors of articles, as 2 days is much *much* too short for most of us). We need something to enforce review times.
3) Folks in other industries aren't monitoring Talk pages and have no idea or sufficient notice that their photos are being deleted. The Talk mechanism is really not a good method for anybody other than very active wikipedians. We need better email and other social notices.
4) We really don't have a method to "prove" that a username is actually under control of the public figure. Hard to do. Needs discussion.
5) We probably could use some kind of comparison utility to help confirm/deny a photo or article is derived from another source.
If there's a better place to discuss this, please indicate.