"The GCHQ mass tapping operation has been built up over five years by attaching intercept probes to transatlantic fibre-optic cables where they land on British shores carrying data to western Europe from telephone exchanges and internet servers in north America.
This was done under secret agreements with commercial companies, described in one document as "intercept partners"."
http://www.guardian.co.uk/uk/2013/jun/21/gchq-cables-secret-world-communicat...
Fred
Someone said a long time ago, I think from EFF, that the problem will not be Big Brother, but all of the Little Brothers.
----- Original Message ----- From: Fred Bauder fredbaud@fairpoint.net To: mediawiki-l@lists.wikimedia.org Cc: Sent: Saturday, June 22, 2013 4:59 AM Subject: [MediaWiki-l] Tapping into the Backbone
"The GCHQ mass tapping operation has been built up over five years by attaching intercept probes to transatlantic fibre-optic cables where they land on British shores carrying data to western Europe from telephone exchanges and internet servers in north America.
This was done under secret agreements with commercial companies, described in one document as "intercept partners"."
http://www.guardian.co.uk/uk/2013/jun/21/gchq-cables-secret-world-communicat...
Fred
_______________________________________________ MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Doh! I meant to reply privately.
----- Original Message ----- From: Al alj62888@yahoo.com To: "fredbaud@fairpoint.net" fredbaud@fairpoint.net; MediaWiki announcements and site admin list mediawiki-l@lists.wikimedia.org Cc: Sent: Saturday, June 22, 2013 7:26 AM Subject: Re: [MediaWiki-l] Tapping into the Backbone
Someone said a long time ago, I think from EFF, that the problem will not be Big Brother, but all of the Little Brothers.
----- Original Message ----- From: Fred Bauder fredbaud@fairpoint.net To: mediawiki-l@lists.wikimedia.org Cc: Sent: Saturday, June 22, 2013 4:59 AM Subject: [MediaWiki-l] Tapping into the Backbone
"The GCHQ mass tapping operation has been built up over five years by attaching intercept probes to transatlantic fibre-optic cables where they land on British shores carrying data to western Europe from telephone exchanges and internet servers in north America.
This was done under secret agreements with commercial companies, described in one document as "intercept partners"."
http://www.guardian.co.uk/uk/2013/jun/21/gchq-cables-secret-world-communicat...
Fred
_______________________________________________ MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
_______________________________________________ MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Hi, we have a fairly large MediaWiki install and I've been porting it to run on Heroku. Part of this has been upgrading from 1.13 to 1.20 and moving images from the on-disk /images directory to S3 (the move to use S3 is required because Heroku doesn't support a filesystem).
We are using LocalS3Repo extension (modified). Running on Heroku…
When I upload a file it only puts the file on S3. It does not put the file in a /thumb/ directory. This seems to be a fact of life right now. But the WIKI will generate URLs for images as-if the thumbnail image was also uploaded...
For example: if I upload a file called "test_upload1.jpg" I see this file on S3:
http://s3.amazonaws.com/%5BBUCKET%5D/images/a/a7/Test_upload1.jpg
Which works with this Wiki Markup:
[[File:test-upload1.jpg]]
But if I use the following Wiki Markup:
[[File:test-upload1.jpg|50px]]
the URL for that IMG tag is :
http://s3.amazonaws.com/%5BBUCKET%5D/images/thumb/a/a7/Test_upload1.jpg/50px...
This wont work with my system because the thumbnail isn't generated.
----------------- How do I make the Wiki just always use the SAME URL:
http://s3.amazonaws.com/%5BBUCKET%5D/images/a/a7/Test_upload1.jpg
no matter what size is specified in the File: wiki markup? (I know the IMG tag's <width> and <height> will resizes it anyway). I know its performance hit.
----------------- Please note: I dont care that this is wrong, the LocalS3Repo extension is horrible and I dont want to spend any more time combing through 1000s of lines of code to fix this, I just want images to work.
And also please note: On Heroku the file upload process seems to NOT be able to upload the thumb file (like cause were running multiple dynos/servers and the request for generating the thumbnail ends up on a machine that does not have the image file Or maybe some other reason. Either way, thumbnail generation is not going to work?? on a multiple-server setup.
Any pointers appreciated. But a pointer tot he code that decides on the URL based on the File name would bne great.
-- Mike Papper
Set $wgUseImageResize = false; or adjust $wgThumbnailScriptPath.
On Sun, Jun 23, 2013 at 2:18 AM, Mike Papper bodaro@gmail.com wrote:
Hi, we have a fairly large MediaWiki install and I've been porting it to run on Heroku. Part of this has been upgrading from 1.13 to 1.20 and moving images from the on-disk /images directory to S3 (the move to use S3 is required because Heroku doesn't support a filesystem).
We are using LocalS3Repo extension (modified). Running on Heroku…
When I upload a file it only puts the file on S3. It does not put the file in a /thumb/ directory. This seems to be a fact of life right now. But the WIKI will generate URLs for images as-if the thumbnail image was also uploaded...
For example: if I upload a file called "test_upload1.jpg" I see this file on S3:
http://s3.amazonaws.com/[BUCKET]/images/a/a7/Test_upload1.jpgWhich works with this Wiki Markup:
[[File:test-upload1.jpg]]But if I use the following Wiki Markup:
[[File:test-upload1.jpg|50px]]the URL for that IMG tag is :
http://s3.amazonaws.com/%5BBUCKET%5D/images/thumb/a/a7/Test_upload1.jpg/50px...
This wont work with my system because the thumbnail isn't generated.
How do I make the Wiki just always use the SAME URL:
http://s3.amazonaws.com/%5BBUCKET%5D/images/a/a7/Test_upload1.jpg
no matter what size is specified in the File: wiki markup? (I know the IMG tag's <width> and <height> will resizes it anyway). I know its performance hit.
Please note: I dont care that this is wrong, the LocalS3Repo extension is horrible and I dont want to spend any more time combing through 1000s of lines of code to fix this, I just want images to work.
And also please note: On Heroku the file upload process seems to NOT be able to upload the thumb file (like cause were running multiple dynos/servers and the request for generating the thumbnail ends up on a machine that does not have the image file Or maybe some other reason. Either way, thumbnail generation is not going to work?? on a multiple-server setup.
Any pointers appreciated. But a pointer tot he code that decides on the URL based on the File name would bne great.
-- Mike Papper _______________________________________________ MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Hi, we have a fairly large MediaWiki install and I've been porting it to run on Heroku. Part of this has been upgrading from 1.13 to 1.20 and moving images from the on-disk /images directory to S3 (the move to use S3 is required because Heroku doesn't support a filesystem).
We are using LocalS3Repo extension (modified). Running on Heroku…
When we upload a duplicate image, we get the familiar "warning this file is a duplicate of…". From this point if you select "proceed anyway/ignore warning" it will upload and then crash. The crash looks like the S3 plugin is calling the File class to get the image properties (and the image/file does not exist on disk).
However, if I upload a duplicate image and pre-check "ignore warnings" it will upload without nay problems.
Example: I upload the same image on http://localhost/page/File:Upload_test_1.jpg and http://localhost/page/File:Upload_test_2.jpg and on the second upload it displays: This file is a duplicate of the following file:
I proceed and click "ignore warning and priced anyway" and then see this in the log file:
----------------------------------- Start request POST /page/Special:Upload HTTP HEADERS: HOST: wiki.local.developerforce.com USER-AGENT: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.8; rv:21.0) Gecko/20100101 Firefox/21.0 ACCEPT: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 ACCEPT-LANGUAGE: en ACCEPT-ENCODING: gzip, deflate REFERER: http://wiki.local.developerforce.com/page/Special:Upload COOKIE: webact=%7B%22l_vdays%22%3A6%2C%22l_visit%22%3A1372115043125%2C%22session%22%3A1372310771883%2C%22l_search%22%3A%22%22%2C%22l_dtype%22%3A%22Typed%2FBookmarked%22%2C%22l_page%22%3A%22%2Fpage%2FSpecial%3AUpload%22%2C%22counter%22%3A9%2C%22pv%22%3A7%2C%22f_visit%22%3A1370479963111%2C%22version%22%3A%22w172.1%22%2C%22d%22%3A%2270130000000sUVq%22%7D; s_pers=%20s_fid%3D05BD9FD26E55AB08-36F45099D0331CB6%7C1435382812060%3B; __utma=173299382.1439959353.1371721608.1371845109.1371850431.4; __utmz=173299382.1371721608.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none); __utma=196062250.745314347.1370479963.1372111246.1372310661.10; __utmz=196062250.1370479963.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none); __utmv=196062250.|1=Member=Yes=1; dfc_wiki_local_en_dfc_wiki_UserName=Mike+papperas; dfc_wiki_local_en_dfc_wiki_Email=bodaro%40gmail.com; __utmv=173299382.|1=Member=Yes=1; dfc_wiki_local_dfc_wiki_UserName=Root+user; dfc_wiki_local_dfc_wiki_Email=root%40developerforce.com; dfc_auth_production=ncc%3DQspFvgr%26rai%3Dcebqhpgvba%26ffb_vq%3D733%26ffb_cebivqre%3Dqsp_fvgr_cebqhpgvba%26hfre_rznvy%3Dzcnccre%40fnyrfsbepr.pbz%26hfre_svefg_anzr%3DZvxr%26hfre_ynfg_anzr%3DCnccre%26hfre_ebyrf%3Dnqzva%2Cpbagrag%2Cqnfuobneq%2Cjnyy_nqzva%2Cjnyy_ivrjre%2Cjnyy_zbqrengbe%2Cfsqprzcyblrr%26gf%3D1372300588%269on5no07977867n5887q9rn327n4024p3qs8n291; dfc_wiki_local_dfc_wiki__session=8a6b74f991c7c948e7a390e024a2740e; dfc_wiki_local_dfc_wiki_LoggedOut=20130627052411; dfc_wiki_local_dfc_wiki_UserID=102732; dfc_wiki_local_dfc_wiki_Token=a4aea494bd378c44dd1a295689eee8b3; s_sess=%20s_ppv_x%3D%3B%20v0%3DTyped%252FBookmarked%3B%20c24%3DTyped%252FBookmarked%3B%20s_cc%3Dtrue%3B%20s_ppv%3D100%257C38%3B%20s_sq%3Dsalesforceadnrollup2%253D%252526pid%25253D%2525252Fpage%2525252FSpecial%2525253AUpload%252526pidt%25253D1%252526oid%25253DIgnore%25252520warning%25252520and%25252520save%25252520file%25252520anyway%252526oidt%25253D3%252526ot%25253DSUBMIT%3B; __utmb=196062250.30.8.1372310778572; __utmc=196062250 CONNECTION: keep-alive CONTENT-TYPE: application/x-www-form-urlencoded CONTENT-LENGTH: 277 CACHES: EmptyBagOStuff[main] SqlBagOStuff[message] SqlBagOStuff[parser] session_set_cookie_params: "0", "/", "", "", "1" LocalisationCache: using store LCStore_CDB Unstubbing $wgParser on call of $wgParser::setHook from wfYouTube Parser: using preprocessor: Preprocessor_DOM User: cache miss for user 102732 Connecting to localhost dfc_wiki_local... Profiler::instance called without $wgProfiler['class'] set, falling back to ProfilerStub for safety Connected to localhost dfc_wiki_local. User: loading options for user 102732 from database. User: logged in from session User: cache miss for user 102732 User: loading options for user 102732 from database. User: logged in from session Fully initialised User: loading options for user 102732 from override cache. Connecting to localhost dfc_wiki_local... Connected to localhost dfc_wiki_local. MessageCache::load: Loading en... got from global cache Unstubbing $wgLang on call of $wgLang::_unstub from ParserOptions::__construct User::getBlockedStatus: checking... UploadBase::createFromRequest: class name: UploadFromStash UploadFromStash::__construct creating new UploadStash instance with no user File::getPropsFromPath: Getting file info for en/images/temp/b/bf/20130627052607!phpQGkrFq.png Use of File::getPropsFromPath was deprecated in MediaWiki 1.19. [Called from FSs3Repo::getFileProps in /Users/mpapper/work/devforce/dfc-wiki/extensions/LocalS3Repo/FSs3Repo.php at line 661] FSFile::getProps: Getting file info for en/images/temp/b/bf/20130627052607!phpQGkrFq.png FSFile::getProps: en/images/temp/b/bf/20130627052607!phpQGkrFq.png NOT FOUND! File::getPropsFromPath: Getting file info for en/images/temp/b/bf/20130627052607!phpQGkrFq.png FSFile::getProps: Getting file info for en/images/temp/b/bf/20130627052607!phpQGkrFq.png FSFile::getProps: en/images/temp/b/bf/20130627052607!phpQGkrFq.png NOT FOUND! ^@ ------------------------------------------------ I also see the PHP Error:
PHP Fatal error: Call to a member function bind() on a non-object in /Users/mpapper/work/devforce/dfc-wiki/includes/upload/UploadBase.php on line 247, referer: http://wiki.local.developerforce.com/page/Special:Upload
That line is this function:
/** * @param $srcPath String: the source path * @return string the real path if it was a virtual URL */ function getRealPath( $srcPath ) { wfProfileIn( __METHOD__ ); $repo = RepoGroup::singleton()->getLocalRepo(); if ( $repo->isVirtualUrl( $srcPath ) ) { // @TODO: just make uploads work with storage paths // UploadFromStash loads files via virtuals URLs $tmpFile = $repo->getLocalCopy( $srcPath ); $tmpFile->bind( $this ); // keep alive with $thumb <======== ERROR LINE 247 wfProfileOut( __METHOD__ ); return $tmpFile->getPath(); } wfProfileOut( __METHOD__ ); return $srcPath; }
Question: how can I make it not crash or perhaps just always upload and ignore warnings (I would like to avoid hacking the core MediaWiki code).
Mike Papper
Hi,
I'm not sure why this is posted on the mediawiki list? It seems to be complete offtopic like lot of other topics from your side lately
On Sat, Jun 22, 2013 at 12:59 PM, Fred Bauder fredbaud@fairpoint.netwrote:
"The GCHQ mass tapping operation has been built up over five years by attaching intercept probes to transatlantic fibre-optic cables where they land on British shores carrying data to western Europe from telephone exchanges and internet servers in north America.
This was done under secret agreements with commercial companies, described in one document as "intercept partners"."
http://www.guardian.co.uk/uk/2013/jun/21/gchq-cables-secret-world-communicat...
Fred
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
While re-hearing the news because some may have missed it may be a laudable goal, it is a bit amiss here on this mailing list. This list is listed as intended as a help/support list for those running/configuring MediaWiki. It also provides announcements of new versions, bug fixes and security issues.
Hence, your news is beyond the topic scope of this list and far more appropriate for a personal blog and perhaps, other lists devoted to the spreading of news. Now, if you have news that the NSA or GCHQ has exploited a security flaw in MediaWiki, that would most certainly be on topic for this mailing list and would be greatly appreciated.
On Jun 22, 2013, at 6:59 AM, Fred Bauder wrote:
"The GCHQ mass tapping operation has been built up over five years by attaching intercept probes to transatlantic fibre-optic cables where they land on British shores carrying data to western Europe from telephone exchanges and internet servers in north America.
This was done under secret agreements with commercial companies, described in one document as "intercept partners"."
http://www.guardian.co.uk/uk/2013/jun/21/gchq-cables-secret-world-communicat...
Fred
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Or if they managed to uncover the Wikimedia Secret Cabal(tm)!
On Sat, Jun 22, 2013 at 3:27 PM, Stephen Villano < stephen.p.villano@gmail.com> wrote:
While re-hearing the news because some may have missed it may be a laudable goal, it is a bit amiss here on this mailing list. This list is listed as intended as a help/support list for those running/configuring MediaWiki. It also provides announcements of new versions, bug fixes and security issues.
Hence, your news is beyond the topic scope of this list and far more appropriate for a personal blog and perhaps, other lists devoted to the spreading of news. Now, if you have news that the NSA or GCHQ has exploited a security flaw in MediaWiki, that would most certainly be on topic for this mailing list and would be greatly appreciated.
On Jun 22, 2013, at 6:59 AM, Fred Bauder wrote:
"The GCHQ mass tapping operation has been built up over five years by attaching intercept probes to transatlantic fibre-optic cables where they land on British shores carrying data to western Europe from telephone exchanges and internet servers in north America.
This was done under secret agreements with commercial companies, described in one document as "intercept partners"."
http://www.guardian.co.uk/uk/2013/jun/21/gchq-cables-secret-world-communicat...
Fred
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
On Jun 22, 2013, at 10:40 AM, Magnus Manske wrote:
Or if they managed to uncover the Wikimedia Secret Cabal(tm)!
Ah! So true. However, the majority of the world refer to them under their proper title: Developers.
On Sat, Jun 22, 2013 at 3:27 PM, Stephen Villano < stephen.p.villano@gmail.com> wrote:
While re-hearing the news because some may have missed it may be a laudable goal, it is a bit amiss here on this mailing list. This list is listed as intended as a help/support list for those running/configuring MediaWiki. It also provides announcements of new versions, bug fixes and security issues.
Hence, your news is beyond the topic scope of this list and far more appropriate for a personal blog and perhaps, other lists devoted to the spreading of news. Now, if you have news that the NSA or GCHQ has exploited a security flaw in MediaWiki, that would most certainly be on topic for this mailing list and would be greatly appreciated.
On Jun 22, 2013, at 6:59 AM, Fred Bauder wrote:
"The GCHQ mass tapping operation has been built up over five years by attaching intercept probes to transatlantic fibre-optic cables where they land on British shores carrying data to western Europe from telephone exchanges and internet servers in north America.
This was done under secret agreements with commercial companies, described in one document as "intercept partners"."
http://www.guardian.co.uk/uk/2013/jun/21/gchq-cables-secret-world-communicat...
Fred
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
On Sat, 22 Jun 2013 07:27:42 -0700, Stephen Villano stephen.p.villano@gmail.com wrote:
Now, if you have news that the NSA or GCHQ has exploited a security flaw in MediaWiki, that would most certainly be on topic for this mailing list and would be greatly appreciated.
Actually it would still be off topic. Since it would be better suited for wikitech-l where people who can actually fix the security flaw subscribe. Better yet actually, since it would be a security issue security@wikimedia.org or the Security project in bugzilla.
Thanks for the addition. You are quite correct. Didn't know of the security@wikimedia.org. I'll note it in case I bump into something in the future.
On Jun 22, 2013, at 12:23 PM, Daniel Friesen wrote:
On Sat, 22 Jun 2013 07:27:42 -0700, Stephen Villano stephen.p.villano@gmail.com wrote:
Now, if you have news that the NSA or GCHQ has exploited a security flaw in MediaWiki, that would most certainly be on topic for this mailing list and would be greatly appreciated.
Actually it would still be off topic. Since it would be better suited for wikitech-l where people who can actually fix the security flaw subscribe. Better yet actually, since it would be a security issue security@wikimedia.org or the Security project in bugzilla.
-- ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
mediawiki-l@lists.wikimedia.org