Are Foundation servers able to withstand Online Certificate Status
Protocol certificate revocations, such as might occur according to RFC
5280 when a government agency declares a private key compromised
because of secret evidence?
Hi,
Due to a current vandalism/spam threat, I've enabled administrator approval
of all new accounts. Hopefully this won't have to stay enabled long.
Will update when it's turned back off.
-Chad
Most of the time we assume that writing code like:
wfMessage( 'foo' )->params( $this->getRequest()->getVal( 'bar' ) )->parse();
is totally safe. However, in a wiki with $wgRawHTML = true; this code
would be an XSS. I've looked through core, and couldn't find any
examples of using unsanitized url parameters as a message parameter in
a parsed message, however it seems to me like this sort of thing is an
accident waiting to happen.
I would like to propose that $wgRawHTML only apply to actual pages.
The <html> parser tag should not be active in wfMessage() or other
parser contexts. I don't think this would break anything, but I'd like
feedback on if anyone could think of anything this could break.
For more information see https://phabricator.wikimedia.org/T156184 .
Please post any feedback about this idea on that bug.
Remember, the deadline for proposing developer wishes is TODAY, January 31
at 23:59 UTC
<https://www.timeanddate.com/worldclock/fixedtime.html?iso=20170131T235959&p…>
.
The Developer Wishlist voting phase is planned to start next Monday,
February 6.
https://www.mediawiki.org/wiki/Developer_Wishlist#Timeline
On Sat, Jan 21, 2017 at 12:00 AM, Srishti Sethi <ssethi(a)wikimedia.org>
wrote:
> That's right! At the Wikimedia Developer Summit, we decided to organize a
> Developer Wishlist Survey, and here we go:
>
> https://www.mediawiki.org/wiki/Developer_Wishlist
>
> The Wikimedia technical community seeks input from developers for
> developers, to create a high-profile list of desired improvements. The
> scope of the survey includes the MediaWiki platform (core software, APIs,
> developer environment, enablers for extensions, gadgets, templates, bots,
> dumps), the Wikimedia server infrastructure, the contribution process, and
> documentation.
>
> The best part: we want to have the results published by Wednesday,
> February 15. Yes, in a month, to have a higher chance to influence the
> Wikimedia Foundation annual plan FY 2017-18.
>
> There's no time to lose. *Propose your ideas before the end of January, *
> either by pushing existing tasks in Phabricator or by creating new ones.
> You can find instructions on the wiki page
> <https://www.mediawiki.org/wiki/Developer_Wishlist>. Questions and
> feedback are welcome especially on the related Talk page.
>
> The voting phase is expected to start on February 6 (tentative). Watch
> this space (or even better, the wiki page).
>
> Cheers,
> Srishti Sethi
> Developer Advocate, Technical Collaboration team
> Wikimedia Foundation
>
> https://www.mediawiki.org/wiki/User:SSethi_(WMF)
>
>
> _______________________________________________
> Wikitech-ambassadors mailing list
> Wikitech-ambassadors(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-ambassadors
>
>
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Hello all,
I would like to announce the release of MediaWiki Language Extension
Bundle 2017.01. This bundle is The bundle is compatible with MediaWiki
1.26 and 1.27 or above and requires PHP 5.5.9 or above.
Next MLEB is expected to be released in 3 months. If there are major
changes or important bug fixes, we will do intermediate release.
Please give us your feedback at
[[Talk:MLEB|https://www.mediawiki.org/wiki/Talk:MLEB]].
* Download: https://translatewiki.net/mleb/MediaWikiLanguageExtensionBundle-2017.01.tar…
* sha256sum: 89f4a029f33ea9f9225c8379367bc526fa63353845a2873290ba82560fb314c9
Quick links:
* Installation instructions are at: https://www.mediawiki.org/wiki/MLEB
* Announcements of new releases will be posted to a mailing list:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-i18n
* Report bugs to: https://phabricator.wikimedia.org/
* Talk with us at: #mediawiki-i18n @ Freenode
Release notes for each extension are below.
-- Kartik Mistry
== Highlights and upgrade notes ==
== Babel ==
* Kunal Mehta fixed compatibility with MediaWiki 1.27 version.
* This, that and the other fixed categorization of 'pt-br' and similar
language codes.
* This, that and the other fixed Babel AutoCreate's edit summaries to
use content language instead of user's UI language.
== CLDR ==
* Reedy updated CLDR to 30.0.2
== CleanChanges ==
* Maintenance and localisation updates only.
== LocalisationUpdate ==
* Niklas Laxström added warning for deprecated PHP entry point. Use
<code>wfLoadExtensions( array( 'LocalisationUpdate' ) );</code> to
load extension in LocalSettings.php now.
== Translate ==
* Niklas Laxström made insertables more touch friendly.
* Niklas Laxström made several improvements for MessageTable.
* Niklas Laxström moved SpecialPage(Preparation|Migration) to tag, as
they are related to page translation.
* Niklas Laxström made several improvements in Special:TranslationStats.
* Niklas Laxström implemented plural aware message content comparison.
* Federico Leva added option to keep outdated page translation without
removing them. T60429
== UniversalLanguageSelector ==
* Fomafix added patch to remove jquery.i18n library which is available
in MediaWiki core since 1.26 version.
* Niklas Laxström added warning for deprecated PHP entry point. Use
<code>wfLoadExtensions( array( 'UniversalLanguageSelector' ) );</code>
to load extension in LocalSettings.php now.
--
Kartik Mistry/કાર્તિક મિસ્ત્રી | IRC: kart_
{kartikm, 0x1f1f}.wordpress.com
Hi all,
Scap version 3.5.1-1 has been released and with it comes a few changes.
tl;dr highlights:
== MediaWiki Deploys ==
* Subcommands are the only way to scap (e.g., scap sync-file vs. sync-file) Old
stub entry points for scap (e.g., sync-file, sync-dir, mwversionsinuse, etc)
are gone. Formerly old binstubs simply exited with a non-zero exit code.
* MediaWiki canary deploys now check for both hhvm and mediawiki errors
* scap sync-file and scap sync-dir are now the same command internally. scap
sync-dir is now deprecated
== Scap3/Service Deploys ==
* Scap's rollback behavior has been greatly improved. Scap supports a global
`failure_limit` and a per-group `failure_limit` -- if a deployment exceeds the
number or percentage of failures specified by this limit a deploy will fail and
you will be prompted to rollback. Also, if you opt to *not* continue a
deployment on remaining deploy groups, you will receive the option to rollback.
(Fixes T149008)
* Scap3: This scap release has some rollback logic fixes. First, if there is
initial ssh failure for a host, scap will no longer attempt a rollback on that
host (since the same ssh failure will likely cause a rollback failure). Next,
all previously deployed groups of servers will now be rolled-back -- not just
the group of servers that had failures. (Fixes T150267, T145460)
You can see the full changelog in the repo[0].
<3 -- The Scap Folks
[0]. <https://phabricator.wikimedia.org/source/scap/browse/release/debian/changel…>
Resistance Manual <https://www.resistancemanual.org/Resistance_Manual_Home>
is a guide for organizing resistance to the policies of the Trump
administration in the United States. The site is running MediaWiki 1.28,
and its admins are looking for help maintaining the site. The main page
says to reach out to info(a)staywoke.org if interested.
Hi all,
my change for https://phabricator.wikimedia.org/T155892 was blocked by
hashar because he do not know impact that deploying will cause to our
infrastructure. May somebody have a look at it?
Thanks in advance,
Martin Urbanec
Urbanecm at phab
Hello,
I'm managing some mediawiki 1.27.1's running CirrusSearch 0.2 with
Elasticsearch 1.7.5. I been noticing that there are often search results
missing so I started running the forceSearchIndex.php script each night
on a cron job.
But I'm still finding results missing. Today I re-ran the script
manually and then found that one of the missing results showed up and
that the result count for that term had increased from 18 to 23. I ran
the script again and it increased more to 37. I ran more times but the
result count did not increase any more.
The commands I've been doing are:
forceSearchIndex.php --skipLinks --indexOnSkip
forceSearchIndex.php --skipParse
Is this the correct way to do a full index rebuild? is there some
parameter that can ensure that no pages get missed?
Thanks,
Aran