Many of you on the mailing list should be aware of the troubles that
the style attribute brings to mobile [1,2] and the amount of hacks [3]
that we have to introduce to work around them.
I still truly believe the only way we can resolve this is a long term
rethink of how we approach custom styling on wiki. I have also heard
from Chris Steipp that there are security implications with allowing
inline styles which such a move would address.
I have submitted a patch [4] (mostly to share ideas and prompt
discussion - before you pounce on it be aware I have -2ed it to allow
discussion on whether there is a better way to do this - for instance
it might be worthy of a new namespace, it might need more protection
etc.. ).
All the patch does is allow Template:Foo to have an associated
stylesheet Template:Foo.css which is included in pages that use it.
So if the San Francisco article uses templates Foo, Bar and Baz, a
style tag will be constructed from the content of Template:Foo.css,
Template:Bar.css and Template:Bar.css and inserted into the page. When
the templates change the entire page San Francisco is changed and thus
the new styling is applied.
This would reduce the need for css hacks in mobile and keep power in
editors hands.
On the assumption that this patch makes it into core in some form that
in future the mobile site can strip any style attributes from content
and use the template css files instead and thus benefit from the
ability to use media queries. This could be a long tedious process but
I think it needs to be done.
Thanks in advance for your discussion and thoughts around this long
standing issue!
~Jon
[1] https://www.mediawiki.org/wiki/Requests_for_comment/Deprecating_inline_styl…
[2] https://bugzilla.wikimedia.org/show_bug.cgi?id=35704
[3] https://github.com/wikimedia/mediawiki-extensions-MobileFrontend/blob/maste…
[4] https://gerrit.wikimedia.org/r/68123
Hi everybody!
I tried to set up a WikiData development environment using Vagrant[1]. At first everything seemed to be okay. The “precise64” VM was downloaded and the automated setup with “puppet” started. But then after Apache, MySQL and PHP were installed the following error occurred:
←[0;37mdebug: Puppet::Type::Package::ProviderApt: Executing '/usr/bin/dpkg-query -W --showformat ${Status} ${Package} ${Version}\n php5-dev'←[0m
←[0;37mdebug: Class[Php]: The container Stage[main] will propagate my refresh event←[0m
←[0;37mdebug: Finishing transaction 70306158277360←[0m
←[0;37mdebug: Storing state←[0m
←[0;32minfo: Creating state file /var/lib/puppet/state/state.yaml←[0m
←[0;37mdebug: Stored state in 0.04 seconds←[0m
←[0;36mnotice: Finished catalog run in 318.46 seconds←[0m
←[0;37mdebug: /File[/var/lib/puppet/rrd]/ensure: created←[0m
←[0;37mdebug: Finishing transaction 70306152454460←[0m
←[0;37mdebug: Received report to process from precise64.hw.local←[0m
←[0;37mdebug: Processing report from precise64.lmw.local with processor Puppet::Reports::Store←[0m
The following SSH command responded with a non-zero exit status.
Vagrant assumes that this means the command failed!
cd /tmp/vagrant-puppet/manifests && puppet apply --verbose --debug --modulepath '/etc/puppet/modules:/tmp/vagrant-puppet/modules-0' base.pp --detailed-exitcodes || [ $? -eq 2 ]
When I execute this last command manually on the VM I get the error
err: /Stage[main]/Generic/Exec[fix-sources]/returns: change from notrun to 0 failed: sed -i'' -e 's/us\.archive/archive/g' /etc/apt/sources.list returned 4 instead of one of [0] at /tmp/vagrant-puppet/manifests/base.pp:13
This seem to be due to file permissions. Executing this command as root works fine but afterwards this error occurs:
err: /Stage[main]/Wikidata::Repo/Exec[repo_setup]/returns: change from notrun to 0 failed: /usr/bin/php /srv/repo/maintenance/install.php Wikidata-repo admin --pass vagrant --dbname repo --dbuser root --dbpass vagrant --server 'http://localhost:8080' --scriptpath '/repo' --confpath '/srv/orig-repo/' returned 1 instead of one of [0] at /tmp/vagrant-puppet/modules-0/wikidata/manifests/init.pp:38
The “/srv/repo/” folder is almost empty, which is probably because of earlier errors.
I am able to connect to “http://127.0.0.1:8080” and see the “welcome” page. But both links (“repo” and “client”) are dead.
Has anybody an idea what the problem might be? I’m using a Windows 8 machine as host.
Thanks!
[1] http://meta.wikimedia.org/wiki/Wikidata/Development/Setup
--
Robert Vogel
for your information:
source: http://php.net/archive/2013.php#id2013-06-20-17
The PHP development team is proud to announce the immediate availability
of PHP 5.5.0.
This release includes a large number of new features and bug fixes.
The key features of PHP 5.5.0 include:
Added generators and coroutines.
Added the finally keyword.
Added a simplified password hashing API.
Added support for constant array/string dereferencing.
Added scalar class name resolution via ::class.
Added support for using empty() on the result of function calls and
other expressions.
Added support for non-scalar Iterator keys in foreach.
Added support for list() constructs in foreach statements.
Added the Zend OPcache extension for opcode caching.
The GD library has been upgraded to version 2.1 adding new functions
and improving existing functionality.
A lot more improvements and fixes.
Hi,
The E3 team (that's Dario, Matt, S, Steven, & me) proposes to generalize
the implementation of watchlists by adding support in core for multiple
generic, user-specific lists of pages. We think that such functionality
could be used to implement a broad range of useful features, some of which
have been requested repeatedly over the years. We also think that
implementing this would be a matter of clarifying and simplifying
infrastructure that already exists in core, albeit obscured by an
unreasonably tight coupling to the notion of watchlists.
We'd like to know what you think, so we're filing an RFC. You can read it
here:
https://www.mediawiki.org/wiki/Requests_for_comment/Support_for_user-specif…
---
Ori Livneh
ori(a)wikimedia.org
I added a link to http://tinyurl.com/n3twd8k to the channel topic of
#wikimedia-tech & #wikimedia-operations. It points to a live graph of
MediaWiki's error rate over the last 24 hours . I hope to automate
monitoring of this data sometime soon, but in the meantime let's keep an
eye on it collectively, especially right after deployments.
---
Ori Livneh
ori(a)wikimedia.org
Hello all,
The MediaWiki release management Request for Proposals (RFP)[0] open
submission period has now ended[1], now on to the fun part, feedback!
= The Submissions =
We have two great submissions:
https://www.mediawiki.org/wiki/Release_Management_RFP/NicheWork_and_Hallo_W…!
and
https://www.mediawiki.org/wiki/Release_Management_RFP/EIJL
= Feedback / Review =
Now begins the two weeks of community feedback. Please review and leave
questions/comments on the submissions. Your feedback is what will make
this RFP process successful.
On each submission there is space at the bottom for you to ask
questions/provide feedback publicly, or you can use the Talk: page.
Addtionally: You can provide private feedback directly to either myself
or Robla (robla(a)wikimedia.org) if desired.
= Office Hours =
Next week we will have a IRC "office hour" with myself and Rob Lanphier
and the parties who submitted proposals. This is a time for
anyone/everyone to ask questions in real time from both the RFP
submitters and of us (WMF/Robla and I). This is yet to be scheduled, but
it is looking like Wednesday or Thursday (the 19th or 20th) in the
morning Pacific time (around 4 or 5pm UTC).
I will send out a note with the final date/time as soon as possible.
Thanks!
Greg
[0] https://www.mediawiki.org/wiki/Release_Management_RFP
[1] https://www.mediawiki.org/wiki/Release_Management_RFP#Timeline
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
Hello,
Jenkins now detects parser tests registered in MediaWiki extensions,
which has some side effects (see end of this mail).
I have merged today a change in MediaWiki core that let it recognize
parser tests in extensions. That is done by looking at the
$wgParserTestFiles https://gerrit.wikimedia.org/r/#/c/63164/
Then I updated our MediaWiki extensions Jenkins jobs to have them invoke
phpunit with '--testsuite extensions' which make use of the above code.
The job change is https://gerrit.wikimedia.org/r/#/c/62622/
Some extensions were missing and have been added to Jenkins and Zuul:
- Arrays
- BookManager
- NaturalLanguageList
- ReplaceSet
- Transliterator
- Variables
Others are now running unit tests in addition to lint checks:
- Cite
- Poem
- wikihiero
Some extensions are failing parser tests and need action:
BookManager: https://bugzilla.wikimedia.org/49879
NaturalLanguageList https://bugzilla.wikimedia.org/49881
Transliterator https://bugzilla.wikimedia.org/49882
The Math extension does not pass test since texvc is not build. We need
a job to verify it compiles properly and the tests are passing. That is
logged as https://bugzilla.wikimedia.org/49884
I have disabled voting for the four related jobs.
--
Antoine "hashar" Musso