Hey Steffen and Andy,
Continuing what I started on Twitter here, as some more characters might be
helpful :)
It seems that both our projects (FLOW3 and Wikidata) are in a similar
situation. We are using Gerrit as CR tool, and TravisCI to run our tests.
And we both want to have Travis run tests for all patchsets submitted to
Gerrit, and then +1 or -1 on verified based on the build passing or
failing. To what extend have you gotten such a thing to work on your
project? Is there code available anywhere? If both projects can use the
same code for this, I'd be happy to contribute to what you already have.
Cheers
--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
My take on assertions, which I also tried to stick to in Wikibase, is as follows:
* A failing assertion indicates a "local" error in the code or a bug in PHP;
They should not be used to check preconditions or validate input. That's what
InvalidArgumentException is for (and I wish type hints would trigger that, and
not a "fatal error"). Precondition checks can always fail, never trust the
caller. Assertions are things that should *always* be true.
* Use assertions to check postconditions (and perhaps invariants). That is, use
them to assert that the code in the method (and maybe class) that contains the
assert is correct. Do not use them to enforce caller behavior.
* Use boolean expressions in assertions, not strings. The speed advantage of
strings is not big, since the expression should be a very basic one anyway, and
strings are awkward to read, write, and, as mentioned before, potentially
dangerous, because they are eval()ed.
* The notion of "bailing out" on "fatal errors" is a misguided remnant from the
days when PHP didn't have exceptions. In my mind, assertions should just throw
an (usually unhandled) exception, like Java's AssertionError.
I think if we stick with this, assertions are potentially useful, and harmless
at worst. But if there is consensus that they should not be used anywhere, ever,
we'll remove them. I don't really see how the resulting boiler plate would be
cleaner or safer:
if ( $foo > $bar ) {
throw new OMGWTFError();
}
-- daniel
Am 31.07.2013 00:28, schrieb Tim Starling:
> On 31/07/13 07:28, Max Semenik wrote:
>> I remeber we discussed using asserts and decided they're a bad
>> idea for WMF-deployed code - yet I see
>>
>> Warning: assert() [<a href='function.assert'>function.assert</a>]:
>> Assertion failed in
>> /usr/local/apache/common-local/php-1.22wmf12/extensions/WikibaseDataModel/DataModel/Claim/Claims.php
>> on line 291
>
> The original discussion is here:
>
> <http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/59620>
>
> Judge for yourself.
>
> -- Tim Starling
>
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
These ideas are also relevant to Wikidata, so I'm forwarding to the
general and tech lists. Besides the people listed in the forwarded
email, I was also in the discussion.
Matt Flaschen
hi,
recently i tested several sites who are using https, most of them
communicate with my chromium-webbrowser over TLS 1.1, but
wikipedia/wikimedia still is using TLS 1.0.
ssllabs (see link below) shows a warning notice that you should upgrade
to the newer version, i dont think there is a urgent security reason for
this but even if its only preventive upgarding wouldn't be wrong, right?
example:
https://encrypted.google.com/ TLS 1.1
https://mega.co.nz/ TLS 1.1
https://www.ixquick.com/ TLS 1.1
https://btc-e.com/ TLS 1.1
https://www.wsws.org/ TLS 1.1
https://linksunten.indymedia.org/ TLS 1.1
https://en.wikipedia.org TLS 1.0
https://commons.wikimedia.org/ TLS 1.0
https://www.taz.de/ TLS 1.0
https://duckduckgo.com/ TLS 1.0
https://www.ssllabs.com/ssltest/analyze.html?d=https://en.wikipedia.org
hopefully at the right mailinglist, greetings 0x0000(a)anche.no
--
-----BEGIN PGP PUBLIC KEY BLOCK-----
Version: OpenPGP.js v.1.20130627
Comment: http://openpgpjs.org
xsFNBFHfIcwBD/98vPJ4oYVHbL4cfVcP3NXn6S7pjBFYFQifOAw6r7nHCYnc
8q+rVtsJIEti4n1HBGo9xLsG3Ut06BSFZv87wISdOBeNFYg6y0EhbmmYTfiG
XxHVtK8h8A54En67jYm/QV91IjVs7LPaohW7dtilbSG57/x6uQHcB87z146T
yMXMiLwRo8N0ruu07+2sYZEcTXNCe7E7SGjDyYXTpgYn6jy6QmLE26TkiAvU
6jmJqFWIQxmG32mMnAuOHIhuPDavK4RctDhumJBRzf8/FpU/LydjcYVWLX2z
zMOctzjm+TBRErWAaJh592uQAbgaC8pfkZ6nPcm7UPz74G61wwjkHjxxnBnL
PJ8ZwtOUvoiNIKa/K6l/q9v6Dxzk3vWw3bq/+Xm85iejA+T32nSMbk9yguEo
uZOB3Wc++cg9YJLr/VCqxyjaTeWhUz8S2zVW1BvSzmTwAf0aBfzX8Nr0ZgRS
52TMIt0DEDUNXBDh/ipKWeSgNMc4oO8oA8Af1rjWOC0SghTn+L3zv9oThNU8
5cqquR98JVeWJ0LzEFACuiN6wR1n61m3HYRNcwQufA/UC6E+/+3rZ61WavWj
cjHoJG4YdigsamLgZWumvZorzW3BryDtfhugA5C6viUrhLmT7OiJ1/51CIAm
FabIe9T8y1ui7ZzBJInpp61wlpSY3PBKzePHMQARAQABzRgweDAwMDAgPDB4
MDAwMEBhbmNoZS5ubz7CwVwEEAEIABAFAlHfIgcJEI7JF5Pk6lpcAABFlA//
YeQsQBKxzRQzPEYQb4vtxsw2TreREHJ4iKFqAmu36Dm1I8HE+wGYjAHab2ia
DEAS1hbhsQFCYExj/3PajhLyn7GLCV9avlK7A/jQS8LuWUqTk8I21okzWv0g
ICXWwbugzw6IPqdfrGM2b6cMXRz9Aw5wXv5qRHqC/oMqehdHSPq9cKUmY4oG
jSiZcbDfFn4FoTtHnqLv44D4zhjk0Yk2hgdvR3oAhddsNBF7Ujz3mnFm2cee
Af5UbY4n0N9y6VJOiPGhoATtBUBSB774ftrIJFZXHFnQi0WW5liCZNXuHaZZ
o8T/f1Zu9H8V4juj3p2v3/OigIKdhkfVDcNDMb/1aHqyawm6XO6Yx0oKytJU
Ci04p5tb4t/8ZYSAY9ktyakOc7o13P33etlb7cpK9Bv1FmjWftBmrf+xp4Dm
OCavWrxMAnAS5UspUBeYPRBhYCBUWFK5Y89ZxXcOLWplqb7HKUXfwwfat1Su
w75kQn7mOShSNxtQl2OkkFSi4Mi6C081GwpQ1ZJYQXvkkTKPN3ZeCJMVBpsL
MSTeZLxX8xEk9zrFQn7czQsWgdc9eVOj/DVASSXs+lrz766w6nT/r/Jl/dfR
tnhdTgUUn/D88pstke06w/TFbxPc9YfKowGAUMcyMQwn1SajKgBkPNg5c7tg
Y4pw3OQzKxreQO7hRjKrdLg=
=mCgO
-----END PGP PUBLIC KEY BLOCK-----
Due to Wikimania and general unbearableness of the weather, the next
deployment and branching will be a bit later.
The next branching is planned for August 21st (that will be branch
1.22wmf16, as it is currently called).
This will be deployed on Thus, August 22 to test,
on Mon, August 26 to Wikidata and Wikivoyage,
and on Thu, August 29 to Wikipedia.
This should be enough time to get the data values split done and let the
dust settle, so we probably should start with that soon.
Cheers,
Denny
P.S.: is anyone not on the -tech list who is on -intern? If no one
complains, I will stop sending these Emails as double-postings.
--
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
In poking at the code for supporting wikivoyage, it looks to me that the
site_identifiers table is populated incorrectly for non-wikipedias.
For wikipedia sites, it has interwiki keys such as 'en'. For wikivoyage, it
is 'enwikivoyage'. These are used to construct interwiki links and
'enwikivoyage:Berlin' is not valid. Thus it's problematic to use the sites
code for interwiki links.
Is there a good reason for populating the table in this way?
I think the table should have si_key = 'en' for English Wikivoyage. That
means a database rows for enwiki and enwikivoyage would be:
4977 | interwiki | en |
4984 | interwiki | en |
****but**** the table defines a si_type and si_key as a unique key, so this
does not work currently.
UNIQUE KEY `site_ids_type` (`si_type`,`si_key`),
Any insights on why things are the way they are?
Cheers,
Katie
--
Katie Filbert
Wikidata Developer
Wikimedia Germany e.V. | NEW: Obentrautstr. 72 | 10963 Berlin
Phone (030) 219 158 26-0
http://wikimedia.de
Wikimedia Germany - Society for the Promotion of free knowledge eV Entered
in the register of Amtsgericht Berlin-Charlottenburg under the number 23
855 as recognized as charitable by the Inland Revenue for corporations I
Berlin, tax number 27/681/51985.
Hey,
I found an excellent talk on Bower, a package manager for client side
JavaScript.
https://www.youtube.com/watch?v=o9Xo_WFAyqg
It is very similar to NPM for server side JS, or to Composer, the PHP
thing. In fact, you can watch this talk as an introduction to Composer :D
This talk briefly touches on some other tools as well, namely require.js,
grunt and yoman. When I went to IPC a few weeks back, I returned with
basically the same list of tools, based on the various JS related talks
there. The people at the local (in Berlin) JS user groups generally seem to
have these tools on their "the good stuff out there you want to be using"
list as well.
Given how promising these things look, I continue to recommend we
investigate them to see how we can use them to improve our workflow and
code. If I'm not mistaken, Danwe is planning to poke at some off this
"soonish".
This talk gives one an idea on what this tool is, why it is useful, and how
it can be used. I encourage everyone doing JS in the project and not
familiar with Bower yet to at least watch the video, as this seems to be
something the self respecting JS devs all know about nowadays.
Cheers
--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
Hi,
sorry for another long Email today.
Currently, when you change a Wikidata item, its associated Wikipedia
articles get told to update, too. So your change to the IMDB ID of a movie
in Wikidata will be pushed to all language versions of that article on
Wikipedia. Yay!
There are two use cases that currently are not possible:
* a Wikipedia article on a city might display the mayor. Now someone
changes on Wikidata the label of the mayor - the Wikipedia article will get
updated the next time the page is rendered, but there is no active update
of the page.
* a Wikipedia article might want to include data about another item than
the associated item - most importantly for references, where I might be
interested in the author of a book, it's year of publication, etc. This
feature is currently disabled (even though it would be trivial to switch it
on) because this information would only get updated when the page is
actively rerendered.
In order to enable these use cases we need to track on which pages (on
Wikipedia) an item (from Wikidata) is used. We are thinking of doing this
in two tables:
* EntityUsage: one table per client. It has two columns, one with the
pageId and one with the entityId, indexed on both columns (and one column
with a pk, I guess, for OSC).
* Subscriptions: one table on the client. It has two columns, one with the
pageId and one with the siteId, indexed on both columns (and one column
with a pk, I guess, for OSC).
EntityUsage is a potentially big table (something like pagelinks-size).
On a change on Wikidata, Wikidata consults the Subscriptions table, and
based on that it dispatches the changes to all clients listed there for a
given change. Then the client receives the changes and based on the
EntityUsage table performs the necessary updates.
We wanted to ask for input on this approach, and if you see problems or
improvements that we should put in.
Cheers,
Denny
--
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
Hey,
I'm unhappy with the yyyy-mm-dd tags we are creating for all our repos.
ATM these tags are created for deployment purposes (right?). They are
however rather meaningless to consumers of the software, and they stick
around forever. This is not a very standard thing to do, and it is causing
problems for Composer usage. For instance, if I want to install a stable
release of Diff, it is going to get me "2013-06-25" rather then "0.7".
I propose stopping this tagging policy and removing these existing tags.
For deployment we can have a wmfdeployment branch or so. That makes it
clear to people what it is, causes less clutter, does not play havoc with
Composer, and presumably makes things easier on the deployment side, as the
reference remains the same. Any reasons to not do this? Alternative
solutions?
Cheers
--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--