On Wed, Jul 8, 2009 at 2:43 AM, Marco
Schuster<marco(a)harddisk.is-a-geek.org> wrote:
> We should not recommend Chrome - as good as it is, but it has serious
> privacy problems.
> Opera is not Open Source, so I think we'd best stay with Firefox, even if
> Chrome/Opera begin to support video tag.
I don't think we should use these kinds of ideological criteria when
making any sort of recommendation here. We should state in a purely
neutral fashion that browsers X, Y, and Z will result in the video
playing better on your computer than your current browser does. It
would be misleading to imply that Firefox is superior to these other
browsers for the purposes of playing the video tag.
On Wed, Jul 8, 2009 at 2:42 PM, Gregory Maxwell<gmaxwell(a)gmail.com> wrote:
> I'd drop the word experience. It's superfluous marketing speak.
>
> So the notice chain I'm planning on adding to the simple <video/>
> compatibility JS is something like this:
>
> If the user is using safari4 on a desktop system and doesn't have xiphqt:
> * Advise the user to install XiphQT (note, there should be a good
> installer available soon)
>
> The rational being that if they are known to use safari now they
> probably will in the future, better to get them to install XiphQT than
> to hope they'll continue using another browser.
>
> If the users is using any of a list of platforms known to support firefox:
> * Advise them to use firefox 3.5
>
> Otherwise say nothing.
> It would be silly at this time to be advising users of some
> non-firefox-supporting mobile device that firefox 3.5 provides the
> best experience. ;)
That sounds good. Why not recommend Safari plus XiphQT as well, if
the goal is only to tell them what browsers support good <video>
playback?
Hello all,
First of all, I should say since I'm not a native English speaker,
there may be some rough English below. I apologize for any
inconvenience.
As you may know, we have two kind of Chinese characters, zh-hans and
zh-hant. So per that we have a character converter on zhwiki. Now we
have -{}- for single conversion, -{T|} for title conversion, -{A|}-
for article-range conversion and a template {{noteTA}} for a better
view. But since we are dealing with some templates, we found it's very
disappointing because we can only use -{}- for many, many times since
any other tag would effect on the rest of the article which transclude
or substitute this template. So hereby we propose a -{R||} tag which
only have effect on quoted texts. Maybe it can use like this:
-{R|zh-cn:FOO; zh-tw:BAR;|Some FOO and some BAR.}- and it will only
convert from "Some" to "BAR", with no effect on other text in the
article, like "FOOBAR" here. So that we would easily deal with those
conversions in a template. Thanks.
Best regards,
Jimmy Xu
--- On Thu, 7/9/09, Chad <innocentkiller(a)gmail.com> wrote:
>
> Hell, we barely have unit tests for Mediawiki itself, much
> less the many many
> extensions in SVN. I can't think of a single one, offhand.
>
> FWIW, handling updates between versions is a mess. There
> are two accepted
> and documented ways to apply an extension's schema updates.
> There needs
> to be one, period. There also needs to be a cleaner Update
> interface so things
> like this can be handled more cleanly.
>
> It's nice and great to talk about automated regression
> testing of the software,
> but in reality there is no clean way to do it right now. I
> really admire Gerard
> and Kim's work on this, but it's really a hack on top of a
> system that should
> support this stuff natively.
>
> Regression testing should be automatic, the test cases
> should be standardized,
> and extensions should have an easy way to add their own
> tests to the core
> set of them. None of these are currently the case. There's
> a bug open about
> running parserTests and/or test cases in CodeReview so we
> can easily and
> verifiably track regressions in the software. Can't do that
> until the system
> makes some sense to begin with :)
>
> -Chad
Hmm. Not the perfect situation :D . But, as a manager once told me, baby steps, Dan, baby steps. So, I think an informal plan to incrementally improve testing of Mediawiki would be useful. One idea is to broadcast an appeal for testing engineers to help rectify the situation. I am retired myself and I suspect there are bunch of retired testing engineers out there that might be willing to help. Of course, figuring out how to reach them is the main problem.
Dan
Hi Gerard,
I am very interested in the tool you mention. Let's keep the discussion on list, since I suspect there are others who might want to set up a regression test environment either now or later.
Can you provide some pointers how to use this tool? Is it described in the standard SVN documentation or is that located somewhere else? What is its name? Would its use allow testing against the most recent version in trunk?
My initial thoughts for such a regression test installation are:
I think some extensions involve database schema changes. My initial idea is to create a new installation, make all of the schema changes necessary for any extensions in the regression test set and then dump the database. This dump could then be used to set up new regression test installations (or reinitialize existing test installations). Perhaps the dump could be placed in Subversion in a "test" area.
Just loading a set of extensions in an installation doesn't really provide much in the way of verifying they work together. The test needs to use them together. One way to do this would be to create a set of pages that concurrently access the extensions when the pages are rendered. Knowing which extensions to use together is a key to this approach. Perhaps there is information in Bugzilla that would help determine that. Also, extension authors might provide some tests that could be incorporated into the test set. If you or others have some ideas on how to test extensions that would be very helpful.
When I worked on Solaris at Sun in the mid-90s, developers were required to regression test their changes before submitting them (through a gatekeeper) for inclusion in the nightly build. Those who failed to do so and broke the build had their hands slapped. Perhaps something similar might be established for the mediawiki development process. Extension authors might be required to: 1) provide some extension tests that could included in the regression test set (if their extensions ever become important enough to do that), and 2) run their extension tests and the standard tests against a standard regression test installation and provide evidence that there are no problems before their extensions are included in the mediawiki extensions matrix.
Dan
--- On Wed, 7/8/09, Gerard Meijssen <gerard.meijssen(a)gmail.com> wrote:
> From: Gerard Meijssen <gerard.meijssen(a)gmail.com>
> Subject: Re: [Wikitech-l] Defining a configuration for regression testing
> To: "Wikimedia developers" <wikitech-l(a)lists.wikimedia.org>
> Cc: "Kim Bruning" <kim(a)bruning.xs4all.nl>
> Date: Wednesday, July 8, 2009, 10:28 PM
> Hoi.
> In Subversion there is a tool created for the setup of
> environments. What it
> does well is setup an environment with a specific
> configuration. This
> configuration for a specific environment can be found in a
> file. In this way
> it is possible to define a specific revision or tag for
> either MediaWiki
> itself or for an extension. The software is such that you
> can specify
> multiple languages for an environment.. As there are
> important differences
> because of the language, the script you want to be able to
> test for a key
> subset of wikis. Duplication of an initial created wiki
> works well.
>
> When you look at ALL the extensions used whereever on WMF
> projects, you will
> find that they are not an homongous bunch; they are not
> used together. This
> means that you may want to have multiple environments
> configured. At this
> time there is a Wikipeida configuration and a Usability
> Initiative
> configuration. Given that the configuration is in a file,
> there is room to
> indicate a specific revision..
>
> As you can imagine, there are scripts to install particular
> extensions that
> can not be installed in a default way.
>
> When you have an interest, contact me or ask on this list.
> Thanks,
> GerardM
>
> 2009/7/9 dan nessett <dnessett(a)yahoo.com>
>
> >
> > I am setting up a testing environment for mediawiki
> and the first thing
> > that came to mind is testing new extensions against a
> "regression test
> > configuration". That raises the question of what
> should constitute such a
> > configuration. One issue is which extensions should be
> loaded.
> >
> > There are over 2000 extensions in the mediawiki
> extensions matrix and 512
> > stable extensions. It would be impractical to run a
> configuration with all
> > of either class. So, I asked around and received a
> suggestion that at the
> > very least the extensions on the wikimedia servers
> should be loaded. I went
> > to http://noc.wikimedia.org/conf/ and copied
> CommonSettings.php. From it
> > I extracted 75 extensions that are used on wikimedia's
> servers. I list these
> > below.
> >
> > A question for readers of this list is: should a
> regression test
> > configuration load only these extensions or should it
> load others? Another
> > question is: what other settings should define a
> regression test
> > configuration.
> >
> > Wikimedia installed extensions:
> >
> > Timeline, wikihiero, SiteMatrix, CharInsert,
> CheckUser,
> > SpecialMakesysop, Makebot, ParserFunctions, Cite,
> InputBox,
> > ExpandTemplates, ImageMap, SyntaxHighlight_GeSHi,
> DoubleWiki, Poem,
> > PovWatch, AjaxTest, UnicodeConverter, CategoryTree,
> ProofreadPage, lst,
> > SpamBlacklist, UploadBlacklist, TitleBlacklist, Quiz,
> Gadgets,
> > OggHandler, AssertEdit, FormPreloadPostCache,
> SkinPerPage, Schulenburg,
> > Tomas, ContributionReporting, ContributionTracking,
> ContactPage,
> > ExtensionDistributor, GlobalBlocking, TrustedXFF,
> ContactPage,
> > SecurePoll, OAIRepo, DynamicPageList, Nogomatch,
> > SpecialCrossNamespaceLinks, SpecialRenameuser,
> SpecialNuke, AntiBot,
> > TorBlock, CookieBlock, ScanSet, SpecialCite,
> FixedImage, UserThrottle,
> > ConfirmEdit, FancyCaptcha, HideRevision, AntiSpoof,
> CentralAuth,
> > DismissableSiteNotice, UsernameBlacklist,
> MiniDonation, CentralNotice,
> > TitleKey, WikimediaMessages, SimpleAntiSpam,
> Collection, NewUserMessage,
> > CodeReview, Drafts, Configure, AbuseFilter,
> ClientSide, CommunityVoice,
> > PdfHandler, UsabilityInitiative
> >
> > Regards,
> >
> > Dan Nessett
> >
> >
> >
> >
> >
> > _______________________________________________
> > Wikitech-l mailing list
> > Wikitech-l(a)lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
Hello.
I'm not sure if this is appropriate list for such kind of question.
I'll appreciate if someone directs me to the proper one.
One has task to make a local fully-functional mirror of Wikipedia
sudomain (articles, images, etc. must be located on the local server).
Currently there are not so many articles and downloading dump once a
day may be an option. But there is a problem: how to synchronize
changes made to the local copy back to the Wikipedia? Is there any
piece of software that could help?
I would appreciate any help.
Sincerely,
Artyom.
As many folks have noted, our current templating system works ok for
simple things, but doesn't scale well -- even moderately complex
conditionals or text-munging will quickly turn your template source into
what appears to be line noise.
And we all thought Perl was bad! ;)
There's been talk of Lua as an embedded templating language for a while,
and there's even an extension implementation.
One advantage of Lua over other languages is that its implementation is
optimized for use as an embedded language, and it looks kind of pretty.
An _inherent_ disadvantage is that it's a fairly rarely-used language,
so still requires special learning on potential template programmers' part.
An _implementation_ disadvantage is that it currently is dependent on an
external Lua binary installation -- something that probably won't be
present on third-party installs, meaning Lua templates couldn't be
easily copied to non-Wikimedia wikis.
There are perhaps three primary alternative contenders that don't
involve making up our own scripting language (something I'd dearly like
to avoid):
* PHP
Advantage: Lots of webbish people have some experience with PHP or can
easily find references.
Advantage: we're pretty much guaranteed to have a PHP interpreter
available. :)
Disadvantage: PHP is difficult to lock down for secure execution.
* JavaScript
Advantage: Even more folks have been exposed to JavaScript programming,
including Wikipedia power-users.
Disadvantage: Server-side interpreter not guaranteed to be present. Like
Lua, would either restrict our portability or would require an
interpreter reimplementation. :P
* Python
Advantage: A Python interpreter will be present on most web servers,
though not necessarily all. (Windows-based servers especially.)
Wash: Python is probably better known than Lua, but not as well as PHP
or JS.
Disadvantage: Like PHP, Python is difficult to lock down securely.
Any thoughts? Does anybody happen to have a PHP implementation of a Lua
or JavaScript interpreter? ;)
-- brion
I am setting up a testing environment for mediawiki and the first thing that came to mind is testing new extensions against a "regression test configuration". That raises the question of what should constitute such a configuration. One issue is which extensions should be loaded.
There are over 2000 extensions in the mediawiki extensions matrix and 512 stable extensions. It would be impractical to run a configuration with all of either class. So, I asked around and received a suggestion that at the very least the extensions on the wikimedia servers should be loaded. I went to http://noc.wikimedia.org/conf/ and copied CommonSettings.php. From it I extracted 75 extensions that are used on wikimedia's servers. I list these below.
A question for readers of this list is: should a regression test configuration load only these extensions or should it load others? Another question is: what other settings should define a regression test configuration.
Wikimedia installed extensions:
Timeline, wikihiero, SiteMatrix, CharInsert, CheckUser,
SpecialMakesysop, Makebot, ParserFunctions, Cite, InputBox,
ExpandTemplates, ImageMap, SyntaxHighlight_GeSHi, DoubleWiki, Poem,
PovWatch, AjaxTest, UnicodeConverter, CategoryTree, ProofreadPage, lst,
SpamBlacklist, UploadBlacklist, TitleBlacklist, Quiz, Gadgets,
OggHandler, AssertEdit, FormPreloadPostCache, SkinPerPage, Schulenburg,
Tomas, ContributionReporting, ContributionTracking, ContactPage,
ExtensionDistributor, GlobalBlocking, TrustedXFF, ContactPage,
SecurePoll, OAIRepo, DynamicPageList, Nogomatch,
SpecialCrossNamespaceLinks, SpecialRenameuser, SpecialNuke, AntiBot,
TorBlock, CookieBlock, ScanSet, SpecialCite, FixedImage, UserThrottle,
ConfirmEdit, FancyCaptcha, HideRevision, AntiSpoof, CentralAuth,
DismissableSiteNotice, UsernameBlacklist, MiniDonation, CentralNotice,
TitleKey, WikimediaMessages, SimpleAntiSpam, Collection, NewUserMessage,
CodeReview, Drafts, Configure, AbuseFilter, ClientSide, CommunityVoice,
PdfHandler, UsabilityInitiative
Regards,
Dan Nessett
(Originally asked at [[Wikipedia talk:Searching]] and [[WP:VPT]].)
Is there any existing way to search Wikipedia or MediaWiki in general
using full-fledged regular expressions? Such as those found in Perl,
PCRE, Python, JavaScript?
I started writing a Perl program that uses Parse::MediaWikiDump, goes
over a dump and searches for regexes, but there are two problems with
this:
1. Such a program probably already exists, although i don't know
where. Can anyone point me to an existing tool? It can be in any other
language, not necessarily Perl, but it should be portable - not
Windows-only/Mac-only/Linux-only.
2. The info won't be up-to-date. Would it be too much to ask to search
the database directly using regexes?
If problem number 2 is too hard to solve and nobody knows the answer
to problem number one, then i guess that i'll publish my Perl dump
searching program for the common good. (Why not Python? Because i know
Perl better and Parse::MediaWikiDump works well enough for me.)
--
Amir Elisha Aharoni
http://aharoni.wordpress.com
"We're living in pieces,
I want to live in peace." - T. Moore
wjhonson(a)aol.com wrote:
> My entire point Neil was simply that, "short-time-to-learn" should also be a consideration.? To me, a language that borrows heavily from an *already known* source like English or even BASIC is easier to learn, than one which requires that every command be learned again without any prior foundation.? I am not a subscriber to tech.? I don't think I want to be.
>
>
Wikitech-l is undoubtedly the right forum for this discussion, so we
really should continue this discussion there.
I find it rather difficult to understand exactly what you want here.
Could you please give an example, even a rough one, of the sort of
syntax you are proposing?
For example, how would you write something like, say, this artificial
example:
{{#switch:
{{#iferror: {{#expr: {{{1}}} + {{{2}}} }} | error | correct }}
| error = that's an error
| correct = {{{1}}} + {{{2}}} = {{#expr: {{{1}}} + {{{2}}} }}}}
in your new notation?
-- Neil
Drop a note on the [[Commit access requests]] page
on Mediawiki.org too. Trying to keep requests all in one
place these days :)
-Chad
On Jul 7, 2009 7:00 AM, "Christian Becker" <chris(a)beckr.org> wrote:
Hi all,
I'm developing the new OSM SlippyMap with Aude & Avar. As our code has now
made it into the Wikimedia trunk, I could use SVN commit access.
As for my contributions, I externalized the JavaScript code and made it
object-oriented, added support for image placeholders (i.e. click to get a
dynamic map), and did lots of refactoring (see [1], our previous external
repository).
I'd prefer the username "beckr"; my public key is at [2].
Cheers,
Christian
[1] http://code.google.com/p/wikimaps/updates/list
[2] http://beckr.org/key.pub
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l