Hello,
On 2012-12-12 20:04, James Alexander wrote:
On Wed, Dec 12, 2012 at 8:57 AM, Andre Klapperaklapper@wikimedia.orgwrote:
On Tue, 2012-12-11 at 19:30 -0800, James Forrester wrote:
This is not the final form of the VisualEditor in lots of different ways. We know of a number of bugs, and we expect you to find more. We do not recommend people trying to use the VisualEditor for their regular editing yet. We would love your feedback on what we have done so far – whether it’s a problem you discovered, an aspect that you find confusing, what area you think we should work on next, or anything else, please do let us know.[1]
[1] - https://en.wikipedia.org/wiki/Wikipedia:VisualEditor/Feedback
Playing the bad cop who's reading random feedback pages daily:
As https://www.mediawiki.org/wiki/VisualEditor/Feedback also exists I wonder if the VisualEditor deployment on en.wp and its related feedback is so different from upstream that it needs a separate feedback page (instead of e.g. a soft redirect to the mw: one), or other reasons. Or does the en.wp one somehow make it easier for testers to report issues? When we deploy VE to other Wikipedias, will there also be separate VE feedback pages (maybe due to the different languages)?
Note: I'm not criticizing it, I'm just trying to understand, and I'm picking VE as the most recent example.
Thanks in advance for explaining, andre -- Andre Klapper | Wikimedia Bugwrangler http://blogs.gnome.org/aklapper/
Risker said many of the reasons but the biggest reason is that a large portion of testers would not move wiki. Opening up a local spot for feedback drastically increases the amount of feedback you get which can be really helpful. Personally I think we should do it on as many wikis as we can for major projects like this but it's obviously difficult to do on many because of both the language barriers and watching too many feedback channels.
Yet another thing that once a product like Echo works cross wiki it could be helpful for :) but that's a bit of a ways away.
The Wikidata-Team lays focus on the testing on RTL-wikis. The first Wikipedia ever will be huwiki, because their community decided themselves to be the first. Itwiki wanted to be the second one, but the Wikidata-Team wanted to test RTL, therefore they asked the hewiki.
Here I think i18n is very important as well, but I think it had already been tested many years ago. The PHP-regex-construct aka. wikiparser which is unidirectional has to be reimplemented by a real parser in JS.
Implementing this is not very easy, but developers can may use some of the old ideas. Parsing the other way around has to be realized really from the scratch but is easier because everything is in a tree. not in a single text-string.
Neither de- nor searalizing includes any surface, testing could be done automatically really easy comparing the results of conventional and the new parsing. The result of the serialization can be compared with the original markup.
The components including surfaces will for sure need i18n. Using icons instead of texts in a menu can help getting around this issue very easily. Although using many icons we also need text, but IMHO this should be avoided, also in foresight to the need of translation. ;-)
So this early version is for testing user's experience with this surface. I think this is really great and am impressed of the result it is a bit slow but, guys, it's still the alpha 1, what should I expect?
It also does not work on all browsers, as mentioned. E.g. the latest Firefox (aka. Iceweasel) on Debian Wheezy (not yet released), is not supported. This is okay for me.
I also think that people using outdated browsers should not have the same great experience. Everything necessary should work, but is the VE really essential?
Cheers
Marco