Cross, posting - this is great!
---------- Forwarded message ----------
From: Antoine Musso <hashar+wmf(a)free.fr>
Date: Thu, Nov 19, 2015 at 5:19 AM
Subject: [Wikitech-l] CI and cross repository dependencies
To: wikitech-l(a)lists.wikimedia.org
Hello,
We often have the case of a change to an extension depending on a
pending patch to MediaWiki core. I have upgraded our CI scheduler -
Zuul - a couple weeks ago and it now supports marking dependencies even
in different repositories.
Why does it matter? To make sure the dependency is fulfilled one
usually either:
* CR-2 the patch until dependent change is merged
* write a test that exercise the required patch in MediaWiki.
With the first solution (lack of test), once both are merged, nothing
prevent one from cherry picking a patch without its dependent patch.
For example for MediaWiki minor releases or Wikimedia deployment branches.
When a test covers the dependency, it will fail until the dependent one
is merged which is rather annoying.
Zuul now recognizes the header 'Depends-On' in git messages, similar to
'Change-Id' and 'Bug'. 'Depends-On' takes as parameter a change-id and
multiple ones can be added.
When a patch is proposed in Gerrit, Zuul looks for Gerrit changes
matching the 'Depends-On' and verify whether any are still open. In such
a case, it will craft references for the open patches so all the
dependencies can be tested as if they got merged.
Real world example
------------------
The ContentTranslation extension is tested with the Wikidata one and was
not passing the test. Wikidata created a patch and we did not want to
merge it until we confirm the ContentTranslation one is passing properly.
The Wikidata patch is https://gerrit.wikimedia.org/r/#/c/252227/
Change-Id: I0312c23628d706deb507b5534b868480945b6163
On ContentTranslation we indicated the dependency:
https://gerrit.wikimedia.org/r/#/c/252172/1..2//COMMIT_MSG
+ Depends-On: I0312c23628d706deb507b5534b868480945b6163
Which is the Wikidata patch.
Zuul:
* received the patch for ContentTranslation
* looked up the change-id and found the Wikidata
* created git references in both repo to point to the proper patches
Jenkins:
* zuul-cloner cloned both repos and fetched the references created by
the Zuul service
* run tests
* SUCCESS
That confirmed us the Wikidata patch was actually fixing the issue for
ContentTranslation. Hence we CR+2 both and all merged fine.
Please take a moment to read upstream documentation:
http://docs.openstack.org/infra/zuul/gating.html#cross-repository-dependenc…
Wikidata/ContentTranslation task:
https://phabricator.wikimedia.org/T118263
--
Antoine "hashar" Musso
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Good for some general perspective on (lack of) user privacy in the app industry:
http://techscience.org/a/2015103001/
"Who Knows What About Me? A Survey of Behind the Scenes Personal Data
Sharing to Third Parties by Mobile Apps"
summarized by Ars Technica here:
http://arstechnica.com/security/2015/11/user-data-plundering-by-android-and…
"Apps in both Google Play and the Apple App Store frequently send
users' highly personal information to third parties, often with little
or no notice, according to recently published research that studied
110 apps. ... 49 percent of Android apps sent users' names, 33 percent
transmitted users' current GPS coordinates, 25 percent sent addresses,
and 24 percent sent a phone's IMEI or other details. An app from
Drugs.com, meanwhile, sent the medical search terms "herpes" and
"interferon" to five domains, including doubleclick.net,
googlesyndication.com, intellitxt.com, quantserve.com, and
scorecardresearch.com ..."
The Wikipedia apps were not among those tested - sadly, because it
might have been a nice opportunity to highlight the fact that we don't
share data with third parties at all. (In another context, we made
that point with this tweet:
https://twitter.com/Wikipedia/status/579220963755044864 )
--
Tilman Bayer
Senior Analyst
Wikimedia Foundation
IRC (Freenode): HaeB
Hi all,
here is the weekly look at our most important readership metrics (CCing
Wikitech-l too this time).
As laid out earlier
<https://lists.wikimedia.org/pipermail/mobile-l/2015-September/009773.html>,
the main purpose is to raise awareness about how these are developing, call
out the impact of any unusual events in the preceding week, and facilitate
thinking about core metrics in general. We are still iterating on the
presentation and eventually want to create dashboards for those which are
not already available in that form already. Feedback and discussion
continue to be welcome.
As it might be of interest for readers of this report who haven’t already
seen the news on Analytics-l or Wikitech-l, I’d like to mention the
exciting news that the monthly pageview data on Wikistats
<https://stats.wikimedia.org/EN/TablesPageViewsMonthlyCombined.htm> has
been transitioned to the new pageview definition
<https://lists.wikimedia.org/pipermail/analytics/2015-November/004502.html>.
Now to the usual data, while introducing one new metric as well this time.
(All numbers below are averages for November 2-8, 2015 unless otherwise
noted.)
Pageviews
Total: 536 million/day (+2.2% from the previous week)
Context (April 2015-November 2015):
We more than reversed the -1.5% drop from last week, yay!
(See also the Vital Signs dashboard
<https://vital-signs.wmflabs.org/#projects=all/metrics=Pageviews>)
Desktop: 57.5%
Mobile web: 41.3%
Apps: 1.2%
Global North ratio: 77.5% of total pageviews (previous week: 77.1%)
Context (April 2015-November 2015):
New app installations
Android: 60.6k/day (+63.2% from the previous week)
Daily installs per device, from Google Play
Context (last three months):
On November 5, the app started to get featured in the "New and Updated
Apps" section of the Google Play store, enabled by the Android team’s
recent update work. The effect is already clearly visible here; we’ll have
a fuller view of the impact in the next report after the placement ends
today.
iOS: 4.41k/day (+11.2% from the previous week)
Download numbers from App Annie
Context (September 2014-September 2015):
Things are back to normal after the iOS app had been featured in the App
Store in mid-October. (Much of the 11.2% rise over the preceding week can
be tied to the - still unexplained - drop on Oct 30.)
App user retention
With this issue of the report, we’re adding a new metric that should be
more directly tied to how new users perceive the quality and usefulness of
the apps. Day-7 retention (D7) is defined as the proportion of users who
used the app again on the seventh day after they first opened it. The iOS
team has set themselves a quarterly goal to bring this “stickiness” metric
to at least 15% with their 5.0 update
<https://commons.wikimedia.org/wiki/File:IOS_Wikipedia_App_5.0_Update.pdf>
(p.5 of that doc contains some further context on this metric and links on
how it is perceived elsewhere in the industry; the following post is also
useful for perspective: “losing 80% of mobile users is normal, and why the
best apps do better”
<http://andrewchen.co/new-data-shows-why-losing-80-of-your-mobile-users-is-n…>).
Android: 13.9% (previous week: 11.5%)
(1:100 sample)
Context (last three months):
iOS: 13.1% (previous week: 10.6%)
(from iTunes Connect, opt-in only = ca. 20-30% of all users)
Context (October 11-November 8, 2015):
Unique app users
Android: 1.185 million / day (+2.0% from the previous week)
Context (last three months):
There are already signs of a small but discernible rise in active users due
to the app being featured, but we’ll need to wait until later to fully
assess this.
iOS: 280k / day (+1.0% from the previous week)
Context (last three months):
No news here
----
For reference, the queries and source links used are listed below (access
is needed for each). Most of the above charts are available on Commons, too
<https://commons.wikimedia.org/wiki/Category:Wikimedia_readership_metrics_re…>
.
hive (wmf)> SELECT SUM(view_count)/7000000 AS avg_daily_views_millions FROM
wmf.projectview_hourly WHERE agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-11-02"
AND "2015-11-08";
hive (wmf)> SELECT year, month, day,
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) as date,
sum(IF(access_method <> 'desktop', view_count, null)) AS mobileviews,
SUM(view_count) AS allviews FROM wmf.projectview_hourly WHERE year=2015 AND
agent_type = 'user' GROUP BY year, month, day ORDER BY year, month, day
LIMIT 1000;
hive (wmf)> SELECT access_method, SUM(view_count)/7 FROM
wmf.projectview_hourly WHERE agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-11-02"
AND "2015-11-08" GROUP BY access_method;
hive (wmf)> SELECT SUM(IF (FIND_IN_SET(country_code,
'AD,AL,AT,AX,BA,BE,BG,CH,CY,CZ,DE,DK,EE,ES,FI,FO,FR,FX,GB,GG,GI,GL,GR,HR,HU,IE,IL,IM,IS,IT,JE,LI,LU,LV,MC,MD,ME,MK,MT,NL,NO,PL,PT,RO,RS,RU,SE,SI,SJ,SK,SM,TR,VA,AU,CA,HK,MO,NZ,JP,SG,KR,TW,US')
> 0, view_count, 0))/SUM(view_count) FROM wmf.projectview_hourly WHERE
agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-11-02"
AND "2015-11-08";
hive (wmf)> SELECT year, month, day,
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")), SUM(view_count) AS
all, SUM(IF (FIND_IN_SET(country_code,
'AD,AL,AT,AX,BA,BE,BG,CH,CY,CZ,DE,DK,EE,ES,FI,FO,FR,FX,GB,GG,GI,GL,GR,HR,HU,IE,IL,IM,IS,IT,JE,LI,LU,LV,MC,MD,ME,MK,MT,NL,NO,PL,PT,RO,RS,RU,SE,SI,SJ,SK,SM,TR,VA,AU,CA,HK,MO,NZ,JP,SG,KR,TW,US')
> 0, view_count, 0)) AS Global_North_views FROM wmf.projectview_hourly
WHERE year = 2015 AND agent_type='user' GROUP BY year, month, day ORDER BY
year, month, day LIMIT 1000;
https://console.developers.google.com/storage/browser/pubsite_prod_rev_0281…
(“overview”)
https://www.appannie.com/dashboard/252257/item/324715238/downloads/?breakdo…
(select “Total”)
SELECT LEFT(timestamp, 8) AS date, SUM(IF(event_appInstallAgeDays = 0, 1,
0)) AS day0_active, SUM(IF(event_appInstallAgeDays = 7, 1, 0)) AS
day7_active FROM log.MobileWikiAppDailyStats_12637385 WHERE timestamp LIKE
'201510%' AND userAgent LIKE '%-r-%' AND userAgent NOT LIKE '%Googlebot%'
GROUP BY date ORDER BY DATE;
(with the retention rate calculated as day7_active divided by day0_active
from seven days earlier, of course)
https://analytics.itunes.apple.com/#/retention?app=324715238
hive (wmf)> SELECT SUM(IF(platform = 'Android',unique_count,0))/7 AS
avg_Android_DAU_last_week, SUM(IF(platform = 'iOS',unique_count,0))/7 AS
avg_iOS_DAU_last_week FROM wmf.mobile_apps_uniques_daily WHERE
CONCAT(year,LPAD(month,2,"0"),LPAD(day,2,"0")) BETWEEN 20151102 AND
20151108;
hive (wmf)> SELECT CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0"))
as date, unique_count AS Android_DAU FROM wmf.mobile_apps_uniques_daily
WHERE platform = 'Android';
hive (wmf)> SELECT CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0"))
as date, unique_count AS iOS_DAU FROM wmf.mobile_apps_uniques_daily WHERE
platform = 'iOS';
--
Tilman Bayer
Senior Analyst
Wikimedia Foundation
IRC (Freenode): HaeB
There are currently plans on deploying skin Blueprint on mediawiki.org
<https://phabricator.wikimedia.org/T93613>. Besides my work on UI
Standardization I'll also continue to work on the skin. I think Blueprint
should be on that list.
Best,
Volker
On Thu, Jul 23, 2015 at 6:25 PM, Adam Baso <abaso(a)wikimedia.org> wrote:
> Would you please share this on the list?
>
> On Thursday, July 23, 2015, Volker Eckl <veckl(a)wikimedia.org> wrote:
>
>> Hi Adam,
>> there are currently plans on deploying skin Blueprint on mediawiki.org
>> <https://phabricator.wikimedia.org/T93613>. Besides my work on UI
>> Standardization I'll also continue to work on Blueprint. Although UI
>> Standardization is a "special case", formally we belong to Reading and
>> therefore I think Blueprint should be on that list.
>>
>>
>> Best,
>> Volker
>>
>> On Mon, Jul 20, 2015 at 10:58 PM, Adam Baso <abaso(a)wikimedia.org> wrote:
>>
>>> Hi all -
>>>
>>> I've been reviewing a list of extensions with Reading Engineering and
>>> Reading Infrastructure leads - props to James Forrester for promoting this
>>> discussion. Here's a list of extensions we believe currently falls under
>>> Reading for triage (n.b., not all extensions will get active development
>>> support).
>>>
>>> https://www.mediawiki.org/wiki/User:ABaso_(WMF)/Extension_Responsibility
>>>
>>> Presuming no major issues with this, I think we should move the page to
>>> mw:Reading/Extension_Responsibility.
>>>
>>> One important outstanding question:
>>>
>>> Is MultimediaViewer appropriate for Reading given its
>>> consumption-oriented nature? Or is this actually better suited to Editing
>>> (where there exists a team named Multimedia)?
>>>
>>> Some other notes:
>>>
>>> * For skins with low utilization, we in time probably should coordinate
>>> handover to interested community members (or discuss with community members
>>> practical approaches for EOL).
>>>
>>> * Regarding the Nostalgia skin, we believe it's only used on
>>> https://nostalgia.wikipedia.org/wiki/HomePage, so maintenance would be
>>> updating for breaking skin changes or security issues only.
>>>
>>> * JsonConfig, ZeroBanner, ZeroPortal - we'll need to examine this more
>>> closely. Yuri (who has deepest PHP knowledge on extensions) is now over in
>>> Discovery, Jeff (JS & Lua) is in Reading, and now I'm managing instead of
>>> writing lots of code.
>>>
>>> * Collection probably belongs in Services
>>>
>>>
>>>
>>> _______________________________________________
>>> Mobile-l mailing list
>>> Mobile-l(a)lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/mobile-l
>>>
>>>
>>
https://twitter.com/paul_irish/status/664507136710316032?s=09
There's a couple of links to interesting articles about native apps,
discoverability, sustainability of the app stores for app developers.
Not sure how it impacts us but it's interesting.
Hello!
I wanted to send an update with results from the "Try the free Wikipedia
app" banners that we ran in Finland.
The Reading team wanted to learn more about how readers perceive the native
Wikipedia apps. From August 27 - September 3, 2015 we ran banners
encouraging readers to download the Android app in Finland in order to see
if this increased awareness and usage of the app. We chose to test in
Finland because it would not interfere with other fundraising and community
banner campaigns.
While we saw a big increase in installs and opens, we found that the
campaign did not significantly increase pageviews, leading us to conclude
that encouraging readers to use the native app would not benefit them right
now.
For more complete results, including screenshots of the banners, check out
this wiki page:
https://www.mediawiki.org/wiki/Wikimedia_Apps/Finland_Banner_Test
Thanks,
Anne
--
*Anne Gomez* // Product Manager, Fundraising & Reading
https://wikimediafoundation.org/
*Imagine a world in which every single human being can freely share in the
sum of all knowledge. That's our commitment. Donate
<http://donate.wikimedia.org>. *
Hi all,
here is the usual weekly look at our most important readership metrics
(CCing the Analytics-l mailing list too this time).
As laid out earlier, the main purpose is to raise awareness about how these
are developing, call out the impact of any unusual events in the preceding
week, and facilitate thinking about core metrics in general. We are still
iterating on the presentation (e.g. to better take seasonality into
account, in particular including year-over-year comparisons) and eventually
want to create dashboards for those which are not already available in that
form already. Feedback and discussion continue to be welcome.
Now to the usual data. (All numbers below are averages for October
26-November 1, 2015 unless otherwise noted.)
Pageviews
Total: 525 million/day (-1.5% from the previous week)
Context (April 2015-October 2015):
(see also the Vital Signs dashboard
<https://vital-signs.wmflabs.org/#projects=all/metrics=Pageviews>)
1.5% is a somewhat noticeable drop, and as in the last report I ran a query
for the countries with the largest changes from the previous week. Some
interesting data, but not sufficient for attributing the overall drop to a
particular area:
-
Ireland +40.2%
-
Romania -35.6% (previous week: +60%)
-
France +14.5%
-
Philippines -12.4% (previous week: -12.4%(!))
-
Mexico -10.3%
-
Colombia -9.2%
-
Ecuador -8.9%
-
Israel -7.7%
-
Malaysia -7.6%
-
Sweden -7.5%
Desktop: 57.7%
Mobile web: 1.2%
Apps: 41.1%
(same as previous week)
Global North ratio: 77.1% of total pageviews (previous week: 76.9%)
Context (April 2015-November 2015):
This week, instead of plotting the absolute numbers as usual
<https://commons.wikimedia.org/wiki/File:Wikimedia_pageviews,_Global_South_v…>,
let’s chart the percentage:
It’s not a definite proof, but this chart shows a pretty clear rise (or
conversely, decrease in the ratio of traffic from the Global South) during
the time of the staggered HTTPS-only rollout in June.
Unique app users
Android: 1.161 million /day (-0.0% from the previous week)
Context (August-November 2015):
iOS: 278k / day (-0.7% from the previous week)
Context (August-November 2015):
As anticipated, the marked increase in new installations while the app was
featured recently (see below) did not move the DAU needle much.
New app installations
Android: 37.2k/day (-1.2% from the previous week)
Daily installs per device, from Google Play
Context (August-November 2015):
iOS: 3.96k/day (-35.0% from the previous week)
Download numbers from App Annie
Context (August-November 2015):
A slightly conspicuous drop last Thursday. But most of the large
week-over-week decrease came from the app having been featured in the App
Store previously (see earlier weekly reports).
And since you read this far, a little reward in form of a link
<http://www.adweek.com/news/advertising-branding/ad-day-adobe-knows-what-you…>
to a mildly amusing 1 minute video ad that mocks the data analytics mishaps
of a fictitious but easily recognized large encyclopedia project ;)
----
For reference, the queries and source links used are listed below (access
is needed for each). Most of the above charts are available on Commons, too
<https://commons.wikimedia.org/wiki/Category:Wikimedia_readership_metrics_re…>
.
hive (wmf)> SELECT SUM(view_count)/7000000 AS avg_daily_views_millions FROM
wmf.projectview_hourly WHERE agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-10-26"
AND "2015-11-01";
hive (wmf)> SELECT year, month, day,
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) as date,
sum(IF(access_method <> 'desktop', view_count, null)) AS mobileviews,
SUM(view_count) AS allviews FROM wmf.projectview_hourly WHERE year=2015 AND
agent_type = 'user' GROUP BY year, month, day ORDER BY year, month, day
LIMIT 1000;
hive (wmf)> SELECT access_method, SUM(view_count)/7 FROM
wmf.projectview_hourly WHERE agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-10-26"
AND "2015-11-01" GROUP BY access_method;
hive (wmf)> SELECT SUM(IF (FIND_IN_SET(country_code,
'AD,AL,AT,AX,BA,BE,BG,CH,CY,CZ,DE,DK,EE,ES,FI,FO,FR,FX,GB,GG,GI,GL,GR,HR,HU,IE,IL,IM,IS,IT,JE,LI,LU,LV,MC,MD,ME,MK,MT,NL,NO,PL,PT,RO,RS,RU,SE,SI,SJ,SK,SM,TR,VA,AU,CA,HK,MO,NZ,JP,SG,KR,TW,US')
> 0, view_count, 0))/SUM(view_count) FROM wmf.projectview_hourly WHERE
agent_type = 'user' AND
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")) BETWEEN "2015-10-26"
AND "2015-11-01";
hive (wmf)> SELECT year, month, day,
CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0")), SUM(view_count) AS
all, SUM(IF (FIND_IN_SET(country_code,
'AD,AL,AT,AX,BA,BE,BG,CH,CY,CZ,DE,DK,EE,ES,FI,FO,FR,FX,GB,GG,GI,GL,GR,HR,HU,IE,IL,IM,IS,IT,JE,LI,LU,LV,MC,MD,ME,MK,MT,NL,NO,PL,PT,RO,RS,RU,SE,SI,SJ,SK,SM,TR,VA,AU,CA,HK,MO,NZ,JP,SG,KR,TW,US')
> 0, view_count, 0)) AS Global_North_views FROM wmf.projectview_hourly
WHERE year = 2015 AND agent_type='user' GROUP BY year, month, day ORDER BY
year, month, day LIMIT 1000;
SELECT country_code, changeratio, ROUND(milliondailyviewsthisweek,1) AS
milliondailyviewsthisweek FROM
(SELECT country_code, ROUND(100*SUM(IF((day>25 AND month=10) OR (day<2
AND month=11), view_count, null))/SUM(IF(day>18 AND day<26, view_count,
null))-100,1) AS changeratio, SUM(IF((day>25 AND month=10) OR (day<2 AND
month=11), view_count, null))/7000000 AS milliondailyviewsthisweek
FROM wmf.projectview_hourly
WHERE
year = 2015
AND month > 9
AND agent_type = "user"
GROUP BY country_code)
AS countrylist
WHERE milliondailyviewsthisweek > 1 GROUP BY country_code, changeratio,
milliondailyviewsthisweek ORDER BY ABS(changeratio) DESC LIMIT 10;
hive (wmf)> SELECT SUM(IF(platform = 'Android',unique_count,0))/7 AS
avg_Android_DAU_last_week, SUM(IF(platform = 'iOS',unique_count,0))/7 AS
avg_iOS_DAU_last_week FROM wmf.mobile_apps_uniques_daily WHERE
CONCAT(year,LPAD(month,2,"0"),LPAD(day,2,"0")) BETWEEN 20151026 AND
20151101;
hive (wmf)> SELECT CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0"))
as date, unique_count AS Android_DAU FROM wmf.mobile_apps_uniques_daily
WHERE platform = 'Android';
hive (wmf)> SELECT CONCAT(year,"-",LPAD(month,2,"0"),"-",LPAD(day,2,"0"))
as date, unique_count AS iOS_DAU FROM wmf.mobile_apps_uniques_daily WHERE
platform = 'iOS';
https://console.developers.google.com/storage/browser/pubsite_prod_rev_0281…
(“overview”)
https://www.appannie.com/dashboard/252257/item/324715238/downloads/?breakdo…
(select “Total”)
--
Tilman Bayer
Senior Analyst
Wikimedia Foundation
IRC (Freenode): HaeB
Hi Everyone --
We wanted to give you an update on where we were with the strategy process
and provide some more avenues for feedback and participation.
When we last spoke, we had presented an initial draft of our strategy along
with some details of the process we are using. This did not get the
feedback that we had hoped for[1]. We considered this feedback and came up
with some conclusions that we've used to move forward:
1. The strategic problem itself was too broad and unclear to be a
suitable foundation for discussion or specifics.
2. The process is difficult to understand if you haven't been immersed
in it.
3. Strategy is hard and collaborative strategy, particularly with a
large group of distributed stakeholders, is harder.
In response to these conclusions, we've done the following:
1. Simplified and pared down our scope and made our strategic problem
more accessible and actionable.
2. Changed our communications plan to focus more on the outcomes of the
strategic process (what we're proposed we do v. how we came to these ideas)
with the belief that this will be easier to discuss.
3. Moved forward with the process to get to more specific plans and most
importantly start the process to reverse engineer them to see if they will
be effective. We also made a video to explain where we are and have started
a process to gather input on design from the community.
We'd like to ask for your help in the following way:
1. Please review this page:
https://www.mediawiki.org/wiki/Reading/Strategy/Strategy_Process/Testing for
details on how you can provide feedback become more involved in the
process. We're also very happy to recieve feedback on the process itself
and on the strategic options we are reverse-engineering.
- The Reading Team
[1] Our favorite was something along the lines of "It sounds like you put
all of the buzzwords into a bag, shook it up, and poured it out on a table"
-- this was good.