Hi,
Does anybody know where this comes from, i.e., "view structure for view
'page_view'" (see below)? I was not able to find anything about this. Is
this from MediaWiki core or from some extension I do not know about?
Google suggests some Google Analytics-related stuff but ... Does not
look like it to me. For context: Line 50013 prevents me from importing a
database correctly.
Thanks a lot for a pointer or some insight.
Cheers, Karsten
--
-- Final view structure for view `page_view`
--
/*!50001 DROP VIEW IF EXISTS `page_view`*/;
/*!50001 SET @saved_cs_client = @@character_set_client */;
/*!50001 SET @saved_cs_results = @@character_set_results */;
/*!50001 SET @saved_col_connection = @@collation_connection */;
/*!50001 SET character_set_client = utf8 */;
/*!50001 SET character_set_results = utf8 */;
/*!50001 SET collation_connection = utf8_general_ci */;
/*!50001 CREATE ALGORITHM=UNDEFINED */
/*!50013 DEFINER=`root`@`localhost` SQL SECURITY DEFINER */
/*!50001 VIEW `page_view` AS select `page`.`page_title` AS
`page_title`,`page`.`page_counter` AS `page_counter` from `page` */;
/*!50001 SET character_set_client = @saved_cs_client */;
/*!50001 SET character_set_results = @saved_cs_results */;
/*!50001 SET collation_connection = @saved_col_connection */;
/*!40103 SET TIME_ZONE=@OLD_TIME_ZONE */;
/*!40101 SET SQL_MODE=@OLD_SQL_MODE */;
/*!40014 SET FOREIGN_KEY_CHECKS=@OLD_FOREIGN_KEY_CHECKS */;
/*!40014 SET UNIQUE_CHECKS=@OLD_UNIQUE_CHECKS */;
/*!40101 SET CHARACTER_SET_CLIENT=@OLD_CHARACTER_SET_CLIENT */;
/*!40101 SET CHARACTER_SET_RESULTS=@OLD_CHARACTER_SET_RESULTS */;
/*!40101 SET COLLATION_CONNECTION=@OLD_COLLATION_CONNECTION */;
/*!40111 SET SQL_NOTES=@OLD_SQL_NOTES */;
Hi everyone,
Do you want to present at Wikimania 2023 <https://www.wikimania.org>?
Wikimania program submissions are open now until the end of day Tuesday 28
March, anywhere on earth <https://zonestamp.toolforge.org/1680091140>.
After the 2020 edition of Wikimania was postponed and the two following
online editions, Wikimania is now back with an *in-person*, *hybrid* and
*on-demand* event in Singapore, from 16-19 August 2023!
The theme for Wikimania 2023 is *Diversity, Collaboration, Future*. There
are 11 tracks to choose from: familiar ones like Community Initiatives,
Governance, GLAM, or Technology; and new ones like Open Data and Wild
Ideas. You can submit an interactive workshop or panel, a lecture, a short
lighting talk or a poster for our dedicated poster session. Making a
submission is easy and we have upcoming conversation hours on Sunday March
19 at 00:00 and 14:00 UTC to help you out. You can also reach out to us on
the talk page or on Telegram. All the information you need is available on
wiki <https://wikimania.wikimedia.org/wiki/2023:Program/Submissions>.
To explore suggested topics for the track, please see here
<https://wikimania.wikimedia.org/wiki/2023:Program/Submissions#Tracks>: to
browse through the already submitted proposals, please go to the Program
Submission category
<https://wikimania.wikimedia.org/wiki/Category:Wikimania_2023_Program_submis…>
on the Wikimania wiki.
We welcome your session proposals. Read more on *Diff*
<https://diff.wikimedia.org/2023/02/28/be-part-of-the-wikimania-2023-program/>and
start preparing your ideas!
Kind regards,
*Ciell*
On behalf of the Wikimania 2023 Core Organizing Team
Hi,
Andre suggested mailing to this list about task T236671 [0]. It is about
a misleading error message issued when having senior moments and trying
to run "update.php against" a non-existing database.
The issue is not a big deal, but it may be possible to improve usability
without considerable effort. I cannot assess, though.
Thanks a lot for having a peep at this.
Cheers, Karsten
[0] https://phabricator.wikimedia.org/T236671
Hello,
I have deployed a change to Gerrit which makes it display the ongoing
CI/Zuul build if there is any.
If Jenkins jobs are running, you would see below the commit message some
gray chipset with the name of the Zuul pipeline (test, gate-and-submit
..). The Check tab shows the jobs details
(https://phabricator.wikimedia.org/F36925186).
By exposing the Zuul CI status directly in the Gerrit web UI, people
will notice a build error earlier. That also saves the hassle of having
to monitor https://integration.wikimedia.org/zuul/.
There are a few glitches:
* the way I have implemented it abuses the model proposed by Gerrit
and in progress jobs are always considered a SUCCESS but will be
marked as ERROR when they fail.
* code is entirely running in the client browser. It is unable to send
notifications or triggers any email when a build has failed. The
EarlyWarning bot by Kosta Harlan
<https://phabricator.wikimedia.org/J302> does it though)
* I am not a JavaScript developer per see but learned about TypeScript
for static analysis and rediscovered QUnit. So at least there are
some basic guarantees.
* there are surely a bunch of edge cases that I have not properly handled
The code for those that are curious is at
https://gerrit.wikimedia.org/g/operations/software/gerrit/+/refs/heads/depl…
If you see problems, JavaScript errors etc please paste them on
https://phabricator.wikimedia.org/T214068 :)
Antoine "hashar" Musso
TL;DR: The legacy Mobile Content Service is going away in July 2023. Please
switch to Parsoid or another API before then to ensure service continuity.
Hello World,
I'm writing about a service decommission we hope to complete mid-July 2023.
The service to be decommissioned is the legacy Mobile Content Service
("MCS"), which is maintained by the Wikimedia Foundation's Content
Transform Team. We will be marking this service as deprecated soon.
We hope that with this notice, people will have ample time to update their
systems for use of other endpoints such as Parsoid [1] (n.b., MCS uses
Parsoid HTML).
The MCS endpoints are the ones with the relative URL path pattern
/page/mobile-sections* on the Wikipedias. For examples of the URLs see the
"Mobile" section on the online Swagger (OpenAPI) specification
documentation with matching URLs here:
https://en.wikipedia.org/api/rest_v1/#/Mobile
== History ==
The Mobile Content Service ("MCS") is the historical aggregate service that
originally provided support for the article reading experience on the
Wikipedia for Android native app, as well as some other experiences. We
have noticed that there are other users of the service. We are not able to
determine all of the users, as it's hard to tell with confidence from the
web logs.
The Wikimedia Foundation had already transitioned the Wikipedia for
Android and iOS apps to the newer Page Content Service ("PCS") several
years ago. PCS has some similarities with MCS in terms of its mobility
focus, but it also has different request-response signatures in practice.
PCS, as with MCS, is intended to primarily satisfy Wikimedia
Foundation-maintained user experiences only, and so this is classified with
the "unstable" moniker.
== Looking ahead ==
Generally, as noted in the lead, we recommend that folks who use MCS (or
PCS, for that matter) switch over to Parsoid for accessing Wikipedia
article content programmatically for the most predictable service.
The HTML produced by Parsoid has a versioned specification [2] and because
Parsoid is accessed regularly by a number of components across the globe
tends to have fairly well cached responses. However, please note that
Parsoid may be subject to stricter rate limits that can apply under certain
traffic patterns.
At this point, I do also want to note that in order to keep up with
contemporary HTML standards, particularly those favoring accessibility and
machine readability enhancements, Parsoid HTML will undergo change as we
further converge parsing stacks [3]. Generally, you should expect iteration
on the Parsoid HTML spec, and of course as you may have come to appreciate
that the shape of HTML in practice can vary nontrivially wiki-by-wiki as
practices across wikis vary.
You may also want to consider Wikimedia Enterprise API options, which range
from no cost to higher volume access paid options.
https://meta.wikimedia.org/wiki/Wikimedia_Enterprise#Access
== Forking okay, but not recommended ==
Because MCS acts as a service aggregate and makes multiple backend API
calls, caveats can apply for those subresources - possibility of API
changes, deprecation, and the like. We do not recommend a plain fork of MCS
code because of the subresource fetch behavior. This said, of course you
are welcome to fork in a way compatible with MCS's license.
== Help spread the word ==
Although we are aware of the top two remaining consumers of MCS, we also
are not sure who else is accessing MCS and anticipate that some downstream
tech may break when MCS is turned off. As we are cross-posting this
message, we hope most people who have come to rely upon MCS will see this
message. Please feel free to forward this message to contacts if you know
they are using MCS.
== Help ==
Although we intend to decommission MCS in July 2023, we would like to share
resources if you need some help. We plan to hold office hours in case you
would like to meet with us to discuss this or other Content Transform Team
matters. We will host these events on Google Meet. We will provide notice
of these office hours on the wikitech-l mailing list in the coming weeks
and months.
Additionally, if you would like to discuss your MCS transition plans,
please visit the Content Transform Team talk page:
https://www.mediawiki.org/wiki/Talk:Content_Transform_Team
Finally, some Content Transform Team members will also be at the Wikimedia
Hackathon [4] if you would like some in-person support.
Thank you.
Adam Baso (he/him/his/Adam), on behalf of the Content Transform Team
Director of Engineering
Wikimedia Foundation
[1] https://www.mediawiki.org/wiki/Parsoid
[2] https://www.mediawiki.org/wiki/Specs/HTML
[3] https://www.mediawiki.org/wiki/Parsoid/Parser_Unification/Updates
[4] https://www.mediawiki.org/wiki/Wikimedia_Hackathon_2023
Wikimedia Phabricator tasks have a "Priority" dropdown field (which is
not consistently used) with several discrete values, among them
"Lowest". See [1] for the full list of Priority field values.
Since [2], "Low" and "Lowest" priority share the same definition.
I propose to disable setting the "Lowest" Priority value in Phab tasks.
"Lowest priority" can sound demotivating / disrespectful ("there is
nothing that could be even less important"). However, bikeshedding
about changing the name of the value would ignore further observations:
* About half of the people who are most active in setting initial
Priority values do not ever set Priority to "Lowest"[3].
* There is no significant difference between median age of open tasks
with Low and with Lowest priority[4], thus nothing seems to get
realistically differentiated here.
* Personally I also assume Lowest priority is sometimes used instead
of honestly declining a task (means: "this is not a good idea"[5]).
But of course that is rather hard to prove.
If you have opinions and/or ideas how to interpret data differently,
please add them to https://phabricator.wikimedia.org/T228759 to keep
them in a single place - either as a text comment, or via "Award Token"
to express support or disagreement without adding words.
Thanks,
andre
[1] https://www.mediawiki.org/wiki/Phabricator/Project_management#Priority_leve…
[2] https://phabricator.wikimedia.org/T317533
[3] https://phabricator.wikimedia.org/T228759#6988320
[4] https://lists.wikimedia.org/hyperkitty/list/wikitech-l@lists.wikimedia.org/…
[5] https://www.mediawiki.org/wiki/Bug_management/Bug_report_life_cycle
--
Andre Klapper (he/him) | Bugwrangler / Developer Advocate
https://blogs.gnome.org/aklapper/
The mission of Wikimedia Performance is for our sites to transcend socioeconomic barriers around reliable and fast access to find and contribute knowledge. We provide tools and expertise to empower developers, and directly inform or undertake high-yield engineering projects. [1][2]
The below is a periodic introduction and summary of recent changes to our guides. If you haven't read these before, or if it's been more than six months, I recommend taking a fresh look. Especially if you work on frontend or backend components in a MediaWiki extension or MediaWiki core.
== *Current best practices* ==
The practices guides help set direction. Use them to guide new developments, or to identify areas for improvement in current code. If you're short on time, focus on the "Getting started" section atop each guide.
*Frontend*: https://wikitech.wikimedia.org/wiki/Performance/Guides/Frontend_performance…
* The introduction details the principles that drive our platform's architecture, and how to get the most out of it.
* Changed: The CSS "@embed" optimisation is now only recommended for very small icons (up to 0.3KB). The guide explains why and how.
*Backend*: https://wikitech.wikimedia.org/wiki/Performance/Guides/Backend_performance_…
* New "Getting started" section, with pointers to specific chapters for detailed guidance.
* Rewritten "Shared resources" chapter, now with a more accessible explanation of MySQL deadlocks and how to avoid them.
* Update "Multi-datacenter deployment" guidance. (No changes are needed to existing code.) We first adopted Multi-DC practices in 2015. WANObjectCache and JobQueue interfaces have gotten simpler since. Cross-DC purges and job queuing "just work", with no awareness or responsibility on calling code. MediaWiki now automatically pins a browser to the primary DC for a few seconds after publishing an edit. This allowed us to remove cross-DC complexity around the ChronologyProtector <https://doc.wikimedia.org/mediawiki-core/master/php/classWikimedia_1_1Rdbms…>.
== *Measuring* ==
The new "measure" guides help assess performance of existing code, and can help iterate development of proposed changes.
Frontend includes browser dev tools and continuous monitoring through dedicated perf testing infrastructure:
https://wikitech.wikimedia.org/wiki/Performance/Guides/Measure_frontend_per…
Backend includes flame graphs, benchmarking, and automatic Grafana stats if you adopt WANObjectCache:
https://wikitech.wikimedia.org/wiki/Performance/Guides/Measure_backend_perf…
== *More* ==
The above guides and an overview of datasets, tools, recommended Grafana dashboards, and infrastructure diagrams are available at:
https://wikitech.wikimedia.org/wiki/Performance
On behalf of the Performance Team,
-- Timo Tijhof.
[1] https://techblog.wikimedia.org/2018/12/12/why-performance-matters/
[2] https://www.mediawiki.org/wiki/Wikimedia_Performance_Team#Mission