Hello everyone,
The Core Platform and Parsing teams at the Wikimedia Foundation are glad
to announce the implementation of a content negotiation protocol for
Parsoid HTML in the REST API [1]. This was deployed to the Wikimedia
cluster on October 1, 2018.
TL;DR
-----
Parsoid HTML clients can now use the Accept header to specify which
version of content they expect when requesting Parsoid HTML from the
REST API. If omitted, as before, they will get whatever version of the
HTML is in storage, regardless of any breaking changes it may contain.
Parsoid's HTML is versioned
---------------------------
An advantage of Parsoid’s HTML output is that it is both specced and
versioned [2]. By adhering to the principles of semantic versioning [3],
Parsoid can signal to clients what kinds of changes can be expected
in the output between versions.
However, until recently, Parsoid always returned the latest version
of its HTML. Naturally, this posed challenges when deploying breaking
changes since clients had to be prepared to consume the newer version.
Rolling out new HTML versions without breaking clients
------------------------------------------------------
Throughout its history, Parsoid developers have had close enough contact
with the developers of Parsoid clients (they are internal to the
Wikimedia Foundation for the most part) to coordinate deployment
of breaking changes to the HTML. This mainly involved ensuring all
known clients were forward and backwards compatible with the newer
HTML version before deploying the change. Needless to say, as more
clients were coming along, this informal process would not suffice;
a scalable and predictable version upgrade solution was needed.
Content Negotiation Protocol
----------------------------
To solve this problem, a content negotiation protocol [4] relying on
HTTP Accept headers was implemented. See RESTBase’s documentation [5]
for the exact details of the protocol. What follows is just an
informal description.
Parsoid clients are expected to pass an Accept header that specifies
the HTML version they can handle. If the version present in storage
does not satisfy the request, RESTBase will attempt to resolve the
inconsistency. However, if the requested version cannot be satisfied,
an (HTTP 406) error will be returned. The meaning of “satisfied” here
mostly follows semver’s caret semantics [6] (the main difference being
that the patch level is ignored).
If a client does not pass the Accept header, everything works exactly
like before, with all the downsides of the previous behaviour:
no protection from breaking changes; you get whatever HTML version
is currently in storage.
Caveat emptors
--------------
The deployed Parsoid version generates HTML versions 1.8.0 [7] and
2.0.0 [8]. But, it is worth mentioning that the oldest acceptable
version supported is 1.6.0, so if you’re sending an Accept header with
a version less than 1.6.0, your application will break. The reason
for this odd constraint is that we mistakenly released that version
without bumping the major version [9] even though it introduced a
breaking change. Mea culpa!
Also, RESTBase only stores the latest version so, as content gets
rerendered and storage gets replaced, clients requesting older content
have to pay a latency penalty while the stored content is downgraded
to an appropriate version. Hence, we encourage Parsoid HTML clients to
pay attention to announcements about major version changes and upgrade
promptly. Going forward, we’ll send announcements about Parsoid HTML
versions changes on the mediawiki-api-announce mailing list.
How does this impact 3rd party wikis?
-------------------------------------
Finally, astute readers will have noted that this announcement is
concerning the REST API. However, many 3rd party installs have VE
communicating directly with Parsoid and may be wondering how they’ll
be impacted by the change.
Parsoid has had a similar protocol (the difference is mainly in
respecting the patch level) implemented since the v0.9.0 release [7].
So, going forward, when upgrading Parsoid or VE, if the HTML version
requested by VE can be provided by Parsoid, the upgrade will be safe.
In Conclusion
-------------
Content negotiation now allows us to deploy new Parsoid features to the
Wikimedia cluster without needing prior coordination with all clients.
Clients can continue to request older versions until they are ready to
update (assuming they don’t fall too far behind since we only plan on
supporting two major versions concurrently). And, conversely, they can
request newer versions with the guarantee that they will not receive
incompatible content.
[1]: https://phabricator.wikimedia.org/T128040
[2]: https://www.mediawiki.org/wiki/Specs/HTML
[3]: https://semver.org/
[4]: https://tools.ietf.org/html/rfc7231#section-5.3
[5]:
https://www.mediawiki.org/wiki/API_versioning#Content_format_stability_and_…
[6]: https://www.npmjs.com/package/semver#caret-ranges-123-025-004
[7]: https://www.mediawiki.org/wiki/Specs/HTML/1.8.0
[8]: https://www.mediawiki.org/wiki/Specs/HTML/2.0.0
[9]:
https://lists.wikimedia.org/pipermail/mediawiki-l/2018-March/047337.html
*https://www.mediawiki.org/wiki/Scrum_of_scrums/2018-11-14
<https://www.mediawiki.org/wiki/Scrum_of_scrums/2018-11-14>*
*=2018-11-14=*
== Callouts ==
* Release Engineering
** 1.33.0-wmf.4 deployment blockers
https://phabricator.wikimedia.org/T206658
*** memcached error: A BAD KEY WAS PROVIDED/CHARACTERS OUT OF RANGE
https://phabricator.wikimedia.org/T209429
* Weird writes to codfw kafka cluster:
https://phabricator.wikimedia.org/T207994
== Audiences ==
=== Contributors ===
==== Community Tech ====
* Blocked by:
* Blocking:
* Updates:
**
==== Anti-Harassment Tools ====
* Blocked by:
* Blocking:
* Updates:
**
==== Editing ====
* Blocked by:
* Blocking:
** Updates:
**
==== Growth ====
* Blocked by:
* Blocking:
* Updates:
**
==== Language ====
* Blocked by:
* Blocking:
* Updates:
**
=== Readers ===
==== iOS native app ====
* Blocked by:
* Blocking:
* Updates:
**breaking down the editing work (
https://phabricator.wikimedia.org/tag/ios-app-v6.2-beluga-on-a-pogo-stick/)
**preparing for fundraising
==== Android native app ====
* Blocked by:
* Blocking:
* Updates:
**
==== Readers Web ====
* Blocked by:
* Blocking:
* Updates:
** Summary: We're performing a staged rollout of SEO changes, continuing
the MobileFrontend investment project, and trying to wrap up Proton and
page issues projects.
** Mobile website (MinervaNeue / MobileFrontend):
*** Invest in the MobileFrontend & MinervaNeue frontend architecture
https://www.mediawiki.org/wiki/Reading/Web/Projects/Invest_in_the_MobileFro…
**** Merge mobile.references and mobile.references.gateway ResourceLoader
modules T207805
**** Remove unused MW configs loaded on desktop pageviews T186062
**** T206699 Add tests for Button, Panel and Section
**** Improve View composition T209007
**** Increase test coverage for non-View files with 0% coverage T206698
**** MobileFrontend pre-commit hooks don't work on Windows T208143
*** Page issues
https://www.mediawiki.org/wiki/Reading/Web/Projects/Mobile_Page_Issues
**** [Research 4hrs] [Bug] Page issues link overlapping in multiple
languages for larger screens T206887
**** Feature flag the page issues code T206179
**** Prepare selenium browser test for page issues A/B test T206647
*** Allow users to change their mobile skin preference T173527
*** Maintenance and bug fixes T202557 T99009
** SEO:
*** Old page_random values are nonuniformly distributed T208909 (Thanks
Gergő Tisza, Tilman Bayer, Piotr Miazga, Sam Smith, Tim Starling, Brad
Jorsch, Alex Monk, Brion Vibber, Daniel Kinzler, Max Semenik)
*** Staged rollout for SEO A/B test T208755
** PDF rendering (Proton)
https://www.mediawiki.org/wiki/Reading/Web/PDF_Functionality
*** Allow BBPromise cancellation T209070
*** Rewrite Queue to Promises T204055
*** Remaining work tracked in T186748
==== Readers Infrastructure ====
* Blocked by:
* Blocking:
* Updates:
** Maps:
*** The Stretch upgrade/data loading on the maps cluster continues (
https://phabricator.wikimedia.org/T205462 )
**** Some performance changes were made this week to help speed up the
process
** ReadingLists:
*** Batched deletion queries to give more predictable performance (
https://phabricator.wikimedia.org/T189926 )
** MCS/PCS:
*** Still working on improving performance by breaking up CPU-intensive
work (https://phabricator.wikimedia.org/T202642 )
*** Working with the app teams on end-of-year fundraising announcements (
https://phabricator.wikimedia.org/T204821 )
*** Working on feed endpoint, adding attributes to 'Picture of the day" (
https://phabricator.wikimedia.org/T202311 )
==== Multimedia ====
* Blocked by:
* Blocking:
* Updates
** Main focus continues to be on Structured Data on Commons, getting file
captions live on production Commons
https://phabricator.wikimedia.org/T194750
*** Thanks to colleagues in the Core Platform/MCR virtual team for work on
anti-vandalism extension support.
*** Still need progress on the https://phabricator.wikimedia.org/T194046
and https://phabricator.wikimedia.org/T200915 MCR tasks for
https://phabricator.wikimedia.org/T205891
==== Parsing ====
* Blocked by:
* Blocking:
* Updates:
==== UI Standardization ====
* Blocked by:
* Blocking:
* Updates:
** OOUI v0.29.4 & 0.29.5 got released last week, 0.29.3 before (missed SoS
last week, therefore mentioning now), highlights:
*** TagMultiselectWidget becomes invalid if there's text in input & there's
a `tagLimit` option now, thx Tchanders
*** Icon-only ButtonWidgets can carry a “waterproof” accessible label since
0.29.3, that's only visible for screen readers!
- You only have to add a label as usual and additional config option
`invisibleLabel: true`
- Updating documentation on mediawiki.org –
https://www.mediawiki.org/wiki/OOUI
*** Accessibility measurements across products, like MobileFrontend and
portals
== Technology ==
=== Analytics ===
* Blocked by:
* Blocking:
* Updates:
** Legacy Wikistats home page now links to wikistats2, deprecated older
reportcard completely: http://stats.wikimedia.org
** Upgraded cluster to cloudera 5.15 distro which solved several security
issues. Nobody noticed upgrade happened so SUCCESS.
** Refactoring our privacy policy data deletion scripts for safety [wiki,
email]
** We run into quite an issue when calculating the reconstruction of mw
history this month, there was a major refactor on mediawiki that we did
not know happened (comment tables) and the way that data is surfaced on
labs is very non optimal for analytics use cases. Working with core
team/DBAa on finding a better way to expose comment data:
https://phabricator.wikimedia.org/T209031 [wiki,email]
=== Cloud Services ===
* Blocked by:
* Blocking:
* Updates:
**
=== Fundraising Tech ===
* Blocked by:
* Blocking:
* Updates:
** CiviCRM optimizations, bug fixes and new reports
** More cleanup from PayPal's recurring payment ID changes
** Continuing CentralNotice backend work
** Supporting another card processor in Perú and Uruguay
** Testing backup card processor for US and Canada
=== MediaWiki Core Platform ===
* Blocked by:
* Blocking:
* Updates:
**
=== Performance ===
* Blocked by:
**
* Blocking:
**
* Updates:
**
=== Release Engineering ===
* Blocked by:
** 1.33.0-wmf.4 deployment blockers
https://phabricator.wikimedia.org/T206658
*** memcached error: A BAD KEY WAS PROVIDED/CHARACTERS OUT OF RANGE
https://phabricator.wikimedia.org/T209429
* Blocking:
* Updates:
**
=== Research ===
* Blocked by: None
* Blocking: None
* Updates:
** Working on the "Crosslingual Section Alignment in Wikipedia" paper
** Researching ways of resolving interlanguage conflicts for article
recommendations
=== Scoring Platform ===
* Blocked by:
* Blocking:
* Updates:
=== Search Platform ===
* Blocked by:
* Blocking:
* Updates:
** Fixed issue with highlighter breaking surrogate pairs:
https://phabricator.wikimedia.org/T208736
** Setting up A/B test for new wikidata completion models:
https://phabricator.wikimedia.org/T209402
** RDF ontology being moved out of beta:
https://phabricator.wikimedia.org/T112127
** Working on ElasticSearch 6 upgrade:
https://phabricator.wikimedia.org/T183282
** Working on running multiple Elastic instances on the same hardware:
https://phabricator.wikimedia.org/T193654
** Working on query parsing refactoring:
https://phabricator.wikimedia.org/T185108
=== Security ===
* Blocked by:
* Blocking:
* Updates:All Reviews are in progress by Security Analysts.
**
=== Services ===
* Blocked by:
* Blocking:
* Updates:
**
=== Site Reliability Engineering ===
* Blocked by:
* Blocking:
* Updates:
**
== Wikidata ==
* Blocked by:
** Operations? Create a wmf production ready nginx image -
https://phabricator.wikimedia.org/T209292
* Blocking:
* Updates:
** Arbritray data access enabled on oldwikisource and any wiktionaries that
didn't already have access
== German Technical Wishlist ==
* Blocked by:
* Blocking:
* Updates:
**
== Multi-Content Revisions ==
* Blocked by:
* Blocking:
* Updates:
**
== SoS Meeting Bookkeeping ==
* Updates:
**
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512
Hello!
MediaWiki-CodeSniffer 23.0.0 is now available for use in your
MediaWiki extensions and other projects. This release features
new sniffs and other bug fixes.
Here's the changelog since 22.0.0:
* Add comment why @private and @protected are okay (Umherirrender)
* Add sniff to detect + for string concat (Umherirrender)
* Add sniff to detect __METHOD__ in closures (Umherirrender)
* Fix deprecation check for compounded licenses (Umherirrender)
* Recognize MediaWikiTestCaseBase as test class (Aryeh Gregor)
* Remove [optional] from types in @param (Umherirrender)
* Update message to talk about "top level" instead of "file comment"
(Thiemo Kreuz)
* Upgrade squizlabs/php_codesniffer to 3.3.2 (Kunal Mehta)
Thanks,
- -- Legoktm
-----BEGIN PGP SIGNATURE-----
iQIzBAEBCgAdFiEE+h6fmkHn9DUCyl1jUvyOe+23/KIFAlvr0ekACgkQUvyOe+23
/KLCUA/9GsLywwf0+P8K5619qSIJ4eXmyy0VGNvAeiLDpf1TqKtuVn+6jBF40RE0
3sU2LMUOjjMBjlEgJpXpmQciM606Ob3mG4XQrBnioQV7rIY8kG+gRwPdaEQxQGbj
8jATa59vzVNuLwZvCJ6Lsm0S1xuD5UYlPMjvmHtc0VM/bG9oeAT/FZBwAofrD88D
W9bDcgo3jzQhv8smtYIhSzxLXrpPyccZyouwRxtYNlZHQg11pPm70qIPaJERxefi
iyYyV22tEK6iewuS01oUJqoPg+t1mO/UMezdgxeOvCczZj1KwgpDaiB+JTX9/r6Z
voEt/Zt67u1r90b2b7rNS30l309aZPRjgvGbFdFUe4uDwfmjXaZW4B2I0ZVXVCW1
A+2lN64wrRwQR74feO00nnl6bUC2IsjajwuOQ/pTJ6oXXda4hMT7+XX6u8fg1p9z
xHv2jWtrCS/eGlMaRZ4PWqOEOHjaa0KOw/iKxaYnNAzkc1TMjHquDsv0DekBw9CP
owkoKPWipsXKOg0r8oOinTx0+Hv5CvDDjSZPNftjoSng7GUtI4iU9SC2Uyk98ZW/
smsmWb1Ow0inV6Q3cGGctgvjDoyVhbcirmtXV6PlvanWHnMJziwRX6winz0F+YWG
REo4UvdA69tSi+qzN64Il7YhFBFZuZpUSKsoV9uxWVjvPN93Dcc=
=fN5s
-----END PGP SIGNATURE-----
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512
Hi,
In preparation for Wikimedia production switching to PHP 7.2, we need
to get CI running using 7.2 (and for the rest of the MediaWiki world
too!). But before we can do that, we'll need 7.1 to be passing first.
The following extensions are currently failing tests on 7.1:
* FileImporter: T206286
* Kartographer: T206293
* Wikibase: T205958
* WikibaseLexeme: T206100
* WikibaseMediaInfo: T206281
And one non-Wikimedia deployed extension:
* MultiMaps: T206291
Thank you to James F, Matma Rex and Umherirrender for already fixing
some other extensions that were failing.
I'd like to make 7.1 tests voting by the end of the week, and would
appreciate maintainers fixing their extensions before then. You can
currently trigger 7.1 tests to verify fixes by commenting "check
experimental" in Gerrit.
Thanks,
- -- Kunal / Legoktm
-----BEGIN PGP SIGNATURE-----
iQIzBAEBCgAdFiEE+h6fmkHn9DUCyl1jUvyOe+23/KIFAlu68zMACgkQUvyOe+23
/KJ6Aw//d/uVNoAjO3HfumaVwr0EqobZ3pHBOozl+pZAbMjeJRG4MkuoiRheAfYI
L51iIDbYpTceqbsMsJD304E2d/nBjIXjVw5fdXn2qUSLt31lourUETl3akX12ueQ
Gv1B3ZqMQqZ1e06agUg52dTnDJSPLSI3wCbRDQNV8fqC8xDrUe8N4Xzux7XE5zhT
RltBHJUx+dm1uPyPC3nDa/tUTCXQHiYc5hn0M/2ZLv1+DZMV1tf/eQsLYdUhkm2A
b/tUTwafUzlyXw6yJUhHDFxZX+jfds8ER3Hl8qPLaUH6UusaRz4YdoPMHEWsyUaf
3+3yp2xbvvADD4b3aNk2b1o0pykutciIaJjJA9gBXukxfvOutDxM5axEJlXLyovs
FjCTMjNzrb4gv0eZ8MLRndtM4Rph+izTDmGtmTXsUyUESNN7Cul7ud6Y8zuRCCx5
Fzg1cWEn0bivECE43555SIldjgsDc/5Io5Gc/55lXNyqLzM3h6jW7gU3jyohAqfh
DYT7Y7/fnqFQmqMiDztU+gOA4LMa2+Dj+EXBvIEYgUgdP3tQ0ro664HO+TeQWAqD
YpIe/LzFwZnkuwwSZlgVYqCNqV1QCTqO36Gam/sJAyJ+TqtbjefjLFOiB9q6+l1F
/bGy0XYB8msSavuljswr3uuPJx5V2n9T0/hXpYuBcphSkdzLH/E=
=spUL
-----END PGP SIGNATURE-----
>what does a few weeks mean exactly
If things go well, sometime in the next two weeks. However, if we
can't get it done (I think there's a few things to review for
deployment) by the end of November we'll postpone until January as to
avoid the English fundraiser. A small reminder, not to be too
pedantic, but most templates described should be fine (famous last
words). :)
If you want to check out your favorite template on the mobile site,
add "?minerva-issues=b" to see the new treatment.
Thanks for the reminder for Tech News! Tabbing over to do that now.
Yours,
Chris Koerner
Community Relations Specialist
Wikimedia Foundation
Hi All,
A reminder that TechCom is hosting an IRC meeting tomorrow (Wednesday
14 November) on RfC: Session storage service interface
<https://phabricator.wikimedia.org/T206010>
This RFC outlines an API for a new multi-master session storage
service for use in multi-DC session management.
The meeting is scheduled for Wednesday 14 November 2pm PST(22:00 UTC,
23:00 CET) in #wikimedia-office.
If you haven't joined a #wikimedia-office meeting before more
information can be found here
<https://meta.wikimedia.org/wiki/IRC_office_hours#How_to_participate>
More information regarding the TechCom RFC process is available here
<https://www.mediawiki.org/wiki/Wikimedia_Technical_Committee/Processes#RFC_…>
Thanks,
-Kate
--
Kate Chapman
Senior Program Manager, Core Platform
Wikimedia Foundation
kchapman(a)wikimedia.org
This is a more solemn email than is usual. I recognize that this email
reflects my personal view, and if this email is not something that you
appreciate then I invite you to disregard it and write your own email
regarding something that makes you happy or grateful this week.
The 11th of November is commemorated in some parts of the world as
Armistice Day, Remembrance Sunday, or Veterans Day. The year 2018 marks the
100th anniversary of Armistice Day. I would like to take a moment to
reflect on the subject of Armistice Day, and on the roles of Wikimedia --
especially Wikipedia -- in sharing knowledge of history and being a
repository of our collective memory.
"Armistice Day is commemorated... to mark the armistice signed between the
Allies of World War I and Germany at Compiègne, France, for the cessation
of hostilities on the Western Front of World War I, which took effect at
eleven o'clock in the morning—the "eleventh hour of the eleventh day of the
eleventh month" of 1918." [1 <https://en.wikipedia.org/wiki/Armistice_Day>]
World War I was one of the deadliest conflicts in human history, with a
total of approximately 17 million civilian and military deaths. [2
<https://en.wikipedia.org/wiki/World_War_I_casualties>]
I would like to share a story.
John McCrae (photo here
<https://en.wikipedia.org/wiki/File:John_McCrae_in_uniform_circa_1914.jpg>)
was a medical doctor and Canadian soldier during World War I. He wrote a
famous poem, “In Flanders Fields
<https://en.wikipedia.org/wiki/File:In_Flanders_fields_and_other_poems,_hand…>”.
The poem refers to the red poppies that grew over the graves of soldiers
who died in the Second Battle of Ypres in Belgium. There are variants of
the wording of the poem. I quote one of them below.
In Flanders fields the poppies blow
Between the crosses, row on row,
That mark our place; and in the sky
The larks, still bravely singing, fly
Scarce heard amid the guns below.
We are the Dead. Short days ago
We lived, felt dawn, saw sunset glow,
Loved and were loved, and now we lie
In Flanders fields.
Take up our quarrel with the foe:
To you from failing hands we throw
The torch; be yours to hold it high!
If ye break faith with us who die
We shall not sleep, though poppies grow
In Flanders fields."
Here are a few images:
* Poppies in the sunset on Lake Geneva, Montreux, Switzerland
<https://en.wikipedia.org/wiki/File:Poppies_in_the_Sunset_on_Lake_Geneva.jpg>
* Canadian Tomb of the Unknown Soldier
<https://en.wikipedia.org/wiki/File:Canadian_Tomb_of_the_Unknown_Soldier_wit…>
* Remembrance Day 2010 in Ottawa, Canada
<https://en.wikipedia.org/wiki/File:Remembrance_Day_National_War_Memorial_Ot…>
* Memorial of "In Flanders Fields"
<https://en.wikipedia.org/wiki/File:Inflandersfieldslestweforget01.JPG>
In our contemporary world where there are many disputes about history,
resources are limited, and sometimes it is difficult to be optimistic about
human nature, I am especially grateful for Wikipedia's aspiration to be a
place to share neutral, reliable, and verifiable information with an open
license.
Wikimedia has remarkable success at being a collaborative endeavor for the
education and information of humanity. Wikimedia content is collaboratively
developed by thousands of diverse individuals, many of whom are volunteers
and never meet in person. Content that is shared on Wikimedia sites is
viewed by millions of people around the world. Although we sometimes
caution the public that Wikipedia is not a primary source, for many people
Wikipedia seems to be a good starting point, and the references that we
provide allow people to perform their own research regarding history and
many other topics.
Thank you to everyone who documents history on Wikimedia, and to the people
who support this effort behind the scenes. We all benefit from your
generosity to our common memory. By documenting and learning about our
history, I hope that we improve our understanding of ourselves and our
potential, and can make wise decisions about our future.
I close with a poem by Catherine Munro
<https://en.wikipedia.org/wiki/User:CatherineMunro>:
THIS IS AN ENCYCLOPEDIA
One gateway to the wide garden
of knowledge, where lies
The deep rock of our past,
in which we must delve
the well of our future,
The clear water we must leave untainted
for those who come after us,
The fertile earth, in which
truth may grow in bright places,
tended by many hands,
And the broad fall of sunshine,
warming our first steps toward knowing
how much we do not know.
Ever onward,
Pine
( https://meta.wikimedia.org/wiki/User:Pine )
(I sent this to xmldatadumps-l yesterday but just realised that this
might be a more suitable place.)
Hallo,
I'm looking at the data dumps for all Wikipedia languages and noticed
that for some larger wikis, the geo_tags.sql.gz dump file does not
include any geotags found in articles. Is it possible to determine why
this is, and for which languages this is the case?
For example, the geotags dump file for Indonesian (a wiki with
>400,000 articles) is only 7kb large, and all geotags in it are from
user pages, file uploads, or file templates, but not from articles:
https://dumps.wikimedia.org/idwiki/20181020/
Yet it doesn't take much effort to find pages that are geotagged, such
as this one (see the infobox): https://id.wikipedia.org/wiki/London
I realise that there are a number of alternative geotagging
conventions. Does idwiki possibly use a geotagging scheme that is not
supported by some part of this data ingestion/export process? Which
other wikis/languages may fall in this category?
I tried to find the script(s) that populate the geo_tags table from
page content but so far had no luck, as I'm not sufficiently familiar
with WP's software architecture; if someone can point me in the right
direction I'd be happy to investigate myself.
Many thanks!
m.