Hi everybody!
As you might have noticed, the Revision class is being hard deprecated in MediaWiki core 1.35 release. The class has been replaced with RevisionRecord as a part of MCR work and has been soft-deprecated in 1.31 release. You can find out more about deprecation process in [1]
In practice, this means that extensions using Revision will keep working with MediaWiki 1.35, but would start emitting a deprecation warning, causing your tests to fail if $wgDevelopmentWarnings configuration enabled (it is in WMF CI). After 1.35 is released, Revision class and fallbacks using it will start being removed from core, breaking extensions that are still using it.
In order to help you start the update process and prepare, DannyS has written a comprehensive migration guide available under [2]. If you have any questions, Platform Engineering Team(Core Platform Team) will be answering them and providing assistance. To reach us, please use any of the following methods of communication:
#wikimedia-cpt IRC channel. The best time to find CPT is during the office hours, the next time slot it July 23 16:00-17:00 UTC - that’s when you’re guaranteed to find assistance, but if this timing is not good for you, please feel free to drop your questions at any convenient time.
Phabricator with #core-platform-team tag
Here on this email thread
Best regards.
Petr Pchelko
Senior Software Engineer, Core Platform Team
1. https://www.mediawiki.org/wiki/Stable_interface_policy#Deprecation_process <https://www.mediawiki.org/wiki/Stable_interface_policy#Deprecation_process>
2. https://www.mediawiki.org/wiki/Manual:Revision.php/Migration <https://www.mediawiki.org/wiki/Manual:Revision.php/Migration>
Hi there,
>From the Release Note for MW 1.33:
(https://www.mediawiki.org/wiki/Release_notes/1.33)
> LinksDeletionUpdate is now a subclass of LinksUpdate. As a consequence,
> the following hooks will now be triggered upon page deletion in addition
> to page
> updates: LinksUpdateConstructed, LinksUpdate, LinksUpdateComplete.
I have an extension that uses the LinksUpdate hook to re-parse the article
in order to extract meta-data that it then stores in the DB for use
elsewhere in the extension.
I use LinksUpdate, rather than ArticleSaveComplete, as I also need to
re-parse the article when any of its templates changes. This hook captures
all changes that affect the page (by my understanding) except - up until
1.33 - deletions.
I therefore also use the ArticleDeleteComplete hook to remove the article
from the extension's meta-data tables when the article is deleted.
In the light of the above change to the LinksUpdate hook, what changes do I
need to make to my extension? Shall I simply drop the ArticleDeleteComplete
hook and update LinksUpdate to check whether an article is deleted or not
(how?) to decide which action is required? Or is there more to it.
On a separate note, I also reparse the page when the ArticleUndelete hook is
called. Is this necessary, or is LinksUpdate also being called for
undeletes?
Finally, I do a similar thing in the TitleMoveComplete hook - I reparse both
the old article and the new article. Is this also redundant due to
LinksUpdate calls, or not?
Kind regards,
- Mark Clements (HappyDog)
Hello,
Today the Wikimedia Foundation would like to introduce a new community
blog. It's called "Diff" (diff.wikimedia.org) and is a blog by – and
for – the Wikimedia volunteer community to connect and share
learnings, stories, and ideas from across our movement. We'd like to
encourage you to learn more about Diff and how it can help you in
sharing and learning from your fellow Wikimedians.
Everyone is invited to contribute!
https://diff.wikimedia.org/2020/07/14/welcome-to-diff-a-community-blog-for-…
The name “Diff” is in reference to the wiki interface that displays
the difference between one version and another of a Wikipedia page. It
also reflects the “difference” our communities and movement make in
the world every day.
For some background, Diff builds on lessons and experiences from the
Wikimedia Blog, the Wikimedia Foundation News, and Wikimedia Space;
previous posts from these channels are archived on Diff. The channel
is primarily intended for community-authored posts, in which
volunteers can share their stories, learnings, and ideas with each
other.
Diff offers a simple and accessible editorial process, moderated by
Foundation communications staff and open to volunteers, to encourage
participation from all — especially emerging and under-represented
communities. Additionally, content on Diff can be written and
translated into languages to reach a wide audience. Diff also has a
code of conduct and comments can be flagged and moderated.
Still curious to learn more?
https://diff.wikimedia.org/2020/07/14/welcome-to-diff-a-community-blog-for-…
Yours,
Chris Koerner (he/him)
Community Relations Specialist
Wikimedia Foundation
Dear all,
This is a demo of how the Desktop Improvements
<https://www.mediawiki.org/wiki/Reading/Web/Desktop_Improvements#What_featur…>
project might look once it's finished:
http://demian-demo.epizy.com/wiki/Desktop_Improvements_volunteer_demo
Some details are not specified yet, therefore the end result will look
somewhat different.
I've implemented this mostly in 2 months' free time, around March-April.
Early, partial previews (wmf hosted):
http://patchdemo.wmflabs.org/wikis/380ec9d400f1bd8a573e09b015352723/w/http://patchdemo.wmflabs.org/wikis/b70f4202792d685831e5f957fb5953e1/w/
About me: I'm Demian (aka. Aron), senior software architect, creator of the
most recent Wikipedia Dark Theme
<https://www.mediawiki.org/wiki/User:Aron_Manning/Skin_themes> for Vector
and Timeless skins, among other various contributions and swift bugfixes.
With this preview I'd like to demonstrate what is possible with
collaboration. These demos were kept up-to-date with the project's progress
and benefited from the WMF developers' work to iron out some details. If
that collaboration would go both ways, the project could have been at the
same stage possibly a few months ago, nearing the finish line by now.
There are surprisingly few volunteers, even less professionals contributing
at MediaWiki, despite its high visibility. The small number of GitHub stars
and forks of the replica repos is also negligible compared to the thousands
of stars and forks of similarly popular projects.
The following is my opinion.
There is a great pool of talent that could contribute to modernizing the
aging code-base and designs (both software and UI), but despite the WMF's
regular calls for participation, volunteering has a very limited
aspect, more similar to interning at a closed-source project, not
reminiscent of the contribution patterns of free software. The development
- or community - practices can't benefit from a wide range of volunteers.
This untapped potential is of great value, lost for the MediaWiki project
and the communities using it.
If the WMF wishes to manifest its vision
<https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2018-20/Recomme…>
of an inclusive community, innovation and improved user experience, basic
steps can be taken at the back-office in that direction by making
development more open and welcoming to volunteers. Openness and
collaboration from the developer team would invite more talent and serve as
a foundation for a bigger, thriving developer community, similar to
open-source projects that succeeded with this model. There is a great
unused potential, that's in our core values to invite to the creation of
the foundation of free knowledge.
Further resources:
https://www.mediawiki.org/wiki/Reading/Web/Desktop_Improvements/Wikimania_S…https://www.mediawiki.org/wiki/Reading/Web/Desktop_Improvements/Feature_seq…https://www.mediawiki.org/wiki/User:Aron_Manning/Design/Desktop_Improvement…
Demian (aka. Aron)
Hi,
for HTML version see https://www.mediawiki.org/wiki/Scrum_of_scrums/2020-07-15
Željko
--
= 2020-07-15 =
== Callouts ==
* Release Engineering
** [All] Review guidance at [[wikitech:Deployments/Covid-19]] and Code
Deployment Office Hour at 17:00UTC in #wikimedia-office
** "scap sync" will be renamed to "scap sync-world" in the next
release. If you use "scap sync" non-interactively, please add a note
to: [[phab:T250302]] (and also, explain why you're using it)
** scap sync now has option --canary-wait-time; [[phab:T217924]]
== Product ==
=== iOS native app ===
* Updates:
** Bug fixes and 1st experiment development for 6.7 - [[phab:project/view/4661]]
** WWDC research
=== Android native app ===
* Updates:
** Continuing work on onboarding design refresh - [[phab:project/view/4819/]]
=== Web ===
* Updates:
** '''Summary''': Desktop Improvements Project (DIP) deployment prep,
continuing DIP sidebar persistence, and wrapping up DIP content
max-width, Vue.js search design spec, initial WVUI library rolling
development release, building button, input, and icon for Vue.js
search.
** [[Reading/Web/Desktop_Improvements|Desktop Improvements Project
(Vector / DIP)]]:
*** [[phab:T153043|<nowiki>Align Vector skin with WikimediaUI color
palette</nowiki>]]
*** [[phab:T257518|<nowiki>[Bug] Max-Width Layout: Sidebar overlaps
footer when its height is longer than the content height</nowiki>]]
*** [[phab:T255727|<nowiki>Make collapsible sidebar persistent for
logged-in users</nowiki>]]
*** [[phab:T247790|<nowiki>wgLogos follow up work</nowiki>]]
*** [[phab:T246427|<nowiki>[Spike 8hrs] Decide how to persist state of
collapsible sidebar across sessions for logged-in users, logged-out
users</nowiki>]]
*** [[phab:T246420|<nowiki>Limit content width, and refine alignment &
styling of relevant elements</nowiki>]]
*** [[phab:T246419|<nowiki>Build collapsible sidebar and sidebar
button </nowiki>]]
*** [[phab:T167956|<nowiki>Deprecate and remove printable version
mode</nowiki>]]
*** [[phab:T256092|<nowiki>[Modern Vector] Fix broken rendering of
`main` and `dialog` elements in IE9-11</nowiki>]]
*** [[phab:T251212|<nowiki>[Dev] Drop VectorTemplate usage in Vector</nowiki>]]
*** [[phab:T244392|Vue.js search case study]]:
**** Thank you, Anne Tomasevich, for your work on wvui-icon!
**** See [[Reading/Web/Desktop Improvements/Vue.js case study/Status
log|weekly status updates]].
** Mobile website (MinervaNeue / MobileFrontend):
*** [[phab:T254287|<nowiki>Final warning: Mobile main page special
casing will be disabled July </nowiki>]]
*** [[phab:T32405|<nowiki>[EPIC] MobileFrontend extension should stop
special-casing main page</nowiki>]]
** Standardization
*** [[phab:T257167|<nowiki>Narrator not announcing popup dialog when
help button/icon is clicked.</nowiki>]]
*** [[phab:T182050|<nowiki>Deprecate and replace mediawiki.ui
components</nowiki>]]
*** [[phab:T256520|<nowiki>Consider 'normalize' stylesheet RL module</nowiki>]]
*** [[phab:T257768|<nowiki>Change filewarning to normal warning color
and amend padding slightly</nowiki>]]
*** [[phab:T257385|<nowiki>Window: focus lost when navigating with
shift + tab key</nowiki>]]
*** [[phab:T257279|<nowiki>Standardize 'mediawiki.ui' variables to CSS
variables naming scheme in preparation for WikimediaUI Base variables
takeover</nowiki>]]
*** [[phab:T257165|<nowiki>OOUI PopupWidget : need to Notify caller
when popup widget closes</nowiki>]]
*** [[phab:T248206|<nowiki>SelectFileInputWidget: "Close" button is
not accessible through keyboard.</nowiki>]]
*** [[phab:T165650|<nowiki>Directly use the WikimediaUI values in the
WikimediaUI theme in OOUI, rather than via copy-paste</nowiki>]]
*** [[phab:T255325|<nowiki>Outline control widget wrong focus order</nowiki>]]
*** [[phab:T253399|<nowiki>Focus not visible on Button on high
contrast mode</nowiki>]]
** Portals
*** [[phab:T128546|<nowiki>[Recurring Task] Update Wikipedia and
sister projects portals statistics</nowiki>]]
** QuickSurveys
*** [[phab:T246977|<nowiki>Run baseline quicksurvey on test wikis</nowiki>]]
** Miscellaneous
*** [[phab:T257630|<nowiki>SkinMustache::getTemplateData keys should
be hyphenated and existing in a dictionary</nowiki>]]
*** [[phab:T254048|<nowiki>Render the FallbackSkin and SkinApi with a
simplistic SkinMustache class</nowiki>]]
*** [[phab:T248751|<nowiki>Adopt mustache templates in Modern and
Monobook</nowiki>]]
*** [[phab:T231615|<nowiki>Use project logo wordmarks on Wikimedia
projects in Timeless</nowiki>]]
*** [[phab:T257877|<nowiki>MediaWiki installer appears unstyled</nowiki>]]
=== Product Infrastructure ===
* Updates:
** working on adding image schema.org licence data to article pages
** mediasearch improvements - vue frontend, improved linked data
search on backend
== Technology ==
=== Fundraising Tech ===
* Updates:
** Refining and improving bulk email sync
** Better filtering for matching gift policies db
** Looking into potential bugs with flow asking one-time donors to
convert to monthly donations
=== Engineering Productivity ===
==== Quality and Test Engineering ====
* Updates:
** New blog!
*** Wow. So wikimedia. Such quality. Many testing. Very team. 🐶
[[phab:phame/blog/view/21/]]
** Blog posts (by Google Summer of Code interns):
*** [[User:Vidhi-mody|Vidhi Mody]] - GSoCpedia: The journey so far
[[phab:phame/post/view/201/gsocpedia_the_journey_so_far/]]
*** [[User:AlQaholic007|Soham Parekh]] - Fanboying Cypress
[[phab:phame/post/view/202/fanboying_cypress/]]
==== Release Engineering ====
* Updates:
** [All] Deployments/Covid-19 [[wikitech:Deployments/Covid-19]]
** Train Health
*** Last week: 1.35.0-wmf.40 - [[phab:T256668]]
*** This week: 1.35.0-wmf.41 - [[phab:T256669]]
*** Next week: 1.36.0-wmf.1 - [[phab:T257969]]
The example below works, and I see it used in some extensions, but it
has no autocompletion and not catching typos.
Services.php:
class TranslateServices implements ContainerInterface {
public function getParsingPlaceholderFactory(): ParsingPlaceholderFactory {
return $this->container->get( 'Translate:ParsingPlaceholderFactory' );
}
public function getTranslatablePageParser(): TranslatablePageParser {
return $this->container->get( 'Translate:TranslatablePageParser' );
}
}
ServiceWiring.php:
return [
'Translate:ParsingPlaceholderFactory' => function ():
ParsingPlaceholderFactory {
return new ParsingPlaceholderFactory();
},
'Translate:TranslatablePageParser' => function ( MediaWikiServices
$services )
: TranslatablePageParser
{
return new TranslatablePageParser(
$services->get( 'Translate:ParsingPlaceholderFactory' ) # <--------
);
},
];
Do you see any downsides of using code like below instead?
'Translate:TranslatablePageParser' => function (): TranslatablePageParser {
$services = TranslateServices::getInstance();
return new TranslatablePageParser(
$services->getParsingPlaceholderFactory() );
},
I looked at other extensions and I noticed a lot of small differences
among them:
* Some extensions use static methods as opposed to wrapping the core
service container
* Some extensions use constants for service identifiers
* Lots of different implementations of "To avoid name conflicts, the
service names should be prefixed with the extension's name.":
** ExtensionService
** Extension.Service
** Extension:Service
** Extension_Service
Are we yet in a stage to agree on some (additional) conventions and
document them somewhere? Maybe in
https://www.mediawiki.org/wiki/Dependency_Injection
-Niklas
Dear all:
In light of the number of extensions making use of Revision objects, I've
documented some suggestions for how to replace each method. These
suggestions are also available at
https://www.mediawiki.org/wiki/Manual:Revision.php/Migration.
tl;dr: its complicated, but doable. I'm happy to review any patches, and if
you want to do part of the migration at a time, Jenkins will fail but can
be overridden.
----
For extensions that make widespread use of Revision objects, switching to
RevisionRecord may be challenging. It may be easier to split the migration
into multiple patches that each do part of the migration. Tests will still
fail, but if you add me (DannyS712) as a reviewer I'd be happy to review
all of the patches.
== Constructors ==
To replace uses of the static constructors (Revision::newFrom*) while still
having Revision objects, as a stop-gap measure use `new Revision(
$revisionRecord )` with the RevisionRecord constructed using the
RevisionLookup service. The RevisionLookup methods either return a
RevisionRecord or null, and the corresponding Revision methods returned
either a Revision or null:
Instead of
$revision = Revision::newFromId( $id, $flags );
use
$revisionRecord =
MediaWikiServices::getInstance()->getRevisionLookup()->getRevisionById(
$ids, $flags );
$revision = $revisionRecord ? new Revision( $revisionRecord ) : null;
For the other static constructors, simply replace `getRevisionById` with
the corresponding RevisionLookup method (note that the loadFrom* methods
accept different parameters than the newFrom*):
Revision::newFromTitle -> RevisionLookup::getRevisionByTitle
Revision::newFromPageId -> RevisionLookup::getRevisionByPageId
Revision::loadFromPageId -> RevisionLookup::loadRevisionFromPageId
Revision::loadFromTitle -> RevisionLookup::loadRevisionFromTitle
Revision::loadFromTimestamp -> RevisionLookup::loadRevisionFromTimestamp
The RevisionFactory service is used to replace Revision::newFromArchiveRow
and Revision::newFromRow; use RevisionFactory::newRevisionFromArchiveRow,
::newMutableRevisionFromArray, and ::newRevisionFromRow. Note, however,
that RevisionFactory::newMutableRevisionFromArray is itself deprecated, and
uses of it will need to be replaced with construction of a
MutableRevisionRecord.
For direct uses of `new Revision`, either existing ones or those added now,
see the notes below regarding the final removal of constructing Revision
objects.
== Fetching relative revisions ==
Revision::getNext was replaced by RevisionLookup::getNextRevision, which
returns a RevisionRecord object. Likewise, Revision::getPrevious was
replaced by RevisionLookup::getPreviousRevision.
== Revision text ==
Revision::getRevisionText returns a string of the text content for a
specific revision (based on a row from the database). To replace it, get
the RevisionRecord object corresponding to the row, and then use
RevisionRecord::getContent( SlotRecord::MAIN ) to get the relevant content
object, and if there is such an object, use Content::serialize to get the
text.
Revision::compressRevisionText can be replaced with
SqlBlobStore::compressData.
Revision::decompressRevisionText can be replaced with
SqlBlobStore::decompressData.
== Other static methods ==
The following static methods are replaced by non-static methods:
Revision::getQueryInfo -> RevisionStore::getQueryInfo
Revision::getArchiveQueryInfo -> RevisionStore::getArchiveQueryInfo
Revision::getParentLengths -> RevisionStore::getRevisionSizes
Revision::getTimestampFromId -> RevisionStore::getTimestampFromId (NOTE:
the Revision method's first parameter was a Title object that was ignored,
the RevisionStore method does not accept a Title, and only needs the
relevant id and any query flags)
Revision::countByPageId -> RevisionStore::countRevisionsByPageId
Revision::countByTitle -> RevisionStore::countRevisionsByTitle
Revision::userWasLastToEdit -> RevisionStore::userWasLastToEdit (NOTE: the
Revision method's first parameter was an int or an IDatabase object, the
RevisionStore method requires an IDatabase object. ADDITIONALLY, the
RevisionStore method is itself soft deprecated)
== userCan ==
Revision::userCanBitfield can be replaced with
RevisionRecord::userCanBitfield, and Revision::userCan can be replaced with
RevisionRecord::userCanBitfield with the bitfield being the int returned
from RevisionRecord::getVisibility. NOTE that both of the Revision methods
fell back to $wgUser if no user object was passed;
RevisionRecord::userCanBitfield requires that a User be provided.
== Corresponding Revision and RevisionRecord methods ==
The following Revision methods can be replaced with idential RevisionRecord
methods (though some have different names). Once a Revision object is
available in a relevant class or function, I suggest immediately retrieving
the corresponding RevisionRecord via Revision::getRevisionRecord and slowly
making use of the RevisionRecord instead of the Revision:
Revision::getId -> RevisionRecord::getId
Revision::getParentId -> RevisionRecord::getParentId
Revision::getPage -> RevisionRecord::getPageId
Revision::isMinor -> RevisionRecord::isMinor
Revision::isDeleted -> RevisionRecord::isDeleted
Revision::getVisibility -> RevisionRecord::getVisibility
Revision::getTimestamp -> RevisionRecord::getTimestamp
Revision::isCurrent -> RevisionRecord::isCurrent
The following Revision methods can be replaced with identical
RevisionRecord methods, BUT, while the Revision methods returned null if
the value was unknown, the RevisionRecord methods throw
RevisionAccessException exceptions:
Revision::getSize -> RevisionRecord::getSize
Revision::getSha1 -> RevisionRecord::getSha1
Revision::getTitle returned the relevant Title object for the revision. Its
replacement, RevisionRecord::getPageAsLinkTarget, returns a LinkTarget
instead of a Title. If a full Title object is needed, use
Title::newFromLinkTarget.
== Audience-based information ==
The Revision class had multiple methods that accepted a specific audience
for which a value (the editing user's id and username, the edit summary
used, or the revision content) was based on whether the audience could view
the information (since it may have been revision deleted).
The constants used for specifying an audience are identical in the Revision
and RevisionRecord classes:
Revision::FOR_PUBLIC -> RevisionRecord::FOR_PUBLIC
Revision::FOR_THIS_USER -> RevisionRecord::FOR_THIS_USER
Revision::RAW -> RevisionRecord::RAW
Replacements:
Revision::getUser returned the editing user's id, or 0, and
Revision::getUserText returned the editing user's username, or an empty
string. These were replaced with RevisionRecord::getUser, which returns a
UserIdentity object if the audience specifid can view the information, or
null otherwise. To get the user's id, use UserIdentity::getId, and for the
username, use UserIdentity::getName.
Revision::getComment returned the revision's edit summary (as a string), or
null. It was replaced with RevisionRecord::getComment, HOWEVER instead of a
string, RevisionRecord::getComment returns a CommentStoreComment.
Revision::getContent returned the content of the revision's main slot, or
null. It was replaced with RevisionRecord::getContent, HOWEVER the
RevisionRecord method requires that the slot for which the content should
be retrieved be specified. Use SlotRecord::MAIN to match the behaviour of
the Revision method. FURTHERMORE, the RevisionRecord method can throw a
RevisionAccessException, which the Revision method silently converted to
null.
NOTE: When the audience specified for ::getUser, ::getUserText,
::getComment, or ::getContent was FOR_THIS_USER, and no second parameter
was passed with the user, the Revision methods fell back to using $wgUser.
The RevisionRecord methods have no such fallback, and will throw an
InvalidArgumentException if attempting to use FOR_THIS_USER without passing
a User.
== Content handling ==
As part of the migration to multi-content revisions, there is no longer a
single content model or format for a revision, but rather there are content
models and formats for each slot.
Revision::getContentModel returned a string for the content model of the
revision's main slot (SlotRecord::MAIN), either the model set or the
default model. To replace it, get the RevisionRecord's main slot, and use
SlotRecord::getModel. If the revision has no main slot, fall back to the
default model for the title. Use the SlotRoleRegistry service to get the
SlotRoleHandler for the relevant role (in this case SlotRecord::MAIN), and
then use SlotRoleHandler::getDefaultModel.
Revision::getContentFormat returned a string for the content format of the
revision's main slot, or falls back to the default format for the content
model. To replace it, get the RevisionRecord's main slot, and use
SlotRecord::getFormat. If the revision has no main slot, or if
SlotRecord::getFormat returns null, fall back to the default format for the
content model using ContentHandler::getDefaultFormat with the relevant
ContentHandler.
Revision::getContentHandler returned the relevant ContentHandler object.
Use ContentHandlerFactory::getContentHandler with the relevant content
model as a replacement.
== Setting revision information ==
Revision::setId was only supported if the Revision object corresponded to a
MutableRevisionRecord. It can be replaced with MutableRevisionRecord::setId.
Revision::setUserIdAndName was only supported if the Revision object
corresponded to a MutableRevisionRecord. It can be replaced with
MutableRevisionRecord::setUser; NOTE that the MutableRevisionRecord method
requires a UserIdentity, rather than a user id and name.
Revision::setTitle was not supported, and threw an exception if it was
called with a different title than the one already set for the revision.
== Misc ==
Revision::getTextId -> use SlotRecord::getContentAddress for retrieving an
actual content address, or RevisionRecord::hasSameContent to compare content
Revision::isUnpatrolled -> RevisionStore::getRcIdIfUnpatrolled
Revision::getRecentChange -> RevisionStore::getRecentChange
Revision::getSerializedData -> use SlotRecord::getContent for retrieving a
content object, and Content::serialize for the serialized form
Revision::insertOn -> RevisionStore::insertRevisionOn
Revision::base36Sha1 -> SlotRecord::base36Sha1
Revision::newNullRevision -> RevisionStore::newNullRevision
Revision::newKnownCurrent -> RevisionLookup::getKnownCurrentRevision
== Constructing ==
For uses of `new Revision`, there are multiple relevant replacements:
If the Revision was constructed with a RevisionRecord, just use that
RevisionRecord directly
If the Revision was constructed with an array, use
RevisionFactory::newMutableRevisionFromArray
NOTE: If neither the `user` nor `user_text` fields were set, the Revision
class fell back to using $wgUser;
RevisionFactory::newMutableRevisionFromArray includes no such fallback, and
if no user is provided the MutableRevisionRecord returned simply won't have
a user set.
If the Revision was constructed with an object, use
RevisionFactory::newRevisionFromRow
----
Sorry if the suggestions were overwhelming. Let me know if there are any
questions. Hope this helps!
--DannyS712
This is a heads-up that we are planning to replace the host keys
for the Gerrit SSH server at gerrit.wikimedia.org:29418.
The change is planned for Tuesday, July 14th in the PDT
morning right after the MediaWiki train, that's around 11:00 UTC.
(https://wikitech.wikimedia.org/wiki/Deployments#Tuesday,_July_14)
The RSA key will be replaced with a longer version and additionally we
will start to offer ecdsa_256, ecdsa_384, ecdsa_521 and ed25519.
The service will not be RSA-only anymore which some users had already
reported as an issue.
After the change on Gerrit, your git / git-review / direct ssh
commands are expected to fail with errors about mismatched or changed
host keys or host identification.
This is expected.
You will need to remove the old, no longer used host key, and verify
the new one.
To remove the old host key, follow the instructions on screen or
consult the manual of your SSH software. Once that is done, retry the
command, and you'll be prompted to verify the new host key.
You can find the new keys for verification in this email below and on
https://wikitech.wikimedia.org/wiki/Help:SSH_Fingerprints/gerrit.wikimedia.…
If they match, confirm, and your command should continue. Once you
have successfully updated the host key you should no longer see any
errors.
If you are running any bots talking to gerrit-ssh please also update
their configuration accordingly and restart where needed.
https://wikitech.wikimedia.org/wiki/Help:SSH_Fingerprints/gerrit.wikimedia.…
ssh_host_rsa_key
2048 SHA256:j9/pXXc9WzjQwYP0t7nlzqH9EBOTw6q7DgcfnamJtsY
gerrit-code-review(a)gerrit1001.wikimedia.org (RSA)
ssh_host_ecdsa_256_key
256 SHA256:58swSiByT+4LVqs30/FqJpEPj+Mwjtn3WJY5hitlEgM
gerrit-code-review(a)gerrit1001.wikimedia.org (ECDSA)
ssh_host_ecdsa_384_key
384 SHA256:vFEVzNGuagPmYiw9EIwBStzd0X+gtprZzOi8vbLxAfc
gerrit-code-review(a)gerrit1001.wikimedia.org (ECDSA)
ssh_host_ecdsa_521_key
521 SHA256:OWb1uenhapK7AFPfEB+NRxgfxhktZ1Q6C5eCy+VbgsY
gerrit-code-review(a)gerrit1001.wikimedia.org (ECDSA)
ssh_host_ed25519_key
256 SHA256:njCmWMsshq3MqQxyIFO36UNwCwzTamXERqylF1XJhd8
gerrit-code-review(a)gerrit1001.wikimedia.org (ED25519)
--
Daniel Zahn <dzahn(a)wikimedia.org>
Operations Engineer
Apologies for the cross-post, but doing so because the thread was
forwarded, also apologies for the length.
On Wed, Jul 8, 2020 at 5:01 PM Maarten Dammers <maarten(a)mdammers.nl> wrote:
> Of interest to the wider community. I really hope this is not part of a
> larger pattern of the WMF ignoring community.
>
>
"Never attribute to malice that which is adequately explained by
stupidity." [1]
In this case, my own stupidity (I'm the new CTO here at the WMF, for
context), or perhaps to be a little kinder to myself, a combination of bias
and naivety: my engineering bias towards wanting to solve a problem I felt
was important to take action on (context provided shortly) which got in the
way of taking a user-centric approach first in trying to understand what
the needs and wants are of the people using the system. As I said on the
talk page, I mistakenly thought that the main feedback loops would be about
porting workflows and not about the tool itself.
Even though many have publicly said that moving from Gerrit might still be
the right decision, how we go about deciding that is just as important as
what we do and I messed that up. Given that perspective, I've asked the
team to pause with moving forward on changes to our Code Review (CR) tools
and to begin a consultation that includes the option of sticking with what
we have for CR. I've also asked my team to update some of our decision
making processes relative to topics like this to make sure we properly hear
from stakeholders (e.g. in this case, both staff developers and our broader
community of developers) along the way.
For some more context, if it is helpful:
I'm ~11 months in here and still learning every day. While I've worked in
open source for a long time, this community is new to me and different
enough that I have and continue to need to update and adjust the way I
think and the way I direct my teams to do their work.
Coming in and talking with our tech teams and folks in the community, I see
a few themes that have emerged that contributed to me wanting to move
forward faster on this decision:
1. We have a lot of tech debt[2]. In many cases, I think software,
especially software that is successful, can collapse under its own weight
if people are not careful in servicing that tech debt. The work required
to both maintain existing infrastructure, products and services while at
the same time improving what we offer is a delicate balancing act. At our
scale, there is a significant and justified bias towards production, but it
has come at a cost that has compounded over the years and has a very real
human toll. Much of this debt was created because we had to invent things
that didn't exist. Now some of those things do exist and we should check
to see whether we can replace those older, albeit well-understood-by-us
systems, with newer ones that have become standards or best in class and
are still in line with our open source values.
2. The tech debt and the sheer number of services we support (many of which
aren't fully maintained[3]) is compounded by the scale at which we support
them. The result is that a number of people, especially those on the front
line of caring for that software, are either burnt out, or approaching that
point. A global pandemic hasn't helped. I view much of my role here early
on as one of trying to help somehow reduce that burnout. Modernizing and
upgrading our processes and toolchains can, I think, help fight this, even
if there is some short term pain in the shift.
All that being said, in this particular case, we have a team of people who
work on maintaining our CI (Continuous Integration) and CR systems who have
long been looking at replacing our CI system. This system runs on an
end-of-lifed version of Python and on an end-of-lifed version of Zuul, and
it’s critical we correct this since end-of-lifed software doesn’t receive
security updates. This is primarily behind the scenes work that most people
don't have to think about. There is also a growing sense of desire by some
of our developers to adopt more mainstream, well understood toolchains like
Gitlab/Github for development, combined with my own view that CI/CR is
*not* somewhere we should be deviating from broad industry norms on
ourselves and that we should adopt workflows that are (de facto) standards
(e.g. Gitlab/Github, with Gitlab being the open one of the two) amongst
developers irrespective of their backgrounds. Those two things led to my
biased thinking that it was obvious it needed to be changed and that the
primary feedback needed would therefore be on the workflows, not the tool
itself.
While I still think it needs to be changed, I completely missed, as I said
above, the stakeholder angle here and basic community laws of not
surprising people. For that I apologize. We are now working to correct
this, even if it means it's going to take longer or we end up sticking with
the status quo on CR.
Thanks,
Grant
[1] https://en.wikipedia.org/wiki/Hanlon%27s_razor
[2] https://en.wikipedia.org/wiki/Technical_debt
[3] https://www.mediawiki.org/wiki/Developers/Maintainers
> Maarten
>
> -------- Forwarded Message --------
> Subject: Re: [Wikitech-l] CI and Code Review
> Date: Wed, 8 Jul 2020 22:40:38 +0200
> From: Maarten Dammers <maarten(a)mdammers.nl>
> Reply-To: For developers discussing technical aspects and
> organization
> of Wikimedia projects <wikitech-l(a)lists.wikimedia.org>
> To: wikitech-l(a)lists.wikimedia.org
>
>
>
> Hi Greg,
>
> On 06-07-2020 19:39, Greg Grossmeier wrote:
> > First, apologies for not announcing this last week. A short work week
> > coupled with a new fiscal year delayed this until today.
> >
> > tl;dr: Wikimedia will be moving to a self-hosted (in our datacenter(s))
> > GitLab Community Edition (CE) installation for both code review and
> > continuous integration (CI).
>
> tl;dr: WMF decides to do a major change without any community
> consultation. Community members are upset.
> More at https://www.mediawiki.org/wiki/Topic:Vpbt50rwxgb2r6qn
>
> Maarten
>
>