Even if the email addresses can’t be removed, I was hoping they could at-least be automatically transformed into images or with “at” replacements in the address. Similar transformation is performed at other places on gossamer threads’ archives, it should be possible to do it in the “From” area of message bodies too.
In fact, the archives site even has the following in their FAQ:
Q. How do you protect users email addresses from harvesters?
A. We make every effort to stop people from harvesting emails from the site. Currently, we encode, strip and transform all email addresses we can find (including parsing the message body for emails that show up in forwards). The only place the email address is viewable is by clicking on the user, and we embed the email in an image to prevent harvesting.
I tried contacting them through https://www.gossamer-threads.com/contact/, but didn’t get a reply. I was hoping if someone from the Wikimedia foundation contacted them, they would be more motivated to reply back.
But I don't *HAVE* a LocalSetting.php file yet. This happens either when I
do a "git clone/composer update" install, or a vanilla "tarball/extract"
install. I hit up http://mysite.dev/mw-config/index.php, and it dutifully
LocalSettings.php not found.
Please complete the installation <http://mysite.dev/mw-config/index.php> and
But on the page following it errors out with:
A LocalSettings.php file has been detected. To upgrade this installation,
please enter the value of $wgUpgradeKey in the box below. You will find it
Installer be crazy.
I've cleared my cache, used a different browser, erased the ~/.composer
directory, deleted the directory/tried again and various combinations of
trying to "reset" this, but I'm still prompted with the error.
Occasionally, though, I run across this error too:
*Fatal error*: Class 'ComposerAutoloaderInit_mediawiki_vendor' not found in
*/usr/local/www/wiki/vendor/autoload.php* on line *7*
Does anyone have an idea of what might be causing this?
Unfortunately there's been a mistake in generating the 1.28.1 and 1.27.2
tarball releases, where the wrong version of SyntaxHighlight_GeSHi
extension was included. The version of this extension that was included has
publically known severe security issues in it.
Until such a time as a new release is issued (which will hopefully be soon)
I reccomend that people either disable that extension or download a new
. The version of this extension in git, or from extension distributor is
fine. Only the version in the most recent tarball is wrong. The rest of the
release is also fine. The 1.23.16 and 1.23.17 release was not affected by
Sorry for the confusion,
I would like to announce the release of MediaWiki Language Extension
Bundle 2017.04. This bundle is The bundle is compatible with MediaWiki
1.27 and 1.28 or above and requires PHP 5.5.9 or above.
Next MLEB is expected to be released in 3 months. If there are major
changes or important bug fixes, we will do intermediate release.
Please give us your feedback at
* Download: https://translatewiki.net/mleb/MediaWikiLanguageExtensionBundle-2017.04.tar…
* sha256sum: 4207398d7ed3ea9c793f35f88e75f29672b33aee4d0baf01afc869517c629c65
* Installation instructions are at: https://www.mediawiki.org/wiki/MLEB
* Announcements of new releases will be posted to a mailing list:
* Report bugs to: https://phabricator.wikimedia.org/
* Talk with us at: #mediawiki-i18n @ Freenode
Release notes for each extension are below.
-- Kartik Mistry
== Highlights and upgrade notes ==
== Babel ==
* TTO fixed disappearing language names on Category: and File: pages.
== CLDR ==
* Reedy updated to CLDR 31.
== CleanChanges, LocalisationUpdate ==
* Localisation updates only.
== Translate ==
* Bartosz Dziewoński fixed "Mark as reviewed" tooltip positioning.
* David Causse added support for Elastic5, Multi-DC for TTMServices,
* Federico Leva fixed import of the new translations in
* Geoffrey Mon added insertables suggester class for $1, $2 and allow
* MarcoAurelio modified log message to match with all other logs
listed in Special:Log.
* Niklas Laxström added support to allow moving all pages for
translation admins (translate-manage).
* TTO updated translator-stats to account for expiring user groups.
* datguy added confirmation box in Special:ManageTranslatorSandbox
when approving or rejecting a translator. (T60706)
== UniversalLanguageSelector ==
* Bartosz Dziewoński fixed language change tooltip position. (T161203)
* Niklas Laxström added link to Compact Language Links info page in
the beta feature description.
* Niklas Laxström fixed broken site picks feature for Compact Language
=== Fonts ===
* Kartik Mistry added Sundanese (su) font. (T162221)
=== Input methods ===
* Amir Aharoni fixed name of Arabic keyboard.
* Amir Aharoni and Felix Nartey added keyboard for Ga language.
* Amir Aharoni added Akan keyboard also for Twi language.
* fgaim added new IMEs for Eritrean languages: Tigrinya (ti), Tigre
(tig), Blin (byn).
* WikiDiscoverer fixed autonym for Gõychi Konknni (gom) language.
Kartik Mistry/કાર્તિક મિસ્ત્રી | IRC: kart_
Referring to the steps provided in this article here on how to enable the citation templates in the Visual Editor menus,
I followed all the steps written in this article, and imported all the necessary components, including the CS1 component.
Also, I edited the template page and saved it, as per the reported bug in the troubleshooting section. I even added the templatedata lines from the /doc to the main template page to test it, then removed it again.
I also ran the jobqueue to make sure that there are no jobs waiting.
However, all this did not lead to any improvement. When I click on any citation component, I get this message: "The "Template:Cite book" template doesn't yet have a description, but there might be some information on the template's page."
When I try to add a generic field, I get the following error: "lua error in package.lua at line 80: module 'Module:Citation/CS1/Suggestions' not found."
I understand that I don't have suggestion in the /CS1/Suggestion page, but I should not need suggestions anyway if the fields show.
Also, If I clicked insert / template then searched for the "Cite: Book", and clicked "Add Template", I will still get this message: "The "Template:Cite book" template doesn't yet have a description, but there might be some information on the template's page."
I added my question first to the page discussion here, but could not get an answer:https://www.mediawiki.org/wiki/Talk:VisualEditor/Citation_tool#Templ…
What shall I do to solve this problem?
thank you very much,
Kind Regards,Atef RaoufThe Coptic Treasures Project
Has anyone set up a MediaWiki installation on Azure or AWS? It's possible
that I'll set up a new MediaWiki installation this summer, and using Azure
or AWS are possible, so I'd be interested to hear about others'
experiences. Links to relevant blog posts or other resources would be of
Quick question, is there any downside to doing this:
wfLoadExtension('Foo/Bar'); ? It loads the extension and the extension works
I wrote an extension for a project which another developer(s) set up with
custom extensions outside of the public www directory. They symlinked this
custom directory back to the public extension directory but all custom
extension are in their own directory. So /extensions has standard MW
extensions and /extensions/Foo contains all the custom extensions. All of
the custom extensions are called with require_once
'$IP/extensions/Foo/Bar01/Bar01.php" and so on. I know you can't do the
standard double entry method with a deprecated message unless you use the
"Foo/Bar" in the wfLoadExtension statement with the custom ones.
wfLoadExtension('Foo/Bar'); works but going forward I wonder about any long
term issues. Wiki is on 1.27 but worry about long term problems as the other
custom extensions are improved to use wfLoadExtension. There would be
multiple extensions which would require this type of call, Foo/Bar,
Foo/Bar1, Foo/Bar2, etc.
Following the process described in the Code of Conduct for Wikimedia
technical spaces <https://www.mediawiki.org/wiki/Code_of_Conduct>, the
Wikimedia Foundation’s Technical Collaboration team has selected five
candidates to form the first Code of Conduct Committee and five candidates
to become auxiliary members.
Here you have their names in alphabetical order. For details about each
candidate, please check
Committee member candidates:
Amir Sarabadani (Ladsgroup)
Lucie-Aimée Kaffee (Frimelle)
Nuria Ruiz (NRuiz-WMF)
Sébastien Santoro (Dereckson)
Tony Thomas (01tonythomas)
Auxiliary member candidates:
Ariel Glenn (ArielGlenn)
Caroline Becker (Léna)
Florian Schmidt (Florianschmidtwelzow)
This list of candidates is subject to a community review period of two
weeks starting today. If no major objections are presented about any
candidate, they will be appointed in six weeks.
You can provide feedback on these candidates, via private email to
techconductcandidates(a)wikimedia.org. This feedback will be received by
group handling this process, and will be treated with confidentiality.
We want to thank all the people who has considered the possibility to
support the Code of Conduct with their participation in this Committee. 77
persons have been contacted during the selection process, counting
self-nominations and recommendations. From these, 21 made it to a short
list of candidates confirmed and (according to our estimation) a potential
good fit for the Committee. Selecting the five candidates for the Committee
has been hard, as we have tried to form a diverse group that could work
together effectively in the consolidation of the Code of Conduct. Selecting
the five auxiliary members has been even harder, and we know that we have
left out candidates who could have contributed just as much. Being the
first people assuming these roles, we have tended a bit towards more
technical profiles with good knowledge of our technical spaces. We believe
that future renewals will offer better chances to other profiles (not so
technical and/or not so Wikimedia veteran), adding a higher diversity and
variety of perspectives to the mix.
On Thu, Mar 9, 2017 at 12:30 PM, Quim Gil <qgil(a)wikimedia.org> wrote:
> Dear Wikimedia technical community members,
> The review of the Code of Conduct for Wikimedia technical spaces has been
> completed and now it is time to bootstrap its first committee. The
> Technical Collaboration team is looking for five candidates to form the
> Committee plus five additional auxiliary members. One of them could be you
> or someone you know!
> You can propose yourself as a candidate and you can recommend others
> *privately* at
> techconductcandidates AT wikimedia DOT org
> We want to form a very diverse list of candidates reflecting the variety
> of people, activities, and spaces in the Wikimedia technical community. We
> are also open to other candidates with experience in the field. Diversity
> in the Committee is also a way to promote fairness and independence in
> their decisions. This means that no matter who you are, where you come
> from, what you work on, or for how long, you are a potential good member of
> this Committee.
> The main requirements to join the Committee are a will to foster an open
> and welcoming community and a commitment to making participation in
> Wikimedia technical projects a respectful and harassment-free experience
> for everyone. The committee will handle reports of unacceptable behavior,
> will analyze the cases, and will resolve on them according to the Code of
> Conduct. The Committee will also handle proposals to amend the Code of
> Conduct for the purpose of increasing its efficiency. The term of this
> first Committee will be one year.
> Once we have a list of 5 + 5 candidates, we will announce it here for
> review. You can learn more about the Committee and its selection process at
> https://www.mediawiki.org/wiki/Code_of_Conduct/Committee and you can ask
> questions in the related Talk page (preferred) or here.
> You can also track the progress of this bootstrapping process at
> PS: We have many technical spaces and reaching to all people potentially
> interested is hard! Please help spreading this call.
> Quim Gil
> Engineering Community Manager @ Wikimedia Foundation
Engineering Community Manager @ Wikimedia Foundation
I would like to know if anyone has migrated the monthly dump injection
process from wikipedia (MWDumper) and a wikipedia mirror (mediawiki
software) to AWS and if so, what was the approach you had followed for it.
Thanks in advance,