Hi,
I am working on this bug <https://phabricator.wikimedia.org/T64305> and I
want to make sure that I understood the bug right. Can somebody please help?
So for a JPEG image, right under the image something like only "Full
resolution: JPEG" should be there?
and same goes for a PNG and GIF files as well(something like only "Full
resolution: PNG" and only "Full resolution: GIF" respectively)
And for other formats like SVG and TIFF, does media wiki automatically
converts them to something that all browsers can display?
I just checked for it here
<https://commons.wikimedia.org/wiki/File:%22Single%22Japanese_castle_Tenshu_…>,
it is a SVG file and it provides those other format links in PNG.
And here
<https://commons.wikimedia.org/wiki/File:%22Ammunition!%22_And_remember_-_bo…>,
it is a tiff file and clicking other resolutions would give a JPEG file
even though it is not explicitly and clearly mentioned.
So only for formats like that(SVG, TIFF and others of that sort), those PNG
or JPEG links of the maximum possible image size also should be clearly
provided right under the image in addition to the link to the original file?
Is this what is exactly expected to be fixed from this bug?
Thanks in advace.
The next two weeks will hopefully be quiet, deployment-wise. This is due
to two weeks of in-person events that will take the majority of the
attention of WMF and community engineers.
== Week of January 19th ==
Summary:
* No MediaWiki train deploy due to WMF All Hands
** WMF All Hands is all day Wed and Thurs
* No deploys Mon/Wed/Thurs/Fri
* SWAT/etc on Tues OK
== Week of January 26th ==
Summary:
* No MediaWiki train deploy due to MediaWiki Dev Summit
** MediaWiki Dev Summit is Mon and Tues all day
* SWAT all week as normal
* No other deploys Mon/Tuesday
Thanks and as always, questions and comments welcome,
Greg
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
LDQ 2015 CALL FOR PAPERS
2nd Workshop on Linked Data Quality
co-located with ESWC 2015, Portorož, Slovenia
May 31 or June 1, 2015
http://ldq.semanticmultimedia.org/
Important Dates
* Submission of research papers: March 6, 2015
* Notification of paper acceptance: April 3, 2015
* Submission of camera-ready papers: April 17, 2015
Since the start of the Linked Open Data (LOD) Cloud, we have seen an
unprecedented volume of structured data published on the web, in most
cases as RDF and Linked (Open) Data. The integration across this LOD
Cloud, however, is hampered by the ‘publish first, refine later’
philosophy. This is due to various quality problems existing in the
published data such as incompleteness, inconsistency,
incomprehensibility, etc. These problems affect every application
domain, be it scientific (e.g., life science, environment),
governmental, or industrial applications.
We see linked datasets originating from crowdsourced content like
Wikipedia and OpenStreetMap such as DBpedia and LinkedGeoData and also
from highly curated sources e.g. from the library domain. Quality is
defined as “fitness for use”, thus DBpedia currently can be appropriate
for a simple end-user application but could never be used in the medical
domain for treatment decisions. However, quality is a key to the success
of the data web and a major barrier for further industry adoption.
Despite the quality in Linked Data being an essential concept, few
efforts are currently available to standardize how data quality tracking
and assurance should be implemented. Particularly in Linked Data,
ensuring data quality is a challenge as it involves a set of
autonomously evolving data sources. Additionally, detecting the quality
of datasets available and making the information explicit is yet another
challenge. This includes the (semi-)automatic identification of
problems. Moreover, none of the current approaches uses the assessment
to ultimately improve the quality of the underlying dataset.
The goal of the Workshop on Linked Data Quality is to raise the
awareness of quality issues in Linked Data and to promote approaches to
assess, monitor, maintain and improve Linked Data quality.
The workshop topics include, but are not limited to:
* Concepts
* - Quality modeling vocabularies
* Quality assessment
* - Methodologies
* - Frameworks for quality testing and evaluation
* - Inconsistency detection
* - Tools/Data validators
* Quality improvement
* - Refinement techniques for Linked Datasets
* - Linked Data cleansing
* - Error correction
* - Tools
* Quality of ontologies
* Reputation and trustworthiness of web resources
* Best practices for Linked Data management
* User experience, empirical studies
Submission guidelines
We seek novel technical research papers in the context of Linked Data
Quality with a length of up to 8 pages (long) and 4 pages (short)
papers. Papers should be submitted in PDF format. Other supplementary
formats (e.g. html) are also accepted but a pdf version is required.
Paper submissions must be formatted in the style of the Springer
Publications format for Lecture Notes in Computer Science (LNCS). Please
submit your paper via EasyChair at
https://easychair.org/conferences/?conf=ldq2015. Submissions that do not
comply with the formatting of LNCS or that exceed the page limit will be
rejected without review. We note that the author list does not need to
be anonymized, as we do not have a double-blind review process in place.
Submissions will be peer reviewed by three independent reviewers.
Accepted papers have to be presented at the workshop.
Important Dates
All deadlines are, unless otherwise stated, at 23:59 Hawaii time.
* Submission of research papers: March 6, 2015
* Notification of paper acceptance: April 3, 2015
* Submission of camera-ready papers: April 17, 2015
* Workshop date: May 31 or June 1, 2015 (half-day)
Organizing Committee
* Anisa Rula – University of Milano-Bicocca, IT
* Amrapali Zaveri – AKSW, University of Leipzig, DE
* Magnus Knuth – Hasso Plattner Institute, University of Potsdam, DE
* Dimitris Kontokostas – AKSW, University of Leipzig, DE
Program Committee
* Maribel Acosta – Karlsruhe Institute of Technology, AIFB, DE
* Mathieu d’Aquin – Knowledge Media Institute, The Open University, UK
* Volha Bryl – University of Mannheim, DE
* Ioannis Chrysakis – ICS FORTH, GR
* Jeremy Debattista – University of Bonn, Fraunhofer IAIS, DE
* Stefan Dietze – L3S, DE
* Suzanne Embury – University of Manchester, UK
* Christian Fürber – Information Quality Institute GmbH, DE
* Jose Emilio Labra Gayo – University of Oviedo, ES
* Markus Graube – Technische Universität Dresden, DE
* Maristella Matera – Politecnico di Milano, IT
* John McCrae – CITEC, University of Bielefeld, DE
* Felix Naumann – Hasso Plattner Institute, DE
* Matteo Palmonari – University of Milan-Bicocca, IT
* Heiko Paulheim – University of Mannheim, DE
* Mariano Rico – Universidad Politécnica de Madrid, ES
* Ansgar Scherp – Kiel University, DE
* Jürgen Umbrich – Vienna University of Economics and Business, AT
* Miel Vander Sande – MultimediaLab, Ghent University, iMinds, BE
* Patrick Westphal – AKSW, University of Leipzig, DE
* Jun Zhao – Lancaster University, UK
* Antoine Zimmermann – ISCOD / LSTI, École Nationale Supérieure des
Mines de Saint-Étienne, FR
* Andrea Maurino – University of Milan-Bicocca, IT
More details can be found on the workshop website:
http://ldq.semanticmultimedia.org/
Howdy all,
Recently we've been playing with tracking our code coverage in Services
projects, and so far it's been pretty interesting.
We've learned about where the gaps are in our testing (which has even
revealed holes in our understanding of our own specifications and use
cases), and had fun watching the coverage climb with (nearly) each pull
request.
I've slapped together some notes about our experience here:
https://github.com/wikimedia/restbase/tree/master/doc/coverage#code-coverage
I'd love to hear your thoughts and learn about your related experiences.
What are your favorite code coverage tools and services?
Cheers!
James
Hi,
I was working to develop a theme [1] for Mediawiki. I was following the
Manual:Skinning page on Mediawiki site. I wanted to make some changes in
the footer area but could not find in the list. I am going to add a list in
the following section. Can anyone please tell me how can i show this
sections and is there any better doc than the one i am following now.
- what is the easier way to add/remove links from the footer links
- show/hide lastmod
- show/hide viewcount
- show/hide numberofwatchingusers
- show/hide credits on web view and print page
- show copyright link as a link
- show license, powered by mediawiki and other images as text links
[1] - https://github.com/nasirkhan/mediawiki-bootstrap
[2] - https://www.mediawiki.org/wiki/Manual:Skinning
regards
Nasir Khan
--
*Nasir Khan Saikat*
www.nasirkhn.com
Founding member, Wikimedia Bangladesh
Public Lead, Creative Commons Bangladesh
Fellow, Institute of Open Leadership
Hello Wikiwizards,
I have developed an application that reads the RCstream, performs a
filter/augmentation operation[1] and then rebroadcasts the filtered
RCstream.
The problem I am running into on Tools-Labs is that when I submit to the
grid, the rebroadcasting of the filtered websocket stream is happening on a
different server with each submission to the grid (i.e.
task(a)tools-exec-##.eqiad.wmflabs).
So how can I have submit the rebroadcast job to the grid and have the
rebroadcasted websocket be accessed by a static address, so that external
people can read it? What is the best strategy?
Some solutions I can think of:
*Should I make an instance for this?
*Write out to the filesystem and rebroadcast from a webserver?
*Dynamically bind incoming requests to the jobs-grid?
[1]the operation is to look at the content of all changes and filter only
those that involve addition or deletion of specific wikitext template
parameters.
Make a great day,
Max Klein ‽ http://notconfusing.com/
Hi folks,
Most of you probably heard about new stream of RC changes as
alternative to IRC: https://wikitech.wikimedia.org/wiki/RCStream
I simply found it complex beyond the edge of usability for most of
solutions that use low level languages or frameworks that don't
support modern technologies like JSON or websocket.IO and don't want
to be overbloated with 3rd libs, so I launched this alternative
provider: https://wikitech.wikimedia.org/wiki/XmlRcs
If you aren't using JavaScript or Python and you found new RC provider
(RCStream) too complex, you might find this useful.
[Crossposted to Wikitech-l]
Hello, all!
As a reminder, the planned partial downtime of the Labs network
filesystem for maintenance is today (January 15) at 18:00 UTC and is
scheduled to last up to 24 hours - but I have high hopes that it will be
much shorter than that.
In addition to the expected impact during the window (copied below)
there will be a brief (less than 5 minutes) complete suspension of
service to the affected filesystems at the very beginning of the process
that should not cause errors but may cause many services to stall briefly.
-- Marc
On 14-12-31 12:11 PM, Marc A. Pelletier wrote:
> The expecte impacts is:
>
> * Starting at the beginning of the window, /home and /data/project will
> switch to readonly mode; any attempt to write to files to those trees
> will result in EROFS errors being thrown. Reading from those
> filesystems will still work as expected, so would writing to other
> filesystems;
> * Read performance may degrade noticably as the disk subsystem will be
> loaded to capacity;
> * It will not be possible to manipulate the gridengine queue -
> specifically, starting or stopping jobs will not work; and
> * At the end of the window, when the operation is complete, the "old"
> file system will go away and be replaced by the new one - this will
> cause any access to files or directories that were previously opened
> (including working directories) on the affected filesystems to error out
> with ESTALE. Reopening files by name will access the new copy identical
> to the one at the time the filesystems became readonly.
Hello,
I have crafted and enabled two new jobs:
* mediawiki-phpunit-hhvm
* mediawiki-phpunit-zend
Which are triggered whenever a patch is proposed to the repos:
mediawiki/core
mediawiki/vendor
Or the mobile related extensions:
Echo
JsonConfig
Mantle
MobileApp
MobileFrontend
VisualEditor
WikiGrok
ZeroBanner
ZeroPortal
The jobs clone all of those repositories and run the MediaWiki core
extension test suite, ie:
cd tests/phpunit
php phpunit.php --testsuite extensions
They are NOT run on old branches (REL1_19, REL1_22, REL1_23, REL1_24),
the old jobs are.
They are run (and pass) on master and wmf branches and will on future
REL branch (ie REL1_25).
They do NOT run the MediaWiki core main tests suites cause that is
awfully slow. The previous jobs weren't either.
On mediawiki/core and mediawiki/vendor we still run the core tests
though (mediawiki-phpunit-{hhvm,zend}).
_
/ \
/ | \
/__.__\
Side effect: if one deprecates a function/method in mediawiki/core and
it is used by one of the extensions above, the job will fail until the
extensions above have been adjusted.
I hope to add more extensions to that list. Additionally I am looking to
add a command to verify whether an extension is a good candidate for
inclusion.
For more context/details see:
[WikimediaMobile] Testing extensions together
https://lists.wikimedia.org/pipermail/mobile-l/2014-December/008398.html
Gerrit change
https://gerrit.wikimedia.org/r/#/c/180494/
RFC Task
https://phabricator.wikimedia.org/T1350
The RFC itself:
https://www.mediawiki.org/wiki/Requests_for_comment/Extensions_continuous_i…
--
Antoine "hashar" Musso