FOSDEM, Brussels, 30 & 31 January
It's time to decide the broad lines of the Wikimedia participation in that
event.
Deadlines and discussion: https://phabricator.wikimedia.org/T88414
If you are interested, join the task and have your say.
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
In the next RFC meeting, we will discuss the following RFC:
* Overhaul Interwiki map, unify with Sites and WikiMap
<https://phabricator.wikimedia.org/T113034>
The meeting will be on the IRC channel #wikimedia-office on
chat.freenode.net at the following time:
* UTC: Wednesday 21:00
* US PDT: Wednesday 14:00
* Europe CEST: Wednesday 23:00
* Australia AEDT: Thursday 08:00
-- Tim Starling
*CFP: Semantic Web Journal - Special Issue on Quality Management of
Semantic Web Assets (Data, Services and Systems):*
http://www.semantic-web-journal.net/blog/call-papers-special-issue-quality-…
<http://www.semantic-web-journal.net/blog/call-papers-special-issue-quality-…>
Submission guidelines
*Deadline (_only 1 month left_):October 31, 2015
*
Submissions shall be made through the Semantic Web journal website at
http://www.semantic-web-journal.net
<http://www.semantic-web-journal.net/>. Prospective authors must take
notice of the submission guidelines posted at
http://www.semantic-web-journal.net/authors
<http://www.semantic-web-journal.net/authors>. Note that you need to
request an account on the website for submitting a paper. Please
indicate in the cover letter that it is for the Special Issue on Quality
Management of Semantic Web Assets (Data, Services and Systems).
Submissions are possible in the following categories: full research
papers, application reports, reports on tools and systems, and case
studies. While there is no upper limit, paper length must be justified
by content.
Guest editors
* Amrapali Zaveri, University of Leipzig, AKSW Group, Germany
* Dimitris Kontokostas, University of Leipzig, AKSW Group, Germany
* Sebastian Hellmann, University of Leipzig, AKSW Group, Germany
* Jürgen Umbrich, Vienna University of Economics and Business, Austria
*Overview and Topics*
The standardization and adoption of Semantic Web technologies has
resulted in a variety of assets, including an unprecedented volume of
data being semantically enriched and systems and services, which consume
or publish this data. Although gathering, processing and publishing data
is a step towards further adoption of Semantic Web, quality does not yet
play a central role in these assets (e.g., data lifecycle,
system/service development).
Quality management essentially refers to activities and tasks involved
to guarantee a certain level of consistency and to meet the quality
requirements for the assets. In general, quality management consists of
the following four phases and components: (i) quality planning, (ii)
quality control, (iii) quality assurance and (iv) quality improvement.
The quality planning phase in the Semantic Web typically involves the
design of procedures, strategies and policies to support the management
of the assets. The quality control and assurance components have their
primary aim in preventing errors and to meet quality requirements
pertaining to the Semantic Web standards. A core part for both
components are quality assessment methods which provide the necessary
input for the controlling and assurance tasks.
Quality assessment of Semantic Web Assets (data, services and systems),
in particular, presents new challenges that were not handled before in
other research areas. Thus, adopting existing approaches for data
quality assessment is not a straightforward solution. These challenges
are related to the openness of the Semantic Web, the diversity of the
information and the unbounded, dynamic set of autonomous data sources,
publishers and consumers (legal and software agents). Additionally,
detecting the quality of available data sources and making the
information explicit is yet another challenge. Moreover, noise in one
data set, or missing links between different data sets, propagates
throughout the Web of Data, and imposes great challenges on the data
value chain.
In case of systems and services, different implementations follow the
specifications for RDF and SPARQL to varying extents, or even propose
and offer new, non-standardized extensions. This causes strong
incompatibilities between systems, e.g., between the used SPARQL
features in the query engines and support features in RDF stores. The
potential heterogeneity and incompatibility poses several challenges for
the quality assessments in and for such systems and services.
Eventually, quality improvement methods are used to further enhance the
value of the Semantic Web Assets. One important step to improve the
quality of data is identifying the root cause of the problem and then
designing corresponding data improvement solutions. These solutions
select the most effective and efficient strategies and related set of
techniques and tools to improve quality. Quality improvement metrics for
products and services entails understanding and improving operational
processes and establishing valid and reliable service performance measures.
This Special Issue is addressed to those members of the community
interested in providing novel methodologies or frameworks in managing,
assessing, monitoring, maintaining and improving the quality of the
Semantic Web data, services and systems and also introduce tools and
user interfaces which can effectively assist in this management.
Topics of Interest
We welcome original high quality submissions on (but are not restricted
to) the following topics:
* Methodologies and frameworks to plan, control, assure or improve the
quality of Semantic Web Assets
* Quality exploration and analysis interfaces
* Quality monitoring
* Developing, deploying and managing quality service ecosystems
* Assessing the quality evolution of Semantic Web Assets
* Large-scale quality assessment of structured datasets
* Crowdsourcing data quality assessment
* Quality assessment leveraging background knowledge
* Use-case driven quality management
* Evaluation of trustworthiness of data
* Web Data and LOD quality benchmarks
* Data Quality improvement methods and frameworks, e.g., linkage,
alignment, cleaning, enrichment, correctness
* Service/system quality improvement methods and frameworks
* Managing sustainability issues in services
* Guarantee of service (availability, performance)
* Systems for transparent management of open data
The Discovery team is planning on running a session about the search api at
the dev summit https://phabricator.wikimedia.org/T113540 and we'd love your
feedback about what we should cover.
Please comment on the phab task and let us know
thanks to those who already have
--tomasz
....in Phabricator.
See this task for more background:
https://phabricator.wikimedia.org/T114486
Summary of what the problem was:
We routinely have tasks reported and included in the #Beta-Cluster
project that are software defects found when testing on Beta Cluster.
The fact that we found those issues on Beta Cluster is AWESOME and
GREAT. But they aren't issues with Beta Cluster the service and muddy up
the workboard (and any future potential reporting).
Now:
* Issues with the infrastructure of Beta Cluster? Use
#Beta-Cluster-infrastructure
* Did you find a bug in software that was deployed to the Beta Cluster?
Use #Beta-Cluster-reproducible
If you aren't sure, you can just use #Beta-Cluster (which is an
additional hashtag to #Beta-Cluster-infrastructure) and we'll take care
of it.
Greg
NB: #Beta-Cluster-reproducible isn't the best name in the world, and
we're open to better/additional names if you have them, but we went with
what made sense for now and can change it later. Naming is hard.
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
Hello,
I would like to get your input for this potential Google-Code /
Outreachy task: https://phabricator.wikimedia.org/T114631
Quick summary: Right now there is no simple way to detect syntax
errors of a wikicode (text user input as a source code for wiki page).
Its existence would make it simple for tools and bots to validate
wikitext (tools like ORES, Huggle, AWB and many others) as well as it
could make it possible for new features such as real-time syntax
checking to be implemented into MediaWiki (so that user would be
notified about syntax errors while typing the source code or upon page
saving).
Before this task gets proposed I would like to discuss it, whether
it's needed, who would benefit from it, how it should be implemented
and so on. Please respond to the phabricator task if possible.
Thank you
Hi,
I'm Josephine, an Outreachy applicant who's hoping to work on the Upload to
Commons Android app project -
https://phabricator.wikimedia.org/T114358#1701261
With the guidance of Nicolas_Raoul (co-mentor for this project), I have
started contributing to the app on GitHub and have completed the microtask.
However, we are still needing one more mentor for this project.
If anyone is willing to help out as a mentor or point me to someone who
would, I'd hugely appreciate it.
--
Regards,
Josephine
For a while Cirrus Search was the "bee's knees"[1] around here, and we were
got to the stage that all wikis were moved onto this search functionality.
Then silence. Complete and utter silence.
Presumably there has been stuff happening out of the public eye, and I am
not wanting to dig into personal areas, however, the silence on what was a
key development is very disappointing. Can we please have an update.
Thanks.
Regards, Billinghurst
[1] https://en.wiktionary.org/wiki/bee's_knees
Consensus was reached in the first discussion regarding the intro,
"Principles", and "Unacceptable behavior" sections. See
https://www.mediawiki.org/wiki/Talk:Code_of_conduct_for_technical_spaces/Dr…
.
However, while that was being discussed, several of us made changes to
these sections. In the future, we need to avoid this (to avoid endless
discussion).
Thus, in the future I'll send out two separate announcements, one for
"last call to work on these sections" and one for "consensus discussion
for these sections".
However, this time, that wasn't done. So I want to give people a chance
to weigh in on whether we should accept the changes that were made
during the first discussion.
Thus, there is a new discussion at
https://www.mediawiki.org/wiki/Talk:Code_of_conduct_for_technical_spaces/Dr…
.
This will only last a week. I expect to close it October 6th.
Thanks,
Matt Flaschen