Folks, I've registered the domain Wikipedia.org.br and I'm currently
forwarding it as a CNAME to pt.wikipedia.org. A pt-editor pointed me
that maybe is better to leave this with the Wikipedia DNS servers
(maintained by Wikipedia developers). Does it makes sense? Is this
possible? Thanks. Alex
Apologies for cross-posting
----
Call for papers for the
2nd Workshop on Semantic Wikis:
Wiki-based Knowledge Engineering [WibKE2006]
http://wibke2006.semwiki.org
-> abstract deadline: in 8 days <-
-> early registration: * TODAY * <-
(register now, pay later or cancel)
co-located with
* the 2nd International Symposium on Wikis (WikiSym2006)
* ACM HyperText 2006
August 21-23, 2006, Odense, [[located in::Denmark]]
== Important Dates ==
* Abstract deadline: 27th June 2006 (an e-mail to chair(a)semwiki.org
with an abstract/topic/basic idea
is enough)
* Paper deadline: 9th July 2006 (can not be extended)
* Notification: 31th July 2006
* Camera ready due: 14th August 2006 (can not be extended)
* Workshop: 21-23 August 2006 (one day out of the three)
== Objectives ==
WibKE2006 is the second workshop on Semantic Wikis. It aims to explore
and collect the different ideas and motivations that have lead to the
recent emergence of so-called "Semantic Wiki" systems. Wikis can be
seen as editable information portals. In this workshop, we are mostly
interested in the editing/authoring/articulation of information and
less about the portal aspect.
The main objective of the workshop is to promote and
further develop the idea of employing Wiki design principles and
implementations as enabling technology and development paradigm for
(formally) creating and structuring knowledge in distributed
scenarios. A further aim is to sensitize knowledge engineers in
dealing with vague and changing requirements in distributed, Web-based
settings.
The workshop looks for novel approaches for a seamless integration of
Wiki technologies in knowledge engineering practices, for supporting
concepts and strategies, as well as for tools and use-cases of their
application. Of special interest are ideas and applications using
Wikis for creating, managing and using formally represented knowledge
in the context of the Semantic Web.
== Topics of Interest ==
include, but are not limited to:
* Methods employing the Wiki Way in knowledge engineering,
* Concepts and strategies for integrating domain experts
and end users into the knowledge engineering process,
* Methods for automatic, Wiki-based knowledge elicitation
from collaborative environments,
* Methods for supporting the creation of structured
and (partially) formalised personal knowledge bases
* Policies, authentication and trust within agile
collaborative knowledge engineering scenarios,
* Strategies and methods joining Web 2.0, Wiki and
Semantic Web technologies for knowledge management purposes
* Semantic Wiki tools supporting semantic collaboration
* Semantic Wiki tools supporting personal knowledge management
* Wiki-driven applications enabling massively distributed knowledge elicitation,
* Requirements and use-cases for Web-scaled collaborative knowledge engineering
in relation to Wiki technologies
* Applications of Semantic Wikis, e.g. in Bio-Medicine,
Business, Software-Engineering
* Experience reports, best practices
and guidelines in the aforementioned areas
== Workshop Format ==
* Introduction and overview about current developments with regard to
the application of Wikis in knowledge engineering and knowledge management
tasks
* Paper presentation
** Position (5 pages) and full papers (10 pages) will be peer-reviewed by three independent reviewers.
** Workshop proceedings will appear at CEUR Workshop Proceedings, ISSN 1613-0073
** Panel discussion
== Prospective Audience ==
* Researchers and practitioners developing or interested
in Wiki-based methods and tools for knowledge engineering,
* Knowledge engineers, and domain experts interested in the
development of collaborative evolving content repositories,
* Participants from industry interested in employing Wikis
for organizational knowledge engineering,
* Wiki community activists.
== Organising Committee==
* Max Völkel, Universität Karlsruhe (TH) / FZI, Germany
* Sebastian Schaffert, Salzburg Research, Austria
* Elena Paslaru Bontas, Freie Universität Berlin, Germany
* Sören Auer, Universität Leipzig, Germany & University of Pennsylvania, USA
== Program Committee (to be completed)==
* Adam Souzis (Liminal Systems)
* Bertin Klein (DFKI Kaiserslautern, Germany)
* Björn Decker (Fraunhofer IESE, Germany)
* Chris Bizer (Free University of Berlin, Germany)
* Denny Vrandecic (AIFB Karlsruhe, Germany)
* George A. Vouros (University of the Aegean, Greece)
* Giovanni Tummarello (Universita' Politecnica delle Marche, Italy)
* Jean Rohmer (Thales Communications, France)
* Klaus Fuchs-Kittowski (Fraunhofer ISST, Germany)
* Libby Miller (asemantics, US)
* Malte Kiesel (DFKI Kaiserslautern, Germany)
* Martin Hepp (DERI Innsbruck, Austria)
* Natasha Noy (Stanford University, US)
* Peter Haase (AIFB Karlsruhe, Germany)
* Peter Mika (Vrije Universiteit Amsterdam, NL)
* Rupert Westenthaler (Salzburg Research, Austria)
* Valentina Presutti (LOA-CNR, Italy)
== Related Work ==
Authors are encouraged to look at the proceedings of [[SemWiki2006]] and WikiSym 2005.
See http://wiki.ontoworld.org/wiki/SemWiki2006
Please ask questions to wibke2006(a)semwiki.org
---
Kind regards,
Max Völkel
--
Dipl.-Inform. Max Völkel, Universität Karlsruhe / FZI / http://xam.de
Two minor changes in the process for the data dumps I started earlier today:
* The intermediate "stub" XML dumps are now available for download instead of
vanishing into a temporary directory. These contain all the article and revision
metadata but not the revision text.
* The .7z version of the full-history dump is now built after the .bz2 completes
instead of both at the same time; this should make the .bz2 versions of the big
wikis available for download sooner as it won't have to wait on the slower 7-zip
compressor. (It's still using the slow single-threaded bzip2 for the moment,
though.)
The stub dumps are the same format as the full XML dumps, with the exception
that the <text> element is empty. It has an id attribute (not listed in the XML
Schema file, so don't enforce schema validation in your parser) indicating the
internal storage node which contains that revision's text.
This node number isn't really useful unless you're on our servers as the raw
storage tables are not accessible from outside, but if you want to do statistics
dealing with the rest of the metadata fields it's going to be a lot faster to
deal with these lighter-weight files than the version with full text embedded.
These are compressed with gzip for speed; the stub dump for English Wikipedia
full history runs about 2 gigabytes compressed.
-- brion vibber (brion @ pobox.com)
An automated run of parserTests.php showed the following failures:
Running test External image... FAILED!
Running test External image from https... FAILED!
Running test External links: Clickable images... FAILED!
Running test Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test Link containing double-single-quotes '' (bug 4598)... FAILED!
Running test message transform: <noinclude> in transcluded template (bug 4926)... FAILED!
Running test message transform: <onlyinclude> in transcluded template (bug 4926)... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test Language converter: output gets cut off unexpectedly (bug 5757)... FAILED!
Running test HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test HTML nested ordered list, open tags (bug 5497)... FAILED!
Running test Parsing optional HTML elements (Bug 6171)... FAILED!
Running test Inline HTML vs wiki block nesting... FAILED!
Running test Mixing markup for italics and bold... FAILED!
Passed 388 of 404 tests (96.04%) FAILED!
Uploaded files which are deleted can now be undeleted; admins can also view the
deleted files without actually undeleting them. This is integrated into the
existing Special:Undelete in what I hope is a fairly clear and intuitive manner.
It's my hope that this will encourage admins to tackle the deletion backlogs a
little more aggressively, since mistakes will be easier to undo.
I've tested it both offline and on the live servers and everything seems fine so
far, but if you do encounter problems please report them at
http://bugzilla.wikimedia.org/
-- brion vibber (brion @ pobox.com)
Hi all,
next week there will be a Wikipedia presentation in Italy [1] where we
will offer the opportunity to browse Wikipedia from a group of public
pc. The problem is: probably we will only have one/two GPRS card for the
internet connection, so we think there will be a problem because of the
low bandwidth.
I can install a squid trasparent proxy, but i don't know how to populate
the cache; i wish i could "download in the cache" at least all the
images stored on commons, in order to save a lot of bandwidth.
How can i do?
[1] http://it.wikipedia.org/wiki/Wikipedia:Raduni/Vicenza_2006
--
Flavio "Iron Bishop" Pastore
http://www.fsfe.org/fellows/ironbishop/index
Important notice: Brion totally rocks.
Images can now be undeleted...Yes, I said undeleted.
I'll be leading group worship at the Church of the Almighty Brion at 7PM
UTC tomorrow.
Essjay
--
http://en.wikipedia.org/wiki/User:Essjay
Wikipedia:The Free Encyclopedia
http://www.wikipedia.org/
An automated run of parserTests.php showed the following failures:
Running test External image... FAILED!
Running test External image from https... FAILED!
Running test External links: Clickable images... FAILED!
Running test Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test Link containing double-single-quotes '' (bug 4598)... FAILED!
Running test message transform: <noinclude> in transcluded template (bug 4926)... FAILED!
Running test message transform: <onlyinclude> in transcluded template (bug 4926)... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test Language converter: output gets cut off unexpectedly (bug 5757)... FAILED!
Running test HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test HTML nested ordered list, open tags (bug 5497)... FAILED!
Running test Parsing optional HTML elements (Bug 6171)... FAILED!
Running test Inline HTML vs wiki block nesting... FAILED!
Running test Mixing markup for italics and bold... FAILED!
Passed 388 of 404 tests (96.04%) FAILED!
The new magic word {{NUMBEROFADMINS}} doesn't seem to work as expected. On
en.wiktionary it gives 19 (while there are 38), and on en.wikipedia produces
496.
Cheers,
Wildrick
http://en.wiktionary.org/wiki/User:Vildricianus