*FLOSS4P2P: Call for Participation*
A 2-day London workshop in March, gathering *FLOSS projects* that are
building software for *peer production and organization*, with a focus on
*distributed* platforms. *Scholarships* to attend are offered to grassroots
communities.
** Context **
We know that the Internet was originally decentralized, with protocols and
services built by hackers. However, with the arrival of the celebrated Web
2.0, centralization and corporations proprietary platforms seem to have
taken over. Moreover, this centralized structure is used by governments to
increase surveillance (following Snowden’s revelations), to blackout
internet whenever it is needed (e.g. Egypt, Syria, or San Francisco’s BART)
or to choke annoying activist organizations (such as Wikileaks).
On the other hand, in the last few years we have seen the emergence of
Internet-enabled collaborative communities building shared libre/open
resources. Commons-based Peer to Peer Production (CBPP) is rapidly growing:
not just for software and encyclopedias, but also for information
(OpenStreetMap, Wikihow), hardware (FabLabs, Open Source Ecology),
accommodation (Couchsurfing) and currency (Bitcoin, Altcoins).
In the last few years, it has become clear to many that it is not enough to
develop free/libre/open source (FLOSS) alternatives, but we also need
to re-decentralize
the Internet. Many initiatives are being undertaken under this premise
(e.g. Ethereum, Diaspora, OwnCloud, MediaGoblin, Sandstorm). These new
software tools may also be useful to boost CBPP communities further. In
this workshop, we will gather those working around the decentralized FLOSS
that could help CBPP/P2P communities. Hackers, academics, activists and
interested stakeholders are welcome.
**When**
March 16-17th 2015
**Where**
Fab Lab London
http://fablablondon.org
**Call for Proposals**
We welcome proposals for:
-
Lightning talks (2m-5m): summarise your idea & receive feedback
-
Show & Tell presentations (20m): explain your project/tech/research
-
Tutorials on software tools (1h)
Please email: lu.yang(a)surrey.ac.uk with your idea/proposals.
The workshop will have both presentations and unconference-style
participatory dynamics for finding points of collaboration and extraction
of conclusions.
** Topics **
-
Focus on FLOSS software with some of the following features:
-
Social: communication
-
e.g. social-networking, microblogging, reworked email
-
Social: collaboration
-
e.g. wikis, pads, wave, shared file hosting, multimedia
repositories
-
Alternative to proprietary choices
-
Federated / Distributed / Interoperable
-
Open Standards
-
Secure / Encrypted
-
Encouraging Peer Production communities
-
Encouraging the construction/maintenance of Commons
-
Potential cases for discussion:
-
Diaspora (federated social network)
-
Wave (federated real-time collaboration)
-
Lorea (federated social network)
-
DarkWallet (distributed wallet & social network)
-
Ethereum (P2P infrastructure)
-
MaidSafe (P2P infrastructure)
-
Sandstorm.io (facilitates federated sw)
-
Mailpile (encripted email)
-
MediaGoblin (federated multimedia repository)
-
OwnCloud (file hosting)
-
… (your case)
**Scholarships**
There are a few scholarships for potential participants who wish to attend
the event. The scholarship will cover participant’s travel and subsistence
cost, up to €400. If you are interested in applying for the scholarship,
please email: lu.yang(a)surrey.ac.uk before 28 February 2015, with a
paragraph stating why you think your FLOSS is relevant, plus a short bio.
Priority will be given to those with low resources, innovative FLOSS within
the topics of the call, and being a grassroot community.
** More **
More info will be posted online in:
http://p2pvalue.eu/2nd-floss4p2p-workshop
Email queries to: lu.yang(a)surrey.ac.uk
Hey folks,
Dario and I just updated the scholarly citations dataset to include Digital
Object Identifiers. We found 742k citations (524k unique DOIs) in 172k
articles. Our spot checking suggests that 98% of these DOIs resolve. The
remaining 2% were extracted correctly, but they appear to be typos.
http://dx.doi.org/10.6084/m9.figshare.1299540
Like the dataset that we released for PubMed Identifiers, this dataset includes
the first known occurrence of a DOI citation in an English Wikipedia
article and the associated revision metadata, based on the most recent
complete content dump of English Wikipedia.
Feel free to share this with anyone interested via:
https://twitter.com/WikiResearch/status/564908585008627712
We'll be organizing our own work and analysis of these citations here:
https://meta.wikimedia.org/wiki/Research:Scholarly_article_citations_in_Wik…
-Aaron
Laura Hale wrote:
>...
> [Avoiding SEO spamming is] a Wikipedia thing, by putting
>
> <a rel="nofollow" class="external text"
>
> in the article source code....
Of the mirrors that come and go from time to time, it always seems to
be about even odds as to whether they keep rel=nofollow in external
and citation links. So, some months being linked to from Wikipedia
does increase search engine ranking, if such mirrors are reliably
catching up, and other months not so much.
I'm enthusiastic about the academic establishment using Wikipedia
citations as performance metrics, but agree with everyone who wants to
keep an eye on gaming. At present, commercial gaming is thousands of
times more prevalent than academics trying to enhance their CVs. Let
us know when Altmetric.com or Wikipedia is mentioned in a decision to
grant or deny tenure (track hires.)
I would love to see this become an easier way to find the secondary
peer reviewed literature reviews, because they aren't always where
people expect them to be published at all. If we attract academics in
whose interest it is to help with that, and they understand how and
actually do help, that would be great.
FYI:
http://www.altmetric.com/blog/new-source-alert-wikipedia/
Pine
*This is an Encyclopedia* <https://www.wikipedia.org/>
*One gateway to the wide garden of knowledge, where lies The deep rock of
our past, in which we must delve The well of our future,The clear water we
must leave untainted for those who come after us,The fertile earth, in
which truth may grow in bright places, tended by many hands,And the broad
fall of sunshine, warming our first steps toward knowing how much we do not
know.*
*—Catherine Munro*
LDQ 2015 CALL FOR PAPERS
2nd Workshop on Linked Data Quality
co-located with ESWC 2015, Portorož, Slovenia
June 1, 2015
http://ldq.semanticmultimedia.org/
Important Dates
* Submission of research papers: March 6, 2015
* Notification of paper acceptance: April 3, 2015
* Submission of camera-ready papers: April 17, 2015
Since the start of the Linked Open Data (LOD) Cloud, we have seen an
unprecedented volume of structured data published on the web, in most
cases as RDF and Linked (Open) Data. The integration across this LOD
Cloud, however, is hampered by the ‘publish first, refine later’
philosophy. This is due to various quality problems existing in the
published data such as incompleteness, inconsistency,
incomprehensibility, etc. These problems affect every application
domain, be it scientific (e.g., life science, environment),
governmental, or industrial applications.
We see linked datasets originating from crowdsourced content like
Wikipedia and OpenStreetMap such as DBpedia and LinkedGeoData and also
from highly curated sources e.g. from the library domain. Quality is
defined as “fitness for use”, thus DBpedia currently can be appropriate
for a simple end-user application but could never be used in the medical
domain for treatment decisions. However, quality is a key to the success
of the data web and a major barrier for further industry adoption.
Despite the quality in Linked Data being an essential concept, few
efforts are currently available to standardize how data quality tracking
and assurance should be implemented. Particularly in Linked Data,
ensuring data quality is a challenge as it involves a set of
autonomously evolving data sources. Additionally, detecting the quality
of datasets available and making the information explicit is yet another
challenge. This includes the (semi-)automatic identification of
problems. Moreover, none of the current approaches uses the assessment
to ultimately improve the quality of the underlying dataset.
The goal of the Workshop on Linked Data Quality is to raise the
awareness of quality issues in Linked Data and to promote approaches to
assess, monitor, maintain and improve Linked Data quality.
The workshop topics include, but are not limited to:
* Concepts
* - Quality modeling vocabularies
* Quality assessment
* - Methodologies
* - Frameworks for quality testing and evaluation
* - Inconsistency detection
* - Tools/Data validators
* Quality improvement
* - Refinement techniques for Linked Datasets
* - Linked Data cleansing
* - Error correction
* - Tools
* Quality of ontologies
* Reputation and trustworthiness of web resources
* Best practices for Linked Data management
* User experience, empirical studies
Submission guidelines
We seek novel technical research papers in the context of Linked Data
Quality with a length of up to 8 pages (long) and 4 pages (short)
papers. Papers should be submitted in PDF format. Other supplementary
formats (e.g. html) are also accepted but a pdf version is required.
Paper submissions must be formatted in the style of the Springer
Publications format for Lecture Notes in Computer Science (LNCS). Please
submit your paper via EasyChair at
https://easychair.org/conferences/?conf=ldq2015. Submissions that do not
comply with the formatting of LNCS or that exceed the page limit will be
rejected without review. We note that the author list does not need to
be anonymized, as we do not have a double-blind review process in place.
Submissions will be peer reviewed by three independent reviewers.
Accepted papers have to be presented at the workshop.
Important Dates
All deadlines are, unless otherwise stated, at 23:59 Hawaii time.
* Submission of research papers: March 6, 2015
* Notification of paper acceptance: April 3, 2015
* Submission of camera-ready papers: April 17, 2015
* Workshop date: May 31 or June 1, 2015 (half-day)
Organizing Committee
* Anisa Rula – University of Milano-Bicocca, IT
* Amrapali Zaveri – AKSW, University of Leipzig, DE
* Magnus Knuth – Hasso Plattner Institute, University of Potsdam, DE
* Dimitris Kontokostas – AKSW, University of Leipzig, DE
Program Committee
* Maribel Acosta – Karlsruhe Institute of Technology, AIFB, DE
* Mathieu d’Aquin – Knowledge Media Institute, The Open University, UK
* Volha Bryl – University of Mannheim, DE
* Ioannis Chrysakis – ICS FORTH, GR
* Jeremy Debattista – University of Bonn, Fraunhofer IAIS, DE
* Stefan Dietze – L3S, DE
* Suzanne Embury – University of Manchester, UK
* Christian Fürber – Information Quality Institute GmbH, DE
* Jose Emilio Labra Gayo – University of Oviedo, ES
* Markus Graube – Technische Universität Dresden, DE
* Maristella Matera – Politecnico di Milano, IT
* John McCrae – CITEC, University of Bielefeld, DE
* Felix Naumann – Hasso Plattner Institute, DE
* Matteo Palmonari – University of Milan-Bicocca, IT
* Heiko Paulheim – University of Mannheim, DE
* Mariano Rico – Universidad Politécnica de Madrid, ES
* Ansgar Scherp – Kiel University, DE
* Jürgen Umbrich – Vienna University of Economics and Business, AT
* Miel Vander Sande – MultimediaLab, Ghent University, iMinds, BE
* Patrick Westphal – AKSW, University of Leipzig, DE
* Jun Zhao – Lancaster University, UK
* Antoine Zimmermann – ISCOD / LSTI, École Nationale Supérieure des
Mines de Saint-Étienne, FR
* Andrea Maurino – University of Milan-Bicocca, IT
More details can be found on the workshop website:
http://ldq.semanticmultimedia.org/
I just published https://archive.org/details/wikia_dump_20141219 :
----
Snapshot of all the known Wikia dumps. Where a Wikia public dump was
missing, we produced one ourselves. 9 broken wikis, as well as
lyricswikia and some wikis for which dumpgenerator.py failed, are still
missing; some Wikia XML files are incorrectly terminated and probably
incomplete.
In detail, this item contains dumps for 268 902 wikis in total, of which
21 636 full dumps produced by Wikia, 247 266 full XML dumps produced by
us and 5610 image dumps produced by Wikia. Up to 60 752 wikis are
missing. Nonetheless, this is the most complete Wikia dump ever produced.
----
We appreciate help to:
* verify the quality of the data (for Wikia dumps I only checked valid
gzipping; for WikiTeam dumps only XML well-formedness
https://github.com/WikiTeam/wikiteam/issues/214 );
* figure out what's going on for those 60k missing wikis
https://github.com/WikiTeam/wikiteam/commit/a1921f0919c7b44cfef967f5d07ea49…
;
* improve dumpgenerator.py management of huge XML files
https://github.com/WikiTeam/wikiteam/issues/8 ;
* fix anything else! https://github.com/WikiTeam/wikiteam/issues
For all updates on Wikia dumps, please watchlist/subscribe to the feed
of: http://archiveteam.org/index.php?title=Wikia (notable update: future
Wikia dumps will be 7z).
Nemo
Hey all,
we just released a dataset of scholarly citations in the English Wikipedia by Pubmed / Pubmed Central ID.
http://dx.doi.org/10.6084/m9.figshare.1299540
The dataset currently includes the first known occurrence of a PMID or PMCID citation in an English Wikipedia article and the associated revision metadata, based on the most recent complete content dump of English Wikipedia. We’re planning on expanding this dataset to include other types of scholarly identifier soon.
Feel free to share this with anyone interested or spread the word via: https://twitter.com/WikiResearch/status/562422538613956608
Dario and Aaron
2nd Call for Papers
Conference on Intelligent Computer Mathematics
CICM 2015
13-17 July 2015
Washington DC, USA
Digital and computational solutions are becoming the prevalent means for the
generation, communication, processing, storage and curation of mathematical
information. Separate communities have developed to investigate and build
computer based systems for computer algebra, automated deduction, and
mathematical publishing as well as novel user interfaces. While all of these
systems excel in their own right, their integration can lead to synergies
offering significant added value. The Conference on Intelligent Computer
Mathematics (CICM) offers a venue for discussing and developing solutions
to the great challenges posed by the integration of these diverse areas.
CICM has been held annually as a joint meeting since 2008, co-locating
related conferences and workshops to advance work in these
subjects. Previous meetings have been held in Birmingham (UK 2008),
Grand Bend (Canada 2009), Paris (France 2010), Bertinoro (Italy 2011),
Bremen (Germany 2012), Bath (UK 2013), and Coimbra (Portugal 2014).
This is a (short version of the) call for papers for CICM 2015, which
will be held in Washington, D.C., 13-17 July 2015.
The full version of the CFP is available from the conference web page at
http://cicm-conference.org/2015/cicm.php
**********************************************************************
Invited Speakers:
**********************************************************************
* Leonardo de Moura, https://leodemoura.github.io/
"Formalizing mathematics using the Lean Theorem Prover"
(http://leanprover.github.io/)
* Tobias Nipkow, http://www21.in.tum.de/~nipkow/
"Analyzing the Archive of Formal Proofs"
* Jim Pitman, http://www.stat.berkeley.edu/~pitman/
* Richard Zanibbi, http://www.cs.rit.edu/~rlaz/
"Math Search for the Masses: Multimodal Search
Interfaces and Appearance-Based Retrieval"
**********************************************************************
The principal tracks of the conference will be:
**********************************************************************
* Calculemus (Symbolic Computation and Mechanised Reasoning)
Chair: Jacques Carette
* DML (Digital Mathematical Libraries)
Chair: Volker Sorge
* MKM (Mathematical Knowledge Management)
Chair: Cezary Kaliszyk
* Systems and Data
Chair: Florian Rabe
* Doctoral Programme
Chair: Umair Siddique
Publicity chair is Serge Autexier. The local arrangements will be
coordinated by the Local Arrangements Chairs, Bruce R. Miller
(National Institute of Standards and Technology, USA) and Abdou
Youssef (The George Washington University, Washington, D.C.), and the
overall programme will be organized by the General Programme Chair,
Manfred Kerber (U. Birmingham, UK).
As in previous years, we will have co-located workshops. Currently
planned are:
* Formal Mathematics for Mathematicians
* Theorem proving components for Educational software (ThEdu'15)
* MathUI
Furthermore we will have a doctoral programme to mentor doctoral
students giving presentations and a tutorial on the generic proof
assistant Isabelle. We also solicit for project descriptions, surveys,
and work-in-progress papers.
**********************************************************************
Important Dates
**********************************************************************
Conference submissions:
Abstract submission deadline: 16 February 2015
Submission deadline: 23 February 2015
Reviews sent to authors: 6 April 2015
Rebuttals due: 9 April 2015
Notification of acceptance: 13 April 2015
Camera ready copies due: 27 April 2015
Conference: 13-17 July 2015
Work-in-progress and Doctoral Programme submissions:
Submission deadline:
(Doctoral: Abstract+CV) 4 May 2015
Notification of acceptance: 25 May 2015
Camera ready copies due: 1 June 2015
More detailed information, e.g. on submission via EasyChair
and the Springer LNAI format, can be found on
http://cicm-conference.org/2015/cicm.php
Dear list members,
please find attached a current call for research fellowships in the field of internet research.
Kind regards,
Simon
Simon Rinas | Associate Researcher
Alexander von Humboldt Institut für Internet und Gesellschaft gGmbH
Oberwallstraße 9 · 10117 Berlin
T +49 30 20 07-6082 · F +49 30 20 60-8960 · www.hiig.de <http://www.hiig.de/> · <http://www.facebook.com/HIIG.Berlin> <https://twitter.com/hiig_berlin> <https://plus.google.com/u/0/104662709701524780855>
Gesellschaftssitz Berlin | Amtsgericht Berlin Charlottenburg | HRB 140911B
Steuer-ID 27/601/54619 | Geschäftsführung: Prof. Dr. Jeanette Hofmann · Prof. Dr. Dr. Ingolf
Pernice · Prof. Dr. Dr. Thomas Schildhauer · Prof. Dr. Wolfgang Schulz · Dr. Karina Preiß
SUMMER FELLOWSHIP 2015 – INTERNET AND SOCIETY
This year the Alexander von Humboldt Institute for Internet and Society once again opens its doors for summer fellows from all over the globe. We invite applications from early stage researchers pursuing transdisciplinary internet research. If you are seeking exchange regarding your research aspirations and think your objectives match or complement ours, we look forward to hearing from you!
Opportunities
Our summer fellowship provides innovative thinkers a unique opportunity to exchange experiences and set off new initiatives in an inviting intellectual environment. In addition to that you will get in touch with other summer fellows during the common research period. The selected fellows are very welcome to collaborate in a growing international team and to participate in the research activities at our institute. We encourage you to actively shape your stay according to your research interests. We offer a number of opportunities to get involved with our research programme and discuss your research project with the HIIG research team, such as:
Writing a paper related to your research project, e.g. publishing a journal paper in our SSRN Internet & Society Series <http://www.hiig.de/en/ssrn-internet-society-series-2/>
Holding a presentation about a topic of your choice in our weekly HIIG Club
Organising a workshop related to your research topic
Engage in joint activities and projects with other fellows
And more – according to your interest
Allowances
Fellows are expected to bring their own funding through their home institution or outside grants. Fellows must take care of their accommodation, insurance, childcare, and transportation arrangements. However, in specific cases we can provide a travel allowance of up to € 700,- and a visa subsidy of up to € 200,- on request.
Time frame
We offer fellowships ranging from 3 to 12 months starting from 1 June 2015 at the earliest. Fellowships must cover the period from 1 July to 31 August to guarantee the exchange among all of our fellows.
Qualifications
Master’s degree, PhD in full process/nearing completion
Fluency in English; command of German is appreciated
Research experience and an Internet research project of your own
Required application documents
curriculum vitae
letter of motivation explaining your interest in the fellowship, your expectations and your research background (1 page)
outline of a) your research project, b) the work you propose to conduct during the fellowship, c) contributions you plan to realise during your stay, d) projects on our research agenda that are of interest to you, and e) if possible, preferred project partners at our institute (maximum 3 pages total)
optional: your latest publication or work sample covering Internet research (maximum of 1 paper/ chapter/presentation in English or German)
Applications will only be accepted through our online application form at:
<http://hiig.de/summer2015>hiig.de/summer2015
<http://hiig.de/summer2015>
Closing date for applications is Sunday, 1 March 2015. Please contact us with any questions via application(a)hiig.de <mailto:application@hiig.de>.