Heiya, forwarding as this may be of interest for you too. Cheers Karsten
-------- Weitergeleitete Nachricht --------
Betreff: [Semediawiki-user] Announcing SMWCon Fall 2017
Datum: Fri, 30 Jun 2017 13:53:16 +0200
Von: Rutledge, Lloyd <Lloyd.Rutledge(a)ou.nl>
An: Semediawiki-user(a)lists.sourceforge.net
Dear users, developers and all people interested in semantic wikis,
We are happy to announce SMWCon Fall 2017 - the 14th Semantic MediaWiki Conference:
Dates: October 4th to October 6th 2017 (Wednesday to Friday).
Location: Rotterdam, The Netherlands.
Conference page: https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2017
Participants: Everybody interested in semantic wikis, especially in Semantic MediaWiki, e.g., users, developers, consultants, business representatives, researchers.
SMWCon Fall 2017 will be supported by ArchiXL B.V. [0], Wikibase Solutions [1], The Open University in the Netherlands [2] and Open Semantic Data Association e. V. [3].
Following the success of this format the SMWCon will have one tutorial and workshop day preceding two conference days.
Participating in the conference: To help us planning, you can already informally register on the conference page, although a formal registration will later be needed.
Contributing to the conference: If you want to present your work in the conference please go to the conference page and add your talk there. To create an attractive program for the conference, we will later ask you to give further information about your proposals.
Among others, we encourage contributions on the following topics:
Applications of semantic wikis:
Semantic wikis for enterprise workflows and business intelligence
Semantic wikis for corporate or personal knowledge management
Exchange on business models with semantic wikis
Lessons learned (best/worst practices) from using semantic wikis or their extensions
Semantic wikis in e-science, e-humanities, e-learning, e-health, e-government
Semantic wikis for finding a common vocabulary among a group of people
Semantic wikis for teaching students about the Semantic Web
Offering incentives for users of semantic wikis
Challenges and obstacles for semantic wikis in business environments
Development of semantic wikis:
Semantic wikis as knowledge base backends / data integration platforms
Comparisons of semantic wiki concepts and technologies
Community building, feature wishlists, roadmapping of Semantic MediaWiki
Improving user experience in a semantic wiki
Speeding up semantic wikis
Integrations and interoperability of semantic wikis with other applications and mashups
Modeling of complex domains in semantic wikis, using rules, formulas etc.
Access control and security aspects in semantic wikis
Multilingual semantic wikis
For questions about sponsorship opportunities, please do not hesitate to contact Ad Strack van Schijndel <ad at wikibase.nl>.
Hope to see you in Rotterdam!
Remco de Boer, Toine Schijvenaars, Esther Greefhorst, Lloyd Rutledge, Erwin Oord
(The Organizing Committee)
[0] http://www.archixl.nl/en/
[1] http://www.wikibase.nl/
[2] https://www.ou.nl/
[3] https://opensemanticdata.org/
------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
Semediawiki-user mailing list
Semediawiki-user(a)lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/semediawiki-user
I want to let some of my administrators (in the wizards group) edit
LocalSettings.php, so I used this snippet, which allows them to make
changes by editing the Project:Shared_config.php page. Then I protected the
page so that only wizards can edit it. Do you think this presents any
security issues?
(I was also going to have it save the old version to a bak file, but I had
to comment that code out because I was getting a call to a function on a
non-object error, for some reason)
function editLocalSettingsOnPageContentSaveComplete( $article, $user,
$content,
$summary, $isMinor, $isWatch, $section, $flags,
$revision, $status, $baseRevId ) {
if (
$article->getTitle()->getFullText() !== 'Project:Shared
config.php' ) {
return true;
}
# $oldRevision = Revision::newFromId( $baseRevId );
# $oldRevisionContent = $oldRevision->getContent( Revision::RAW );
# $oldRevisionContents = ContentHandler::getContentText(
$oldRevisionContent );
# $oldRevisioncontents = str_replace( '<source lang="php"' . ">\n",
'', $oldRevisionContents );
# $oldRevisioncontents = str_replace( '</source' . '>', '',
$oldRevisionContents );
# file_put_contents ( '/home/wiki/shared_config.bak',
$oldRevisionContents );
$contents = ContentHandler::getContentText( $content );
$contents = str_replace( '<source lang="php"' . ">\n", '',
$contents );
$contents = str_replace( '</source' . '>', '', $contents );
file_put_contents ( '/home/wiki/shared_config.php',
$contents );
return true;
}
$wgHooks['PageContentSaveComplete'][] =
'editLocalSettingsOnPageContentSaveComplete';
# add an additional protection level restricting edit/move/etc. to users
with the "wizards" permission
$wgRestrictionLevels[] = 'wizards';
# give the "wizards" permission to users in the "wizard" group
$wgGroupPermissions['developer']['wizards'] = true;
I try to convert a MediaWiki database from 1.4.5 to 1.5.8 (yes I know I am a bit late. but up to now the database has perfectly
served me ever since 2005). We now want to migrate to a new Linux server, so we want to upgrade to the latest available version.
I understood I should first migrate to 1.5 first.
So I downloaded <https://releases.wikimedia.org/mediawiki/1.5/mediawiki-1.5.8.tar.gz>
https://releases.wikimedia.org/mediawiki/1.5/mediawiki-1.5.8.tar.gz
I follow the script on https://www.mediawiki.org/wiki/Manual:Upgrading
But I receive the following error message:
php maintenance/upgrade1_5.php
2017-06-25 15:09:10: Checking cur table for unique title index and applying if necessary
wiki: cur table has the current unique index; no duplicate entries.
2017-06-25 15:09:10: ...converting from cur/old to page/revision/text DB structure.
2017-06-25 15:09:10: Creating page and revision tables...
2017-06-25 15:09:10: Last old record is 31158
......Moving text from cur.
2017-06-25 15:09:10: Last cur entry is 31970
PHP Notice: mysql_query(): Function called without first fetching all rows from a previous unbuffered query in
/var/www/html/bike/includes/Database.php on line 349
2017-06-25 15:09:11: 0.31% done on old; ETA 2017-06-25 15:12:48 [100/31970] 147.11/sec
2017-06-25 15:09:11: 100.00% done on old (last chunk 0 rows).
Unable to free MySQL result
Backtrace:
GlobalFunctions.php line 513 calls wfBacktrace()
Database.php line 495 calls wfDebugDieBacktrace()
FiveUpgrade.inc line 426 calls Database::freeResult()
FiveUpgrade.inc line 49 calls FiveUpgrade::upgradePage()
upgrade1_5.php line 22 calls FiveUpgrade::upgrade()
The following (empty) tables are created:
xxx_page;
xxx_revision;
All of the other V1.5 tables are still missing.
* How could I avoid the above error?
* What could I have done wrong/missing
vi /var/www/html/bike/includes/Database.php
:349
function doQuery( $sql ) {
if( $this->bufferResults() ) {
$ret = mysql_query( $sql, $this->mConn );
} else {
$ret = mysql_unbuffered_query( $sql, $this->mConn );
}
return $ret;
Geert Van Pamel
What are the best tools for parsing barelinks to generate full citations?
E.g., converting
https://www.youtube.com/watch?v=lRcthTPP0s4
to
{{cite web|author=SecularSkin|date=11 November
2013|title=Christy0Misty's Series On Feminism|publisher=YouTube|url=
https://www.youtube.com/watch?v=lRcthTPP0s4}}
The easiest way seems to be to just grab whatever's in the website's
<title> tags, put that as the title in the citation, and call it a day. But
that usually doesn't give you the date, often doesn't give you the author,
etc. The website, or publisher, etc. you can usually infer from the URL, so
that's not as big a problem.
On my wiki, articles often reference YouTube videos, Reddit comments, etc.
Thus far, I haven't figure out a way to gather the metadata from those,
except by coming up with specialized scripts to scrape each different kind
of website. Is there any collaboration that is working on this kind of
issue?
Today, a friend of mine had a few really long wiki pages whose revision
histories he wanted to delete (specifically, chapters 1-117 of The Count of
Monte Cristo), to save some space in his database. So, he went to do php
deleteOldRevisions.php --delete and then, oops, his finger slipped, and he
accidentally hit enter when he meant to enter the page IDs. So he lost the
old revisions for every page on his wiki. (See
https://www.mediawiki.org/wiki/Manual:DeleteOldRevisions.php )
I think back to how, over the past months, he spent countless hours putting
a detailed edit comment for almost every single revision, including wry
remarks; friendly banter; Twitter-style 140-character social commentary and
provocative philosophical musings on the meaning of life and the universe,
and whatnot; and now all those are gone. (Now I feel smart for never
bothering to enter any edit summaries on that wiki.)
I'm thinking, it's probably best for the script to ask "Are you sure
(y/n)?" when the user doesn't put any page IDs.
The script should also probably have a parameter to allow the page titles
to be read from a text file, kind of like what deleteBatch.php offers.
https://www.mediawiki.org/wiki/Manual:DeleteBatch.php That would probably
mitigate the potential for screw-ups.
On my wiki, I'm working on an architecture document [1] to explain how you
would best serve MediaWiki in a scalable, performant way. One goal of this
paper is to update the information at mw:Manual:MediaWiki architecture [2]
so that people everywhere can learn best how to implement MediaWiki.
What architecture and specific techniques or technologies do YOU use for
MediaWiki?
Contributions, feedback and criticisms welcome.
[1] https://freephile.org/wiki/Architecture
[2] https://www.mediawiki.org/wiki/Manual:MediaWiki_architecture
Greg Rundlett
https://eQuality-Tech.comhttps://freephile.org