In http://www.mediawiki.org/wiki/Special:Code/MediaWiki/41710 Poem Extension
is integrated with MediaWiki. Yay!
However, this was done without mentioning either me or Steve Sanbeg (who did
more work on it than me) anywhere. Worse, whoever this Nathaniel Herman is,
he didn't forgot to sign himself in both CREDITS and RELEASE-NOTES so that it
appears that he is the author while in fact he has only adjusted the
extension to the core.
This is pure plagiarism, and I am asking to see it rectified, by adding me and
Steve to MediaWiki's CREDITS and RELEASE-NOTES as appropriate.
---------- Forwarded message ----------
From: raghu0891 <raghu0891(a)gmail.com>
Date: 2008/11/13
Subject: [WikiEN-l] How to get a static English HTML dump from Wikipedia
To: WikiEN-l(a)lists.wikimedia.org
I downloaded the whole english wikipedia from this link
http://static.wikipedia.org/downloads/2008-06/en/wikipedia-en-html.tar.7z.
When I tried to unzip the file using 7-zip, its giving a 'File is Broken'
error.
Any suggestion, or alternate method to get the english static html wikipedia
dump will be appreciated.
Thanking you in anticipation,
D.Raghuram.
--
View this message in context:
http://www.nabble.com/How-to-get-a-static-English-HTML-dump-from-Wikipedia-…
Sent from the English Wikipedia mailing list archive at Nabble.com.
_______________________________________________
WikiEN-l mailing list
WikiEN-l(a)lists.wikimedia.org
To unsubscribe from this mailing list, visit:
https://lists.wikimedia.org/mailman/listinfo/wikien-l
http://news.slashdot.org/article.pl?sid=08/11/04/136220&\1from=rss
<<<<<<<<<<<<<<<
The Xiph.Org Foundation announced Monday the release of Theora 1.0.
Theora is a free/open source video codec with a small CPU footprint
that offers easy portability and requires no patent royalties.
Upcoming versions of Firefox and Opera will play natively Ogg/Theora
videos with the new HTML5 element <video src="file.ogv"></video>, and
ffmpeg2theora offers an easy way to create content. Theora developers
are already working on a 1.1 encoder that offers better
quality/bitrate ratio, while producing streams backward-compatible
with the current decoder
>>>>>>>>>>>>><<
--
--
ℱin del ℳensaje.
Hello!
You are receiving this email because your project has been selected to
take part in a new effort by the PHP QA Team to make sure that your
project still works with PHP versions to-be-released. With this we
hope to make sure that you are either aware of things that might
break, or to make sure we don't introduce any strange regressions.
With this effort we hope to build a better relationship between the
PHP Team and the major projects.
If you do not want to receive these heads-up emails, please reply to
me personally and I will remove you from the list; but, we hope that
you want to actively help us making PHP a better and more stable tool.
The third & final release candidate of PHP 5.2.7 was just released and
can be downloaded from http://downloads.php.net/ilia/. Please try this
release candidate against your code and let us know if any regressions
should you find any. The goal is to have 5.2.7 by end of next week, so
timely testing would be extremely helpful.
In case you think that other projects should also receive this kinds
of emails, please let me know privately, and I will add them to the
list of projects to contact.
Best Regards,
Ilia Alshanetsky
5.2 Release Master
Hi all,
Usually, for code documentation, we've either grepped the code for the
method we wanted to look for, or used doxygen. I'm sure many of us
have found doxygen a little too clunky, with too many clicks to find
the required information.
For fun, I ran a perl script to parse our source code and pick out
function data. Then I wrote a simple web-interface that searches for
functions as you type, displaying a list of declaration lines, with a
[show] link to show the full code of the function.
I thought this might be useful to some people who constantly grep
through the code to find out the arguments to a function.
The site is available at:
<http://mwref.werdn.us>
The source code for the Perl script, and the website, is available in
the 'mwref' directory on my git repository. Patches are most welcome!
<http://gitweb.werdn.us/?p=scripts/.git;a=summary> (web interface)
<git://werdn.us/scripts/> (git repository)
A database dump of the extracted function data is available to
download (~1 MB) from my server:
<http://werdn.us/~andrew/mw_reference.sql.gz>
Enjoy!
--
Andrew Garrett
Hi,
I was thinking last night that it might be cool if there was a kind of
"template repository" for people to copy templates from for their own
MediaWiki. Copying them from Wikipedia etc while possible tends to be
very nasty and/or ugly due to extreme nestedness.
Actually, with the Help: pages on mediawiki.org, I had an idea that
these would eventually ship by default with MediaWiki. Is that still
the plan? I think it's a good idea - plenty of people may never come
back to mediawiki.org after installing, and it's just so convenient to
already have those pages in your wiki. And the special:export
instructions are ok but kind of a pain (you have to upload the images
yourself? oh and enable SVGs? joy).
Anyway, would it be OK to set up a PD template repository on
mediawiki.org, and if so, what method(s) could be used for
distributing it?
1- shipped with MW by default
2- extension????
3- special:export
Could an extension avoid the need for manually uploading images? (Or,
is there any prospect that files will work properly with
special:export?)
thanks
Brianna
--
They've just been waiting in a mountain for the right moment:
http://modernthings.org/
Hi,
Is it possible to modify page title section without effecting the page name
itself. For example, having a tag so that it's content be placed over page
title (below tabs), or in front of it as an explanation, e.g. a page with
the name of "Wikipedia" but with the title "Wikipedia: the
free<http://en.wikipedia.org/wiki/Free_content>
encyclopedia <http://en.wikipedia.org/wiki/Encyclopedia>" (may contain
internal links).
If it is possible, how can it be done?
Best regards,
--
__ \ /_\\_-//_ Mohsen A. Momeni