Hi, I'm Tianhao Wang, from China.
I'll participate in GSoC this summer and help improve the
UploadWizard to be more friendly to book uploaders.
Here is my proposal: https://
www.mediawiki.org/wiki/User:Vvv214wth/UploadWizard.
Any feedbacks or advices is welcomed!
Hello,
I have drafted a proposal for my GSoC Project: jQuery.IME extensions for
Firefox and Chrome. I would love to hear what you think about it.
I would really appreciate any kind of feedback and suggestions. Please let
me know if I can improve it in any way.
My proposal can be found here:
http://www.mediawiki.org/wiki/User:Prageck/GSoC_2013_Application
Thanks,
Praveen Singh
Hi,
My name is Nazmul Chowdhury, I would like to participate in the GSoC 2013,
working on the UploadWizard: Book upload customization (potential mentor:
MarkTraceur). My proposal can be found here:
http://www.mediawiki.org/wiki/User:Rasel160. Any feedback is much
appreciated.
Thank You,
Nazmul Chowdhury
Because Faidon idly suggested that we should install VisualEditor on
wikitech as a way of dogfooding, I went ahead and did it.
https://wikitech.wikimedia.org/w/index.php?title=PowerDNS&diff=68284&oldid=…
is the first VE edit :)
Inside baseball note: this uses the Tampa Parsoid cluster, not the one
in eqiad, because the machine hosting wikitech (virt0) is in Tampa. If
and when things move to eqiad we can change that.
Roan
Hi all,
To resolve bug 41729, patch 49364 was recently merged. This means that
there's going to be a small design change for section edit links, and
probably more relevant for this list, the way the HTML for them is
generated will change. This is likely to break any custom gadgets,
scripts or styles, etc. You can already see the change on
MediaWiki.org and testwiki, if you're curious.
There's pretty comprehensive documentation about this change at:
https://meta.wikimedia.org/wiki/Change_to_section_edit_links
Thanks,
--
Steven Walling
https://wikimediafoundation.org/
This week we have a Tech Talk special edition:
Big Data Tools
by David Schoonover
May 1, 19:30 UTC / 12:30 PDT
Overview of the big data tools we have available at the Wikimedia
Foundation. We'll be writing real queries to explore real data from the
mobile site!
More details and timezones at
http://www.mediawiki.org/wiki/Meetings/2013-05-01
--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
This is a notice that on Tuesday, April 30th between 20:00-21:00 UTC
(1-2pm PDT) Wikimedia Foundation will release security updates for
current and supported branches of the MediaWiki software. Downloads
and patches will be available at that time, with the git repositories
updated later that afternoon.
Hi guys,
I have an own idea for my GSoC project that I'd like to share with you.
Its not a perfect one, so please forgive any mistakes.
The project is related to the existing GSoC project "*Incremental Data dumps
*" , but is in no way a replacement for it.
*Offline Wikipedia*
For a long time, a lot of offline solutions for Wikipedia have sprung up on
the internet. All of these have been unofficial solutions, and have
limitations. A major problem is the* increasing size of the data dumps*,
and the problem of *updating the local content. *
Consider the situation in a place where internet is costly/
unavailable.(For the purpose of discussion, lets consider a school in a 3rd
world country.) Internet speeds are extremely slow, and accessing Wikipedia
directly from the web is out of the question.
Such a school would greatly benefit from an instance of Wikipedia on a
local server. Now up to here, the school can use any of the freely
available offline Wikipedia solutions to make a local instance. The problem
arises when the database in the local instance becomes obsolete. The client
is then required to download an entire new dump(approx. 10 GB in size) and
load it into the database.
Another problem that arises is that most 3rd part programs *do not allow
network access*, and a new instance of the database is required(approx. 40
GB) on each installation.For instance, in a school with around 50 desktops,
each desktop would require a 40 GB database. Plus, *updating* them becomes
even more difficult.
So here's my *idea*:
Modify the existing MediaWiki software and to add a few PHP/Python scripts
which will automatically update the database and will run in the
background.(Details on how the update is done is described later).
Initially, the MediaWiki(modified) will take an XML dump/ SQL dump (SQL
dump preferred) as input and will create the local instance of Wikipedia.
Later on, the updates will be added to the database automatically by the
script.
The installation process is extremely easy, it just requires a server
package like XAMPP and the MediaWiki bundle.
Process of updating:
There will be two methods of updating the server. Both will be implemented
into the MediaWiki bundle. Method 2 requires the functionality of
incremental data dumps, so it can be completed only after the functionality
is available. Perhaps I can collaborate with the student selected for
incremental data dumps.
Method 1: (online update) A list of all pages are made and published by
Wikipedia. This can be in an XML format. The only information in the XML
file will be the page IDs and the last-touched date. This file will be
downloaded by the MediaWiki bundle, and the page IDs will be compared with
the pages of the existing local database.
case 1: A new page ID in XML file: denotes a new page added.
case 2: A page which is present in the local database is not among the page
IDs- denotes a deleted page.
case 3: A page in the local database has a different 'last touched'
compared to the one in the local database- denotes an edited page.
In each case, the change is made in the local database and if the new page
data is required, the data is obtained using MediaWiki API.
These offline instances of Wikipedia will be only used in cases where the
internet speeds are very low, so they *won't cause much load on the servers*
.
method 2: (offline update): (Requires the functionality of the existing
project "Incremental data dumps"):
In this case, the incremental data dumps are downloaded by the
user(admin) and fed to the MediaWiki installation the same way the original
dump is fed(as a normal file), and the corresponding changes are made by
the bundle. Since I'm not aware of the XML format used in incremental
updates, I cannot describe it now.
Advantages : An offline solution can be provided for regions where internet
access is a scarce resource. this would greatly benefit developing nations
, and would help in making the world's information more free and openly
available to everyone.
All comments are welcome !
PS: about me: I'm a 2nd year undergraduate student in Indian Institute of
Technology, Patna. I code for fun.
Languages: C/C++,Python,PHP,etc.
hobbies: CUDA programming, robotics, etc.
--
Kiran Mathew Koshy
Electrical Engineering,
IIT Patna,
Patna
Hi,
My idea of contribution to Visual Editor is in a very naive state and I
don't know whether it will in full feature state before the deadline of
GSoC this year. Sorry for being late.
--
At the time, we are focusing at giving the users a real time, true 'visual'
experience of editing on Wikipedia, It is also necessary to make sure that
the editing speed and flexibility offered by the tools is maintained. If it
takes too much time to perform an edit, structure the paragraph or switch
between tools would eventually hamper the interest of editor especially in
case of writing long article.
I had previously worked on a VIM type text area which enabled users to edit
text in VIM mode (the popular text) editor, and thereby increasing the
speed of editing and maintaining the text. I was hoping of a similar
integration with Visual Editor. I hope a major improvement to this idea can
be done which can offer enhancement to regular contributors of Wikipedia.
Also, if there is anything extra (and I believe there is) which can be
added to this very naive idea to create something universal and better. If
there exists already any extension upon which it can be build upon, please
let us know.
Thanks!
--
Vivek Rai
B.Tech First Year Undergraduate
IIT Kharagpur
Contact - 8013291569
Hi
I am Harsh kothari I am an engineering student at L.D college. I have been
an open source contributor since last 2 years and am contributing in
MediaWiki since last 8 months. I am passionate about coding and enjoy it
too. I also have been participating in various open source events in
Gujarat as well outside of it.
I am submitting a proposal on Language Coverage Matrix dashboard as my GSoC
2013 project. It would help automate the information about language support
provided by the Language Engineering team for e.g. key maps, web fonts,
translation, language selector, i18n support for gender, plurals, grammar
rules. The LCM would display this information as well as provide
visualization graphs of language coverage using various search criteria
such as tools or languages. it will be of tremendous use to the Language
Engineering Team.
The below mentioned link will elaborate more on my proposal.
http://www.mediawiki.org/wiki/User:Harsh4101991/GSoC_2013
Bug filed on Bugzilla
https://bugzilla.wikimedia.org/show_bug.cgi?id=46651
Looking forward to suggestions and inputs.
Thanks
Harsh
--
Harsh Kothari
Engineering Trainee,
Physical Research Laboratory.
Follow Me : harshkothari410 <https://twitter.com/harshkothari410/>