Dear GSoC & OPW candidates,
The deadline for submissions is approaching fast. Here you have some
tips to increase your chances:
* Draft your proposal publicly under your mediawiki.org userpage. We
won't accept any proposal missing this basic requirement.
* Submit your draft proposal as soon as possible. Don't wait until the
last day.You can keep editing it until the deadline.
* Your submission must include links to your wiki page and the related
Bugzilla report.
* All the technical discussion must happen in your wiki page or the
related Bugzilla report. Private channels only for private questions.
* Show us your work. Prove your skills. Think of ways to get our trust.
A Wikimedia / MediaWiki contribution is a very tangible proof.
* Getting two co-mentors is better than getting one. We do our best
finding mentors. Your initiative finding your mentors is appreciated.
* GSoC students: keep your entry at
https://www.mediawiki.org/wiki/Summer_of_Code_2013#Students up to date.
(List of OPW candidates coming soon to
https://www.mediawiki.org/wiki/Outreach_Program_for_Women )
Best wishes to all candidates!
--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Hi guys,
I've had an idea for a GSOC project that doesn't appear in the project
list. It's something I've been considering for a while, but before I write
a full proper proposal with timelines and technical outlines, I want to see
if this isn't just a random idea I am alone in wanting to see happen :)
In general, my idea is to produce a physics-related image rendering engine,
similar to LaTeX only for physics visual demonstrations. So, while LaTeX
takes text-based 'code' and renders it into an image of the equation, my
idea would take text-based 'code' and render it into a visual
representation of the equation.
The easiest idea to demonstrate is something like 'projectile motion' that
would produce a simple graph of the motion -- but this can also be used to
represent rotational movement with vector representations, collisions, or
electricity/magnetism images.
It is meant mostly for educators, tutors, schools and physics-related
articles but can be used by anyone who wants to produce an image
representation of an equation without editing that image in a separate
software.
This can become a huge project eventually, if we include advanced physics
and many other features, but it can start out as a basic "physics 101"
image rendering engine with select subjects, and perhaps grow from there.
In terms of how to do this technically, I was thinking of using either a
php imaging techniques (jpg or svg), or even jQuery libraries like
RaphaelJS or others. The images are relatively basic (circles / arrows /
boxes, etc).
The main challenge would be to produce something that's easy to use and yet
robust enough to be useful -- as well as flexible so we can add more
equations/subjects to it in the future.
A rough draft of the idea with a brief summary is here:
http://www.mediawiki.org/wiki/User:Mooeypoo/GSOC_2013_Project
Please let me know what you think! Do you think it's feasible? Am I getting
myself in too deep? Is it something you would like to see or am I the only
one who'd use something like this?
Thank you very much for any feedback!
Moriel
(aka mooeypoo)
--
No trees were harmed in the creation of this post.
But billions of electrons, photons, and electromagnetic waves were terribly
inconvenienced during its transmission!
I'm seeing a few reports that people are running into trouble running
MediaWiki on Apache 2.4.
The reports seem focused on XAMPP, but I tried to run MW against the
Apache 2.4 package on Debian and didn't succeed. Granted, I tried only
for 5 minutes or so.
Is this something we should document?
--
http://hexmode.com/
Imagination does not breed insanity. Exactly what does breed insanity
is reason. Poets do not go mad; but chess-players do.
-- G.K. Chesterson
Anybody know if we have the REST API for Bugzilla enabled - and if so what
the URL is for it?
I came across some old docs on wikitech.wikimedia.org that suggest we do
[1] - but no indication of the URL for service. I know I can use the
XML-RPC API for Bugzilla, but would rather use REST if it's available. Any
insight is appreciated!
[1] https://wikitech.wikimedia.org/wiki/Bugzilla_REST_API
--
Arthur Richards
Software Engineer, Mobile
[[User:Awjrichards]]
IRC: awjr
+1-415-839-6885 x6687
Hello,
I want to work with Mediawiki project.
I have studied for Computer Science and currently I am CRM analyst and
developer.
My skills are: PHP,JSP, JAVA, Javascript ,jQuery, REST, MySql ect.
Thank you
Hi folks,
I'd like to start a broader conversation about language support in MW
core, and the potential need to re-think some pretty fundamental
design decisions in MediaWiki if we want to move past the point of
diminishing returns in some language-related improvements.
In a nutshell, is it time to make MW aware of multiple content
languages in a single wiki? If so, how would we go about it?
Hypothesis: Because support for multiple languages existing in a
single wiki is mostly handled through JS hacks, templates, and manual
markup added to the content (such as <div>s indicating language
direction), we are providing an opaque, confusing and often
inconsistent user experience in our multilingual wikis, which is a
major impediment for growth of non-English content in those wikis, and
participation by contributors who are not English speakers.
Categories have long been called out as one of the biggest factors,
and they certainly are (since Commons categories are largely in
English, they are by definition excluding folks who don't speak the
language), but I'd like to focus on the non-category parts of the
problem for the purposes of this conversation.
Support for the hypothesis (please correct misconceptions or errors):
1) There's no consistent method by which multiple language editions of
the same page are surfaced for selection by the use. Different wikis
use different templates (often multiple variants and layouts in a
single wiki), different positioning, different rules, etc., leading to
inconsistent user experience. Consistency is offered by language
headers generated by the Translate extension, but these are used for
managing translations, while multilingual content existing in the same
wiki may often not take the form of 1:1 translations.
Moreover, language headers have to be manually updated/maintained,
consider the user-friendliness of something like the +/- link in the
language header on a page like
https://commons.wikimedia.org/wiki/Commons:Kooperationen
which leads to:
https://commons.wikimedia.org/w/index.php?title=Template:Lang-Partnerships&…
Chances are that a lot of people who'd have the ability to provide a
version (not necessarily a translation) of the page in a given
language will give up even on the process of doing so correctly.
2) There's no consistent method by which page name conflicts (which
may often occur in similar languages) are resolved, and users have to
manually disambiguate.
3) There are basic UX issues in the language selection tools offered
today. For example, after changing the language on Commons to German,
I will see the page I'm on (say English) with a German user interface,
even if there's an actual German content version of the page
available. This is because these language selection tools have no
awareness of the existence of content in relevant languages.
4) In order to ensure that content is rendered correctly irrespective
of the UI language set, we require content authors to manually add
<div>s around RTL content, even if that's all the page contains.
5) It's impossible to restrict searches to a specific language. It's
impossible to restrict recent changes and similar tools to a specific
language.
I'll stop there - I'm sure you can think of other issues with the
current approach. For third party users, the effort of replicating
something like the semi-acceptable Commons or Meta user experience is
pretty significant, as well, due to the large number of templates and
local hacks employed.
This is a very tricky set of architectural issues to solve well, and
it would be easy to make the user experience worse by solving it
poorly. Still, as we grow our bench strength to take on hard problems,
I want to raise the temperature of this problem a bit again,
especially from the standpoint of future platform engineering
improvements.
Would it make sense to add a language property to pages, so it can be
used to solve a lot of the above issues, and provide appropriate and
consistent user experience built on them? (Keeping in mind that some
pages would be multilingual and would need to be identified as such.)
If so, this seems like a major architectural undertaking that should
only be taken on as a partnership between domain experts (site and
platform architecture, language engineering, Visual Editor/Parsoid,
etc.).
I'm not suggesting this should be done in the very near term, but I'd
like to at least start talking about it, hear if I'm completely off
base (and if there are simpler ways to improve on current state), and
explore where it could fit in our longer term agenda.
Relevant existing code:
* https://www.mediawiki.org/wiki/Extension:Translate - awesome for
page and message translation, but I'm not clear that it can help for
the other multilingual content scenarios and problems
* Others: https://www.mediawiki.org/wiki/Category:Internationalization_extensions
Thanks,
Erik
--
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation
Wikipedia and our other projects reach more than 500 million people every
month. The world population is estimated to be >7 billion. Still a long
way to go. Support us. Join us. Share: https://wikimediafoundation.org/
Hello,
Welcome to your weekly Roadmap highlights email. This documents the
changing present and planned future of the work that the WMF Engineering
teams are doing.
== Highlights from the Roadmap updates ==
(As this is the last update to the roadmap for the month, there aren't
very many updates)
* Full details at:
https://docs.google.com/a/wikimedia.org/spreadsheet/ccc?key=0Aoizbfxc5g6KdE…
** shorturl: http://goo.gl/7611Q
=== Ops ===
Ceph deployment is planned for Monday (April 29th). This will enable
multi-write (going to our current NFS '''as well as''' the Ceph based
filesystem) but only reading from NFS.
This is part of the Multimedia work:
http://www.mediawiki.org/wiki/Multimedia
=== Analytics ===
Mostly worked on mobile stuff in April and the Kraken initial base
cluster deploy has moved to May.
http://www.mediawiki.org/wiki/Analytics/Infrastructure
=== Features ===
Echo deploying to English Wikipedia is happening this month, under the
wire on Tuesday April 30th.
http://www.mediawiki.org/wiki/Echo_%28Notifications%29
EventLogging for UserLogin and AccountCreation is now in MW Core, but it
is only accessible/in use when passing a specific URL parameter.
http://www.mediawiki.org/wiki/Event_logging
=== Platform ===
The platform team will be partnering with the i18n team to work on the
Chinese category collation work as it requires code review, UX work,
etc. This is expected for June.
See: https://bugzilla.wikimedia.org/show_bug.cgi?id=44667
=== i18n ===
Planning to deploy Universal Language Selector (ULS) in May.
http://www.mediawiki.org/wiki/Universal_Language_Selector
== Closing ==
Greg
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |