Hello,
I am summarizing here my proposal idea for OPW MediaWiki projects.
Objective: To test MediaWiki/WikiPedia websites across different browsers. To make sure various website features are supported on different browser platforms. This is mainly functionality testing of website features. When a new feautre is implemented, regression testing to be done to make sure existing feaures work fine. Once the tests are passed, it can be confirmed that the tested feature support the particular browsers.
Scope: To do functionality testing of selected features(i.e Print/Export PDF option etc. ) of the website on selected browser(i.e IE, FF, Chrome etc.) This is nothing to do with Performance Testing, Load Testing of the website.
Plan:
* To select features which will be tested and then select the browsers on which testing to be done
* To prepare test cases covering all functionality of the selected feature
* To prepare multiple test scripts for each of these testcases, covering negative/postive etc. different scenario
* To prepare test data to cover various sets of data i.e to cover edge conditions etc.
* To execute the test scripts and analyse the results
* For failed result, bug will be logged others will be marked as Passed testing
* Bug lifecycle will be tracked and once the fix is available it will be retested.
* Test scripts will be reused for regression testing.
* Test scripts will be updated now and then to include new changes.
* New Test scripts can be included to support different scenario of the bug fix.
* Test report to be presented summarizing the findings, highlighting the open defects.
Tools : Test Automation testing tool to be used - Selenium suite or Cucumber. BugZilla will be used to log defects. Test Cases to be maintained in spreadsheet or any version controlled tools. Test Report to be prepared in graphical format, any open source tool can be used.
Thanks and Regards,
Indrani Sen
Msc Software Engineering
Queen Mary, University of London
London, United Kingdom
Hey,
I was wondering what the qualifications are for getting a Bugzilla section
for a MediaWiki extension, and how it can be set up? (It's difficult having
to memorize or write down on notepads what things need to be fixed, and I
don't want to have to set up my own Bugzilla instance.)
*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerromeo(a)gmail.com
Hi Siebrand, moving your feedback about process to the list.
On 05/06/2013 01:12 PM, Siebrand Mazeland (WMF) wrote:
> Here is my ranking
(snip, thank you!)
> I would like to provide some feedback, too: The whole process of GSoC
> was very confusing to me. Students communicated on melange,
> mediawiki.org <http://mediawiki.org>, and mailing lists. Some also
> emailed me and others privately. This scattered communication made me
> feel I was not able to properly inform myself of the feedback cycles a
> proposal went through. Melange not having any capabilities to show
> differences between versions of proposals, does not help - that's
> unfortunately not something we can directly information. I hope that the
> number of communication platforms for GSoC communication and
> documentation can be reduced in the next iteration, to make the process
> easier to follow to those that are supposed to comment on, evaluate and
> rank the proposals.
Yes, I agree. If it was confusing for you we can imagine how confusing
it has been for many students. In the next iteration we will still have
the same community channels + imperfect Google Melange, but we can do
better at focusing the discussion
> I was very able and willing to follow all of your instructions from the
> below email and the ones on linked page, until I truly understood what
> following them would mean for me time wise. If I understand your request
> well, you are basically asking us to read all proposals, take 15
> criteria and all comments into account, and then rate all *subjects*,
> not the individual proposals. I estimate this is about a day of work to
> do well. I'm sorry, but this is too much effort for me with this short
> notice. I've done the best I can with the time available to me.
Mmm well, no. And I'm glad you invested maybe 1 hour instead of one day.
Saying "the project I mentor should go first!" is easy. I'm asking
mentors to tell what proposals they think should be considered before
the ones they are willing to mentor. We have a pool of 38 smart people
directly involved in Wikimdia GSoC mentoring and I believe your personal
rankings will answer directly most of the questions.
Of course you could spend a whole day assessing each proposal in detail.
I believe you can go through the list pretty fast through,
double-checking a few proposals that sound interesting but you are not
familiar with. This quicker method has more room for personal mistakes,
but if there are 37 other people playing the game I bet the consolidated
list cannot go too wrong.
Thank you for the participation and the feedback! We are trying many
things here as we go.
--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Hi folks,
Short version: This mail is fishing for feedback on proposed work on
Gerrit-Bugzilla integration to replace code review tags.
Long version:
One feature of our old code review system that was a tagging system
that made it quick and easy to assign a keyword to a revision at any
time. There were a number of uses we had for the system, which are
documented here:
http://www.mediawiki.org/wiki/Subversion/Code_review/tags
Examples of tags that we miss:
scaptrap - this change requires special care at deploy time. It may
be that it requires two components to be in sync that aren't generally
deployed in sync, or it requires a configuration change, or something
else.
fixme - this change introduces a bug (weirdly, not in our old
documentation - go figure). In Gerrit, this would be a -1 or -2 if
the code hasn't been merged yet, but post-merge, there's no uniform
way to attach that metadata.
backcompat - backwards compatibility breakers.
This has been a frequently requested feature of the upstream Gerrit
developers. However, they've been reluctant to implement such a
feature until after some unscheduled major architectural work is
completed, so we shouldn't hold our breath waiting for that.
With that in mind, we have bug 38239, assigned to Chad:
https://bugzilla.wikimedia.org/38239
Chad worked up a hacky version of tagging as a MediaWiki extension
earlier this week, which would have kept the tags in MediaWiki instead
of Gerrit. That's only one half of what might be an acceptable
solution, since the other half would need to be some Javascript in the
Gerrit user interface to allow for insertion into the MediaWiki tag
database. I discouraged Chad from continuing on this because it
seemed to me that it would have been a little *too* hacky. I
preferred that if we were going to have our own hacky solution, it
should at least be implemented as a Gerrit plugin, so that it would at
least stand a chance of becoming a well-integrated solution.
The problem with tagging generally, though, is that it ended up being
this weird parallel workflow to other systems. "fixme" was frequently
used as a substitute for filing a bug report. "scaptrap" was a
substitute for proper deployment notes, and "backcompat" was a
solution for proper developer notes. That said, it was lightweight,
which meant that it actually got used, as opposed to many "proper"
solutions, which are frequently enough work that it's difficult to
expect uniform followthrough.
The solution that Chad and I discussed is an addition to the
Bugzilla-Gerrit plugin that Christian is already working on. The idea
would be that, for any given revision, there would be a "file bug
about this revision" link. Following that link would throw to the
standard Bugzilla bug filing page, with as many fields prepopulated
based on Gerrit context as could sensibly be filled (including, at the
very minimum, a link back to the Gerrit rev, but probably also with
the assignee set to the developer who introduced the issue, the
component set based on the repo).
A Bugzilla-based solution would be an ideal replacement for "fixme",
since fixmes are basically bugs anyway. It would work reasonably well
for "scaptrap", since they generally imply something that needs to be
done prior to deployment. It would be an awkward replacement for
"backcompat" and others.
Still, the nice thing about this is that a Bugzilla-based solution is
that it's general purpose enough that it may very well find use
outside of Wikimedia-land. The BZ-Gerrit work is actually being done
as part of a larger issue tracker plugin that the GerritForge folks
have written to support Jira. Filing issues based on revisions is
likely a common request for people integrating Gerrit with their issue
tracking system.
Is a Bugzilla-based solution worthwhile enough for our purposes for a
modest (but probably not insignificant) investment in this area, or
should we prioritize other Gerrit work higher (say, for example, a
native Gerrit tagging plugin)? Assuming we move forward with
development, anything we need to consider?
I've put the bulk of this email on mediawiki.org here:
http://www.mediawiki.org/wiki/Git/Tagging
We should evolve that page into a spec for the work that Chad and
Christian will be doing.
Rob
Hi,
The Google Summer of Code student application phase is closed.
We have got a total of 69 applications from 60 students. The numbers so
far are:
* 52 valid applications from 51 students. We are still triaging some of
the last submissions. This number might decrease over the weekend.
* 11 ignored from 10 students: out of scope, clearly incomplete or
duplicate.
* 6 withdrawn by the students.
The table at https://www.mediawiki.org/wiki/Summer_of_Code_2013 should
match the list in the GSoC site.
Next milestone: May 6
Telling Google the min/max number of slots we want before May 6. Last
year we got 9 slots and by all measurements we need a lot more this
time. Both students and mentors need to work hard to convince Google
that we deserve them!
EVERYBODY
Remember that all community members can provide feedback about features
and proposals:
http://lists.wikimedia.org/pipermail/wikitech-l/2013-May/069040.html
STUDENTS
Complete your data in the public Students table. Get feedback and
improve your proposal in your wiki page. Contribute!
https://www.mediawiki.org/wiki/How_to_contribute
MENTORS
* Fight for your slot! Remember that Google is the first who needs to be
convinced. Wish-to-mentor ALL candidates for the features you volunteer
to mentor. Don't worry yet about selecting candidates.
* Specify in the private comments whether you consider the projects you
mentor ESSENTIAL or DESIRABLE. We will use this input to ask Google
min/max slots. (Language, VisualEditor and Wikidata teams have already
sent their team decisions on max/min slots).
* Be aware of the selection process:
https://www.mediawiki.org/wiki/Mentorship_programs/Selection_process#Select…
- http://lists.wikimedia.org/pipermail/wikitech-l/2013-April/068913.html
I will submit the min/max numbers for Google by Sunday night PDT.
Questions? Just ask.
Enjoy your weekend!
PS: I won't chase students or mentors during the selection process. The
principle is: if I have to do it now then I'll have to do it even more
during the program. B)
--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Call for Submissions: Community Track at WikSym + OpenSym 2013, the Joint
International Symposium on Open Collaboration
WikiSym, the 9th International Symposium on Wikis and Open Collaboration
OpenSym, the 2013 International Symposium on Open Collaboration
August 5-7, 2013 | Hong Kong, China
http://opensym.org/wsos2013
In-cooperation with ACM SIGWEB and ACM SIGSOFT. Archived in the ACM Digital
Library.
Community track submission deadlines:
* Regular deadline: May 17, 2013
The 2013 Joint International Symposium on Open Collaboration (WikiSym +
OpenSym 2013) is the premier conference on open collaboration research and
practice, including wikis and social media, Wikipedia, free, libre, and open
source software, open access, open data and open government research. WikiSym
is in its 9th year and will be complemented by OpenSym, a new conference on
open collaboration research and practice and an adjunct to the successful
WikiSym conference series. WikiSym + OpenSym 2013 is the first conference to
bring together the different strands of open collaboration research and
practice, seeking to create synergies and inspire new collaborations between
computer scientists, social scientists, legal scholars, and everyone
interested in understanding open collaboration and how it is changing the
world. Read more about the conference at http://opensym.org/wsos2013/about
CALL FOR SUBMISSIONS: COMMUNITY TRACK
The following types of papers can be submitted to the community track:
* Experience report long and short: A regular presentation slot (30min) will
be provided
* Workshop proposals: A workshop slot (half-day or full-day) is provided at
the conference
* Panel proposals: A session (90min) discussion slot for the panel will be
provided
* Demo proposals: Space and time is provided during the demo session (90min)
* Tutorial proposals: A tutorial slot (90min) is provided at the conference
Submissions are reviewed by the community track committee for their interest
to the WikiSym + OpenSym community in general. For questions about community
track submissions, please dont hesitate to get in touch with us:
http://opensym.org/wsos2013/about
Experience Reports
Experience reports are an integral part of the conference program. These are
opportunities to discuss how ideas that sound good on paper (and at
conferences!) work in real life projects and deployments. Many attendees want
to learn from people on the front lines what it is like to do things like
start a company wiki, use open collaboration tools in a classroom, or build a
political campaign around open collaboration systems.
Experience reports are not research papers; their goal is to present
experience and reflections on a particular case, and they are reviewed for
usefulness, clarity and reflection. Strong experience reports discuss both
benefits and drawbacks of the approaches used and clearly call out lessons
learned. Reports may focus on a particular aspect of technology usage and
practice, or describe broad project experiences.
Experience reports can be long (up to 10 pages) or short (up to 4 pages). A
long experience report will receive a regular 30 minute presentation slot, a
short experience report will receive a shorter presentation slot.
Workshops
Workshops provide an opportunity for researchers and practitioners to discuss
and learn about topics that require in-depth, extended engagement such as new
systems, research methods, standards, and formats.
Workshop proposals should describe what you intend to do and how your session
will meet the criteria described above. It should include a concise abstract,
proposed time frame (half-day or full-day), what you plan to do during the
workshop, and one-paragraph biographies of all organizers.
Workshop proposals will be reviewed and selected for their interest to the
community. Each accepted workshop will be provided with a meeting room for
either a half or full day. Organizers may also request technology and
materials (projector, flip pads, etc).
Panels
Panels provide an interactive forum for bringing together people with
interesting points of view to discuss compelling issues around open
collaboration. Panels involve participation from both the panelists and
audience members in a lively discussion. Proposals for panels should describe
the topics and goals and explain how the panel will be organized and how the
Wikisym + OpenSym community will benefit. It should include a concise abstract
and one-paragraph biographies of panelists and moderators. Panel submissions
will be reviewed and selected for their interest to the community. Each panel
will be given a 90-minute time slot.
Demos
No format is better suited for demonstrating the utility of new collaboration
technologies than showing and using them. Demonstrations give presenters an
opportunity to show running systems and gather feedback. Demo submissions
should provide a setup for the demo, a specific description of what you plan
to demo, what you hope to get out of demoing, and how the audience will
benefit. A short note of any special technical requirements should be included.
Demo submissions will be reviewed based on their relevance to the community.
All accepted demos will given space at a joint demo session (90 minutes)
during the conference.
Tutorials
Tutorials tutorials are half-day classes, taught by experts, designed to help
professionals rapidly come up to speed on a specific technology or
methodology. Tutorials can be lecture-oriented or participatory. Tutorial
attendees deserve the highest standard of excellence in tutorial preparation
and delivery. Tutorial presenters are typically experts in their chosen topic
and experienced speakers skilled in preparing and delivering educational
presentations. When selecting tutorials, we will consider the presenters
knowledge of the proposed topic and past success at teaching it.
SUBMISSION INFORMATION AND INSTRUCTIONS
There are two submission deadlines, an early and a regular one. The early
deadline is for those who need to know early that their community track
submission has been accepted. This mostly applies to workshops that require a
program committee and their own paper submission and review process (as
opposed, for example, to walk-in workshops). Also, some may need the
additional time to raise funds and acquire a visa.
Submissions should follow the standard ACM SIG proceedings format. For advice
and templates, please see
http://www.acm.org/sigs/publications/proceedings-templates. All papers must
conform at time of submission to the formatting instructions and must not
exceed the page limits, including all text, references, appendices and
figures. All submissions must in PDF format.
All papers and proposals should be submitted electronically through EasyChair
using the following URL:
https://www.easychair.org/conferences/?conf=opensym2013community
SUBMISSION AND NOTIFICATION DEADLINES
* Early submission deadline: March 17, 2013
* Notification for early submissions: March 31, 2013
* Regular submission deadline: May 17, 2013
* Notification for regular submissions: May 31, 2013
* Camera-ready for both rounds: June 9, 2013
As long as it is May 17 somewhere on earth, your submission will be accepted.
COMMUNITY TRACK PROGRAM COMMITTEE
Chairs
Regis Barondeau (Université du Québec à Montréal)
Dirk Riehle (Friedrich-Alexander University Erlangen-Nürnberg)
--
Website: http://dirkriehle.com - Twitter: @dirkriehle
Ph (DE): +49-157-8153-4150 - Ph (US): +1-650-450-8550
http://blog.okfn.org/2013/05/07/okcon-2013-call-for-proposals-out-now/
OKCon is the annual conference for Open Knowledge (Foundation),
17th-18th September 2013, Geneva, Switzerland. It was called "OKFest"
last year. It's a well-attended and well-organized conference for anyone
interested in open knowledge, sharing, open hacking, etc.
Opportunities for Wikimedia lighting talks, workshops, etc.:
- Wikipedia Zero (see Open Development & Sustainability track)
- Analytics and open data (see Technology, Tools & Business)
(UserMetrics API? privacy? Limn?)
- SOPA/PIPA and related activities (see Evidence & Stories)
- Hack events: use their hackspace. Teach folks to make bots,
gadgets, apps, and Lua templates. Get user testing from other
open culture advocates and learn what tools they need.
This conference is eligible for subsidy of travel costs -- see
Participation Support
https://meta.wikimedia.org/wiki/Participation:Support to put in your
request.
Thanks to Sarah Stierch for the heads-up.
--
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation
Hi, long email but only for GSoC mentors and curious minds alike.
WARNING: GSoC & common sense requires absolute confidentiality about
discussions or resolutions of candidates. You can't share any
confidential information, no matter how evident it looks to you or how
well you get along with a student or anybody without mentor / org admin
access to Wikimedia GSoC. Google forbids explicitly any leakage of
information before they publish officially the results on May 27.
We can and we must discuss publicly our selection process, though.
Ideally GSoC mentors would have a private call and discuss until
deciding on a ranking of candidates. But with 38 people in different
timezones, 47 proposals and ? slots this clearly won't work.
Some organizations resolve this situation with votes, but I don't think
this is a good solution in our context and the Wikimedia community
favors consensus over voting anyway.
Your distributed feedback on essential/desirable features has been very
useful to make a first decision. Let's try a second round of distributed
feedback to solve the clear cases:
If you would be the only one deciding, how would you rank the proposals
received (see the list below)? Please send me a PRIVATE email (not to
this list!) with your ranking of features, ideally before the end of
tomorrow Tuesday.
* Read
https://www.mediawiki.org/wiki/Mentorship_programs/Selection_process
(just updated) and act accordingly.
* Rank at least the projects you would prioritize before the one(s) you
want to mentor. All the better if you rank more. Don't rank based on the
title alone. All proposals have mentors feedback by now.
* No need to decide on specific candidates for the features you are not
mentoring. It is enough to rank "Feature X", without you having to
decide which of the students proposing Feature X should be selected.
* But you do need to specify which student you select for the 1-2
projects you are co-mentoring. Agree the names with your co-mentors. You
can't be in more than 2 projects, and ideally in just one.
Mentors are free to skip the ranking game and go directly for the call.
In that case their proposals will be ranked based on the feedback from
the rest of us.
I will consolidate sensibly all this feedback on Wednesday, in a private
document shared with the mentors. Hopefully some proposals will be clear
candidates to be accepted or declined. Then we will also know how many
slots we are getting from Google, and we can focus the discussion in one
call or more with the mentors of the unclear cases.
Should work. We'll see.
The list of projects to rank:
* Android app for MediaWiki translation
* Auto suggestion of categories
* Automatic category redirects
* Bayesan Spam Filter
* Centralized Search Engine
* Contribute to Wikimedia
* Curriculum Wiki
* Entity Suggester for Wikidata
* Improve support for book structures
* Improvement of glossary tools
* Incremental data dumps
* Incremental updates for Kiwix
* Internationalization and Right-To-Left Support in VisualEditor
* jQuery.IME extensions for Firefox and Chrome
* jQuery.IME next big release improvements
* Language Coverage Matrix Dashboard
* MediaWiki API 2.0
* MediaWiki-Moodle extension
* Mobilize Wikidata
* Pronunciation Recording Extension
* Prototyping inline comments
* Refactoring of Proofread Page extension
* Section handling in Semantic Forms
* UploadWizard: Book upload customization
* VisualEditor Math Equation Plugin
* VisualEditor plugin for source code editing
* VisualEditor plugins
* Wikidata features
* Wikidata language fallback and conversion
* Wikipedia - My Encyclopedia
--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil