On meta.wikimedia I used to see the filter rule "user_emailconfirm". I
am not seeing that currently, and I am trying to work out the reason
why, can someone please confirm one of ...
* user_emailconfirm is no longer a valid filter rule
OR
* meta.wikimedia will only show that field in the AF if that is their
"home" wiki
OR
* it only actually shows in the AF if the email is confirmed
Thanks. Regards, Billinghurst
Hi,
I am Yeongjin Jang, a Ph.D. Student at Georgia Tech.
In our lab (SSLab, https://sslab.gtisc.gatech.edu/),
we are working on a project called B2BWiki,
which enables users to share the contents of Wikipedia through WebRTC
(peer-to-peer sharing).
Website is at here: http://b2bwiki.cc.gatech.edu/
The project aims to help Wikipedia by donating computing resources
from the community; users can donate their traffic (by P2P communication)
and storage (indexedDB) to reduce the load of Wikipedia servers.
For larger organizations, e.g. schools or companies that
have many local users, they can donate a mirror server
similar to GNU FTP servers, which can bootstrap peer sharing.
Potential benefits that we think of are following.
1) Users can easily donate their resources to the community.
Just visit the website.
2) Users can get performance benefit if a page is loaded from
multiple local peers / local mirror (page load time got faster!).
3) Wikipedia can reduce its server workload, network traffic, etc.
4) Local network operators can reduce network traffic transit
(e.g. cost that is caused by delivering the traffic to the outside).
While we are working on enhancing the implementation,
we would like to ask the opinions from actual developers of Wikipedia.
For example, we want to know whether our direction is correct or not
(will it actually reduce the load?), or if there are some other concerns
that we missed, that can potentially prevent this system from
working as intended. We really want to do somewhat meaningful work
that actually helps run Wikipedia!
Please feel free to give as any suggestions, comments, etc.
If you want to express your opinion privately,
please contact sslab(a)cc.gatech.edu.
Thanks,
--- Appendix ---
I added some detailed information about B2BWiki in the following.
# Accessing data
When accessing a page on B2BWiki, the browser will query peers first.
1) If there exist peers that hold the contents, peer to peer download happens.
2) otherwise, if there is no peer, client will download the content
from the mirror server.
3) If mirror server does not have the content, it downloads from
Wikipedia server (1 access per first download, and update).
# Peer lookup
To enable content lookup for peers,
we manage a lookup server that holds a page_name-to-peer map.
A client (a user's browser) can query the list of peers that
currently hold the content, and select the peer by its freshness
(has hash/timestamp of the content,
has top 2 octet of IP address
(figuring out whether it is local peer or not), etc.
# Update, and integrity check
Mirror server updates its content per each day
(can be configured to update per each hour, etc).
Update check is done by using If-Modified-Since header from Wikipedia server.
On retrieving the content from Wikipedia, the mirror server stamps a timestamp
and sha1 checksum, to ensure the freshness of data and its integrity.
When clients lookup and download the content from the peers,
client will compare the sha1 checksum of data
with the checksum from lookup server.
In this settings, users can get older data
(they can configure how to tolerate the freshness of data,
e.g. 1day older, 3day, 1 week older, etc.), and
the integrity is guaranteed by mirror/lookup server.
More detailed information can be obtained from the following website.
http://goo.gl/pSNrjR
(URL redirects to SSLab@gatech website)
Please feel free to give as any suggestions, comments, etc.
Thanks,
--
Yeongjin Jang
Hi everyone,
The Community Tech team's Wishlist Survey is now open for voting; come on
over and upvote your favorites. We're looking for the most important
features and fixes that our team can work on to help the core contributors
on Wikimedia projects.
https://meta.wikimedia.org/wiki/2015_Community_Wishlist_Survey
We've got more than 100 proposals to choose from, organized on
easy-to-browse category pages. As a taster, some of the contenders include:
* Improve diff compare screen
* Migrate dead links to the Wayback Machine
* Enhanced per-user, per-article blocking
* Cite : Share : Export tools
* Improve SVG rendering
* Cross-wiki watchlists
* Pageview Stats tool
And if anything on that list makes you excited, outraged or -- well,
honestly, any reaction besides a blank stare -- then you need to come and
vote for the ideas you like best. Once the voting is over, the prioritized
list becomes the Community Tech's backlog of projects to investigate and
address.
We'll be posting invites to as many village pumps as we can find tomorrow,
to make sure that everyone has the chance to vote. All of the voting pages
are marked for translation, and we would welcome any volunteers to help
translate a proposal into a language of their choice.
Thanks for checking it out; I'm looking forward to seeing your votes.
Danny
Product Manager, WMF Community Tech
(I'm cross-posting this to Wikimedia-l, Wikitech-l, and the WMF staff list
-- apologies to people who subscribe to all three for the duplicate spam.)
Hi Community Metrics team,
this is your automatic monthly Phabricator statistics mail.
Number of accounts created in (2015-11): 255
Number of active users (any activity) in (2015-11): 818
Number of task authors in (2015-11): 461
Number of users who have closed tasks in (2015-11): 247
Number of projects which had at least one task moved from one column
to another on their workboard in (2015-11): 173
Number of tasks created in (2015-11): 2582
Number of tasks closed in (2015-11): 2139
Number of open and stalled tasks in total: 27146
Median age in days of open tasks by priority:
Unbreak now: 27
Needs Triage: 137
High: 174
Normal: 326
Low: 680
Lowest: 543
(How long tasks have been open, not how long they have had that priority)
TODO: Numbers which refer to closed tasks might not be correct, as described in T1003.
Yours sincerely,
Fab Rick Aytor
(via community_metrics.sh on iridium at Tue Dec 1 00:00:08 UTC 2015)