Thanks alot Dmitriy ! I thinks its enough to get me started. will email if stuck some where .
Noman
On Tue, Aug 3, 2010 at 7:59 PM, wikitech-l-request@lists.wikimedia.orgwrote:
Send Wikitech-l mailing list submissions to wikitech-l@lists.wikimedia.org
To subscribe or unsubscribe via the World Wide Web, visit https://lists.wikimedia.org/mailman/listinfo/wikitech-l or, via email, send a message with subject or body 'help' to wikitech-l-request@lists.wikimedia.org
You can reach the person managing the list at wikitech-l-owner@lists.wikimedia.org
When replying, please edit your Subject line so it is more specific than "Re: Contents of Wikitech-l digest..."
Today's Topics:
- Re: wikipedia is one of the slower sites on the web (Liangent)
- Re: wikipedia is one of the slower sites on the web (K. Peachey)
- Re: chanfing main page articles from drop down. help required (Dmitriy Sintsov)
- Re: wikipedia is one of the slower sites on the web (John Vandenberg)
- Re: wikipedia is one of the slower sites on the web (Platonides)
- Re: chanfing main page articles from drop down. help required (Platonides)
- Re: Debian packages (was MediaWiki version statistics) (David Gerard)
- Re: Showing bytes added/removed in each edit in "View history" and "User contributions" (Aryeh Gregor)
- Re: Showing bytes added/removed in each edit in "View history" and "User contributions" (soxred93)
Message: 1 Date: Tue, 3 Aug 2010 18:54:38 +0800 From: Liangent liangent@gmail.com Subject: Re: [Wikitech-l] wikipedia is one of the slower sites on the web To: Wikimedia developers wikitech-l@lists.wikimedia.org Message-ID: <AANLkTikotqe0ptb1rcT3J10m+hgim69my1jzg6dBN-bv@mail.gmail.comAANLkTikotqe0ptb1rcT3J10m%2Bhgim69my1jzg6dBN-bv@mail.gmail.com
Content-Type: text/plain; charset=UTF-8
On 8/3/10, Lars Aronsson lars@aronsson.se wrote:
Couldn't you just tag every internal link with a separate class for the length of the target article, and then use different personal CSS to set the threshold? The generated page would be the same for all users:
So if a page is changed, all pages linking to it need to be parsed again. Will this cost even more?
Message: 2 Date: Tue, 3 Aug 2010 20:55:23 +1000 From: "K. Peachey" p858snake@yahoo.com.au Subject: Re: [Wikitech-l] wikipedia is one of the slower sites on the web To: Wikimedia developers wikitech-l@lists.wikimedia.org Message-ID: AANLkTi=23L1UzmH1DpmeDt4vWj-w8nFkNxqOb9OP0R7j@mail.gmail.com Content-Type: text/plain; charset=UTF-8
Would something like what is shown below get it even further down?
a { color: blue } a.1_byte_article, a.2_byte_article, a.3_byte_article, a.4_byte_article, a.5_byte_article, a.6_byte_article, a.7_byte_article, a.8_byte_article, a.9_byte_article, a.10_byte_article,a.11_byte_article, a.12_byte_article, a.13_byte_article, a.14_byte_article, a.15_byte_article, a.16_byte_article, a.17_byte_article, a.18_byte_article, a.19_byte_article, a.20_byte_article, a.21_byte_article, a.22_byte_article, a.23_byte_article, a.24_byte_article, a.25_byte_article, a.26_byte_article, a.27_byte_article, a.28_byte_article, a.29_byte_article, a.30_byte_article, a.31_byte_article, a.32_byte_article, a.33_byte_article, a.34_byte_article, a.35_byte_article, a.36_byte_article, a.37_byte_article, a.38_byte_article, a.39_byte_article, a.40_byte_article, a.41_byte_article, a.42_byte_article, a.43_byte_article, a.44_byte_article, a.45_byte_article, a.46_byte_article, a.47_byte_article, a.48_byte_article, a.49_byte_article, a.50_byte_article, a.51_byte_article, a.52_byte_article, a.53_byte_article, a.54_byte_article, a.55_byte_article, a.56_byte_article, a.57_byte_article, a.58_byte_article, a.59_byte_article, a.60_byte_article, a.61_byte_article, a.62_byte_article, a.63_byte_article, a.64_byte_article, a.65_byte_article, a.66_byte_article, a.67_byte_article, a.68_byte_article, a.69_byte_article, a.70_byte_article, a.71_byte_article, a.72_byte_article, a.73_byte_article, a.74_byte_article, a.75_byte_article, a.76_byte_article, a.77_byte_article, a.78_byte_article, a.79_byte_article, a.80_byte_article, a.81_byte_article, a.82_byte_article, a.83_byte_article, a.84_byte_article, a.85_byte_article, a.86_byte_article, a.87_byte_article, a.88_byte_article, a.89_byte_article, a.90_byte_article, a.91_byte_article, a.92_byte_article, a.93_byte_article, a.94_byte_article, a.95_byte_article { color: red }
Message: 3 Date: Tue, 03 Aug 2010 15:09:24 +0400 From: Dmitriy Sintsov questpc@rambler.ru Subject: Re: [Wikitech-l] chanfing main page articles from drop down. help required To: Wikimedia developers wikitech-l@lists.wikimedia.org Message-ID: 353001217.1280833764.142604888.40626@mcgi-wr-7.rambler.ru Content-Type: text/plain; charset="us-ascii"; format="flowed"
- Noman nomang@gmail.com [Tue, 3 Aug 2010 12:04:31 +0500]:
Thanks Dmitriy, i'm looking for div solution. as iframe will give scrolling if content are large. which is not required. now i was unable to find step by step approach to develop extension.
if you have ne thing / example. i'll be waiting.
Maybe Extension:HTMLets will suits your needs. Otherwise, you have to study MediaWiki developers site:
- Perhaps one would setup a parser xml tag hook to generate proper
form/select/option and four corresponding div's code: http://www.mediawiki.org/wiki/Manual:Tag_extensions
from xml tag attributes one would generate full html which is required to select titles and to place content of these into div's.
- Perhaps one would use API to retrieve the pages whose title are taken
from option.value via javascript, then place these into div.innerHTML, again via the Javascript: http://www.mediawiki.org/wiki/API:Expanding_templates_and_rendering Another possibility is to use Title and Article classes and do your own AJAX handler: http://www.mediawiki.org/wiki/Manual:Ajax http://www.mediawiki.org/wiki/Manual:Title.php http://www.mediawiki.org/wiki/Manual:Article.php
However, that's probably a "reinventing of wheel".
Sorry for not being able to provide full example - I am not a rich guy and busy with projects to feed my family. Also, I am not the fastest coder out there. Dmitriy
Message: 4 Date: Tue, 3 Aug 2010 21:24:03 +1000 From: John Vandenberg jayvdb@gmail.com Subject: Re: [Wikitech-l] wikipedia is one of the slower sites on the web To: Wikimedia developers wikitech-l@lists.wikimedia.org Message-ID: <AANLkTi=aJ13U8URrCk+r+L91Owqwvak3VTup57nRFvMB@mail.gmail.comaJ13U8URrCk%2Br%2BL91Owqwvak3VTup57nRFvMB@mail.gmail.com
Content-Type: text/plain; charset=ISO-8859-1
On Tue, Aug 3, 2010 at 8:55 PM, K. Peachey p858snake@yahoo.com.au wrote:
Would something like what is shown below get it even further down?
a { color: blue } a.1_byte_article, a.2_byte_article, a.3_byte_article, ...
using an abbreviation like <x>ba would also help.
Limiting the user pref to intervals of 10 bytes would also help.
Also, as this piece of CSS is being dynamically generated,it only needs to include the variations that occur in the body of the article's HTML.
Or the CSS can be generated by JS on the client side, which is what Aryeh has been suggesting all along (I think).
btw, I thought Domas was kidding. I got a chuckle out of it, at least.
-- John Vandenberg
Message: 5 Date: Tue, 03 Aug 2010 13:55:13 +0200 From: Platonides Platonides@gmail.com Subject: Re: [Wikitech-l] wikipedia is one of the slower sites on the web To: wikitech-l@lists.wikimedia.org Message-ID: i38vt6$o9s$1@dough.gmane.org Content-Type: text/plain; charset=ISO-8859-1
Lars Aronsson wrote:
On 08/01/2010 10:55 PM, Aryeh Gregor wrote:
One easy hack to reduce this problem is just to only provide a few options for stub threshold, as we do with thumbnail size. Although this is only useful if we cache pages with nonzero stub threshold . . . why don't we do that? Too much fragmentation due to the excessive range of options?
Couldn't you just tag every internal link with a separate class for the length of the target article, and then use different personal CSS to set the threshold? The generated page would be the same for all users:
<a href="My_Article" class="134_byte_article">My Article</a>
That would be workable, eg. one class for articles smaller than 50 bytes, other for 100, 200, 250, 300, 400, 500, 600, 700, 800, 1000, 2000, 2500, 5000, 10000 if it weren't for having to update all those classes whenever the page changes.
It would work to add it as a separate stylesheet for stubs, though.
Message: 6 Date: Tue, 03 Aug 2010 13:59:06 +0200 From: Platonides Platonides@gmail.com Subject: Re: [Wikitech-l] chanfing main page articles from drop down. help required To: wikitech-l@lists.wikimedia.org Message-ID: i39049$o9s$2@dough.gmane.org Content-Type: text/plain; charset=ISO-8859-1
Noman wrote:
Hi, i've installed mediawiki for a wiki project. now we have 4 sections on main page . like there are on wikipedia main
page.
Now as its done in wikipedia these 4 boxes are tables and update on date criteria.
Now i want to do is to give some kind a navigation bar like drop down or paging. so when user selects page from drop down the all 4 pages are updated with relevant articles. to clear more. how the for parts of table can be updated . and how to put combo box ( which is already filled with page numbers / article topics)
and
which user selects any page from drop down all the 4 table rows are
updated.
One way would be to have different pages. You seem to already have lists of what four things should appear when selectiog X, so that would fit.
Another approach that may fit you is the one of: http://ca.wikipedia.org/wiki/Portada
Message: 7 Date: Tue, 3 Aug 2010 14:05:00 +0100 From: David Gerard dgerard@gmail.com Subject: Re: [Wikitech-l] Debian packages (was MediaWiki version statistics) To: Wikimedia developers wikitech-l@lists.wikimedia.org Message-ID: AANLkTiksk6CQHfFWB8==ZBvuKcSSMWtZb60f7PJhy7eJ@mail.gmail.com Content-Type: text/plain; charset=UTF-8
On 3 August 2010 00:17, Edward Z. Yang ezyang@mit.edu wrote:
? ?2. Distributors roll patches without telling upstream developers who ? ? ? would happily accept them into the mainline.
Has anyone reported the following as Debian bugs?
- Package maintainer not sending patches back upstream
- Package maintainer not visible and active in MediaWiki development
- Package maintainer not visible and active in MediaWiki community
support, leaving supporting his packages to the upstream
- d.
Message: 8 Date: Tue, 3 Aug 2010 10:53:57 -0400 From: Aryeh Gregor <Simetrical+wikilist@gmail.comSimetrical%2Bwikilist@gmail.com
Subject: Re: [Wikitech-l] Showing bytes added/removed in each edit in "View history" and "User contributions" To: Wikimedia developers wikitech-l@lists.wikimedia.org Message-ID: <AANLkTinAoOWu=60dwcM3+EQ+X-oL2mdpMdMTP6QfuqOs@mail.gmail.com60dwcM3%2BEQ%2BX-oL2mdpMdMTP6QfuqOs@mail.gmail.com
Content-Type: text/plain; charset=UTF-8
On Tue, Aug 3, 2010 at 1:14 AM, Liangent liangent@gmail.com wrote:
Byte count is used. For example in Chinese Wikipedia, one of the criteria of "Did you know" articles is ">= 3000 bytes".
I mean, is byte count used for anything where character count couldn't be used just about as well? Like is there some code that uses rev_len to figure out whether an article can fit into a field limited to X bytes, or whatever? (That's probably unsafe anyway.)
On Tue, Aug 3, 2010 at 3:48 AM, Robert Ullmann rlullmann@gmail.com wrote:
The revision size (and page size, meaning that of last revision) in bytes, is available in the API. If you change the definition there is no telling what you will break.
The same could be said of practically any user-visible change. I mean, maybe if we add a new special page we'll break some script that was screen-scraping Special:SpecialPages. We can either freeze MediaWiki and never change anything for fear that we'll break something, or we can evaluate each potential change on the basis of how likely it is to break anything. I can't see anything breaking too badly if rev_len is reported in characters instead of bytes -- the only place it's likely to be useful is in heuristics, and by their nature, those won't break too badly if the numbers they're based on change somewhat.
Message: 9 Date: Tue, 3 Aug 2010 10:59:12 -0400 From: soxred93 soxred93@gmail.com Subject: Re: [Wikitech-l] Showing bytes added/removed in each edit in "View history" and "User contributions" To: Wikimedia developers wikitech-l@lists.wikimedia.org Message-ID: 8EA35838-BF71-43BE-8BC2-DDB20416EE62@gmail.com Content-Type: text/plain; charset=US-ASCII; delsp=yes; format=flowed
Just butting in here, if I recall correctly, both the PHP-native mb_strlen() and the MediaWiki fallback mb_strlen() functions are considerably slower (1.5 to 5 times as slow). Unless there's another way to count characters for multibyte UTF strings, this would not be a feasible idea.
-X!
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
End of Wikitech-l Digest, Vol 85, Issue 11
wikitech-l@lists.wikimedia.org