Hi, I would like to request custom dump(s) of all (local) images on these
wikis: enwiktionary, eswiktionary, frwiktionary, dewiktionary, enwikiquote,
eswikiquote, frwikiquote, dewikiquote, enwikibooks, eswikibooks,
frwikibooks, dewikibooks. Also, if possible, it would be helpful (but not
required) to also have a dump of just the commons images that are used on
those wikis.
For context on this e-mail, see:
https://meta.wikimedia.org/wiki/Wikimedia_Forum#Help_with_old_dumps and
https://meta.wikimedia.org/wiki/User_talk:Xaosflux#Images
Hi,
I am interested in performing analysis on recently created pages on English
Wikipedia.
One way to find recently created pages is downloading a meta-history file
for the English language, and filter through the XML, looking for pages
where the oldest revision is within the desired timespan.
Since this requires a library to parse through XML string data, I would
imagine this is much slower than a database query. Is page revision data
available in one of the SQL dumps which I could query for this use case?
Looking at the exported tables list
<https://meta.wikimedia.org/wiki/Data_dumps/What%27s_available_for_download#…>,
it does not look like it is. Maybe this is intentional?
Thanks,
Eric Andrew Lewis
ericandrewlewis.com
+1 610 715 8560
Hello,
on 23.06.2015, 16:35, I got kind help here for this task, and Ariel T.
Glenn created
http://tools.wmflabs.org/betacommand-dev/reports/commonswiki_svg_list.txt.7z
for me.
(That old file looked like:
img_name
!xoo_orthography.svg
"12_Wikivoyage_compass2_definitivo.svg
"12_Wikivoyage_compass_definitivo.svg
"12_World_fly.svg
"Bookmark".svg
"Bow-tie"_diagram_of_components_in_a_directed_network_SVG.svg
"Coalition"Japanese_castle_Tenshu_layout_format.svg
"Composite"Japanese_castle_Tenshu_layout_format.svg
"Consolidation"Japanese_castle_Tenshu_layout_format.svg
"Countless_squares".svg
[... and so on, 916,585 lines, last line empty]
)
Could i get an updated version of this list, please?
And if it is possible also a link to a compressed 7z/zip/rar file of the
svg-files themselves? Preferably split in many small parts, as I
wouldn't be able to download such a hugh data size in one go, and
resuming downloads mostly, but not always work.
Is it possbile for me to create such a list or compressed collection of
all svg-files myself? (Though beside this one task I have no other tasks
from time to time.)
With best regards
D. Hansen
Hello -
I’m interested in hosting/mirroring the monthly XML dumps and additional data files.
I sent an email to “ops-dumps(a)wikimedia.org <mailto:ops-dumps@wikimedia.org>”, but haven’t received a response.
Is anyone aware if that email is still active or if there’s still interest in finding additional mirrors?
Thanks
Ryan
Greetings XML Dump users and contributors!
This is your automatic monthly Dumps FAQ update email. This update
contains figures for the 20221201 full revision history content run.
We are currently dumping 957 projects in total.
---------------------
Stats for frwiki on date 20221201
Total size of page content dump files for articles, current content only:
25,493,373,226
Total size of page content dump files for all pages, current content only:
43,271,175,277
Total size of page content dump files for all pages, all revisions:
4,317,640,984,437
---------------------
Stats for enwiki on date 20221201
Total size of page content dump files for articles, current content only:
91,972,406,092
Total size of page content dump files for all pages, current content only:
193,217,452,675
Total size of page content dump files for all pages, all revisions:
25,820,837,484,338
---------------------
Sincerely,
Your friendly Wikimedia Dump Info Collector