Hi,
as I understand, maybe you should request increasing it on Phabricator, following instructions on https://phabricator.wikimedia.org/project/view/2880/ .

But, i can't guarantee to this will be accepted.

Best regards,
Zoran.

нед, 29. дец 2019. у 14:28 Dirk Hünniger via Cloud <cloud@lists.wikimedia.org> је написао/ла:
Hi,

I am running the mediawiki2latex web service for converting wiki
articles to downloadable formats.

https://mediawiki2latex.wmflabs.org/

The output files generated aim for high quality printing, and thus
contain high resolution images (300 dpi). This makes the task to create
them computationally expensive. For single articles the effect is
usually acceptable, but for collections of Wikipedia articles as for
example found in the Wikipedia "Book:" namespace, the rendering time of
a single Book is often a few hours. It has been suggested to cache the
created make them available for instant download. I have started to
generate PDF files on an old dual core laptop (two at a time) of all
books in the Wikipedia "Book:" namespace. Currently received more than
1000 PDFs using 100GByte of space on disk after one month of compute
time. Since there are about 6000 such books everything will safely fit
into 1 TByte. I would like to ask for one TByte of space that can be
mounted into the mediawiki2latex VM, so I can make them available from
web service for instant download. What is the administrative process I
need to follow for this request. Alternatively I could also host the
files at home or and external provider, but I doubt that this is
actually an option.

Yours Dirk Hünniger


_______________________________________________
Wikimedia Cloud Services mailing list
Cloud@lists.wikimedia.org (formerly labs-l@lists.wikimedia.org)
https://lists.wikimedia.org/mailman/listinfo/cloud