Given a wiki and a list of article titles, we can make an offline
snapshot easily. To me the question is: What kind of selections? Based
on which approach? Using which data exactly?... and then probably find
to apply it and get the list of article titles.
On 06.07.22 11:50, Željko Blaće wrote:
Dear fellows - I would like to offline host
minimal set of Wiki pages (Wikipedias, Wikispore, Commons)
and minimal set of media (images, video and audio)
that all branch out from single Reasonator
(or
Portal.toolforge.org <http://Portal.toolforge.org>) search...
...and that keep global links beyond 2 degrees
(so that one can use them to continue online).
For now I think of just downloading pages offline
and hosting them as mobile wiki-to-static pages
using wget with manual corrections.
I would love to do this for monuments and
tiny libraries like this
https://w.wiki/5QWn <https://w.wiki/5QWn>
so maybe have library-like single-file wiki
(like
https://tiddlywiki.com <https://tiddlywiki.com> system)
so that users could connect to it
and live notices.
Anyone has idea if and how to do this better
and package in the most elegant way?
Anyone interested in collaborating?
Best Z. Blace
_______________________________________________
Offline-l mailing list -- offline-l(a)lists.wikimedia.org
To unsubscribe send an email to offline-l-leave(a)lists.wikimedia.org