On Tue, Aug 2, 2016 at 5:57 PM, Jefsey <jefsey(a)jefsey.com> wrote:
> At 02:44 02/08/2016, John wrote:
>>
>> For mass imports, use importDump.php - see
>> <<http://www.mediawiki.org/wiki/Manual:Importing_XML_dumps>http://www.mediawiki.org/wiki/Manual:Importing_XML_dumps>
>> for details.
>
>
> Dear John,
>
> Thank you for your response. But I am dump myself and I am still lost.
>
> I am not sure I made my need clear enough. I had wikis under MySQL I dumped
> in XML. So I have n XML files to create n separate SQLite wikis. One file
> one SQLite wiki (n is around 35).
>
> When I try to read with an XML viewer these files, they tell me that some
> character in line 295 or so is wrong and they cannot read it, so I do not
> really know how they are structured.
>
> Therefore, my question is about how to create the n SQLwikis I need and
> import in each the pages which are in XML dumped from MySQL.
>
> Also, further on, I understand that the manual is reporting about MySQL
> dumps, not about SQLite dumps? Does that mean that I should consider their
> backup with SQLite tools rathere than mediwiki tools? Hence my question
> about a mediawiki SQLite section/mailing list?
>
> Sorry I am pretty new to this and need to take care of a lot of small
> Libre/Citizen/locally oriented wikis, with more to come. So I try to figure
> out the best way to manage all this ....
>
> Thank you !
>
> jfc
>
>
Note, that there is an XML dump format used by MediaWiki, but there is
also an xml dump format used by mysqldump. Its unclear from your email
which one of those you have. The previous people in the thread are
assuming you're talking about a MediaWiki xml dump file, but if you
are talking about a mysql XML dump file, things are probably going to
be much harder to convert to sqlite.
--
brian
XML dumps are xml dumps regardless of the backend used. The importdump.php
tool that I referenced takes care of the minor differences between the
versions. Go ahead and follow the process for importing XML databases.
On Tue, Aug 2, 2016 at 1:57 PM, Jefsey <jefsey(a)jefsey.com> wrote:
> At 02:44 02/08/2016, John wrote:
>
> For mass imports, use importDump.php - see <
> http://www.mediawiki.org/wiki/Manual:Importing_XML_dumps> for details.
>
>
> Dear John,
>
> Thank you for your response. But I am dump myself and I am still lost.
>
> I am not sure I made my need clear enough. I had wikis under MySQL I
> dumped in XML. So I have n XML files to create n separate SQLite wikis. One
> file one SQLite wiki (n is around 35).
>
> When I try to read with an XML viewer these files, they tell me that some
> character in line 295 or so is wrong and they cannot read it, so I do not
> really know how they are structured.
>
> Therefore, my question is about how to create the n SQLwikis I need and
> import in each the pages which are in XML dumped from MySQL.
>
> Also, further on, I understand that the manual is reporting about MySQL
> dumps, not about SQLite dumps? Does that mean that I should consider their
> backup with SQLite tools rathere than mediwiki tools? Hence my question
> about a mediawiki SQLite section/mailing list?
>
> Sorry I am pretty new to this and need to take care of a lot of small
> Libre/Citizen/locally oriented wikis, with more to come. So I try to figure
> out the best way to manage all this ....
>
> Thank you !
>
> jfc
>
>
> On Mon, Aug 1, 2016 at 8:37 PM, Jefsey <jefsey(a)jefsey.com> wrote:
> I am not familiar with databases. I have old MySQL based wikis sites I
> cannot access anymore due to a change in PHP and MySQL versions. I have old
> XML dumps. Is it possible to reload them under SQLight wikis? These were
> working group wikis: we only are interested in restoring texts. We have the
> images. We are not interested in the access rights: we will have to rebuild
> them anyway.
>
> Thank you for the help !
> jefsey
>
> PS. We dedicate to light wikis which are OK under SQLight, would there be
> a dedicated list to SQLight mangement (and further on development)?
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
At 02:44 02/08/2016, John wrote:
>For mass imports, use importDump.php - see
><<http://www.mediawiki.org/wiki/Manual:Importing_XML_dumps>http://www.mediawiki.org/wiki/Manual:Importing_XML_dumps>
>for details.
Dear John,
Thank you for your response. But I am dump myself and I am still lost.
I am not sure I made my need clear enough. I had wikis under MySQL I
dumped in XML. So I have n XML files to create n separate SQLite
wikis. One file one SQLite wiki (n is around 35).
When I try to read with an XML viewer these files, they tell me that
some character in line 295 or so is wrong and they cannot read it, so
I do not really know how they are structured.
Therefore, my question is about how to create the n SQLwikis I need
and import in each the pages which are in XML dumped from MySQL.
Also, further on, I understand that the manual is reporting about
MySQL dumps, not about SQLite dumps? Does that mean that I should
consider their backup with SQLite tools rathere than mediwiki tools?
Hence my question about a mediawiki SQLite section/mailing list?
Sorry I am pretty new to this and need to take care of a lot of small
Libre/Citizen/locally oriented wikis, with more to come. So I try to
figure out the best way to manage all this ....
Thank you !
jfc
>On Mon, Aug 1, 2016 at 8:37 PM, Jefsey
><<mailto:jefsey@jefsey.com>jefsey(a)jefsey.com> wrote:
>I am not familiar with databases. I have old MySQL based wikis sites
>I cannot access anymore due to a change in PHP and MySQL versions. I
>have old XML dumps. Is it possible to reload them under SQLight
>wikis? These were working group wikis: we only are interested in
>restoring texts. We have the images. We are not interested in the
>access rights: we will have to rebuild them anyway.
>
>Thank you for the help !
>jefsey
>
>PS. We dedicate to light wikis which are OK under SQLight, would
>there be a dedicated list to SQLight mangement (and further on development)?
>
>_______________________________________________
>Wikitech-l mailing list
><mailto:Wikitech-l@lists.wikimedia.org>Wikitech-l(a)lists.wikimedia.org
>https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
Hi all,
With some help from Brandon, I've changed deployment-prep to use Let's
Encrypt instead of the self-signed cert I added last year (to get HTTPS
working - albeit improperly-signed - instead of nothing, and nginx/puppet
working on the Varnish instances again).
It should now behave much more like production - TLS redirects are enabled
in Varnish, and you shouldn't have to ignore cert warnings to use it now.
Details for HTTPS in deployment-prep are spread out over various tickets,
but the main one now is https://phabricator.wikimedia.org/T50501
The puppetisation still needs some work, but it's cherry-picked on
deployment-puppetmaster and seems to be working reliably.
Pages with images may need to be null-edited to make MediaWiki generate
HTTPS URLs for them so browsers don't block the images.
Please let me know if you find any beta.wmflabs.org domains that aren't
covered by the cert or aren't redirecting HTTP to HTTPS in Varnish.
--
Alex Monk
For mass imports, use importDump.php - see <
http://www.mediawiki.org/wiki/Manual:Importing_XML_dumps> for details.
On Mon, Aug 1, 2016 at 8:37 PM, Jefsey <jefsey(a)jefsey.com> wrote:
> I am not familiar with databases. I have old MySQL based wikis sites I
> cannot access anymore due to a change in PHP and MySQL versions. I have old
> XML dumps. Is it possible to reload them under SQLight wikis? These were
> working group wikis: we only are interested in restoring texts. We have the
> images. We are not interested in the access rights: we will have to rebuild
> them anyway.
>
> Thank you for the help !
> jefsey
>
> PS. We dedicate to light wikis which are OK under SQLight, would there be
> a dedicated list to SQLight mangement (and further on development)?
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
I am not familiar with databases. I have old MySQL based wikis sites
I cannot access anymore due to a change in PHP and MySQL versions. I
have old XML dumps. Is it possible to reload them under SQLight
wikis? These were working group wikis: we only are interested in
restoring texts. We have the images. We are not interested in the
access rights: we will have to rebuild them anyway.
Thank you for the help !
jefsey
PS. We dedicate to light wikis which are OK under SQLight, would
there be a dedicated list to SQLight mangement (and further on development)?
WikiConference North America will take place October 7 through 10 in San
Diego.
The session tracks are:
1. Community
2. Advocacy & Outreach
3. Technology & Infrastructure
4. Health care and science
5. GLAM
6. Education and Academic Engagement
Please submit proposals here: https://wikiconference.org/wiki/Submissions
The submission deadline is August 31st.
Pine
Hi Community Metrics team,
This is your automatic monthly Phabricator statistics mail.
Accounts created in (2016-07): 288
Active users (any activity) in (2016-07): 844
Task authors in (2016-07): 459
Users who have closed tasks in (2016-07): 261
Projects which had at least one task moved from one column to another on
their workboard in (2016-07): 229
Tasks created in (2016-07): 2616
Tasks closed in (2016-07): 2517
Open and stalled tasks in total: 30718
Median age in days of open tasks by priority:
Unbreak now: 18
Needs Triage: 186
High: 325
Normal: 475
Low: 774
Lowest: 605
(How long tasks have been open, not how long they have had that priority)
TODO: Numbers which refer to closed tasks might not be correct, as
described in https://phabricator.wikimedia.org/T1003 .
Yours sincerely,
Fab Rick Aytor
(via community_metrics.sh on iridium at Mon Aug 1 00:00:14 UTC 2016)