My company has a large website (FreeEnergyNews.com) that has directory pages that we want to migrate over to our wiki site (PESWiki.com) so that they can be collaboratively developed. It is very tedious to have to copy the text and the links, save down the images, upload the images at PESWiki. It would be nice if there were a way to import the images as a set, for example. Alternatively, we could just call for the images from FreeEnergyNews.com. I like to right-justify the images next to a bullet item. I suppose we could create a table, putting the image in the right-hand cell, and the text in the left, but that gets pretty tedious too.
Might there be a faster way to import the content from pages, including the images?
FreeEnergyNews is administered via FrontPage. PESWiki uses the mediawiki set-up.
Any suggestions would be appreciated.
Thanks
| Sterling D. Allan
|
| New Energy Congress
| http://NewEnergyCongress.org
|
| PES Network, Inc, Executive Director
| http://PureEnergySystems.com
| http://FreeEnergyNews.com
| http://PESN.com
| http://PESWiki.com
|
| "The best news and directory service on the net
| regarding cutting edge energy technologies."
|
| newsletter: fe_updates-subscribe(a)yahoogroups.com
|
| home office: 1-801-407-1292
| Eagle Mountain, Utah, USA
I have searched google and mediawiki and had no success in finding
information on creating a template that would list the 10 most recent
created pages. Any one know of a tutorial or info on how to do this?
Chris
G'day,
How would I go about adding memcached configuration to an existing
MediaWiki 1.8 installation? I have scrounged config/index.php, and it
seems I haven't found it (even though it probably was in front of my
face all the time).
TIA,
Jashank
I'm trying to write a 'foreach' parser function. here is the code:
public function foreach2Hook(&$parser, $text = '', $pattern = '', $replacement = '', $insep = ',', $outsep = ',' ) {
$text= trim( $text );
$variables = explode(trim($insep), $text);
$return = preg_replace(trim($pattern), trim($replacement), $variables);
return implode(trim($outsep), $return);
}
my problem is that (i think) $text contains unprocessed text. i want the text after the wiki engine has gone over it, or, make it go over it (giving me the resulting html)
thanks,
ittay
--
===================================
Ittay Dror,
Chief architect, openQRM group leader,
R&D, Qlusters Inc.
ittayd(a)qlusters.com
+972-3-6081994 Fax: +972-3-6081841
www.openqrm.org - Data Center Provisioning
Does Mediawiki HAVE to use the Pear:Mail functions? The forum software on the server doesn't require the PEAR:Mail functions. It uses simple SMTP stuff.
Is there any way to get the E-mail functions to work without using Pear:Mail?
thanks
Erik
__________________________________________________
Do You Yahoo!?
Tired of spam? Yahoo! Mail has the best spam protection around
http://mail.yahoo.com