You're not the first to be on the list requesting help with that domain. It's off of GoDaddy. A whois from Godaddy gives the following, best of luck with it:
Domain Name: WIKIWHERE.NET Registrar URL: http://www.godaddy.com Updated Date: 2013-02-23 04:43:12 Creation Date: 2012-02-22 02:38:12 Registrar Expiration Date: 2014-02-22 02:38:12 Registrar: GoDaddy.com, LLC DomainStatus: clientDeleteProhibited DomainStatus: clientRenewProhibited DomainStatus: clientTransferProhibited DomainStatus: clientUpdateProhibited Registrant Name: xu zhan Registrant Organization: Registrant Street: Nanjing Ti Yu Ju,Zhong Shan Dong Lu 145# Registrant City: Nanjing Registrant State/Province: Jiangsu Registrant Postal Code: 210002 Registrant Country: China Admin Name: xu zhan Admin Organization: Admin Street: Nanjing Ti Yu Ju,Zhong Shan Dong Lu 145# Admin City: Nanjing Admin State/Province: Jiangsu Admin Postal Code: 210002 Admin Country: China Admin Phone: 1-318-294-9591 Admin Fax: Admin Email: xzwxb98@gmail.com Tech Name: xu zhan Tech Organization: Tech Street: Nanjing Ti Yu Ju,Zhong Shan Dong Lu 145# Tech City: Nanjing Tech State/Province: Jiangsu Tech Postal Code: 210002 Tech Country: China Tech Phone: 1-318-294-9591 Tech Fax: Tech Email: xzwxb98@gmail.com Name Server: NS1.MEDIATEMPLE.NET Name Server: NS2.MEDIATEMPLE.NET
On 8/7/13 10:19 PM, wlbjiab wrote:
Dear Officer,
Here is Mr. King Wei from Wuhan Line Power Transmission Equipment Co., Ltd.. We found our company establishing year on http://supplier.wikiwhere.net/Wuhan_Line_Power_Transmission_Equipment_Co.,_L.... is incorrect, we want to change 2008 into 2000. Pls. inform me of the editor who could modify this article?
Appreciate your further instructions, and looking for your soonest reply!
Best regards
King Wei Import & Export Manager Wuhan Line Power Transmission Equipment Co., Ltd. No.5 Yangguang Road, Miaoshan, Wuhan (430223), Hubei, China. Tel.: +86 27 81319015 ext. 8032 Fax.: +86 27 81319175 Mob.: +86 15827280627 www.whlinepower.com Skype: wlbjiab
From: mediawiki-l-request Date: 2013-08-07 20:00 To: mediawiki-l Subject: MediaWiki-l Digest, Vol 119, Issue 4 Send MediaWiki-l mailing list submissions to mediawiki-l@lists.wikimedia.org
To subscribe or unsubscribe via the World Wide Web, visit https://lists.wikimedia.org/mailman/listinfo/mediawiki-l or, via email, send a message with subject or body 'help' to mediawiki-l-request@lists.wikimedia.org
You can reach the person managing the list at mediawiki-l-owner@lists.wikimedia.org
When replying, please edit your Subject line so it is more specific than "Re: Contents of MediaWiki-l digest..."
Today's Topics:
- Page edit question (John W. Foster)
- dumpBackup.php (John W. Foster)
Message: 1 Date: Tue, 06 Aug 2013 13:30:56 -0500 From: "John W. Foster" jfoster81747@gmail.com To: Mediawiki mediawiki-l@lists.wikimedia.org Subject: [MediaWiki-l] Page edit question Message-ID: 1375813856.14457.1.camel@beast.home Content-Type: text/plain; charset=UTF-8
Im trying to edit a main Page on a new site; I,m getting this error & not sure why, as the file called into error is installed as it is called for. Any help is appreciated. New to Semantic Mediawiki.
Warning: Missing argument 2 for ParamProcessor \ParamDefinition::__construct(), called in /home/content/07/11469707/html/extensions/SemanticMediaWiki/includes/SMW_QueryProcessor.php on line 589 and defined in/home/content/07/11469707/html/extensions/Validator/includes/ParamProcessor/ParamDefinition.php on line 175
Fatal error: Call to a member function setPrintRequests() on a non-object in /home/content/07/11469707/html/extensions/SemanticMediaWiki/includes/SMW_QueryProcessor.php on line 79
john
Message: 2 Date: Tue, 06 Aug 2013 21:08:37 -0500 From: "John W. Foster" jfoster81747@gmail.com To: mediawiki-list mediawiki-l@lists.wikimedia.org Subject: [MediaWiki-l] dumpBackup.php Message-ID: 1375841317.23926.3.camel@beast.home Content-Type: text/plain; charset=UTF-8
Ive used this to dump what I hoped would be a complete backup of my local hosted mediawiki. Th purpose of which was to import the .xml file to a new working server. The script to import it did the job. however the dumped.xml file did not contain all the articles. It uploaded 476 of a site that contains 5391 articles. Just wondering why. Ive done it 3 times with no different results. john
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
End of MediaWiki-l Digest, Vol 119, Issue 4
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l