User:WikiMaster/The Dingo Ate My Wiki

From PdxWikiWednesday
< User:WikiMaster
Revision as of 10:07, 3 November 2013 by WikiMaster (Talk | contribs) (Update thumbnail errors.)

Jump to: navigation, search

Server Getting Hammered?

If your wiki is crawling or frequently inaccessible do to too much traffic and too few server resources, you can try caching pages, slowing down bot and/or restricting their access, etc. Below are some of the techniques tried by PortlandWiki admins.


Basic info on robots.txt settings for MediaWiki wikis.
Manual:Short URL/Prevent bots from crawling index.php prevents bots from crawling index.php.

If you are using short URLs (see note below), you can make sure that search engines only index actual wiki pages, without indexing action views (such as edit or history pages, with URLs in the form index.php?title=Main_page&action=edit).

Create a file named robots.txt in the root of your MediaWiki installation with the following content.

User-agent: *
Disallow: /index.php

Note: Creating a robots.txt file in the root of your MediaWiki with Disallow: /index.php without creating a Short URL first, will block all pages from being indexed. This is because your mediawiki page will still have a index.php in the title, which the robots.txt file will block "disallow".

Thumbnail Errors

Automatic thumbnail generation stopped working on this wiki. This addition to LocalSettings.php fixed it:

$wgMaxShellMemory = 202400;

Source: Error creating thumbnail command not found

An additional confusion came about when using's handy MediaWiki ShortURL Builder. Along with the .htaccess output provided is a suggestion to insert the following thumbnail setting into the LocalSettings.php file:

$wgGenerateThumbnailOnParse = false;
Unfortunately, thumbnail rendering stopped working until the setting was changed to
$wgGenerateThumbnailOnParse = true;


After upgrading PortlandWiki to MediaWiki 1.21.2 and attempting to run php update.php, received the following error:

MediaWiki Update: You are using PHP version 5.2.17 but MediaWiki 1.21 needs PHP 5.3.2 or higher.
ABORTING. Check if you have a newer php executable with a different name, such as php5.

The problem wasn't that PortlandWiki was an outdated version of PHP. Instead the the error was created because of previous changes made to PortlandWiki's PHP.ini settings. These customizations involved making changes to the location of the PHP copy PortlandWiki relies on.

This string will invoke the php update command:
/usr/local/php53/bin/php update.php

See original DreamHost post here.

Login error

Log in create account Login Error.png

Login error
PortlandWiki uses cookies to log in users. You have cookies disabled. Please enable them and try again.

On Thursday, June 14, 2012 anyone trying to log into PortlandWiki and two other wikis on the same DreamHost VPS server began receiving the above login error. The error prevented folks with valid accounts from logging in, even though cookies were not disabled.

(This on a DreamHost shared server, and didn't experience the login issue like the others did.)

On Wednesday, October 23rd, 2013 Kotra found this fix: How can I fix the mediawiki error “Wiki uses cookies to log in users. You have cookies disabled. Please enable them and try again.”?

Four easy steps
  1. Open to LocalSettings.php
  2. Go to the bottom of the page, and enter in the following code: session_save_path("tmp");
  3. Next create a directory called tmp in the folder where you have MediaWiki installed.
  4. Done!

As of 10:30, 27 October 2013 (PDT) it has been applied to PortlandWiki and Telecafe Wiki, and appears to be working.

Various help resources:

The fix Kotra found that worked:
Other suggested fixes:

Internal error (The Dingo Ate Wiki Wednesday Skin)

(Internal error on after installing MW 1.18.2.)

Error Date: 11:49, 24 April 2012 (PDT)

 Global default 'PDXWW' is invalid for field skin


#0 /home/wikiwednesday/ Preferences::getPreferences(Object(User))
#1 /home/wikiwednesday/ Preferences::getFormObject(Object(User))
#2 /home/wikiwednesday/ SpecialPreferences->execute(NULL)
#3 /home/wikiwednesday/ SpecialPageFactory::executePath(Object(Title), Object(RequestContext))
#4 /home/wikiwednesday/ MediaWiki->performRequest()
#5 /home/wikiwednesday/ MediaWiki->main()
#6 /home/wikiwednesday/ MediaWiki->run()
#7 {main}
RE - The Dingo Ate Wiki Wednesday Skin

Tue, Apr 24, 2012 at 10:05 PM - Kotra Says:

yeah, I think I ran into the same problems when I created the skin the first time. I think I ended up doing the same bad-coder "fix" I did this time: just modify monobook.

So we're using "monobook" as the default now. But it's actually the PDXWW skin, it's just called monobook because monobook has a lot of pointers to and from it that I don't fully comprehend. I backed up the real monobook skin this time, so it may be less confusing next time around.

Probably we'll run into this issue every time we upgrade mediawiki on But the process is simple: copy all the files in the skins/PDXWW folder to the skins/monobook folder.

LocalSettings.php & MW 1.18.0rc1

## The protocol and server name to use in fully-qualified URLs
// 25 November 2011 -- Dave Myers commenting out. Both variants shown below appear to force URL path to wiki directory ( to automatically redirect back to domain root.
// $wgServer           = "";
// $wgServer           = "";

export / import templates

Copied two versions of the Template:USSFwikiNews using instructions found here:

The MediaWiki instructions recommended UNCHECKING the "include only the current revision" box before exporting the template. Doing so, however, produces a much larger file. An attempt to import that larger file resulted in a browser and/or server timeout error.

To get a smaller file, make sure the "include only the current revision" remains CHECKED. The imported file from the example shown above imported well, although the image had to be uploaded separately.

.htaccess mod_rewrite woes

Affected Wiki: PortlandWiki

Started noticing issues with page titles containing "/" characters. URLs containing those characters began resolving to a 404 errors. Later, similar errors began affecting articles with titles containing "." and "&" characters.

Articles Affected (Among Others)
Sources Consulted
Fix Applied
Made adjustments to the .htaccess file:
RewriteEngine On

RewriteCond %{REQUEST_URI} ^/.*&.*
RewriteRule ^/?(.*)&(.*)$ /index.php?title=$1\%26$2 [L,NE]

RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule ^(.+)$ index.php?title=$1 [L,QSA]

.htaccess Tools