Difference between revisions of "User:WikiMaster/The Dingo Ate My Wiki"

From PDX.wiki.org
Jump to navigation Jump to search
m (Minor clarification.)
(→‎Cache: Link to APC installation tutorial.)
 
(8 intermediate revisions by the same user not shown)
Line 2: Line 2:
  
 
{{RightTOC}}
 
{{RightTOC}}
 +
 +
== Server Getting Hammered? ==
 +
If your wiki is crawling or frequently inaccessible do to too much traffic and too few server resources, you can try caching pages, slowing down bot and/or restricting their access, etc. Below are some of the techniques tried by PortlandWiki admins.
 +
 +
=== Bots ===
 +
* [[mediawikiwiki:Manual:robots.txt|Manual:robots.txt]]
 +
: Basic info on robots.txt settings for MediaWiki wikis.
 +
 +
* [[mediawikiwiki:Manual:Short URL/Prevent bots from crawling index.php|Manual:Short URL/Prevent bots from crawling index.php]]
 +
;[[mediawikiwiki:Manual:Short URL/Prevent bots from crawling index.php|Manual:Short URL/Prevent bots from crawling index.php]] prevents bots from crawling <tt>index.php</tt>.
 +
If you are using [[mediawikiwiki:short URL|short URL]]s (see note below), you can make sure that search engines only index actual wiki pages, without indexing action views (such as edit or history pages, with URLs in the form <code>index.php?title=Main_page&action=edit</code>).
 +
 +
Create a file named <tt>robots.txt</tt> in the root of your MediaWiki installation with the following content.
 +
<pre>
 +
User-agent: *
 +
Disallow: /index.php
 +
</pre>
 +
 +
'''Note''': Creating a <tt>robots.txt</tt> file in the root of your MediaWiki with <tt>Disallow: /index.php</tt> '''without''' creating a [[mediawikiwiki:Manual:Short URL|Short URL]] first, will block '''all''' pages from being indexed.  This is because your mediawiki page will still have a index.php in the title, which the <tt>robots.txt</tt> file will block "disallow".
 +
 +
== Cache ==
 +
* [[dreamhost:XCache|XCache]]
 +
: The [[wikipedia:List of PHP accelerators|PHP accelerator]] used on the DreamHost [[dreamhost:VPS|VPS]] [[portlandwiki:PortlandWiki|PortlandWiki]] resides on.
 +
 +
* [[dreamhost:APC|APC]] (Alternative PHP Cache)
 +
: The [[wikipedia:List of PHP accelerators|PHP accelerator]] used on the DreamHost shared server used by [[Main Page|this wiki]] (the one this wiki page is on).
 +
:: Helpful step-by-step installation tutorial here: [http://www.brandonmartinez.com/2013/03/16/setup-apc-alternative-php-cache-on-dreamhost-shared-hosting/ Setup APC (Alternative PHP Cache) on Dreamhost Shared Hosting]
 +
 +
== Thumbnail Errors ==
 +
Automatic thumbnail generation stopped working on this wiki. This addition to <code>LocalSettings.php</code> fixed it:
 +
<pre>
 +
$wgMaxShellMemory = 202400;
 +
</pre>
 +
Source: [[mediawikiwiki:Manual talk:Image administration#Error_creating_thumbnail_command_not_found|Error creating thumbnail command not found]]
 +
 +
An additional confusion came about when using [http://redwerks.org/ Redwerks.org's] handy [http://shorturls.redwerks.org/ MediaWiki ShortURL Builder]. Along with the <tt>.htaccess</tt> output provided is a suggestion to insert the following thumbnail setting into the <tt>LocalSettings.php</tt> file:
 +
<pre>$wgGenerateThumbnailOnParse = false;</pre>
 +
Unfortunately, thumbnail rendering stopped working until the setting was changed to <pre>$wgGenerateThumbnailOnParse = true;</pre>.
  
 
== Update.php ==
 
== Update.php ==
Line 20: Line 58:
 
;Login error
 
;Login error
 
;PortlandWiki uses cookies to log in users. You have cookies disabled. Please enable them and try again.
 
;PortlandWiki uses cookies to log in users. You have cookies disabled. Please enable them and try again.
 +
 +
[[File:Login-Error-Capture.png|thumb|500px|right|Example of login cookies error on [[telecafe:Main Page|Telecafe Wiki]]. [[User:Kotra|Kotra]] found the fix here: <span style="plainlinks">[http://stackoverflow.com/questions/16127882/how-can-i-fix-the-mediawiki-error-wiki-uses-cookies-to-log-in-users-you-have-c How can I fix the mediawiki error “Wiki uses cookies to log in users. You have cookies disabled. Please enable them and try again.”?]</span>]]
  
 
On Thursday, June 14, 2012 anyone trying to log into [[portlandwiki:PortlandWiki|PortlandWiki]] and two other wikis on the same DreamHost VPS server began receiving the above login error. The error prevented folks with valid accounts from logging in, even though cookies were '''''not''''' disabled.
 
On Thursday, June 14, 2012 anyone trying to log into [[portlandwiki:PortlandWiki|PortlandWiki]] and two other wikis on the same DreamHost VPS server began receiving the above login error. The error prevented folks with valid accounts from logging in, even though cookies were '''''not''''' disabled.
Line 25: Line 65:
 
(''This'' wiki--'''[[Main Page|pdx.wiki.org]]'''--is on a DreamHost shared server, and didn't experience the login issue like the others did.)
 
(''This'' wiki--'''[[Main Page|pdx.wiki.org]]'''--is on a DreamHost shared server, and didn't experience the login issue like the others did.)
  
Difficult time finding useful help resources. The few that appeared relevant were:
+
On Wednesday, October 23rd, 2013 [[User:Kotra|Kotra]] found this fix: [http://stackoverflow.com/questions/16127882/how-can-i-fix-the-mediawiki-error-wiki-uses-cookies-to-log-in-users-you-have-c How can I fix the mediawiki error “Wiki uses cookies to log in users. You have cookies disabled. Please enable them and try again.”?]
 +
 
 +
;Four easy steps:
 +
# Open to <code>LocalSettings.php</code>
 +
# Go to the bottom of the page, and enter in the following code: <code>session_save_path("tmp");</code>
 +
# Next create a directory called <code>tmp</code> in the folder where you have MediaWiki installed.
 +
# Done!
 +
 
 +
As of 10:30, 27 October 2013 (PDT) it has been applied to [[portlandwiki:PortlandWiki|PortlandWiki]] and [[telecafe:Main Page|Telecafe Wiki]], and appears to be working.
 +
 
 +
Various help resources<nowiki>:</nowiki>
 +
: The fix [[User:Kotra|Kotra]] found that worked:
 +
* http://stackoverflow.com/questions/16127882/how-can-i-fix-the-mediawiki-error-wiki-uses-cookies-to-log-in-users-you-have-c
 +
: Other suggested fixes:
 
* http://wiki.dreamhost.com/Server_Moves
 
* http://wiki.dreamhost.com/Server_Moves
 
* http://stackoverflow.com/questions/1148583/problem-with-mediawiki-cookies
 
* http://stackoverflow.com/questions/1148583/problem-with-mediawiki-cookies
 
* http://www.mediawiki.org/wiki/Thread:Project:Support_desk/Login_Problem
 
* http://www.mediawiki.org/wiki/Thread:Project:Support_desk/Login_Problem
 
* http://www.mediawiki.org/wiki/Thread:Project:Support_desk/Login_Problem/reply_%2812%29
 
* http://www.mediawiki.org/wiki/Thread:Project:Support_desk/Login_Problem/reply_%2812%29
 
After making the recommended "fix" (see contents of above links) on only the <code>/tmp</code> in the PortlandWiki installation directory, PortlandWiki and the other two wikis all became inaccessible.
 
;Solution<nowiki>:</nowiki> Restarted web and database servers which seemed to clear up the problem. All three wikis came back to life and allowed account holders to log in.
 
  
 
== Internal error (The Dingo Ate Wiki Wednesday Skin) ==
 
== Internal error (The Dingo Ate Wiki Wednesday Skin) ==

Latest revision as of 08:37, 4 November 2013

Dingo-ate-me.png

Server Getting Hammered?

If your wiki is crawling or frequently inaccessible do to too much traffic and too few server resources, you can try caching pages, slowing down bot and/or restricting their access, etc. Below are some of the techniques tried by PortlandWiki admins.

Bots

Basic info on robots.txt settings for MediaWiki wikis.
Manual:Short URL/Prevent bots from crawling index.php prevents bots from crawling index.php.

If you are using short URLs (see note below), you can make sure that search engines only index actual wiki pages, without indexing action views (such as edit or history pages, with URLs in the form index.php?title=Main_page&action=edit).

Create a file named robots.txt in the root of your MediaWiki installation with the following content.

User-agent: *
Disallow: /index.php

Note: Creating a robots.txt file in the root of your MediaWiki with Disallow: /index.php without creating a Short URL first, will block all pages from being indexed. This is because your mediawiki page will still have a index.php in the title, which the robots.txt file will block "disallow".

Cache

The PHP accelerator used on the DreamHost VPS PortlandWiki resides on.
  • APC (Alternative PHP Cache)
The PHP accelerator used on the DreamHost shared server used by this wiki (the one this wiki page is on).
Helpful step-by-step installation tutorial here: Setup APC (Alternative PHP Cache) on Dreamhost Shared Hosting

Thumbnail Errors

Automatic thumbnail generation stopped working on this wiki. This addition to LocalSettings.php fixed it:

$wgMaxShellMemory = 202400;

Source: Error creating thumbnail command not found

An additional confusion came about when using Redwerks.org's handy MediaWiki ShortURL Builder. Along with the .htaccess output provided is a suggestion to insert the following thumbnail setting into the LocalSettings.php file:

$wgGenerateThumbnailOnParse = false;

Unfortunately, thumbnail rendering stopped working until the setting was changed to

$wgGenerateThumbnailOnParse = true;

.

Update.php

After upgrading PortlandWiki to MediaWiki 1.21.2 and attempting to run php update.php, received the following error:

MediaWiki Update: You are using PHP version 5.2.17 but MediaWiki 1.21 needs PHP 5.3.2 or higher.
ABORTING. Check if you have a newer php executable with a different name, such as php5.

The problem wasn't that PortlandWiki was an outdated version of PHP. Instead the the error was created because of previous changes made to PortlandWiki's PHP.ini settings. These customizations involved making changes to the location of the PHP copy PortlandWiki relies on.

This string will invoke the php update command:
/usr/local/php53/bin/php update.php

See original DreamHost post here.

Login error

Log in create account Login Error.png

Login error
PortlandWiki uses cookies to log in users. You have cookies disabled. Please enable them and try again.

On Thursday, June 14, 2012 anyone trying to log into PortlandWiki and two other wikis on the same DreamHost VPS server began receiving the above login error. The error prevented folks with valid accounts from logging in, even though cookies were not disabled.

(This wiki--pdx.wiki.org--is on a DreamHost shared server, and didn't experience the login issue like the others did.)

On Wednesday, October 23rd, 2013 Kotra found this fix: How can I fix the mediawiki error “Wiki uses cookies to log in users. You have cookies disabled. Please enable them and try again.”?

Four easy steps
  1. Open to LocalSettings.php
  2. Go to the bottom of the page, and enter in the following code: session_save_path("tmp");
  3. Next create a directory called tmp in the folder where you have MediaWiki installed.
  4. Done!

As of 10:30, 27 October 2013 (PDT) it has been applied to PortlandWiki and Telecafe Wiki, and appears to be working.

Various help resources:

The fix Kotra found that worked:
Other suggested fixes:

Internal error (The Dingo Ate Wiki Wednesday Skin)

(Internal error on pdx.wiki.org after installing MW 1.18.2.)

Error Date: 11:49, 24 April 2012 (PDT)

 Global default 'PDXWW' is invalid for field skin

Backtrace:

#0 /home/wikiwednesday/pdx.wiki.org/includes/Preferences.php(1221): Preferences::getPreferences(Object(User))
#1 /home/wikiwednesday/pdx.wiki.org/includes/specials/SpecialPreferences.php(69): Preferences::getFormObject(Object(User))
#2 /home/wikiwednesday/pdx.wiki.org/includes/SpecialPageFactory.php(458): SpecialPreferences->execute(NULL)
#3 /home/wikiwednesday/pdx.wiki.org/includes/Wiki.php(240): SpecialPageFactory::executePath(Object(Title), Object(RequestContext))
#4 /home/wikiwednesday/pdx.wiki.org/includes/Wiki.php(640): MediaWiki->performRequest()
#5 /home/wikiwednesday/pdx.wiki.org/includes/Wiki.php(547): MediaWiki->main()
#6 /home/wikiwednesday/pdx.wiki.org/index.php(57): MediaWiki->run()
#7 {main}
RE - The Dingo Ate Wiki Wednesday Skin

Tue, Apr 24, 2012 at 10:05 PM - Kotra Says:

yeah, I think I ran into the same problems when I created the skin the first time. I think I ended up doing the same bad-coder "fix" I did this time: just modify monobook.

So we're using "monobook" as the default now. But it's actually the PDXWW skin, it's just called monobook because monobook has a lot of pointers to and from it that I don't fully comprehend. I backed up the real monobook skin this time, so it may be less confusing next time around.

Probably we'll run into this issue every time we upgrade mediawiki on pdx.wiki.org. But the process is simple: copy all the files in the skins/PDXWW folder to the skins/monobook folder.

LocalSettings.php & MW 1.18.0rc1

## The protocol and server name to use in fully-qualified URLs
// 25 November 2011 -- Dave Myers commenting out. Both variants shown below appear to force URL path to wiki directory (http://organizedpower.org/smw) to automatically redirect back to domain root.
// $wgServer           = "http://organizedpower.org";
// $wgServer           = "http://organizedpower.org/smw";

export / import templates

Copied two versions of the Template:USSFwikiNews using instructions found here:

http://www.mediawiki.org/wiki/Help:Templates#Copying_from_one_wiki_to_another

The MediaWiki instructions recommended UNCHECKING the "include only the current revision" box before exporting the template. Doing so, however, produces a much larger file. An attempt to import that larger file resulted in a browser and/or server timeout error.

To get a smaller file, make sure the "include only the current revision" remains CHECKED. The imported file from the example shown above imported well, although the image had to be uploaded separately.

.htaccess mod_rewrite woes

Summary
Affected Wiki: PortlandWiki

Started noticing issues with page titles containing "/" characters. URLs containing those characters began resolving to a 404 errors. Later, similar errors began affecting articles with titles containing "." and "&" characters.

Articles Affected (Among Others)
Sources Consulted
Fix Applied
Made adjustments to the .htaccess file:
RewriteEngine On

RewriteCond %{REQUEST_URI} ^/.*&.*
RewriteRule ^/?(.*)&(.*)$ /index.php?title=$1\%26$2 [L,NE]

RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule ^(.+)$ index.php?title=$1 [L,QSA]

.htaccess Tools