Robots.txt for phpBB 2.x.x

From LinuxReviews
Jump to: navigation, search

phpBB is a very popular PHP-based forum. The forum does not come with a very good default Robots Exclusion Standard robots.txt file. This file should prevents web-crawlers from downloading duplicate content and pages with no content from phpBB installations:

File: robots.txt
User-agent: *
Disallow: /forums/admin/
Disallow: /forums/images/
Disallow: /forums/includes/
Disallow: /forums/language/
Disallow: /forums/templates/
Disallow: /forums/common.php
Disallow: /forums/groupcp.php
Disallow: /forums/memberlist.php
Disallow: /forums/modcp.php
Disallow: /forums/posting.php
Disallow: /forums/login.php
Disallow: /forums/profile.php
Disallow: /forums/privmsg.php
Disallow: /forums/search.php
Disallow: /forums/viewonline.php
Disallow: /forums/faq.php
Disallow: /forums/updates-topic
Disallow: /forums/stop-updates-topic
Disallow: /forums/ptopic
Disallow: /forums/ntopic
Personal tools
hardware tests
Categories
Privacy policy
linux events
ipv6
Networking
IPv6

Search:

linux newz | random page | poetry | free blog