How to Install and Uninstall perl-WWW-RobotRules Package on openSuSE Tumbleweed
Last updated: November 26,2024
1. Install "perl-WWW-RobotRules" package
Please follow the guidelines below to install perl-WWW-RobotRules on openSuSE Tumbleweed
$
sudo zypper refresh
Copied
$
sudo zypper install
perl-WWW-RobotRules
Copied
2. Uninstall "perl-WWW-RobotRules" package
Learn how to uninstall perl-WWW-RobotRules on openSuSE Tumbleweed:
$
sudo zypper remove
perl-WWW-RobotRules
Copied
3. Information about the perl-WWW-RobotRules package on openSuSE Tumbleweed
Information for package perl-WWW-RobotRules:
--------------------------------------------
Repository : openSUSE-Tumbleweed-Oss
Name : perl-WWW-RobotRules
Version : 6.02-9.29
Arch : noarch
Vendor : openSUSE
Installed Size : 24.3 KiB
Installed : No
Status : not installed
Source package : perl-WWW-RobotRules-6.02-9.29.src
Upstream URL : http://search.cpan.org/dist/WWW-RobotRules/
Summary : database of robots.txt-derived permissions
Description :
This module parses _/robots.txt_ files as specified in "A Standard for
Robot Exclusion", at Webmasters
can use the _/robots.txt_ file to forbid conforming robots from accessing
parts of their web site.
The parsed files are kept in a WWW::RobotRules object, and this object
provides methods to check if access to a given URL is prohibited. The same
WWW::RobotRules object can be used for one or more parsed _/robots.txt_
files on any number of hosts.
The following methods are provided:
* $rules = WWW::RobotRules->new($robot_name)
This is the constructor for WWW::RobotRules objects. The first argument
given to new() is the name of the robot.
* $rules->parse($robot_txt_url, $content, $fresh_until)
The parse() method takes as arguments the URL that was used to retrieve
the _/robots.txt_ file, and the contents of the file.
* $rules->allowed($uri)
Returns TRUE if this robot is allowed to retrieve this URL.
* $rules->agent([$name])
Get/set the agent name. NOTE: Changing the agent name will clear the
robots.txt rules and expire times out of the cache.
--------------------------------------------
Repository : openSUSE-Tumbleweed-Oss
Name : perl-WWW-RobotRules
Version : 6.02-9.29
Arch : noarch
Vendor : openSUSE
Installed Size : 24.3 KiB
Installed : No
Status : not installed
Source package : perl-WWW-RobotRules-6.02-9.29.src
Upstream URL : http://search.cpan.org/dist/WWW-RobotRules/
Summary : database of robots.txt-derived permissions
Description :
This module parses _/robots.txt_ files as specified in "A Standard for
Robot Exclusion", at
can use the _/robots.txt_ file to forbid conforming robots from accessing
parts of their web site.
The parsed files are kept in a WWW::RobotRules object, and this object
provides methods to check if access to a given URL is prohibited. The same
WWW::RobotRules object can be used for one or more parsed _/robots.txt_
files on any number of hosts.
The following methods are provided:
* $rules = WWW::RobotRules->new($robot_name)
This is the constructor for WWW::RobotRules objects. The first argument
given to new() is the name of the robot.
* $rules->parse($robot_txt_url, $content, $fresh_until)
The parse() method takes as arguments the URL that was used to retrieve
the _/robots.txt_ file, and the contents of the file.
* $rules->allowed($uri)
Returns TRUE if this robot is allowed to retrieve this URL.
* $rules->agent([$name])
Get/set the agent name. NOTE: Changing the agent name will clear the
robots.txt rules and expire times out of the cache.