How to Install and Uninstall libwww-robotrules-perl Package on Kali Linux
Last updated: December 23,2024
1. Install "libwww-robotrules-perl" package
Please follow the guidance below to install libwww-robotrules-perl on Kali Linux
$
sudo apt update
Copied
$
sudo apt install
libwww-robotrules-perl
Copied
2. Uninstall "libwww-robotrules-perl" package
Please follow the steps below to uninstall libwww-robotrules-perl on Kali Linux:
$
sudo apt remove
libwww-robotrules-perl
Copied
$
sudo apt autoclean && sudo apt autoremove
Copied
3. Information about the libwww-robotrules-perl package on Kali Linux
Package: libwww-robotrules-perl
Version: 6.02-1
Installed-Size: 36
Maintainer: Debian Perl Group
Architecture: all
Replaces: libwww-perl (<< 6.00)
Depends: perl, liburi-perl
Breaks: libwww-perl (<< 6.00)
Size: 12892
SHA256: be69cda8c2a860e64c43396bf2ff1c7145259cb85753ded14e0434f15ed647a0
SHA1: fe400de7bb2b05482bedd440a12a21fdd71b3a87
MD5sum: 1b8324d0e25d6bd149d897f628a8c4d5
Description: database of robots.txt-derived permissions
WWW::RobotRules parses /robots.txt files as specified in "A Standard for
Robot Exclusion", at. Webmasters
can use the /robots.txt file to forbid conforming robots from accessing parts
of their web site.
.
The parsed files are kept in a WWW::RobotRules object, and this object
provides methods to check if access to a given URL is prohibited. The same
WWW::RobotRules object can be used for one or more parsed /robots.txt files
on any number of hosts.
Description-md5:
Homepage: https://metacpan.org/release/WWW-RobotRules
Tag: devel::lang:perl, devel::library, implemented-in::perl, role::shared-lib
Section: perl
Priority: optional
Filename: pool/main/libw/libwww-robotrules-perl/libwww-robotrules-perl_6.02-1_all.deb
Version: 6.02-1
Installed-Size: 36
Maintainer: Debian Perl Group
Architecture: all
Replaces: libwww-perl (<< 6.00)
Depends: perl, liburi-perl
Breaks: libwww-perl (<< 6.00)
Size: 12892
SHA256: be69cda8c2a860e64c43396bf2ff1c7145259cb85753ded14e0434f15ed647a0
SHA1: fe400de7bb2b05482bedd440a12a21fdd71b3a87
MD5sum: 1b8324d0e25d6bd149d897f628a8c4d5
Description: database of robots.txt-derived permissions
WWW::RobotRules parses /robots.txt files as specified in "A Standard for
Robot Exclusion", at
can use the /robots.txt file to forbid conforming robots from accessing parts
of their web site.
.
The parsed files are kept in a WWW::RobotRules object, and this object
provides methods to check if access to a given URL is prohibited. The same
WWW::RobotRules object can be used for one or more parsed /robots.txt files
on any number of hosts.
Description-md5:
Homepage: https://metacpan.org/release/WWW-RobotRules
Tag: devel::lang:perl, devel::library, implemented-in::perl, role::shared-lib
Section: perl
Priority: optional
Filename: pool/main/libw/libwww-robotrules-perl/libwww-robotrules-perl_6.02-1_all.deb