How to Install and Uninstall libwww-robotrules-perl Package on Ubuntu 16.04 LTS (Xenial Xerus)
Last updated: November 26,2024
1. Install "libwww-robotrules-perl" package
This guide let you learn how to install libwww-robotrules-perl on Ubuntu 16.04 LTS (Xenial Xerus)
$
sudo apt update
Copied
$
sudo apt install
libwww-robotrules-perl
Copied
2. Uninstall "libwww-robotrules-perl" package
Please follow the guidance below to uninstall libwww-robotrules-perl on Ubuntu 16.04 LTS (Xenial Xerus):
$
sudo apt remove
libwww-robotrules-perl
Copied
$
sudo apt autoclean && sudo apt autoremove
Copied
3. Information about the libwww-robotrules-perl package on Ubuntu 16.04 LTS (Xenial Xerus)
Package: libwww-robotrules-perl
Priority: optional
Section: perl
Installed-Size: 76
Maintainer: Ubuntu Developers
Original-Maintainer: Debian Perl Group
Architecture: all
Version: 6.01-1
Replaces: libwww-perl (<< 6.00)
Depends: perl, liburi-perl
Breaks: libwww-perl (<< 6.00)
Filename: pool/main/libw/libwww-robotrules-perl/libwww-robotrules-perl_6.01-1_all.deb
Size: 14074
MD5sum: 95112b59013c9e643ada5ddddb62d9b8
SHA1: 39693be3d45fcfd0852e2de9070abc36cb7242a5
SHA256: 5516cd8881af8032ccbeb1fb678c51d2f4b4f98128eccbb126fc0e07571ed830
Description-en: database of robots.txt-derived permissions
WWW::RobotRules parses /robots.txt files as specified in "A Standard for
Robot Exclusion", at. Webmasters
can use the /robots.txt file to forbid conforming robots from accessing parts
of their web site.
.
The parsed files are kept in a WWW::RobotRules object, and this object
provides methods to check if access to a given URL is prohibited. The same
WWW::RobotRules object can be used for one or more parsed /robots.txt files
on any number of hosts.
Description-md5: c1793eba30f9ab7256b5a25ded8e1664
Homepage: http://search.cpan.org/dist/WWW-RobotRules/
Bugs: https://bugs.launchpad.net/ubuntu/+filebug
Origin: Ubuntu
Supported: 5y
Task: ubuntu-desktop, ubuntu-usb, kubuntu-desktop, kubuntu-full, edubuntu-desktop, edubuntu-usb, xubuntu-core, xubuntu-desktop, mythbuntu-frontend, mythbuntu-desktop, mythbuntu-backend-slave, mythbuntu-backend-master, lubuntu-core, ubuntustudio-desktop-core, ubuntustudio-desktop, ubuntustudio-fonts, ubuntu-gnome-desktop, ubuntu-sdk-libs-tools, ubuntu-sdk, ubuntukylin-desktop, ubuntu-mate-core, ubuntu-mate-desktop, ubuntu-mate-cloudtop
Priority: optional
Section: perl
Installed-Size: 76
Maintainer: Ubuntu Developers
Original-Maintainer: Debian Perl Group
Architecture: all
Version: 6.01-1
Replaces: libwww-perl (<< 6.00)
Depends: perl, liburi-perl
Breaks: libwww-perl (<< 6.00)
Filename: pool/main/libw/libwww-robotrules-perl/libwww-robotrules-perl_6.01-1_all.deb
Size: 14074
MD5sum: 95112b59013c9e643ada5ddddb62d9b8
SHA1: 39693be3d45fcfd0852e2de9070abc36cb7242a5
SHA256: 5516cd8881af8032ccbeb1fb678c51d2f4b4f98128eccbb126fc0e07571ed830
Description-en: database of robots.txt-derived permissions
WWW::RobotRules parses /robots.txt files as specified in "A Standard for
Robot Exclusion", at
can use the /robots.txt file to forbid conforming robots from accessing parts
of their web site.
.
The parsed files are kept in a WWW::RobotRules object, and this object
provides methods to check if access to a given URL is prohibited. The same
WWW::RobotRules object can be used for one or more parsed /robots.txt files
on any number of hosts.
Description-md5: c1793eba30f9ab7256b5a25ded8e1664
Homepage: http://search.cpan.org/dist/WWW-RobotRules/
Bugs: https://bugs.launchpad.net/ubuntu/+filebug
Origin: Ubuntu
Supported: 5y
Task: ubuntu-desktop, ubuntu-usb, kubuntu-desktop, kubuntu-full, edubuntu-desktop, edubuntu-usb, xubuntu-core, xubuntu-desktop, mythbuntu-frontend, mythbuntu-desktop, mythbuntu-backend-slave, mythbuntu-backend-master, lubuntu-core, ubuntustudio-desktop-core, ubuntustudio-desktop, ubuntustudio-fonts, ubuntu-gnome-desktop, ubuntu-sdk-libs-tools, ubuntu-sdk, ubuntukylin-desktop, ubuntu-mate-core, ubuntu-mate-desktop, ubuntu-mate-cloudtop