How to Install and Uninstall parsero Package on Ubuntu 21.10 (Impish Indri)
Last updated: November 07,2024
1. Install "parsero" package
Learn how to install parsero on Ubuntu 21.10 (Impish Indri)
$
sudo apt update
Copied
$
sudo apt install
parsero
Copied
2. Uninstall "parsero" package
Learn how to uninstall parsero on Ubuntu 21.10 (Impish Indri):
$
sudo apt remove
parsero
Copied
$
sudo apt autoclean && sudo apt autoremove
Copied
3. Information about the parsero package on Ubuntu 21.10 (Impish Indri)
Package: parsero
Architecture: all
Version: 0.0+git20140929.e5b585a-4
Priority: optional
Section: universe/net
Origin: Ubuntu
Maintainer: Ubuntu Developers
Original-Maintainer: Debian Security Tools
Bugs: https://bugs.launchpad.net/ubuntu/+filebug
Installed-Size: 41
Depends: python3-bs4, python3-pip, python3-urllib3, python3:any, python3-pkg-resources
Filename: pool/universe/p/parsero/parsero_0.0+git20140929.e5b585a-4_all.deb
Size: 11824
MD5sum: 7d0671b59e84559fb19e5642e1be9b1e
SHA1: d2708e505478b4ee8e93fe17f30db8493176bc4d
SHA256: b75b225779e860b36e9d8d497f3408094e6204ee39bd32a310ac19f18426670e
SHA512: 67b80779d130614e46407bd9cb925e5f8de60c69d495e1a8c54aeec25c6363795da570cf08ccbeac15bed2c7095260c8268b948d2ca6216bb3c9f06ed23ca287
Homepage: https://github.com/behindthefirewalls/Parsero
Description-en: Audit tool for robots.txt of a site
Parsero is a free script written in Python which reads the Robots.txt file
of a web server through the network and looks at the Disallow entries. The
Disallow entries tell the search engines what directories or files hosted
on a web server mustn't be indexed. For example, "Disallow: /portal/login"
means that the content on www.example.com/portal/login it's not allowed to
be indexed by crawlers like Google, Bing, Yahoo... This is the way the
administrator have to not share sensitive or private information with the
search engines.
.
Parsero is useful for pentesters, ethical hackers and forensics experts.
It also can be used for security tests.
Description-md5: a2b6e3ec22a2d33737a5182668ace747
Architecture: all
Version: 0.0+git20140929.e5b585a-4
Priority: optional
Section: universe/net
Origin: Ubuntu
Maintainer: Ubuntu Developers
Original-Maintainer: Debian Security Tools
Bugs: https://bugs.launchpad.net/ubuntu/+filebug
Installed-Size: 41
Depends: python3-bs4, python3-pip, python3-urllib3, python3:any, python3-pkg-resources
Filename: pool/universe/p/parsero/parsero_0.0+git20140929.e5b585a-4_all.deb
Size: 11824
MD5sum: 7d0671b59e84559fb19e5642e1be9b1e
SHA1: d2708e505478b4ee8e93fe17f30db8493176bc4d
SHA256: b75b225779e860b36e9d8d497f3408094e6204ee39bd32a310ac19f18426670e
SHA512: 67b80779d130614e46407bd9cb925e5f8de60c69d495e1a8c54aeec25c6363795da570cf08ccbeac15bed2c7095260c8268b948d2ca6216bb3c9f06ed23ca287
Homepage: https://github.com/behindthefirewalls/Parsero
Description-en: Audit tool for robots.txt of a site
Parsero is a free script written in Python which reads the Robots.txt file
of a web server through the network and looks at the Disallow entries. The
Disallow entries tell the search engines what directories or files hosted
on a web server mustn't be indexed. For example, "Disallow: /portal/login"
means that the content on www.example.com/portal/login it's not allowed to
be indexed by crawlers like Google, Bing, Yahoo... This is the way the
administrator have to not share sensitive or private information with the
search engines.
.
Parsero is useful for pentesters, ethical hackers and forensics experts.
It also can be used for security tests.
Description-md5: a2b6e3ec22a2d33737a5182668ace747